Thank you to Ruipro for sponsoring this video! Buy the Ruipro HDMI 2.1 Certified Fiber Optic Cable: amzn.to/432NDGS Get access to all my ICC profiles & Discord: patreon.com/TheDisplayGuy
This guy doesn't even realize 1080p is still the most common monitor lmao. It's not just a bad take. This guy is so detached from reality than a mental asylum couldn't land him back in it.
He is right. For example, Alan Wake 2. It looks bad in native 2k all setting max. But when I use DLDSR 4K it bacomes SHARP and clean. And the same with blurry Control, Quantum break, Forbidden West, Horizon Zero Dawn...
As someone who has a 1080p 240hz, 2k ultra wide at 240hz and a 4k 144hz monitor and a 4k 70"120hz gaming tv, he's absolutely right. Unless you're speaking from experience you don't know. And it's impossible to assume.
@@mrf1213maybe he is right. But his arguments doesn’t make sense on a price level. Try to get a good 1440p monitor 240hz or 165, below 350$ is very possible. For 4k monitors, its not only so much more expensive but its not worth it if you are more on the side of fps gaming. Ive seen what it looks like on a 4k monitor, and it LOOKS great! But for the price, I would choose all day a 240hz+ 1080p monitor or a 1440p 170hz like I currently have. Under 300$ new and very satisfied.
@@dessso4463 The reason many of these games look blurry in 2k or 1080p is because of the TAA image smoothing and blurry filters that look like real shit, in some of these titles you can greatly improve the image quality by modifying internal files of the game like Alan Wake 2. I think in the end this is pure marketing to force people to 4k and spend more money. A 2k without filters or rescaling and without shitty anti-aliasing techniques is extremely sharp even a 1080p is. We are living in a time that makes it powerless to see this panorama.
"I even ran 4K games reasonably well on a mobile RTX 4050 with upscaling and reduced settings" That part made me think, we are done here that's just dumb...
I was coming down to time stamp this exact quote lmao. To each their own I suppose. My 175hz 1440p oled ultrawide is wayyyy better than any 4k monitor because I can actually run 175hz
@@fuzzywinkle8310going to be picking up my Samsung G8 in a couple days. Got it open box with 4yr warranty for only $715. If there was a 4k version of the monitor I think even a 4090 would struggle more than a 4070 ti super would with the g8
Yeah...a 6gb 4k laptop runs 4k reasonably well? ok...maybe low settings, frame gen and DLSS ultra performance upscaling 4k from 720p lol. Display companies need to make higher ppi 2k monitors though.
As someone that uses both 4k, and QHD I agree to an extent but the main reason gamers want QHD is due to being able to achieve higher FPS than 4k while still looking at least better than FHD.
This 100%. I have a 4K monitor and used to have just QHD and the graphical upgrade is nice. But even on a 7900xtx running games at 4K 144 (my refresh rate) can be challenging; especially with how horribly optimized modern games can be. If the difference is even just 4K @ 100 vs 1440p @ 144, there are games where I will downscale to reach that fps. 4K monitors are great for production and movies/UA-cam and stuff, but the motion clarity and better overall experience available at higher FPS to me beats out the benefits of a slightly better image. I love 4K but ngl the video sounds kind of elitist. You can get the gigabyte 1440p 240hz from $400-450 and that’s generally the highest refresh rate even competitive gamers will go. Starter, non HDR, 120hz 4K monitors usually start at like $500. It’s way more expensive and harder to drive.
@@Chuckychargeblade 4K monitors at 500$ are crap displays made on tech from 10 years ago. You need to spend at least 7-800$ on a good 4K display. And some good ones are at 1000$ or more.
4K video content will look better on a 1080p monitor than on a 1440p monitor, due to simpler scaling factor. With a 1080p monitor, each pixel of the 4K content can be represented by a block of exactly 4 pixels, while with a 1440p it doesnt scale down to a whole number.
I mean hard disagree with 4K at low settings looking better than 1440p at high settings, especially when you factor in things like raytracing. Meaning a 1440p game with good raytracing and high settings will look worlds better than a game rendered at 4K with low settings.
i would choose low setting higher resolution every day of the week, especially for multiplayer games, tbh i dont even care to play story games at 1080p.
Yeah I've played on 4k its nice but I much prefer high settings over low settings, I can ignore the faults of a lower resolution pretty easily and like to focus on the game's visuals.
@@smthsmth stuff like overwatch and fortnite. I don’t game too often anymore but I definitely notice a difference when A/B testing. I have the new alienware 4k oled and it’s amazing at 240 fps. Hard to hit on much rn but even moving my cursor I notice
Simple: Price On a 1440p monitor you don't have to spend much to get a 1440p 60fps experience with high settings. This means your GPU and CPU last far longer before you're looking at
The key is to game at 1080p on a 4K monitor while using the 4K resolution for work or movies. That way you can run your 4K monitor on a weaker PC while still enjoying 4K when not gaming. 1080p is still fine but I wouldn't buy a 1080p monitor because it sucks for everything other than gaming.
@@ucrjedi No, that is not the key. A 1080p window on a 4K display is tiny. And Running a game at a non native resolution when stretched out to full screen is also not ideal. 1440p remains the sweet spot alternative to 4K. The running cost of 4K is still too high for gaming.
@@RicochetForce 4k isn't that expensive, if the current gen consoles can do it, so can a mid range PC. Yes, I know the consoles use FSR to upscale a lot of games to 4k, but that same FSR tech is available on PC as well, so there's not really an excuse for not getting a 4k monitor. If your PC is good enough to run a game at 1440p then you can run it at 4k using FSR, just like a PS5 does.
@@acurisur #1 4K is actually quite a bit more expensive across the board. The monitors carry a premium price over 1080p and 1440p monitors. The graphics cards needed to run the same graphics settings at 60fps someone was used to at 1080p or 1440p are also much more expensive. #2 FSR is inferior to DLSS across the board, and even DLSS is inferior to native resolution in terms of image quality. in general you do not want to be feeding fixed pixel displays resolutions that aren't their native resolutions. The consoles having to run at 1440p and upscale to 4K is a knock on 4K being a realistic, affordable option. People should save the money that would've been spent on a 4K monitor and the graphics card needed and instead invest in a higher tier card for high refresh gaming at 1440p.
@RicochetForce Agree to disagree. 1440p looks bad compared to 4k. Also I never said FSR was better than DLSS, I was talking about how the PS5 and Xbox Series X can do 4k, they don't have access to DLSS as they're both using AMD hardware. I use DLSS on my PC and it's better but FSR is still good in most games, specifically FSR 2.1. Completely disagree on sending feeding fixed pixel display resolutions that aren't their native resolutions as DLSS often produces a better image than native. DLSS isn't the same as FSR as it's not upscaling the image, it completely reconstructs it using tensor core a.i., producing an image that's often superior to TAA. Many tech channels like Digital Foundry have done comparisons of native 4k vs DLSS and often DLSS is superior.
It has sense! Many games look blurry in native 2K, but look sharp and clean in DLDSR 4K. Just try Alan Wake 2 and you will wonder that it will change all graphics.
1440p is honestly still great when you consider integer scaling with retro games and emulation. I.E: 240p or 480i/p retro games with clean upscaling and some scanline shaders. At 4K, you’re going to have some portion of 480p content cut off, while at 1440p, it’s going to be a clean integer scale.
This is a good point. My 720p video files still look best on my 4:3 CRT monitor, for color and aspect ratio, and the ability for it to have any native resolution i choose
@@minty_x The PS5 and Xbox Series consoles support 1440p. Unless you’re explicitly talking about the Switch or older game consoles, I don’t really see the point nowadays.
This video isnt relevent unless you have a 4080/4090. I have a 3090ti, and whilst its a 4k targeted card, higher frame rates at 1440p just looks and feels cleaner. Any forum will recommend 1440p 240hz at this moment in time for 90% of pc gamers
Yeah, I agree. Even a top end PC with a 7800x3d and a 4090 will become a 1440p mid range machine a lot quicker with these new photo realistic games coning out in the not too distant future. I 'downgraded' from a Dell G3223Q to an Aorus FO27Q3. I love the better motion clarity, and the picture is better overall because it's an OLED screen.
I have had two different 4K screens and this video is rage bait. I went back to 1440p for gaming mostly because some games scale poorly unless your 4K screen is 42”, which limits your ability to have multiple monitors without harming your neck.
4K video content will look better on a 1080p monitor than on a 1440p monitor, due to simpler scaling factor. With a 1080p monitor, each pixel of the 4K content can be represented by a block of exactly 4 pixels, while with a 1440p it doesnt scale down to a whole number.
This guy sounds like the person who counts every thread of his bed sheets. As someone who owns a 4K Oled monitor, I'm still playing at 1080p without super resolution or DLSS and it honestly looks great. ( I do use integer scaling but all that really does is insure it's a 1:1 pixel ratio for a sharper image)
He was so bewilderingly close to making a real point about integer scaling when he compared those resolutions in terms of "2.25x vs 4x." But it turned out he just had a personal distaste for the number 2.25 compared to 4. Good stuff.
I wish I lived in a country where this video made sense, but I'd have to drop a full 2 months salary for a good 4k monitor, and that's not even considering that I'd still need a high end gpu to run 4k games with a good framerate (a 4080 costs like 6 times the minimum wage here in Brazil). I get where you're coming from but 4K gaming just isn't really an option for the average worker outside USA and similar countries, so to have some type of upgrade over 1080p we'll settle for 1440p.
I second this hitmans comment, I make a pretty average wage but with each component costing over 1k USD to get the optimal experience and the fact most titles arent even really optimized for 4k and ray tracing has been mostly an after thought the value really isnt there yet. for a 1/3 the price you can get a pretty damn nice 2k. and dont get me started on 8k most people wouldnt be able to tell the difference between 8k and 4k and any subsequent improvement would have even less noticeable difference. and as far as refresh rates go 240 is like a professional jet fighter pilots limits so never listen to anybody try to tell you have to have that high of a refresh rate unless your a paid fps player.
Unless you limited by outdated 8gb vram and super slow GPU BUS then 4K with dlss qualily has the same fps as 1440p native. So, if you have 120 fps at 1440p resolution then you will have 4K 120 fps with dlss quality on this GPU. And 4K with DLSS P has the same fps as 1080p native and 4K with DLSS P 20x times better than 1080p native and still much better than 1440p native on 1440p monitor.
@@kotboyarkin5032 Ok, but have you seen 4k 144hz prices? They're double than 1440p. For the same price you can only get 4k 60hz, and 60hz is pain, and that's what op means! While even 2080ti can handle optimized settings with dlss at around 60fps in games at 4k, 60 fps on 60hz feels like crap after 60 on 165hz, and we're at least a few years away from affordable gaming 4k monitors.
Im actually downgrading from 4k 60 42inch to 1440p 165hz 27inch monitor arriving tmrw R.I.P i'll buy a 4k 32inch monitor in 5years or if i see a deal i cant ignore.
I downscaled back from 4k to 1080 xD My eyes are thanking me every day. They got strained from the 4k Monitor, my computer also didn't like it that much, would blow really loud. Sure 1080 isn't as sharp, but it's way more relaxing to my eyes (both are 27inch monitors), the PC isn't struggling anymore and my energy bill is a lot lower as well. Oh and I can run Cyberpunk in all it's shiny glory again with Raytracing on, on 4k I had to pick a much lower resolution and turn all shinies off or it would stutter like crazy.
I have Rtx 4090 and amd ryzen 9 7950x3d. Literally i can't choose the monitor, Should it be 2k or 4k. I play CS2 and also i play RDR2, Bannerlord too. What is the best choice? Can you guys help me objectively?
1080p is still fine for my tastes. Cheap, and probably fine for the vast majority of people, not to mention it massively extends the useful life of hardware. I would've likely had to upgrade from a 1070 by now otherwise.
As someone who still plays in a 1080p but with a 4090, I gotta say, 1080p is sht. You noticed that when you render games at 1620p via DLDSR and games a hell of a lot better even on a 1080p monitor. 1080p is sooo blurry. I have no idea why games rendered natively at 1080p are blurry. But, anyway, 1080p is bad. Try DLDSR or buying a 1440p monitor and you'll see what you're missing.
@@CeceliPS3 My GPU never supported DLDSR, so I never got to experience that myself. Maybe if I do get a GPU upgrade in the future, it might be worth it? For now I'm sticking with 1080p because my GPU is currently barely adequate for that.
Of course. With a 1070 you wouldn't be able to render at higher resolutions even if they made DLDSR available for it. I was just sharing my experience with you so you know what lies ahead. I'm not even telling you to go for 4k or 1440p. I'm just saying 1080p is not all that once you experience higher resolutions with the detail or caveat that it may not even be the higher resolution per se because of DLDSR on 1080p monitor, but the fact that games look a hell of a lot better when rendered in higher resolutions. 1080p native is sooo bad once you see what's out there. But, until you experience it, your 1080p experience will be ok. After seeing the difference, you'll never want to go back.@@NovemberJoy
I love seeing the dislikes of using an extension. You got wrecked on this video. lol 4.5K dislikes. NO multiplayer games run good upscaling when it comes to FPS. 1440p saves you lots of room for competitive FPS while looking as clean as 4k. Derp.
You are misleading viewers. Resolution sweet spots depends on screen size. No need to go higher than 1440p if you have a 27” monitor, no need to go higher than 1080p with a 24”. If you plan for a too big monitor then only then 4K should be an option. And please stop the 8K garbage
That channel guess told you 💯 % like it is. . And I fully AGREE all the way. . You need to start having RESPECT for other people's monitor entertainment preferences . So stop forcing your opinion narratives onto these channel viewers because that's disrespectful, not helpful. . Everybody doesn't care about 4K just because you do. . So get over it
4K on low settings looking better than 1440p on high is certainly one of the takes of all time. I definitely think 4K and 1440p have their place, it depends on what you prioritize. The 27" OLEDs in native 1440p may not be inexpensive monitors, but they really are damn gorgeous while allowing nice native framerates. I don't denegrate 4K at all, my LG G1 is a gorgeous TV, and I completely get why some would be into it even if PC gaming is their only use case, but this video did absolutely nothing to put a damper on 1440p for me. I've seen a 4090/i9 13th gen in 4K, and it's stunning, but my 4070ti in 1440p will do me just fine.
4K video content will look better on a 1080p monitor than on a 1440p monitor, due to simpler scaling factor. With a 1080p monitor, each pixel of the 4K content can be represented by a block of exactly 4 pixels, while with a 1440p it doesnt scale down to a whole number.
He didn’t say 4k on low, he said 4k on lowER settings than max. Likely meaning turning some of the super demanding graphical effects like raytracing a bit but still keeping things mostly at high/ultra.
@@ts8960 Video players have renderers like madVR that use high end algorithms that make the picture look good at any resolution. The only real advantage of 4k is in text rendering.
lol my 3080 is barely keeping up with 60-90fps with today's games at 1440p, mixed settings of medium/high. I try to aim for 120fps and even that isn't easily attainable with my current gpu. I'd rather not spend $1k+ for 4k 120fps
Same. With my 3080 still use 1080p. I tried with DSR downscaling both 1440p and 4K. Even with DLSS, image upscaled from 1080p to 4K performs worse than native 1080p DLAA
What games are you planning, i have 3080 12gb and so far i play all my games on monitor 1440p ultra no problems, and I also game on my tv 4k , with Dlss quality and playing with settings i can get 70-100 fps even in games like RDR2 ?!
I just bought a 1440 p monitor for my 4070s, especially with the graphical updates I want to run games like ffxiv at max settings 1440p and I should have no problems running it at the max refresh rate. I can completely max out cyberpunk at 1080, but it still feels strangely pixelated, not immediately buying a higher res monitor with a 1k$ gpu was probably dumb
Me not going 4k is because everything larger than 27" is just to big for me. I have tryed with 32" and switched back to 27", and 27" inch 1440p monitor is super fine, i will switch to 27" inch 4k monitor in a year or two when we get stronger hardware to run games at more fps at 4k.
You sound like me, then I bought the G9 ultrawide curved and that's what let me get the big screen. Now your only problem is whether to get the G9 1440p 50 inch or the 4k G9 57 inch. I went with the 1440p because it's seems all the tech for 4k gaming isn't there yet on the PC or the monitor side. There isn't display port 2.1 on NIVIDA yet or anything that could even really utilize it fully, the monitors for gaming at 4k are still newish and it all just doesn't seem like it's come together just yet. So I went 1440p and will upgrade if I get a 5090 with display port 2.1 some day as It just doesn't feel like the standard yet, maybe in another gen or two will get there without having to pay 2k for a card and another 2k for a monitor just to be able to play like two games where you would even notice the difference. For now, to me anyway, it just feels like meh, not worth. Anyway, moving to the ultra wide curbed format is what finally let me feel comfortable gaming on big monitors and I got to say, it feels really nice.
1440p OLED 240+ refresh rate seems like ideal for all types of games until we all have 4090 or better hardware in 5 years. But yea if you don’t play competitive shooters then I could see 4k being ideal.
Even the 4090 cant run fortnite on 4k up to that fps, even on low settings. In ideal sircumstances it runs 140fps, on low settings. In other words, the guy in this video is talking out of his ass, cause we're still a couple generations away from afordable and REAL functional 4k.
@@thames21 No one buys a 4090 to play Fortnite lol. That's a GPU you buy if you want to play games like Cyberpunk 2077 with Raytracing turned on. You can still get 100fps + in Fortnite at 4k max settings with ray tracing turned on with a 4090, I literally just watched someone do it. A 4090 also has DLSS 3.5 Frame Generation. Reference : "Fortnite Season 4 - RTX 4090 Ultra Settings (4K + Ray Tracing)" uploaded to UA-cam by RTX GamePlays.
I only play competitive/esports games. Gpus are now powerful enough and cheap enough to push 1440p240Hz in many of those games. 1440p240Hz monitors have also come down enough in price, while still maintaining low total input lag. That's why I'm going with 1440p
@@maxypad3379 there was an acer one that hit $250 recently. But it's not a great model. Next cheapest is the hp omen 27qs which has a super sale price of $300. It was at $300 for like 2 weeks back in July. And it hit that price again a few days ago for early Black Friday. But it sold out at that price and is back way up at $430
It's not just the monitor. For example, I have a desk setup with my GPU, USB C hub, cables, KVMs, etc. 1440p monitors work just fine with USB 3.2 hubs and HDMI 2.0. In order to have a fully functional 4K setup, you need, a 4K Monitor, which costs twice as much as a 1440p monitor, a USB 4.0 or thunderbold dock, which are almost double the price of USB 3.2 hubs, and a HDMI 2.1 or display port interface, which also reduces your options. Also, bigger monitors require bigger desks, stronger monitor arms and more expensive GPUs. DLSS and FSR have taken big leaps, but you just can't yet DLSS up your way to 4K. Considering all of this, jumping from 1080p to 1440p costs like 55% more, while jumping from 1440p to 4K costs almost 200%.
1440p IS the sweet spot for the vast majority of people. We get it, you tried 4k and can't go back. For the rest of us who can't afford 4k displays, or even run it to begin with, we don't care.
@@evaone4286 Yeah sure the display is more affordable now then is has been in the past, but you still need a top of the line PC to get the fps needed on newer AAA titles.
4k is overkill for PC monitors that are usually 24-27 inches more recently 32, 1440 on that size screen makes more sense because most literally couldn't tell the difference and it nets you more fps
You need 4k at 32" or bigger. Had a 35" UltraWide with the UltraWide version of 1440p and it was awful. Ended up doing 32" 4K with 144Hz. Having too few pixels on a large display makes it look bad.
ive used a 32" 4k lg monitor since 2018, its not all that great, picture is good but sitting 2ft away from a screen that big kinda sucks imo and i had up go up on the scaling to even read stuff on the screen because 100% is just to tiny at 32"
I disagree. 1440p at 240hz is way better than 4K at 144 or whatever I have a 4090 and I’d much rather something with an actual decent frame rate Trust me if there was 4K at 360hz or something your boy would be buying it
@@BourbonBiscuit. It seems to me that too many players consider themselves "competitive". Just because they play Valorant or CSGO. Even if they don't make any money out of it.
Even though I am not a "professional gamer" i got a 1440p 240hz display for the sole reason it makes games noticeable more fluid and enjoyable. Especially very fast paced games where the image changes a lot, and yes I'ts a very noticeable upgrade from 144hz, not as much as from 60hz but still very nice.@@BourbonBiscuit.
There is no GPU that can run 4K at 360 fps minimum. Not even 5090 will be able to do that and not even 6090 will be able to do that. 7090 maybe but that's probably 2028/2029, same time PS6 will come out. Also, the visual benefits of 4K are so much greater than 1440p that not even higher refresh rate of 1440p is able to level up the playing field. The difference in 144hz vs 240hz is pretty small but the difference with 1440p vs 4K is huge.
That's only if you use a monitor bigger than 27 though. Because (I used to be a computer monitor salesman for years), you really can't see the pixels in a 1440p vs 4K 27 inch unless you're pixel peeping, and pausing still images on your system. So, if you do get one, make sure it's 32 inches at the minimum.
I would honestly need to do a test drive of a 1440p and 4K moniter I plan on buying to see, with my hardware, what each game I enjoy playing can look like with different settings/upscaling, and what fps I can get with them. With it being SO dependent on each persons' rig, preferred games, desired bias of fidelity/framerate, ect, it's IMPOSIBLE to suggest one is superior than the other at this development stage. I've been a 32" 1440p user for close to 4 years now, and the pixel density of 1440p at even 32" is enough to make games look crispy, albeit less than 4k (duh.) Basically, you cant miss what you don't have and I've enjoyed gaming at 1440p so far. HOWEVER, with this 32" moniter in question just recently kicking the can, I'm in the market for something, and decided if I stick with 1440p, I'm absolutely going to a 27" to recover some clarity from increased pixel density over a 32", or going to 4K. So, to that end, it seems I agree with this video: I do desire more clarity. But this conversation cannot be had without bringing up panel technology. 4K is still *mostly* restricted to traditional IPS and VA, while 1440p panels are readily available in far superior rapid/fast IPS and OLED panels. What good are all the pixels of 4k if the color, contrast, and ghosting detract from the experience? Traditional IPS at 4k would be the best option, and probably a great experience for those choosing to go to 4k. But, how many people might take the recommendation of this video, make the jump to 4K by getting a VA panel, and then be disappointed by not only their reduced framerate, but color banding and ghosting as well? This all goes back to what kind of things you're looking for out of a monitor and your rig, and even specifically what kind of games you play. It's no surprise if you're an FPS sufferer, you'll likely stick with the 1440p (or even 1080p if you're really sweaty) moniters to take advantage of fast refresh rates and lightning fast pixels. Whereas, if you're mostly a single player game enjoyer, or Excel enthusiast, 4k could be the route for you.
basically me 5 years ago, upscale everything just to see; and every program looks bad due to upscaling from websites to basic applications overlapping or over sizing the pixels that makes the image look blurry. Games run at best 60 fps even with a $800 gpu. Games dip down to 50 fps with same gpu.... Sold it at a loss and dont regret any of it.
same. i have a predator monitor who requires TWO display port cables to work with 4k @144Hz. but then it only works in that exact mode. So if i want to play lower resolution to get more FPS, i have to disable the "4k 144hz" mode on the monitor buttons itself. unless you buy the newest high end gpu every generation, you'll not be able to max out the graphics and play at 4k with high frames anyway. it's seriously the worst investment i've ever made.
I am on a very good 1440p IPS monitor atm. My next monitor will be 2160p, but I want it to be an upgrade in all aspects. As it is now there are always downgrades to some aspects in new monitors. Sure I might get 2160p OLED with awesome HDR and 240Hz, but then I have to deal with low brightness, wonky sub-pixel layouts, burn-in and a host of other things that are not currently an issue with my monitor. Or I could get a 2160p IPS with pseudo HDR, but then I go down to a lower refresh rate of 120-144Hz. Not to mention that they are still using DP1.4 and running DSC at maximum compression to be able to handle the bandwidth. It is hard to justify spending $1000 on a monitor just for a bunch of extra pixels with other downsides.
With you on that. I just upgraded my computer with a 4090, new cpu all of that. I really want a 4k monitor, but also really want an oled panel, I can’t go back after getting used to quality of the blacks and overall colors. There’s nothing 4k and 32 inches that fits the bill. I can get 27 inch, but the pixel density on 27 inches for 4k is so high that it seems a bit overkill. Other stuff is all mini led or ips but with sacrifices in one area or another. I ended getting the ultra wide Alienware dwf which runs 1440 in ultra wide and I’ve been really loving it thus far.
This guys told us not buy 1080p / 1440p / ultrawide monitor / qd oled monitor. In conclusion, this channel is pretty useless unless you want to hear rambling over every single monitor out there. He even counter argued himself and decided to sell off his monitor and replace it with a TV lmao..
One of the dumbest video ive ever heard imagine saying this "Its looks blurry for me regardless of screen size". its not about the resolution its about the PPI, thus screen size matter.
You are partially correct, 4K immersion and visual quality is unmatched. However, due to the massive 4K tax, 1440/240hz makes the most sense right now and for the foreseeable future. When u go 4K you need the best gpu every generation to keep up the latest AAA releases. The 5090 will cost $2K, that 4K mountain will only continue to get more impractical for the next gen.
Evern 4090 is struggling to push 240hz minimum in 1440p games. If I had to to choose, I would rather choose 4K 144hz than 1440p 240hz. Huge increase in visual quality and only miniscule downgrade in refresh rate. 4K 240hz is story on its own - it's basically 2 monitors in 1. You have high refresh rate that you can use with the lower resolution for competitive FPS titles and when you wanna enjoy and watch movies or play AAA SP games, you have 4K for that. But pushing minimum of 240fps on 4K is impossible. Even 5090 won't be able to do that. Maybe 6090..
@@evaone4286 1440p240 even more expensive :) Because not only you need best GPU to get as close as possible to 240 fps but also the best CPU to not be bottlenecked by CPU. For 4K you don't have to buy the best CPU because you will be GPU limited most of the times.
Don't... 1440p is still the sweet spot. Hell. 1080 for most I'm willing to bet. If you can't shell out for a 4080 or more. Keep it at whatever you need. 1440p Will be here for a while and it'll arguably be sufficient for a long time.
I have both 4K and 1440p I couldn't notice any difference at all or in better words it's not worth it if all you do is gaming 4K is only worth it if you are content creator other than that I will go for 1440p high refresh rate, rn I'm using G9 5120x1440 as my main monitor paired with RX 7900 XTX, but again nothing right and nothing wrong If you have the money go for the 4K but make sure you pair it with high-end GPU like RX 7900 XTX or RTX 4090 or at least RX 7900 XT and RTX 4080 other than that there's no point of using 4K with lower/mid-end GPUs.
Since motion blur is incredibly expensive to fake in a game engine people prioritize hertz (every title vary) at first. hertz> native resolution>image quality. Balancing this equation perfectly is expensive because technology evolve. So there is no bad resolution.
I do not regret upgrading from a 1440p 165hz IPS to a 1440p 240hz OLED However when the 50 series comes out I will be looking into upgrading to 4k 32in 240hz OLED / mini LED for the higher PPI
watching this video hurts, 4K is stupid for gaming, if you want smooth edges, just turn on MSAA, stay on 1080p, max 1440p kids, both your wallet and GPU will thank you, myself im on 1440p and sometimes i tell myself how i should of stay on 1080p
The upscalers that you're talking about use a lot more VRAM at 4k res. 1440p fsr quality mode actually looks and runs better than native 1080p, but at 4k, the upscaler uses about 1GB more of VRAM, which significantly drops the performance
Yea man my RX 580 (which is 7 years old) can DEFINITELY run any game at 4k 75fps with 0 hitches and 0 stuttering and also I DEFINITELY have the money to buy a 4k 144hz monitor.
Yeah, everybody has four $5000 laying around to spend on a PC to get a 4090 an OLED 4K. The best of the best meanwhile the rest of us plebs are just gutter trash.
The biggest issue I have with 4K is ui scaling. Nothing annoys me more about 4k than ui elements being so tiny. I recently returned a very nice OLED 4K monitor simply because certain programs I rely on will not scale and are impossible to read without jamming my nose to the screen. Many things scale just fine which is great, but using a 4K monitor for the past few weeks showed me that overall, we're just not quite there yet. While I do miss the pitch black of OLED, 1080p has never looked so good!
I never felt like it's worth it to leave 1080p. I strongly prefer a smoother experience over looking super crisp. But then again I mainly play online shooters.
In which case you would do better with a high refresh rate 1440p. There is zero advantage to using a 1080p monitor if you have even a half decent gpu nowadays. You can pick up a good 1440p monitor for about $200 in the US.
Would a 27” 1440P OLED look as good as a 32” 4K mini LED? I just returned a defective 49” OLED G9 and considering the newer 59” Neo G9. I have a 4090 7800x3d pc and mostly play racing and flight sims
I have Rtx 4090 and amd ryzen 9 7950x3d. Literally i can't choose the monitor, Should it be 2k or 4k. I play CS2 and also i play RDR2, Bannerlord too. What is the best choice? Can you guys help me objectively?
I'd rather devs move forward with tech like path tracing etc than optimising for huge resolutions. it will be a while before path tracing is even viable at 240p, let alone 1440p but it will be cool when it is.
I just bought a 27” 1440p 144hz monitor for console gaming and I couldn’t have been happier! Absolutely perfect! Now the next generation consoles? I’ll def upgrade then.
Hey bro, what console gaming are you on? PS5 or XBox? Also what is the monitor are you using? I been trying to research if I should get a 1440p or 4K monitor for console gaming.
@@raheemafg30Neither. Consoles are meant to be played on a TV. You don't need a monitor for a console. There's a reason why consoles never go above 60fps.
I have a 7900XTX. At 1440p with no upscaling in Cyberpunk I get 115-120fps. At 4K with no upscaling, I get 60. When I turn on upscaling I can boost my FPS but the image quality starts to look really ass. Why would I do that to myself? I was bitten by the 4K bug a couple years ago. Bought an expensive 4k HDR monitor. But even with a $1000 graphics card it still can't give me high frame rates at 4K. So yesterday I bought a nice 1440p monitor. Going back is the best move I ever made for my gaming PC.
Personally I think that 1440p is still good, because if you don't have good internet and a very good GPU, you wont really be able to utilize 4k that much. Because you wont be able to watch 4k videos or play 4k games with reasonable performance with only mid end specs. 1440p these days doesn't need super intensive or high end components and should be able to run a 1440p video nearly as well as any 1080p video usually.
4K is great to look at but the novelty of 1440p is having the best of both worlds of 1080p & 4K and you would know why they call 1440p 165Hz the sweet spot for gaming
there is a reason why so many people disliked the video. This is all a bunch of BS. Low 4k looking better than highers settings on 1440p? hard disagree. Can you achieve hundreds of FPS at 4k on a GPU that isn't top of the line? not really We chose 1440p because it's the perfect balance of high resolution and performance. Many games i run at 100-144fps at 1440p high on my radeon 6800, wouldn't even manage the same perf at 4k, even on low settings.
I upgraded from 32 inch 4k 60hz to 32 inch 1440p 165hz monitor, I definitely prefer 1440p. My 6800 GPU almost doubles the fps in 1440p compared to 4k, 4k sharpness is slightly better on 4k but not that’s much difference when you are moving. If it’s not gaming PC, then I can agree with you.
I am looking at the New Samsung 57" Neo G9 Super Wide Screen. I know it is overkill, but I enjoy Flight Sims and the 1000 curve and 57' size makes it more immersive for those Flight Sims that do not support VR.
Sorry but 4k using an upscaler is just not better than 1440p max settings native. Its better to hit that middle ground of image quality and great refresh rates. Especially 1440p QD-OLED, the image quality is nothing to turn your nose up at.
Unfortunately downscaling is nowhere near as good as a higher native resolution and I actually don’t recommend doing it at all. You get less aliased edges, but a softer image.
You know in laptops Asus have this screen tech where it can change between 4K 120hz to 1080p 240hz on the fly. Why can't we have those in gaming monitors SMH?
Thank you to Ruipro for sponsoring this video!
Buy the Ruipro HDMI 2.1 Certified Fiber Optic Cable: amzn.to/432NDGS
Get access to all my ICC profiles & Discord: patreon.com/TheDisplayGuy
with techniques like dldsr or just dsr it cleans up the images really nice. however i have preordered a 4k to get the full impact
One of the specs that needs to be on the monitors is HDR.
If only they knew what a dumb video they sponsored.
OP try to share his bullshit, ok understood.
What a ba stake this video is
Is it just me or is this a really bad take?
Not just you lol
It's an awful take based off the assumptions that everyone can afford the hardware to drive 4k and get lower fps
all this guy posts is bad takes for clickbait lol
This guy always gives bad takes, I am of the opinion he does it on purpose lol
This guy doesn't even realize 1080p is still the most common monitor lmao. It's not just a bad take. This guy is so detached from reality than a mental asylum couldn't land him back in it.
The science is settled, all the experts agree 4k at 12fps is a dream compared to 1440 at 60fps
I mean if a person runs 12 fps in 4k they won't run much better in 1440p because even a 4070 can run better than that or even 7800 xt
4k is more than twice as demanding because it is more than twice the pixels of 1440p.@@XeqtrM1
lol this guy
If you get 60fps at 1440p you should get ~30fps in 4k, if number is so much lower, that means your gpu memory left the chat.
@@4ikibrikivdamk3 hmmm, maybe he just made up some numbers to prove the point, no?
The reason he made this video is so all his opponents run 7 fps
so smart
Even RX 480 can do better than that, so no
😂
"4k lower settings looks better than 1440p max settings in most games"
you have worms in your brain dawg
He is right. For example, Alan Wake 2. It looks bad in native 2k all setting max. But when I use DLDSR 4K it bacomes SHARP and clean. And the same with blurry Control, Quantum break, Forbidden West, Horizon Zero Dawn...
As someone who has a 1080p 240hz, 2k ultra wide at 240hz and a 4k 144hz monitor and a 4k 70"120hz gaming tv, he's absolutely right. Unless you're speaking from experience you don't know. And it's impossible to assume.
@@mrf1213maybe he is right. But his arguments doesn’t make sense on a price level. Try to get a good 1440p monitor 240hz or 165, below 350$ is very possible. For 4k monitors, its not only so much more expensive but its not worth it if you are more on the side of fps gaming. Ive seen what it looks like on a 4k monitor, and it LOOKS great! But for the price, I would choose all day a 240hz+ 1080p monitor or a 1440p 170hz like I currently have. Under 300$ new and very satisfied.
@@mrf1213 but what if you care more about actually playing the game than jerking yourself to the visuals? lmao
@@dessso4463
The reason many of these games look blurry in 2k or 1080p is because of the TAA image smoothing and blurry filters that look like real shit, in some of these titles you can greatly improve the image quality by modifying internal files of the game like Alan Wake 2. I think in the end this is pure marketing to force people to 4k and spend more money. A 2k without filters or rescaling and without shitty anti-aliasing techniques is extremely sharp even a 1080p is. We are living in a time that makes it powerless to see this panorama.
"I even ran 4K games reasonably well on a mobile RTX 4050 with upscaling and reduced settings" That part made me think, we are done here that's just dumb...
I was coming down to time stamp this exact quote lmao. To each their own I suppose. My 175hz 1440p oled ultrawide is wayyyy better than any 4k monitor because I can actually run 175hz
@@fuzzywinkle8310going to be picking up my Samsung G8 in a couple days. Got it open box with 4yr warranty for only $715. If there was a 4k version of the monitor I think even a 4090 would struggle more than a 4070 ti super would with the g8
Exactly who tf has a mobile 4050 that shit sounds slow as hell
Yeah...a 6gb 4k laptop runs 4k reasonably well? ok...maybe low settings, frame gen and DLSS ultra performance upscaling 4k from 720p lol. Display companies need to make higher ppi 2k monitors though.
Was he running Minecraft ? Wtf kind of advice is this .
As someone that uses both 4k, and QHD I agree to an extent but the main reason gamers want QHD is due to being able to achieve higher FPS than 4k while still looking at least better than FHD.
This 100%. I have a 4K monitor and used to have just QHD and the graphical upgrade is nice. But even on a 7900xtx running games at 4K 144 (my refresh rate) can be challenging; especially with how horribly optimized modern games can be. If the difference is even just 4K @ 100 vs 1440p @ 144, there are games where I will downscale to reach that fps. 4K monitors are great for production and movies/UA-cam and stuff, but the motion clarity and better overall experience available at higher FPS to me beats out the benefits of a slightly better image.
I love 4K but ngl the video sounds kind of elitist. You can get the gigabyte 1440p 240hz from $400-450 and that’s generally the highest refresh rate even competitive gamers will go. Starter, non HDR, 120hz 4K monitors usually start at like $500. It’s way more expensive and harder to drive.
Same here. But the main thing that makes 1440p more like 4k is getting a good quantum dot 10bit ips display
@@Chuckychargeblade 4K monitors at 500$ are crap displays made on tech from 10 years ago.
You need to spend at least 7-800$ on a good 4K display. And some good ones are at 1000$ or more.
Most gpus can play at 4k but who wants to play at low settings just for a sharper image
@@shazzi1626 people with 4090's are playing at 4k, well over 100 fps and and at ultra settings. You're just poor af bro.
this guy really has the most shit takes on monitors
This video has been sponsored by 4K lobbyists
😆
No thanks. I'm good at 1440p.
🤝
Yes
4K video content will look better on a 1080p monitor than on a 1440p monitor, due to simpler scaling factor.
With a 1080p monitor, each pixel of the 4K content can be represented by a block of exactly 4 pixels, while with a 1440p it doesnt scale down to a whole number.
Same
1440p 144hz and 1000R curved monitor is the sweetspot for me. oddysey g5 forever!!
Sure bro, i will spend 700+ dollars on some 4k monitor just to play at 1080p reescalated cuz performance is sht
I wish theyd cost 700 haha. Sadly they cost around 1.3k 🥲😫
@@imlegend8108 you from Venezuela tf? 4k 144hz IPS cost 400€/$
I mean hard disagree with 4K at low settings looking better than 1440p at high settings, especially when you factor in things like raytracing. Meaning a 1440p game with good raytracing and high settings will look worlds better than a game rendered at 4K with low settings.
True. 1440p still rules.
big facts.
He he’s just wrong high 1440p will look better
i would choose low setting higher resolution every day of the week, especially for multiplayer games, tbh i dont even care to play story games at 1080p.
Yeah I've played on 4k its nice but I much prefer high settings over low settings, I can ignore the faults of a lower resolution pretty easily and like to focus on the game's visuals.
This guy thinks all gamers are rich. Bro you need to be in touch again with reality
bro talked for 5 minutes and said nothing
Thanks for saving me 5 minutes
1440p at 240hz is a whole 400$. 4k at anything close to that is 800+. Monitor prices are so damn high
Why would you need 240Hz? What games you can play at 240+ FPS that would benefit this type of a monitor?
@@smthsmth stuff like overwatch and fortnite. I don’t game too often anymore but I definitely notice a difference when A/B testing. I have the new alienware 4k oled and it’s amazing at 240 fps. Hard to hit on much rn but even moving my cursor I notice
@@Submersed24 how much did your monitor cost ?
Monitor prices have gotten really good, you're just whining that you can't get a Lambo for the price of a Corolla.
And how much was the video card that can push out that many pixels?
Simple: Price
On a 1440p monitor you don't have to spend much to get a 1440p 60fps experience with high settings.
This means your GPU and CPU last far longer before you're looking at
The key is to game at 1080p on a 4K monitor while using the 4K resolution for work or movies. That way you can run your 4K monitor on a weaker PC while still enjoying 4K when not gaming. 1080p is still fine but I wouldn't buy a 1080p monitor because it sucks for everything other than gaming.
@@ucrjedi No, that is not the key. A 1080p window on a 4K display is tiny. And Running a game at a non native resolution when stretched out to full screen is also not ideal.
1440p remains the sweet spot alternative to 4K. The running cost of 4K is still too high for gaming.
@@RicochetForce 4k isn't that expensive, if the current gen consoles can do it, so can a mid range PC. Yes, I know the consoles use FSR to upscale a lot of games to 4k, but that same FSR tech is available on PC as well, so there's not really an excuse for not getting a 4k monitor. If your PC is good enough to run a game at 1440p then you can run it at 4k using FSR, just like a PS5 does.
@@acurisur
#1 4K is actually quite a bit more expensive across the board. The monitors carry a premium price over 1080p and 1440p monitors. The graphics cards needed to run the same graphics settings at 60fps someone was used to at 1080p or 1440p are also much more expensive.
#2 FSR is inferior to DLSS across the board, and even DLSS is inferior to native resolution in terms of image quality. in general you do not want to be feeding fixed pixel displays resolutions that aren't their native resolutions.
The consoles having to run at 1440p and upscale to 4K is a knock on 4K being a realistic, affordable option. People should save the money that would've been spent on a 4K monitor and the graphics card needed and instead invest in a higher tier card for high refresh gaming at 1440p.
@RicochetForce Agree to disagree. 1440p looks bad compared to 4k. Also I never said FSR was better than DLSS, I was talking about how the PS5 and Xbox Series X can do 4k, they don't have access to DLSS as they're both using AMD hardware.
I use DLSS on my PC and it's better but FSR is still good in most games, specifically FSR 2.1.
Completely disagree on sending feeding fixed pixel display resolutions that aren't their native resolutions as DLSS often produces a better image than native.
DLSS isn't the same as FSR as it's not upscaling the image, it completely reconstructs it using tensor core a.i., producing an image that's often superior to TAA. Many tech channels like Digital Foundry have done comparisons of native 4k vs DLSS and often DLSS is superior.
My guy did you really just complain about blurry 1440p only to pitch upscaling to switch to 4k...?
My exact thoughts here. Cognitive dissonance is strong with this one.
It has sense! Many games look blurry in native 2K, but look sharp and clean in DLDSR 4K. Just try Alan Wake 2 and you will wonder that it will change all graphics.
@@dessso4463that may be cause you shit 2 cm from your monitor but normal people wont notice it
@@GioKuster-s8m no its easy to notice its because of taa
@@macho7409 for the average person its not
i thought he was gonna say stick with 1080p or something this is way worse than i thought
The affiliate link to a 120$ HDMI cable should be a tip off that this guy is not on the level.
1440p is honestly still great when you consider integer scaling with retro games and emulation. I.E: 240p or 480i/p retro games with clean upscaling and some scanline shaders.
At 4K, you’re going to have some portion of 480p content cut off, while at 1440p, it’s going to be a clean integer scale.
you sound well beyond this dingus guy who made the video lol
This is a good point. My 720p video files still look best on my 4:3 CRT monitor, for color and aspect ratio, and the ability for it to have any native resolution i choose
Yeah but 4k scales better with 1080p, which might make it worth it for console owners
@@minty_x The PS5 and Xbox Series consoles support 1440p. Unless you’re explicitly talking about the Switch or older game consoles, I don’t really see the point nowadays.
Yeah older game consoles like the ps4. All ps4 (non pro) games on the ps5 run at 1080p, so 4k/1080 is still better so it scales properly. @@KingKrouch
3 words: refresh rate, money
This video isnt relevent unless you have a 4080/4090. I have a 3090ti, and whilst its a 4k targeted card, higher frame rates at 1440p just looks and feels cleaner. Any forum will recommend 1440p 240hz at this moment in time for 90% of pc gamers
Even with a 4090 4K isn't worth it.
Yeah, I agree. Even a top end PC with a 7800x3d and a 4090 will become a 1440p mid range machine a lot quicker with these new photo realistic games coning out in the not too distant future.
I 'downgraded' from a Dell G3223Q to an Aorus FO27Q3. I love the better motion clarity, and the picture is better overall because it's an OLED screen.
The bottom line...if you don't have a fairly new GPU, you can't run 4K, period. I am in that category. I continue with 1440.
I have had two different 4K screens and this video is rage bait. I went back to 1440p for gaming mostly because some games scale poorly unless your 4K screen is 42”, which limits your ability to have multiple monitors without harming your neck.
4K video content will look better on a 1080p monitor than on a 1440p monitor, due to simpler scaling factor.
With a 1080p monitor, each pixel of the 4K content can be represented by a block of exactly 4 pixels, while with a 1440p it doesnt scale down to a whole number.
@@ts8960I don't watch much content on my PC, I use it for productivity, games. 4K TV's are relatively cheap now for watching content.
This guy sounds like the person who counts every thread of his bed sheets. As someone who owns a 4K Oled monitor, I'm still playing at 1080p without super resolution or DLSS and it honestly looks great. ( I do use integer scaling but all that really does is insure it's a 1:1 pixel ratio for a sharper image)
He was so bewilderingly close to making a real point about integer scaling when he compared those resolutions in terms of "2.25x vs 4x." But it turned out he just had a personal distaste for the number 2.25 compared to 4. Good stuff.
I wish I lived in a country where this video made sense, but I'd have to drop a full 2 months salary for a good 4k monitor, and that's not even considering that I'd still need a high end gpu to run 4k games with a good framerate (a 4080 costs like 6 times the minimum wage here in Brazil). I get where you're coming from but 4K gaming just isn't really an option for the average worker outside USA and similar countries, so to have some type of upgrade over 1080p we'll settle for 1440p.
Nah it’s not even a US thing, he just lives in a dream land. Not everyone got the trust fund money.
I second this hitmans comment, I make a pretty average wage but with each component costing over 1k USD to get the optimal experience and the fact most titles arent even really optimized for 4k and ray tracing has been mostly an after thought the value really isnt there yet. for a 1/3 the price you can get a pretty damn nice 2k. and dont get me started on 8k most people wouldnt be able to tell the difference between 8k and 4k and any subsequent improvement would have even less noticeable difference. and as far as refresh rates go 240 is like a professional jet fighter pilots limits so never listen to anybody try to tell you have to have that high of a refresh rate unless your a paid fps player.
I mean inside USA its no different if not worse lmao
Typical Brazilian thinking every American is rich and Gaming on fully decked out setup 😂
You guys all wrong, I just love my 1080p on a 15.6 inch screen at 144 hz.
Nothing wrong with that. But were not wrong. Were all right.
I dont think you should be giving advice on electronics , let alone anything ...
what a horrible take
I prefer 1440p at 120 fps than 4k at 60fps. I cant get used to low refresh rates anymore.
Unless you limited by outdated 8gb vram and super slow GPU BUS then 4K with dlss qualily has the same fps as 1440p native. So, if you have 120 fps at 1440p resolution then you will have 4K 120 fps with dlss quality on this GPU. And 4K with DLSS P has the same fps as 1080p native and 4K with DLSS P 20x times better than 1080p native and still much better than 1440p native on 1440p monitor.
8gb vram is outdated? Then what the hell is 2gb? God help me get out of this cave
@@AAAAA-re6qh even 12 gb vram is outdated 16 gb is really needed in 2024
@@kotboyarkin5032 Ok, but have you seen 4k 144hz prices? They're double than 1440p. For the same price you can only get 4k 60hz, and 60hz is pain, and that's what op means! While even 2080ti can handle optimized settings with dlss at around 60fps in games at 4k, 60 fps on 60hz feels like crap after 60 on 165hz, and we're at least a few years away from affordable gaming 4k monitors.
Same
Im actually downgrading from 4k 60 42inch to 1440p 165hz 27inch monitor arriving tmrw R.I.P
i'll buy a 4k 32inch monitor in 5years or if i see a deal i cant ignore.
I think thats a very good choice
I downscaled back from 4k to 1080 xD My eyes are thanking me every day. They got strained from the 4k Monitor, my computer also didn't like it that much, would blow really loud. Sure 1080 isn't as sharp, but it's way more relaxing to my eyes (both are 27inch monitors), the PC isn't struggling anymore and my energy bill is a lot lower as well.
Oh and I can run Cyberpunk in all it's shiny glory again with Raytracing on, on 4k I had to pick a much lower resolution and turn all shinies off or it would stutter like crazy.
I'm gonna assume the 4k is a TV.
I have Rtx 4090 and amd ryzen 9 7950x3d. Literally i can't choose the monitor, Should it be 2k or 4k. I play CS2 and also i play RDR2, Bannerlord too. What is the best choice? Can you guys help me objectively?
@@cemdagl4522 2k (1440p) monitor
1080p is still fine for my tastes. Cheap, and probably fine for the vast majority of people, not to mention it massively extends the useful life of hardware. I would've likely had to upgrade from a 1070 by now otherwise.
As someone who still plays in a 1080p but with a 4090, I gotta say, 1080p is sht. You noticed that when you render games at 1620p via DLDSR and games a hell of a lot better even on a 1080p monitor. 1080p is sooo blurry. I have no idea why games rendered natively at 1080p are blurry. But, anyway, 1080p is bad. Try DLDSR or buying a 1440p monitor and you'll see what you're missing.
@@CeceliPS3 My GPU never supported DLDSR, so I never got to experience that myself. Maybe if I do get a GPU upgrade in the future, it might be worth it? For now I'm sticking with 1080p because my GPU is currently barely adequate for that.
Of course. With a 1070 you wouldn't be able to render at higher resolutions even if they made DLDSR available for it. I was just sharing my experience with you so you know what lies ahead. I'm not even telling you to go for 4k or 1440p. I'm just saying 1080p is not all that once you experience higher resolutions with the detail or caveat that it may not even be the higher resolution per se because of DLDSR on 1080p monitor, but the fact that games look a hell of a lot better when rendered in higher resolutions. 1080p native is sooo bad once you see what's out there. But, until you experience it, your 1080p experience will be ok. After seeing the difference, you'll never want to go back.@@NovemberJoy
@@CeceliPS3 my brother in christ you have a 2000 Dollar GPU. Do yourself a favour and get a better monitor.
haha don't worry. It's in the works.@@Smorfar
Yall its not that hard, just get a 4k 500hz monitor and 2 4090 its not that hard
Lmaoo
😏😂
I love seeing the dislikes of using an extension. You got wrecked on this video. lol 4.5K dislikes. NO multiplayer games run good upscaling when it comes to FPS. 1440p saves you lots of room for competitive FPS while looking as clean as 4k. Derp.
most of us dont have a couple grand laying around for a top of the line video card to get a 4k monitor
hes advertising to push his 4k monitor stocks
You are misleading viewers. Resolution sweet spots depends on screen size. No need to go higher than 1440p if you have a 27” monitor, no need to go higher than 1080p with a 24”. If you plan for a too big monitor then only then 4K should be an option.
And please stop the 8K garbage
This is not true.
That channel guess told you 💯 % like it is. . And I fully AGREE all the way. . You need to start having RESPECT for other people's monitor entertainment preferences . So stop forcing your opinion narratives onto these channel viewers because that's disrespectful, not helpful. . Everybody doesn't care about 4K just because you do. . So get over it
4K on low settings looking better than 1440p on high is certainly one of the takes of all time. I definitely think 4K and 1440p have their place, it depends on what you prioritize. The 27" OLEDs in native 1440p may not be inexpensive monitors, but they really are damn gorgeous while allowing nice native framerates. I don't denegrate 4K at all, my LG G1 is a gorgeous TV, and I completely get why some would be into it even if PC gaming is their only use case, but this video did absolutely nothing to put a damper on 1440p for me. I've seen a 4090/i9 13th gen in 4K, and it's stunning, but my 4070ti in 1440p will do me just fine.
He didn't say that
4K video content will look better on a 1080p monitor than on a 1440p monitor, due to simpler scaling factor.
With a 1080p monitor, each pixel of the 4K content can be represented by a block of exactly 4 pixels, while with a 1440p it doesnt scale down to a whole number.
He didn’t say 4k on low, he said 4k on lowER settings than max. Likely meaning turning some of the super demanding graphical effects like raytracing a bit but still keeping things mostly at high/ultra.
@@ts8960 Video players have renderers like madVR that use high end algorithms that make the picture look good at any resolution. The only real advantage of 4k is in text rendering.
lol my 3080 is barely keeping up with 60-90fps with today's games at 1440p, mixed settings of medium/high. I try to aim for 120fps and even that isn't easily attainable with my current gpu. I'd rather not spend $1k+ for 4k 120fps
Same. With my 3080 still use 1080p. I tried with DSR downscaling both 1440p and 4K. Even with DLSS, image upscaled from 1080p to 4K performs worse than native 1080p DLAA
What games are you planning, i have 3080 12gb and so far i play all my games on monitor 1440p ultra no problems, and I also game on my tv 4k , with Dlss quality and playing with settings i can get 70-100 fps even in games like RDR2 ?!
I just bought a 1440 p monitor for my 4070s, especially with the graphical updates I want to run games like ffxiv at max settings 1440p and I should have no problems running it at the max refresh rate. I can completely max out cyberpunk at 1080, but it still feels strangely pixelated, not immediately buying a higher res monitor with a 1k$ gpu was probably dumb
Me not going 4k is because everything larger than 27" is just to big for me. I have tryed with 32" and switched back to 27", and 27" inch 1440p monitor is super fine, i will switch to 27" inch 4k monitor in a year or two when we get stronger hardware to run games at more fps at 4k.
You sound like me, then I bought the G9 ultrawide curved and that's what let me get the big screen. Now your only problem is whether to get the G9 1440p 50 inch or the 4k G9 57 inch. I went with the 1440p because it's seems all the tech for 4k gaming isn't there yet on the PC or the monitor side. There isn't display port 2.1 on NIVIDA yet or anything that could even really utilize it fully, the monitors for gaming at 4k are still newish and it all just doesn't seem like it's come together just yet.
So I went 1440p and will upgrade if I get a 5090 with display port 2.1 some day as It just doesn't feel like the standard yet, maybe in another gen or two will get there without having to pay 2k for a card and another 2k for a monitor just to be able to play like two games where you would even notice the difference. For now, to me anyway, it just feels like meh, not worth.
Anyway, moving to the ultra wide curbed format is what finally let me feel comfortable gaming on big monitors and I got to say, it feels really nice.
These 32" 4K OLEDs can't come soon enough (I won't be able to afford them)
1440p OLED 240+ refresh rate seems like ideal for all types of games until we all have 4090 or better hardware in 5 years. But yea if you don’t play competitive shooters then I could see 4k being ideal.
Exactly 1440 120+FPS in multiplayer shooters and 4k in single player games.
Even the 4090 cant run fortnite on 4k up to that fps, even on low settings. In ideal sircumstances it runs 140fps, on low settings. In other words, the guy in this video is talking out of his ass, cause we're still a couple generations away from afordable and REAL functional 4k.
@@thames21 No one buys a 4090 to play Fortnite lol. That's a GPU you buy if you want to play games like Cyberpunk 2077 with Raytracing turned on.
You can still get 100fps + in Fortnite at 4k max settings with ray tracing turned on with a 4090, I literally just watched someone do it. A 4090 also has DLSS 3.5 Frame Generation.
Reference : "Fortnite Season 4 - RTX 4090 Ultra Settings (4K + Ray Tracing)" uploaded to UA-cam by RTX GamePlays.
@@acurisur a lot of ppl buy the best cpus and gpus to play fortnite valorant overwatch r6s
@@zatchbell366 Source?
I only play competitive/esports games. Gpus are now powerful enough and cheap enough to push 1440p240Hz in many of those games. 1440p240Hz monitors have also come down enough in price, while still maintaining low total input lag. That's why I'm going with 1440p
i can only find 1440p at 180hz for under 250$, are there any 240hz 1440p moniters at 200-250 dollars?
@@maxypad3379 there was an acer one that hit $250 recently. But it's not a great model. Next cheapest is the hp omen 27qs which has a super sale price of $300. It was at $300 for like 2 weeks back in July. And it hit that price again a few days ago for early Black Friday. But it sold out at that price and is back way up at $430
What is a CHEAP gpu that can push 1440p240Hz?!? 😳
@@MaximusAdonicus 6700xt would be sufficient. Or even lower if you only play very easy to run competitive games
@@shoobadoo123 Welp, it's not cheap exactly, but moderately priced compared to others in the market... Speedwise it's in the higher end thou...
Well I know not to take any of your videos seriously now.
Literally though this was like a joke and the end was a rick roll
It's not just the monitor. For example, I have a desk setup with my GPU, USB C hub, cables, KVMs, etc. 1440p monitors work just fine with USB 3.2 hubs and HDMI 2.0.
In order to have a fully functional 4K setup, you need, a 4K Monitor, which costs twice as much as a 1440p monitor, a USB 4.0 or thunderbold dock, which are almost double the price of USB 3.2 hubs, and a HDMI 2.1 or display port interface, which also reduces your options.
Also, bigger monitors require bigger desks, stronger monitor arms and more expensive GPUs. DLSS and FSR have taken big leaps, but you just can't yet DLSS up your way to 4K.
Considering all of this, jumping from 1080p to 1440p costs like 55% more, while jumping from 1440p to 4K costs almost 200%.
1440p IS the sweet spot for the vast majority of people. We get it, you tried 4k and can't go back. For the rest of us who can't afford 4k displays, or even run it to begin with, we don't care.
4K monitors have gotten way more affordable nowadays
@@evaone4286 Yeah sure the display is more affordable now then is has been in the past, but you still need a top of the line PC to get the fps needed on newer AAA titles.
4k is overkill for PC monitors that are usually 24-27 inches more recently 32, 1440 on that size screen makes more sense because most literally couldn't tell the difference and it nets you more fps
You need 4k at 32" or bigger. Had a 35" UltraWide with the UltraWide version of 1440p and it was awful. Ended up doing 32" 4K with 144Hz. Having too few pixels on a large display makes it look bad.
ive used a 32" 4k lg monitor since 2018, its not all that great, picture is good but sitting 2ft away from a screen that big kinda sucks imo and i had up go up on the scaling to even read stuff on the screen because 100% is just to tiny at 32"
I disagree. 1440p at 240hz is way better than 4K at 144 or whatever
I have a 4090 and I’d much rather something with an actual decent frame rate
Trust me if there was 4K at 360hz or something your boy would be buying it
240hz is for professional gamers don't kid yourself
@@BourbonBiscuit.
It seems to me that too many players consider themselves "competitive". Just because they play Valorant or CSGO.
Even if they don't make any money out of it.
Even though I am not a "professional gamer" i got a 1440p 240hz display for the sole reason it makes games noticeable more fluid and enjoyable. Especially very fast paced games where the image changes a lot, and yes I'ts a very noticeable upgrade from 144hz, not as much as from 60hz but still very nice.@@BourbonBiscuit.
There is no GPU that can run 4K at 360 fps minimum. Not even 5090 will be able to do that and not even 6090 will be able to do that. 7090 maybe but that's probably 2028/2029, same time PS6 will come out. Also, the visual benefits of 4K are so much greater than 1440p that not even higher refresh rate of 1440p is able to level up the playing field. The difference in 144hz vs 240hz is pretty small but the difference with 1440p vs 4K is huge.
@@BourbonBiscuit.Still looks better doesn't have to be for professional
Idk I’ve downgraded from a 4K monitor to a 1440p and really don’t feel like I’m missing much
That's only if you use a monitor bigger than 27 though.
Because (I used to be a computer monitor salesman for years), you really can't see the pixels in a 1440p vs 4K 27 inch unless you're pixel peeping, and pausing still images on your system.
So, if you do get one, make sure it's 32 inches at the minimum.
It's either you don't spend much time playing games or you have a blurry vision. Even using dldsr 2.25 on a 1440p 27" I can see a lot sharper image.
@@kiburi2903 Would you recommend 4K 27" monitor? I can't go bigger than 27 inches
I would honestly need to do a test drive of a 1440p and 4K moniter I plan on buying to see, with my hardware, what each game I enjoy playing can look like with different settings/upscaling, and what fps I can get with them. With it being SO dependent on each persons' rig, preferred games, desired bias of fidelity/framerate, ect, it's IMPOSIBLE to suggest one is superior than the other at this development stage. I've been a 32" 1440p user for close to 4 years now, and the pixel density of 1440p at even 32" is enough to make games look crispy, albeit less than 4k (duh.) Basically, you cant miss what you don't have and I've enjoyed gaming at 1440p so far.
HOWEVER, with this 32" moniter in question just recently kicking the can, I'm in the market for something, and decided if I stick with 1440p, I'm absolutely going to a 27" to recover some clarity from increased pixel density over a 32", or going to 4K. So, to that end, it seems I agree with this video: I do desire more clarity. But this conversation cannot be had without bringing up panel technology. 4K is still *mostly* restricted to traditional IPS and VA, while 1440p panels are readily available in far superior rapid/fast IPS and OLED panels. What good are all the pixels of 4k if the color, contrast, and ghosting detract from the experience? Traditional IPS at 4k would be the best option, and probably a great experience for those choosing to go to 4k. But, how many people might take the recommendation of this video, make the jump to 4K by getting a VA panel, and then be disappointed by not only their reduced framerate, but color banding and ghosting as well? This all goes back to what kind of things you're looking for out of a monitor and your rig, and even specifically what kind of games you play. It's no surprise if you're an FPS sufferer, you'll likely stick with the 1440p (or even 1080p if you're really sweaty) moniters to take advantage of fast refresh rates and lightning fast pixels. Whereas, if you're mostly a single player game enjoyer, or Excel enthusiast, 4k could be the route for you.
this is so cringe... i got a 4k monitor with my build and it was probably one of the worst tech decisions ive made
Why
@@zeldarsit just doesn’t run well on 4k lol
basically me 5 years ago, upscale everything just to see; and every program looks bad due to upscaling from websites to basic applications overlapping or over sizing the pixels that makes the image look blurry. Games run at best 60 fps even with a $800 gpu. Games dip down to 50 fps with same gpu.... Sold it at a loss and dont regret any of it.
same. i have a predator monitor who requires TWO display port cables to work with 4k @144Hz.
but then it only works in that exact mode. So if i want to play lower resolution to get more FPS, i have to disable the "4k 144hz" mode on the monitor buttons itself.
unless you buy the newest high end gpu every generation, you'll not be able to max out the graphics and play at 4k with high frames anyway.
it's seriously the worst investment i've ever made.
4k just isn't there yet. Maybe in a couple of more years but price/performance/display quality not at the right place yet.
in 5 years we can have games at 4k with 240 fps+, just take the time and wait xD
I am on a very good 1440p IPS monitor atm. My next monitor will be 2160p, but I want it to be an upgrade in all aspects. As it is now there are always downgrades to some aspects in new monitors. Sure I might get 2160p OLED with awesome HDR and 240Hz, but then I have to deal with low brightness, wonky sub-pixel layouts, burn-in and a host of other things that are not currently an issue with my monitor. Or I could get a 2160p IPS with pseudo HDR, but then I go down to a lower refresh rate of 120-144Hz. Not to mention that they are still using DP1.4 and running DSC at maximum compression to be able to handle the bandwidth. It is hard to justify spending $1000 on a monitor just for a bunch of extra pixels with other downsides.
With you on that. I just upgraded my computer with a 4090, new cpu all of that. I really want a 4k monitor, but also really want an oled panel, I can’t go back after getting used to quality of the blacks and overall colors. There’s nothing 4k and 32 inches that fits the bill. I can get 27 inch, but the pixel density on 27 inches for 4k is so high that it seems a bit overkill. Other stuff is all mini led or ips but with sacrifices in one area or another. I ended getting the ultra wide Alienware dwf which runs 1440 in ultra wide and I’ve been really loving it thus far.
DSC can do 4k 12bit color 160hz no chroma subsampling with visually lossless compression
I'm currious how anyone runs stuff in 4K. I've got the latest gaming PC and I can't even do that.
Just bought 1440p and I'm super happy, fuck 4k!
This guys told us not buy 1080p / 1440p / ultrawide monitor / qd oled monitor.
In conclusion, this channel is pretty useless unless you want to hear rambling over every single monitor out there. He even counter argued himself and decided to sell off his monitor and replace it with a TV lmao..
Soon Braille displays will be his go to.
A decent 2160p gaming monitor and a GPU costs ~300% what a decent 1440p gaming monitor and a GPU costs, so no wonder people prefer to aim for 1440p.
One of the dumbest video ive ever heard imagine saying this "Its looks blurry for me regardless of screen size". its not about the resolution its about the PPI, thus screen size matter.
You are partially correct, 4K immersion and visual quality is unmatched. However, due to the massive 4K tax, 1440/240hz makes the most sense right now and for the foreseeable future. When u go 4K you need the best gpu every generation to keep up the latest AAA releases. The 5090 will cost $2K, that 4K mountain will only continue to get more impractical for the next gen.
Running all your games at 1440 p 240hz also runs your wallet the same way 4K does.
240 hz is a too minor improvement compared to 144 hz I would guess as someone who has never seen 240hz. I think I'd take 144 4k over 240hz qhd
no
Evern 4090 is struggling to push 240hz minimum in 1440p games. If I had to to choose, I would rather choose 4K 144hz than 1440p 240hz. Huge increase in visual quality and only miniscule downgrade in refresh rate. 4K 240hz is story on its own - it's basically 2 monitors in 1. You have high refresh rate that you can use with the lower resolution for competitive FPS titles and when you wanna enjoy and watch movies or play AAA SP games, you have 4K for that. But pushing minimum of 240fps on 4K is impossible. Even 5090 won't be able to do that. Maybe 6090..
@@evaone4286 1440p240 even more expensive :) Because not only you need best GPU to get as close as possible to 240 fps but also the best CPU to not be bottlenecked by CPU. For 4K you don't have to buy the best CPU because you will be GPU limited most of the times.
"Ok hear me out before you yell at me."
Ok this if fair.
*Proceeds to have the dumbest take*
Me running games in windowed mode at 1080p on my 1440p monitor 🙃
😂 That must be so un-immersive
Why?
Does bro think we can afford expensive ahh GPUs to run 4k 60?? My pc bareoy runs 1080 60
I'm honestly just waiting for more OLED 4K monitors to hit the market before I make the move from 1440p
I’m waiting for the ASUS 4K QD OLED 240hz 32” monitor with glossy screen
They say it’s coming out 2024 Q1
@@Shadowsmoke11 Not sure about ASUS but Alienware will launch such monitors in early January of 2024.
Get an LG C series I don't think there's anything better. Of course if you're fine with the huge size and your desk is adequately deep
Is this a joke video? No gamer in their right mind would care about 4k if they can't get a consistent 60fps!
120 fps*
I'm still rocking my 1080p monitor my dude. If it ain't broke don't fix it.
here in brazil most can't even afford a 1080p monitor, 1440p is already a dream for us
Don't... 1440p is still the sweet spot. Hell. 1080 for most I'm willing to bet. If you can't shell out for a 4080 or more. Keep it at whatever you need. 1440p Will be here for a while and it'll arguably be sufficient for a long time.
I have both 4K and 1440p I couldn't notice any difference at all or in better words it's not worth it if all you do is gaming 4K is only worth it if you are content creator other than that I will go for 1440p high refresh rate, rn I'm using G9 5120x1440 as my main monitor paired with RX 7900 XTX, but again nothing right and nothing wrong If you have the money go for the 4K but make sure you pair it with high-end GPU like RX 7900 XTX or RTX 4090 or at least RX 7900 XT and RTX 4080 other than that there's no point of using 4K with lower/mid-end GPUs.
Since motion blur is incredibly expensive to fake in a game engine people prioritize hertz (every title vary) at first. hertz> native resolution>image quality. Balancing this equation perfectly is expensive because technology evolve. So there is no bad resolution.
I can't take anyone serious who says "goofy ahh".
I do not regret upgrading from a 1440p 165hz IPS to a 1440p 240hz OLED
However when the 50 series comes out I will be looking into upgrading to 4k 32in 240hz OLED / mini LED for the higher PPI
watching this video hurts, 4K is stupid for gaming, if you want smooth edges, just turn on MSAA, stay on 1080p, max 1440p kids, both your wallet and GPU will thank you, myself im on 1440p and sometimes i tell myself how i should of stay on 1080p
Lmao this man is a cyborg I guess he said 8k is almost perfect 😂😂
4k is great if you play 5+ year old games. playing modern games usually requires a modern GPU and we all know how highly priced they are
The upscalers that you're talking about use a lot more VRAM at 4k res. 1440p fsr quality mode actually looks and runs better than native 1080p, but at 4k, the upscaler uses about 1GB more of VRAM, which significantly drops the performance
talks about resolution, ignores display size - opinion discarded.
Yea man my RX 580 (which is 7 years old) can DEFINITELY run any game at 4k 75fps with 0 hitches and 0 stuttering and also I DEFINITELY have the money to buy a 4k 144hz monitor.
Sir, until we have GPUs that can play everything in 4K at 60+ without DLSS/FSR, 4K is just not viable...
Yeah, everybody has four $5000 laying around to spend on a PC to get a 4090 an OLED 4K. The best of the best meanwhile the rest of us plebs are just gutter trash.
The biggest issue I have with 4K is ui scaling. Nothing annoys me more about 4k than ui elements being so tiny. I recently returned a very nice OLED 4K monitor simply because certain programs I rely on will not scale and are impossible to read without jamming my nose to the screen. Many things scale just fine which is great, but using a 4K monitor for the past few weeks showed me that overall, we're just not quite there yet. While I do miss the pitch black of OLED, 1080p has never looked so good!
I never felt like it's worth it to leave 1080p. I strongly prefer a smoother experience over looking super crisp. But then again I mainly play online shooters.
In which case you would do better with a high refresh rate 1440p. There is zero advantage to using a 1080p monitor if you have even a half decent gpu nowadays. You can pick up a good 1440p monitor for about $200 in the US.
Dude 4K monitors almost worth 1K My pc is 1.5K 💀
Would a 27” 1440P OLED look as good as a 32” 4K mini LED? I just returned a defective 49” OLED G9 and considering the newer 59” Neo G9. I have a 4090 7800x3d pc and mostly play racing and flight sims
I have Rtx 4090 and amd ryzen 9 7950x3d. Literally i can't choose the monitor, Should it be 2k or 4k. I play CS2 and also i play RDR2, Bannerlord too. What is the best choice? Can you guys help me objectively?
@@cemdagl4522 did you guys make your decisions?
It purely depends on view distance. I'm using a 42 inch LG C3 and it's super crisp but I'm also not sitting right next to it. PPI is irrelevent.
video is invalid, guy uses the P key for storm arrow? that's some psycho shit.
I'd rather devs move forward with tech like path tracing etc than optimising for huge resolutions. it will be a while before path tracing is even viable at 240p, let alone 1440p but it will be cool when it is.
Lol four times the cost for a few inches more of screen space? I'm looking at the center of the screen anyways, idgaf if my monitor is 720p even
I just bought a 27” 1440p 144hz monitor for console gaming and I couldn’t have been happier! Absolutely perfect! Now the next generation consoles? I’ll def upgrade then.
Hey bro, what console gaming are you on? PS5 or XBox? Also what is the monitor are you using? I been trying to research if I should get a 1440p or 4K monitor for console gaming.
@@raheemafg30Neither. Consoles are meant to be played on a TV. You don't need a monitor for a console. There's a reason why consoles never go above 60fps.
@@username8644ps5 and series x literally goes up to 4k 120
@@username8644Where does it say consoles are meant to be played on a TV? Current gen can go well over 60hz.
@@username8644 wtf are you talking about?
I have a 7900XTX. At 1440p with no upscaling in Cyberpunk I get 115-120fps. At 4K with no upscaling, I get 60. When I turn on upscaling I can boost my FPS but the image quality starts to look really ass. Why would I do that to myself?
I was bitten by the 4K bug a couple years ago. Bought an expensive 4k HDR monitor. But even with a $1000 graphics card it still can't give me high frame rates at 4K. So yesterday I bought a nice 1440p monitor. Going back is the best move I ever made for my gaming PC.
Personally I think that 1440p is still good, because if you don't have good internet and a very good GPU, you wont really be able to utilize 4k that much. Because you wont be able to watch 4k videos or play 4k games with reasonable performance with only mid end specs.
1440p these days doesn't need super intensive or high end components and should be able to run a 1440p video nearly as well as any 1080p video usually.
4K is great to look at but the novelty of 1440p is having the best of both worlds of 1080p & 4K and you would know why they call 1440p 165Hz the sweet spot for gaming
interesting take
me: watching this on a 1080p monitor
Stop doing drugs.
there is a reason why so many people disliked the video. This is all a bunch of BS.
Low 4k looking better than highers settings on 1440p? hard disagree. Can you achieve hundreds of FPS at 4k on a GPU that isn't top of the line? not really
We chose 1440p because it's the perfect balance of high resolution and performance. Many games i run at 100-144fps at 1440p high on my radeon 6800, wouldn't even manage the same perf at 4k, even on low settings.
I have best of both worlds: 1440p high refresh for competitive games/productivity & 4k glossy tv for story/viewing.
I upgraded from 32 inch 4k 60hz to 32 inch 1440p 165hz monitor, I definitely prefer 1440p. My 6800 GPU almost doubles the fps in 1440p compared to 4k, 4k sharpness is slightly better on 4k but not that’s much difference when you are moving. If it’s not gaming PC, then I can agree with you.
I am looking at the New Samsung 57" Neo G9 Super Wide Screen. I know it is overkill, but I enjoy Flight Sims and the 1000 curve and 57' size makes it more immersive for those Flight Sims that do not support VR.
I just reviewed it 👏👏👏
Sorry but 4k using an upscaler is just not better than 1440p max settings native. Its better to hit that middle ground of image quality and great refresh rates. Especially 1440p QD-OLED, the image quality is nothing to turn your nose up at.
It goes the other way around too: you could DLDSR from 1440p to 4k and then use DLSS, and it will look much cleaner than just DLSS on a 4k panel ;)
DLDSR for the win!!!!
Unfortunately downscaling is nowhere near as good as a higher native resolution and I actually don’t recommend doing it at all.
You get less aliased edges, but a softer image.
@@thedisplayguy With DLDSR? I disagree, especially if you tweak the “blur” factor int the NVC, and then use quality DLSS and adjust the sharpness.
1080p pixelated? blurry?
delusional consumerism
This man got ratioed so bad lol
You know in laptops Asus have this screen tech where it can change between 4K 120hz to 1080p 240hz on the fly. Why can't we have those in gaming monitors SMH?