you know resolution can be set internally? set it to 1080p on 4k display so best of both worlds whenever u want fidelity or performance. it scales perfectly.
I simply chuckle at this whole debate and people asking what’s worth it. It’s one’s personal taste and use scenarios what’s going to decide what something is worth. People can present you with information about things , but it’s never going to be that accurate because except for maybe scientific hardware focused reviews, everyone gives information with a Bias. It’s 4k worth it? For some 1 billion percent yes. For others absolutely not. For example I have a friend that asked me if I think he should go to 4k felm his current 240hz 1440p monitor. And I told him , bro… When you bought your pc you kept telling me you where mind blown by how fluid games where and how amazing graphics looked. And when I came to your house I realized that 1rst you where running low settings in some games because you never even tried to open the graphics settings menu and see what settings you had. And second , you have been playing multiplayer shooters with me at 60hz for this whole year because you never checked it and your monitor was at 60HZ I don’t think you can tell the difference of going to 4k you really aren’t demanding. I on the other hand had a secondary display that was 1440p that I used as a side monitor while my main media consumption , work etc was on the 4k one. And I had to sell that 1440p and go 4k for the secondary too, because I just can’t bear 1440p it looks blurry to me. I’m used tot he crispness of 4k and I can’t stand 1440p So 100% worth it for me.
@@lawyerlawyer1215 I assume for general media you need a big monitor right? Space is a factor often omitted, really depends on the room and how far you'll sit.
@markedfodeath you playing Alan wake 2 or using ultra settings including raytracing? I'd imagine not. what game are you playing that makes that card not a waste?
The nice thing with 4K, I find, is that when you upgrade your GPU in some generation, it does open a wide range of more games that your GPU can push at 4K. It's very demanding to run the latest games at 4K max settings, but you can run them on modest GPUs with medium settings, and older games, 3+ years old, you can typically get away with 4K max settings. 4K is also great for emulation, and such things. It's awesome upscaling old PS1, PS2, Dreamcast, Xbox games, to 4K. In the end, it will come down to the individual. Do you want fidelity, or response time? Not everyone cares about online shooters. Someone may want to sit back with a coffee, and play Baldur's Gate 3, with the absolute best possible image quality they can get, at 60 fps, or even 30 fps. Myself, I am on a 4K OLED 120Hz, and I wouldn't give it up for 240Hz on a 1440, but, that's just me. Others would, and that's perfectly understandable.
@@Daniel-x2d9k I am using an LG C4 TV actually, and it works quite well as a monitor. If you don't use game mode, which I don't, the picture is very vibrant and colours are quite thick. Game mode dullens the image substantially, though, that's not exclusive to this screen or anything, as all oleds do that in game mode. If you use custom picture set up, it looks very, very nice in HDR, and in SDR as well.
I game on a 1440p 170hz monitor and I love it. I replaced my old Samsung TV with a 43in 4k 144hz monitor in my living room with my Series X and Shield and its nice too. Out of curiosity, I did lug my PC out there to try 4k and It was nice though I'd say it was more because I had proper HDR on that monitor in addition to the bigger size and resolution. I still like the sharpness of my 1440p close range monitor though.
Always 4K for me. You don't have to play at 4K because you've got a 4K monitor. I love and appreciate the clarity of 4K monitor for every other non-gaming task. I can't go back to lower resolution monitor.
@@JustMikeBroThey are saying Apple destroys the whole market with ACTUAL pixel density and visual clarity. They are also destroying the market with their new silicon
I'm glad the main digital foundry guy ("Rich" I think) mentioned pixel density. Because that is the real relevant factor. How pixel dense? and how close are you putting it to your eyeballs?
I play at 4k DLSS quality (1440p) on a 55 inch QLED whenever I can, custom resolutions like 1620p and 1800p too if reconstruction is not possible... Real 4k is rare for newer games, but the details of 1440p is enough and the HDR and level of details that I get out of a big high end tv close to me is outstanding.
I like to use 3840x1600 for the 24:10 ultrawide aspect ratio. 3840x1920 is also good, it's 2:1. I like going letterboxed if I really need that extra bit of performance. I use a 42" OLED though. But I am planning on downsizing to 1440p with a newer 240hz OLED monitor. Honestly, my desk is just a bit too small, and I have to have my second monitor mounted very high and it can be annoying looking up. I could also just buy a better chair with a head rest, but for the price a new chair would cost, I could get at least one of those monitors, maybe 2 depending on the chair.
i have a 3440x1440 monitior at 120 hz. I am completely satisified with this setup. I only plan to go 4k at next generation of cards. I currently have 5800x3d and 6800xt. Runs perfectly most games i wanna play at 120 fps. I might try nvidia next generation due to their superior RT and upscaling technology.
I think TAA makes 1440p not feel as good anymore. Games just sort of look blurry. On my 1440p monitor, whenever I use DLDSR and DLSS to output a 4K image, TAA issues are significantly reduced. The image is much more crisp, and this is only on a 1440p display. Native 4K should look a fair bit better than even that.
Yea your right ans its because TAA is not a good AA. It def helps jaggies, but tye amount of blur that happens is unacceptable. Even if you use taa at 4k, sure naturally it looks better, but its still blurring more than it should, which takes away from the crispness of 4k to some extent.
@@anthonyrizzo9043 Too many PC gamers are obsessed with 'sharpness'. TAA was only 'bad' in the odd examples(especially earlier on), but was mostly hugely worth it, and the hit to clarity when using 4k is negligible given how much clarity there still ultimately is.
@@anthonyrizzo9043 God forbid gamers don't want a vaseline filter over their entire screen. Having to go up to 4K just to get clarity levels that we had at 1080p is insane. I shouldn't have to spend huge money on a setup so I can get the bare minimum of what I got in games a decade ago. TAA is a blight.
there's a fix for the blurriness. install reshade, find the lumasharpen and clarity injectors to make the image sharper and clearer and done. the other 2 injectors i use are lightroom and curves. lightroom is a tone down version of photoshop. you use curves to darken the shadows and brighten the light, then use lightroom to tweak the colors, vibrance and saturation, fine tune the shadows and light with shadows, midrange, and highlight, and you got an HDR look without HDR. it's really a shame that so many gamers don't know about reshade.
People need to keep in mind that ANY demanding game nowadays will offer reconstruction options. You dont need to run at native 4k to get good use from a 4k monitor. That's old thinking.
@@maynardburger People playing games for last few years rather noticed existence of upscalers ;) Or at least group of them tending to tweak graphic settings.
@@youngwt1 Assuming you both have 20/20 vision, the thing that would make the difference is how far you're sitting away from your monitor. Quite a lot of people seem to sit farther than 30" from their monitor. With my desk setup, I'm sitting about 16" from my screen due to limitations with my room space, so anything with lower than a ~100 PPI and the pixels are very visible
@@epiccontrolzx2291 This. With everyone going back and forth about if 1440p or 4k is the better choice... the viewing distance from the screen is the most important part. 1080p looks a bit different when you're looking at it on your phone compared to a flat screen TV lol.
@@justinp9170 This is the truth really. I use my pc from the same distance you would use a console from, and in the same way really. I can't be using a small 1440p screen. I just pipe it through my 4k tv and enjoy all the extra pixels on a 50" screen and actually decent HDR.
i think the question is more along the lines of "is the jump worth the hassle?" because nobody even bothers pondering whether 1080 is better than 720 and 1440 is better than 1080, no one would go backwards in those two cases, but 4k, even 10 years after going kinda mainstream, its not regarded in this way and as you can see its pretty normal to "downgrade" to 1440 or just dont mind having either unless its for a big productivity/multitasking monitor
I was on 4k for 2 years I think and I went back but that was before stuff like DLSS and FSR, now I want to go to 4K again because it's sooo good and you can get 240Hz monitors, when I had it was only 60Hz
Literally the only thing that made me go with 1440p is the non-existence of a high HZ good quality 4K ultrawide (or 5k2k apparently, god i hate marketing). Once a 4K UW can do everything my current 1440p UW can do (that is 160Hz, OLED, HDR1000, etc) i will buy one.
Performance impact of 4K is just too high on PC, especially on the relatively small PC screens (27"). And since 27" is a perfectly acceptable size for PC users, 1440p becomes a perfectly acceptable resolution. This is one of the unspoken benefits of PC's over consoles. Consoles, due to being connected to much larger screens, need to focus on high resolutions.
@@fcukugimmeausername That was before stuff like DLSS and FSR came out, now you can upscale your games to 4K from 1440p and it almost looks the same as native 4K 😅 plus 32" monitors exist and they are awesome
Going from 1440p to 4K was a much bigger upgrade than 1080p to 1440p for me. First time I can't see the pixels or dot pitch, can't go back to lower resolutions now. You have to use upscaling but it actually looks good unlike in 1440p.
@@LosoBanks_332Depends on your viewing distance, as you are sitting further back from a larger screen the difference won't be a lot. Also 2k on a 2k monitor will look better than being displayed on a 4k monitor.
There is literally no reason you cant have both. That's one of the whole points to PC gaming. Sticking to 1440p is not helping games 'look better'. Developers are not designing visuals around 1440p or anything like that. :/
I have a 5800x3D paired with a 7800XT. My current plan is that i already have a 1440p 144hz monitor so that one i use for FPS Games or Competitive PC Games. When i play a Story Game for example GOW 2018 right now i take a 10m HDMI2.1 cable and put it into my 55 inch 4k60hz TV because honestly i wont get much more than 60 at this resolution in most games. I also have a PS5 but its honestly just for the exclusive Sony Games.
I have those same specs with 32GB of DDR4 Ram. I started with a 6650xt with a 5600x3d and noticed that the 8GB card was kinda holding me back in gaming
@AnomieDomine I wanted to go with an Nvidia build but since I already had an AMD card and I'm not too keen on Ray tracing I went with the 7800XT. That and the fact my TV supports FSR
I recently was looking to upgrade from the 27” BenQ 2560x1440p QHD monitor to a 32” 3840x2160p 4K UHD to be used by my RX 6950XT. However, during my search I’ve come across a 34” Samsung Odyssey OLED G8 with a res of 3440x1440p WQHD, and it is by far the best purchase I’ve ever made for a monitor. Not only does WQHD look great but also the 21:9 aspect ratio provides an immersive cinematic experience with a sharp image while the GPU is not being overwhelmed like pushing out 4K. Let alone the OLED Display takes colors to another level. From my own experience, I believe 1440p is the best way to go for monitors. It’s certainly the sweet spot between providing a sharp image without severely comprimising performance. With most GPUs supporting AI upscaling techniques, performance should get even better. Bottom line, no one needs a 4090 to enjoy high quality gaming.
Rocking a 32 4K 144hz monitors for my gaming PC and my PS5. 1440p with 60fps + is where its at. Im not planning really on gaming in 4K unless its an older title or a Fighting/RPG where 60fps is really all i need
to be honest the big jump is when you go from 1080p to 1440p, its a very considerable increase in image quality, from 1440p to 4k honestly theres not much of a big jump or at least is less noticeable
1440p output on a 4K monitor is about the same as 1440p native. Anything past last like DLSS/FSR, 80% render scale, or even native is nothing but upside, and on top of that you also have a much better desktop experience. Budget is really the only justification for 1440p IMHO.
I have both a 28” 4k144hz and a 27” 1440p144hz. 1440p looks ok. But once you look at a high ppi 4k screen 1440p looks blurry. The only time i play on the 1440p monitor is when i cant get the fps i want on the 4k monitor which isnt often (i have a 4070ti super).
@@yc_030 yeah that one didnt make much sense to me either. Maybe its a per person type of thing. And some people cant really tell a difference once you go over a certain threshold? Like people with less than avg eye site? Either way i can definitely appreciate the difference.
@@kapler8550 I actually ended up looking it up and the ppi on a 27in 4k monitor is 163 and my phone is 460ppi so not sure what he was talking about there
1440P is the great new "workhorse" resolution for me. Perfect balance between performance and quality. Cheaper monitors as well. Use DLSS quality mode in many games and it looks great.
I think it also depends on the use case, as in whether you also use the monitor for more than just gaming. I use mine for office use 80% of the time, reading a lot of papers. And there I can definitely tell the difference of 4k vs 1440p on a 27 inch monitor.
yep my 32inch 4K monitor gets mostly used for work these days🤣🤣🤣🤦 couldn't go back to 1440P just for that reason. DLSS modes from performance upto quality look fine at 4K so 🤷. You do need a 7900XTX or 4080 class card though IMHO especially if higher refresh rate is wanted.
Got to be honest I have been on 1080p for years still on my 144hz LCD monitor though i am tempted to get a high frame rate IPS or some other one with better color and contrast
I had used my brother's old 32'' 1080p tv for gaming monitor for few years lol. Last year bought 4k (tho only 60hz, having ~120hz would be great improvement) and the 1080p vs 2160p is like having bad eyesight vs hawk eyes. The details for everything that you see is just super worth it.
The required resolution depends alot on whether you try to recognize the whole screen or concentrate just on parts of it. If your game makes you watch the complete screen, I am still fine with 1080p. If you need to watch around to concentrate on small portions, 4k is the way to go. With refresh rates there is something similar. If the game does not blurr motion and you have fast movements of small objects, refresh rates up to 120Hz can make sense. If the motion is blurred and the game is not about small objects moving fast, even 30Hz can be sufficient. Next question is, if you are playing competetive or just for fun. Competetive you will typically reduce the rendering quality, because all the shades, reflections, effects make it harder to see, what you actually need. If you are playing for fun however, the beauty and the emotions carried are better with more details. There is never a "one fits all" answer.
In tons of game(Cyberpunk, TLOU, RDR2, etc), 4K DLSS performance(1080p) will looks "A LOT" better 1440p native because those engines will use lower LODs when the outputting resolution is below 4K. Unreal Engine doesn't seem to suffer from this, as well as all the Nixxes ports.
If you had to use 4k DLSS performance (1080p) to play the game, what makes you think you can play it 1440p native? Seems kinda weird logic here. If we're comparing 4k DLSS Quality(1440p) with native 1440p, maybe that makes more sense. Then again, if you were willing to play with upscaling in the first place, why not go with the higher resolution display? Especially for older games, you can play at much higher fidelity than you would get otherwise.
@@_ch1psetDLSS and other advanced upscaling techniques are not "free", they do have a compute cost, and in my experience, in terms of average FPS, performance levels at 4K DLSS performance mode are largely comparable to 1440p, and slightly more demanding in VRAM limited scenarios
@@leomariano2735 you're right, it's not "free". Everything you said still reinforces what I said. What makes you think you can run 1440p native if you can't even or struggle to run dlss performance to 4k? If they are the same load on gpu and/or vram, you'll struggle with both. Logic still holds up there. Idk what your point is.
Likely because UE has been pushing for upscaling for very long time with TAAU and TSR with UE5. So they decoupled their render resolution from LOD range a while ago.
Made the switch to 1440p years ago when red dead 2 came out and the TAA blur was horrendous, didn’t cost too much performance wise for a very noticeable boost in image quality.
Rdrd2 has issues with DLSS making the image way too sharp. There's a mod on nexusmods to fix it and it makes the game look 1000 times better. It's a must have even dlss balanced on a 4k display looks astonishing when u use it.
keep in mind, sometimes 2k monitors will have better colors and contrast vs 4k, and 4k isnt necessary until 32 inches or above, especially if all you're doing is gaming. I also would choose visual fidelity over any hz over 240hz. You can tell the difference between 60hz-120hz, but you cant really see much difference between 120-240 and even less above that, but sometimes the higher hz monitors also have better response times
Probably the same panel my Asus uses. I already had a high end IPS monitor, but the switch to OLED really was a drastic visual improvement. It's amazing just how much you lose out on when you have a backlight destroying the details on anything with contrast. I'd recommend a switch to OLED over a GPU upgrade for most people if they had to choose which to do first.
I want a panel that can last me 10 years or so, so OLED is a no-go. I bought an expensive plasma TV a long time ago and it really soured me on the experience. No peace of mind, annoying image retention, worries about long-term burn in, high power/heat, etc. Dont want to go through any of that again.
@@maynardburger go read up on the Rtings OLED test. They have a UA-cam channel simulating 1,000s hours on modern OLEDs including monitors. The tech has moved on, & the panels are a lot better mitigating burn in. 10 years is a hell of a long time though, I can't say I expect any device I own to last 10 years, or be relevant that long. Maybe my laptop, but even then I'd probably choose to upgrade around year 6-7. I think 5 years for an OLED with good mitigation is actually doable though. Expecting 10 years of top end reliability seems delusional to me.
@@WildfireX So you wanna pay $800+ for a tech that lasts only 5 years? and 10 years is not a hell of a long time. My Samsung FHD TV is 12 years old and still running. That's why greedy corporates LOVE people like you. To sell you something every 5 years 🤣
I’m tryna decide what monitor to go for as I’m planning to build my first PC and I want to get a Mac Studio as well. I want to game, stream, video editing/YT, watching content and everyday usage. I’ve been using a 1080p AIO “PC” since 2015, it’s a 27 inch curved Samsung and honestly I think it’s okay even to this day… I also have a 5K iMac 2020 and it looks gorgeous. My PC build I’ve decided on is 13600K w/ 4070 TI Super. I think that can run 4K 240hz and the Mac Studio can also do 4K 240hz with HDMI 2.1… Should I get a 4K 240hz or a 1440p 240hz instead?
1440p is the safe bet. The sweet spot. But I definitely think 4k is a bit demonised at the moment. It requires a bit more compromise, but you can get surprisingly decent performance in 4k with less than 4080 or 4090 level GPUs.
all about screen size and how far away you sit. 1440p is perfect up to 27 inch monitors imo, at 32 inches it just looks a little too pixelated at normal desk viewing distance. you get significantly better performance for not much clarity loss. 4K basically scales perfectly up to even 88-in TVs considering you sit so much further away from them.
If you have a strong GPU, use the PS5 on the monitor and the PC on the TV because the higher PPI will make PS5 look sharper (since it doesn't reach 4K on 98% of games) and you will be able to use your PC's full potential on the TV.
With modern game poly counts 1440P is a sweetspot. In future as poly counts get higher, I can see 4K gaining more popularity but not much point going 4K on a sitaution where even smallest polygons are like 10 pixels wide.
This is complete nonsense. I have no idea where you're getting this notion from. lol 4k is always better, simple as that. Polygon count is not going to change any of that. Aliasing issues even in sort of micro-detail or sub-pixel detail is better resolved at 4k and there's already plenty of that. I dont think you seem to have any idea that distance is a thing. You're not viewing every object at arms length distance. Lots of smaller objects in the distance and whatnot are already in the 'subpixel' region.
@@maynardburger It isn't complete sense, it is just that you are not able to comprehend what I'm saying. Would be better if you tried harder tho. 1. I never said 1440p == 4k. I said 1440p is sweetspot. Sure 4K contains minor extra details but not so much as to justify for most people. Not with the current polygon densities. Poly density has direct effect on perceived resolution quality. Look at flat white image, you won't notice if it is 240P or 8K. That is because there are only 2 polygons on a flat white image. 2. Aliasing artifacts that are resolved by rendering at 4K will also be resolved for 1440P. Just use DSR 2.25x AI mode on 1440p.
1440p on a 16" mini LED 240Hz 4080 laptop works wonderfully for me. 1440p on a 16" display looks so sharp. You can't see the individual pixels, so you get the look and feel of 4K, with the performance of 1440p. I love it.
I have two monitors. One is 1440p, and one is 4k. I initially didn't think 4k was worth it even for my 4090 until.... I discovered integer scaling. 1080p is a perfect integer for 4k, which means great clarity on a 4k monitor at awesome performance. And apps like lossless scaling make 4k monitors very versatile.
I can tell the difference between native 4K and 1440p if I’m close to the screen. I can’t however tell the difference between upsacling to 4K and native 1440p even if I’m close to the screen, probably the things that are far away in game night look pixelated but it’s always going to look like that no matter what resolution.
I went from playing on a 4k LG oled TV to the new MSI 1440p oled monitor. I currently have a 4070ti super as my GPU and honestly speaking the 4k tv does look better but the 1440p monitor is no slouch the oled monitor really gives you beautiful colors and a really clear image that is close to 4k. But the added advantage of being to run everything on higher settings with way higher fps output im 1440p really sold me. The monitor is 360hz and is smooth as butter.
On a 4K 65" Samsung QN90B, I'm playing Dragon's Dogma 2 at 4K with DLSS3 and FG on an RTX 4080 between 70-130fps, at max visual detail. Yes, that does include some frame drops and stutter because something's up with their engine, but most modern titles I'm able to keep a steady framerate and squeeze towards 120fps on this TV. If you plan on doing PC gaming on a TV in a living room setup, 4K is the way to go, and it's totally worth getting a good GPU to drive it! The difference in details like small particles and textures is staggering with this kind of setup. With 27" monitors, 4K isn't going to get you much.
@@ThePgR777 that's what dlss 3 does... so you can play most of the latest games at 4k with 120fps and majority don't even noticed the difference between native res and performance mode dlss
@@ThePgR777Well yes, that's the whole point. If I had a choice I'd play at native resolution without upscaling, if games were actually optimized these days. Devs use upscaling as a crutch to be lazy, but we make do with what we've got.
27 inch 4k looks a lot better than 27 inch 1440p even with dlss. You have to use dlaa at 1440p to get close. I'm running 4k off of a 4070 at the moment and it has been great. I still have my 1440p monitor as a secondary.
@@Mene0 4k monitors are incredibly affordable nowadays. Have y'all not looked at monitor prices in a long time or something? Heck, we're even starting to see 4k/144hz monitors for like $400-450 now. A perfectly decent 4k/60hz IPS display can be had for like $200.
No point in buying a 4K monitor if you can't play at that resolution (even with DLSS). People keep talking like 4K is the norm but the vast majority of gamers on PC are still playing at 1080p. That's fine. I have an RTX 4070 and a 1440p monitor, and more often than not, I have to use DLSS for modern games (quality if I'm lucky, lower if there is some Path Tracing involved). I can barely play at ma native resolution already. 4K is great if you have a 4090, but that's pretty much it.
I would have to disagree with Alex. I have a 165hz 1440p and 4k 60hz monitor in my setup and like 90% of the time, I prefer to game on the 4k. I feel like the only use-case for the 1440p, especially since just about every game now relies on upscaling, is games where fast motion is preferred, such as FPS's but even then I prefer the 4k setup for most non-esports games.
I love 1440p The perfect balance between picture quality and frame rate (which to me, frame rate takes priority). I went from the OMEN 27i to this LG OLED 1440p ultrawide and it's amazing! Some say that the pixel count is low and it's possible to see individual pixels but I can only see it with my face planted in the screen... which isn't my normal viewing angle. Rocking a 6950XT and it's nice to turn on high settings and still get pretty high frames. Seems like you would have to pick and choose more often between quality and frame rate with 4k in my opinion but that's just my opinion.
I think we are kind of missing another point that it also matters what kind of games you play. For example, I would say if you competitive first person shooters, higher frame rates to be competitive with the speed of the action matter more than just wanting beautiful textures, so a solid 1440 p with high frames matters more than 4k with lower frames, especially if you turning on Ray tracing everything maxed and only getting 50-60 fps. However, if you playing solo/rpg story line games in non online settings like a dragons dogma, baulders gate, god of war, etc, I think you would want 4k to maximize the experience and immersion with Ray tracing turned on. The action won’t be so fast that you can’t respond or be competitive, so fps doesn’t matter as long as it is at a playable frame rate.
haha funny than you made this video, a few days ago I switched to my 4k monitor from 1440p (both LG IPS 27inch) and even tho 4k is 60hz its much better, very noticeable. and I have 4070 and even dragons dogma 2 gets to easy 60fps so...
4070 is very capable at 4k. I'm still able to play a lot of modern titles over 100 FPS with dlss quality which looks great at 4k. Can run most 5+ year old titles at native with high FPS
if I'm not playing competitive fps, yes. 4k is not only resolution clarity is looks like everything on the screen is right there, so yeah I was playing horizon and dd2 on both displays and I much prefer 4k 60fps.@@crazy-nephew_5857
correct its very capable and im not even using frame gen, I was watching some guy doing benchmark on 4090 of horizon fw and he was getting only 20-25fps more on 4k.@@mojojojo6292
probably next year we gonna get 4070 class gpu capable running 4k 100+ fps or at least something similar to 4080 super. but for that you have to buy 4k tv on monitor capable of high refresh rate.
Depends on your budget - Ive got 4090/i9 build and currently rocking QD OLED 1st generation AW3423DW from Dell. 1440p monitor, amazing display and for the size it has good PPI - the density is good enough for me, but I am considering Asus ROG PG32UCDM, 4K QD OLED later this year when it becomes available in my country and when/if 5090 arrives. The PPI is just not high enough to fully satisfy me, although this particular monitor strikes a great balance. So again, the answer depends on your budget and wants/needs. If you check my case, you might be in the same boat, however lets face it, we are minority and for most of the people 1440p is absolutely great resolution. Screen size x viewing distance x PPI x your HW. I think if you have at least 4080, you can go with the 4K no problem, but if you are like me and want the best possible experience and you are willing to pay for it, then I think its gonna be 5090 enabling 4K w truly high frame rates and if you pack it with OLED you get it all.
My situation is the same. I have a 7800x3d with a 4090 and I play on an Alienware 3423DWF. I'm considering change to a 32" 4k 240, but I'm not sure about ditching the ultrawide for 16x9.
What about aliasing? Is it still noticeable in 1440p? It's what bothers me the most in my current 1080p monitor, specially how lots of games look blurru whenever I move the camera
If you're taking about motion blur - it's the technique game developers use for compensating for stuttering image (due to inefficient refresh rate of screen) when you move camera around quickly. You can disable blur in most games. Having a monitor with higher refresh rate is what you need than. Refresh rate (Hz) and resolution (1080p, 1440p, 2160p) are two different things.
Just bought a lg 32 inch ultra gear 4k monitor from a 27 inch 1440p . Im on a rtx 3080 and ryzen 9 3900x, i cant hit every game at 4k 60 hz but i cant deny clarity of the picture is beautiful luckily most of the games i play arent that demanding , or the demanding ones have dlss . Loving my new monitor.
I personally decided to get a 4k monitor over a 1440p one not because of gaming but because of watching movies and tv shows in 4k. Most of the gaming that i do never hits 4k and is 1080-1440p but that doesn’t mean u should opt out a 4k monitor. Its also nice to future proof yourself.
I have an RTX 4080 super and recently upgraded from 1440p to 4K monitor. I can say it's definitely worth it. With DLSS and Frame gen tech, you can easily hit above 60 fps in 4K max settings.
I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump. Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med. People even on forums say the jump isn't that big in games, PC, or console. Text is where it shows a big difference from 1440 to 4k, Unoptimised games struggle to run native 4K even with monster cards like a 3080-4090. That problem doesn't always happen with 1440p. I'm thinking of switching back to 1440. I have been using a 4K screen since last year. 1440 I had used for like 6 year's 4k has been frustration with unoptimized/demanding games even using high settings, stutter/dropped fps
I have a 27" 4k monitor, and from my personal experience, there's no difference between the two at native resolutions, playing with upscaling technology however you can definitely see the difference.
You might want to consider getting glasses if you can't see a difference between the 2. granted I don't think it's as noticeable as the jump from 1080p to 1440p it's still pretty big.
I'm still 1080 and fine with it. Had a 4K TV for a while and was quite underwhelmed by it. I was expecting something amazing but it didn't blow me away at all. Sold the TV and just gone back to a 1080 monitor with my PC. I have no desire to upgrade at all.
Well, then go to the NVidia Control Panel if you have an NVidia GPU and enable DLDSR for 1440p & 1620p, it will make your image look a lot sharper and will remove 80% of your jaggies. It doesn't look as good as native but it is a huge jump from native 1080p on a 1080p screen. Just don't use it on desktop, only inside games or your image will become blurry. Also check that you have Scaling set to off or your image will look zoomed in on some games.
I'm actually thinking of doing both. 1440p screen for shooters and a 4k tv for games like God of war, Jedi Survivor, Assassis's creed.... My guess is a 4080 Super could probably handle that load.
I have said before that 1440p or 4K will look absolutely glorious on its own, any of those resolutions will be just amazing on any OLED, BUT until you have both displays next to each other, then of course you will see that 4K resolution is vastly superior. Key thing here is having them next to each other, because honestly your eyes will get used to whatever resolution and monitor size after a couple of hours in. If you're strictly only gaming then I would settle for a 1440p, however in my case I also watch a lot of content and do a lot o productivity, so the better image quality and crisp text is a must for me, I don't mind using upscaling if needed.
Im still gaming on my old 1080p tv because im a console gamer and most games are either 1440p 30fps and 1080p 60fps, so theres no reason for me to upgrade
I have a 3440x1440 pg35v and I prefer to play my games in 5k; the latency increase is the only asterisk for fps games. I am looking forward to the future 4k oled monitors that have higher frame rates (especially at 1080).
I have the same dilemma, although I am leaning heavily to 4K 144hz+ now for several reasons, even considering I am a midrange buyer: 1. I have had a 1440p 144hz monitor since 2016, and I don't really see the need for 240 hz being primarily a single player gamer (I probably would have stayed on this monitor if not for getting tired of being locked into G-Sync only due to it being a pre G-Sync Compatible era monitor and the aging color quality vs new models). 2. Upscaling being quite good even in Balanced and Performance modes if the target is 4K. 3. If worst comes to pass (like Alan Wake 2) there is the option of integer scaling from 1080p while retaining all the eye candy, even full RT. 4. Better compatibility with the PS5. 5. Nicer looking text is a treat for me as a Software Engineer looking 8h a day at text.
Depends on your monitor size tbh, I have a 27" 1440p screen and some games look crap in comparison when in 1080p, and the difference between 1440p and 4K is bigger than 1440p VS 1080p But if you`re on a 5" smartphone, the difference between 720p and 1080p is hardly noticeable at all. At some point the pixel density just becomes pointless, and when it comes to text, it can be straight up detrimental. Screen size, viewing distance and settings make a big difference, so if I` mgaming on a monitor 2 feet from my face, I`m not using the same settings as when I`m gaming on a TV that`s 7 feet away. It all varies based on settings and circumstance.
Was going to say the same thing, size is such a big factor. I use my 75" 4k TV as my monitor. 1440p is not bad on it, but does 4k look much better formost games? Yeah.
@@dece870717 I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump. Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med. People even on forums say the jump isn't that big in games, PC, or console. Text is where it shows a big difference from 1440 to 4k, Unoptimised games struggle to run native 4K even with monster cards like a 3080-4090. That problem doesn't always happen with 1440p. I'm thinking of switching back to 1440. I have been using a 4K screen since last year. 1440 I had used for like 6 year's 4k has been frustration with unoptimized/demanding games even using high settings, stutter/dropped fps
@@4evahodlingdoge226 you’re not wrong but with DLSS and different resolutions you get better results in 4K. In some cases it’s tough to distinguish 2160vs1440.
I made a custom resolution 2140x1200 for my 55 inch Tv and set sharpening to 20%. It looks perfect for me and the performance is pretty much the same like gaming on 1080p.
Personally, I prefer max settings at a high refresh rate ultrawide 1440p over 4k/60. 144+ fps aint always easy these days at 4k without fiddling with settings.
In the market for a new monitor after a new build of 7800X3D paired with a Aorus Master 4080 Super, I was going to go with the Asus PG34WCDM 1440p Ultrawide OLED, but thats nearly £400 more than the Alienware AW3225QF 4k QD-OLED, for me, its a no-brainer given the cost saving.
@@andrewmorris3479agreed, but I mainly play story driven games rather than multiplayer, so for me the bigger screen makes more sense, even for 27" if multiplayer / competitive gaming is your day-to-day gaming experience, then you'll be wanting frames over anything, so 1440p would probably be where you'd want to be,
To me, visually stunning titles like Cyberpunk, Alan Wake 2, Avatar, and the like are really great at 4K if you have a GPU that can handle them. I have a 4080 so I'm often going back to and forth between using DLSS to get above 60fps or just playing at 30fps natively. Neither situation bothers me because the experience is enjoyable regardless. However, HDR10 should be a factor in choosing a display monitor because it gives the game a pop of color to make them more vibrant.
HDR is still generally too expensive in the monitor space. At least to get any kind of decent HDR experience. Still requires so much specialty technology to do right. If you're in the
@@maynardburger actually it's not as expensive as you think. If you get a VA, you could save some money. In January, I bought a refurbished LG 32" 4K 60hz Monitor (VA) with HDR10 for $184 including tax and a 1 year protection plan. I know that IPS is superior in the monitor space, but games look gorgeous on there. That's why I always advocate for HDR10 as opposed to HDR400 which is the most common for monitors. Also, it would allow you to use current Gen systems without a dropoff in quality. I'm not suggesting to get a 4K monitor because I do believe that 1440p is the sweet spot. I'm just saying that I would highly consider getting a display that has HDR10 regardless of resolution.
I feel like 4k 60fps ultra settings for non xx90 tier GPUs is still a generation or two away. When I got my 3080, I was torn between 4k and 1440p and almost went with 4k because in my head I was like "I have one of the most power GPUs of this generation," but I'm actually really glad I went with 1440p. I can crank settings and get over 60 fps in most any modern AAA games. Throw in DLSS and I'm getting close to high refresh experiences with maxed settings. Just don't think that would be possible on a 3080, and maybe not even a 4080. The 4090 cruises through it, of course, but it's harder to justify such an expensive part. Maybe the 5080 will offer near 4090 performance at a price tier below
If my card is hitting 90-100+ @4K (which it does), then I'm not gaming in 1440p just to say I'm getting 240fps. That's crazy for anyone that's not playing competitive CS or something like that. Not only that, but I have streaming boxes and things hooked up to said 4K monitors that benefit from 4K, HDR, Dolby Vision, etc. For those uninterested in those things, 1440p is a better bargain.
Ive played gta v of the legion go on ultra with native resolution and the sharpening and clarity is crazy, never experienced that quality before and i like it, but i will still go back to 720p
1440p vs 1080p the advantage used to actually be the wider screen real estate so you could see more in fps games. I’m not really sure if the same applies. I could actually be worse again because of the screen aspect ratio?
pixel density does wonders for how sharp an image looks. For example my 1080p 15 inch laptop looks impressively sharp because it's so pixel dense. a 24inch 1080p monitor looks like crap in comparison because of the density. For a 4k screen to match the pixel density of a 1440p 27 inch to equal the pixel density you'd need over 40inches at a 16:9 ratio so anything under will gain sharpness. So a 32inch 4k screen will be very, very noticeably more clear and give you a bigger monitor overall. DLSS performance at 4k is over 100p internal resolution higher than 1440p quality as well. With that said you need like a 4080 minimum to be sure you can run every single existing game well. You can push it a bit with a 4070ti super but you'll be cranking down settings in some of the harshest games like AW2. I'll also say that if you're looking to use Ray Reconstruction in any game it is unbelievably smeary at 1440p DLSS quality. You need at least 4k output res to have Ray Reconstruction look good atm.
or you just buy a high end GPU🤷. The tiering of GPUs is really clear now. A 4070TI/7900XT is the most you need for 1440P for a really great experience. However at 32inches and up 4K is a better res for the monitor size which means an 7900XTX/4080 or 4090 is needed.
Of course it makes a difference. It makes a massive difference when the viewing distance is so small. Even 4K is unfortunately too low-res for 27 or 32. Apple got it right with 5K for 27 which is over 200 PPI with an effective resolution of 1440p and 6K for 32. However, we don't really have the standards to transfer that amount of data with higher refresh rates + HDR.
Unfortunately most people seem to be blind these days because they think 1440p isn't a blurry mess. Imagine if we didn't have people like df pushing 1440p all those years ago if we would have had reasonable priced 4k monitors and gpus to run them? I'm sure Nvidia was thanking God for this hive mind seeing as now they could make so much more money.
@@OG-Jakey🤦 you can't cheat physics. 4K is a lot of pixels to render. We've only just got to the point with a $1600 GPU that you can genuinely have native high refresh rate 4K rasterised content.......until you turn on RT or PT🤣🤣🤣 then it's DLSS time.
@@OG-Jakey is it that most people are blind or your eyes are too good? 4k is great for 27 inches, even though most would disagree. However this guy is saying its too low of a res for that size lmao. Maybe for text clarity 5-6k is necessary but for gaming, you can forget about it. 4090 is starting to struggle in certain games at 4k maxed out with RT/PT. Also viewing distance matters.
I got a 4090 and 1440p monitor. It allows me to get great performance even in AAA games. If I want 4k I just plug it to my 4k 120hz TV. Best of both worlds.
It absolutely does lol, DLSS at 4k produces much better quality than 1440p at basically any level except ultra performance. 1440p makes no sense anymore.
@@omarcomming722 Agreed I'm using a 4070Ti Super at 4K and using DLSS Quality or Balanced basically looks like native and Performance looks really good too. Higher res you go the bigger benefit you get from upscaling tech and the better it looks. If you have a decent Nvidia GPU and plan to use DLSS a lot I think it makes more sense to go 4K vs 1440p nowadays.
I want to buy a 4k monitor because I want to watch 4k movies on my Pc and I want to play games in 1440p as my graphic card is RTX 3080. So will 1440p gaming on a 4k monitor look noticeably bad in comparison to a native 1440p monitor?
4K is superior to 1440p. PC components just need to catch up in price for gamers to do the jump on mass. But I doubt the jump from 4K to 8K will be done tho, the human eye doesn't upgrade like the pixel count. And most people struggle to find the difference between 4K VS 8K on a PC monitor
After 4k I want smoothness. I'd much rather have 240hz 4k than 60hz 8k. 2880p is the next jump in resolution for me I think. Not making that jump until after 4k 360hz plus though. I'm gaming on the aw3225qf with a 4080 super and tons of titles are already in the 100+ frames. Older titles and esports titles do actually hit 4k 240hz right now. I want a 5090 on release to really push this monitor. I'm hoping that 4k 360hz or higher monitors are coming out around the time the 6090 is coming out.
It's a huge step in price from a 1440p to 2160p if you are a native resolution with zero frame gen guy like me. Will rather have high framerate with no distortion of the picture.
A lot depends on the anti-aliasing. Some games with really great anti-aliasing can look great on a 1440p monitor. Others still let you see individual pixels and jaggies at 4k. Running those games at 8k and downscaling to 4k gives a big improvement in image quality, so I'd expect to easily see the difference if those games were run on an 8k monitor. A better solution, though, is for developers to give their games decent anti-aliasing! Running Rocket League at 8k rather than 4k with my 4080 drops my fps from 500 to sub 100, so running in 8k at the same frame rate as 4k today needs a multi-generation increase in GPU power. Better anti-aliasing would be a more sensible solution.
In Cyberpunk at 4K, you see a lot more details. The problem is that 4K currently requires significantly more performance. 1440p or 4k. I mean, 4090=4k 4080=4k or 1440p.
In my experience, 1440p provides a nice balance between quality and performance.
Samsung Display QD-OLED at 26.5” 1440P 360Hz is glorious!
you know resolution can be set internally? set it to 1080p on 4k display so best of both worlds whenever u want fidelity or performance. it scales perfectly.
1440p is the sweetspot for me, games look fantastic and perform great.
I simply chuckle at this whole debate and people asking what’s worth it.
It’s one’s personal taste and use scenarios what’s going to decide what something is worth.
People can present you with information about things , but it’s never going to be that accurate because except for maybe scientific hardware focused reviews, everyone gives information with a Bias.
It’s 4k worth it?
For some 1 billion percent yes.
For others absolutely not.
For example I have a friend that asked me if I think he should go to 4k felm his current 240hz 1440p monitor.
And I told him , bro…
When you bought your pc you kept telling me you where mind blown by how fluid games where and how amazing graphics looked. And when I came to your house I realized that 1rst you where running low settings in some games because you never even tried to open the graphics settings menu and see what settings you had.
And second , you have been playing multiplayer shooters with me at 60hz for this whole year because you never checked it and your monitor was at 60HZ
I don’t think you can tell the difference of going to 4k you really aren’t demanding.
I on the other hand had a secondary display that was 1440p that I used as a side monitor while my main media consumption , work etc was on the 4k one.
And I had to sell that 1440p and go 4k for the secondary too, because I just can’t bear 1440p it looks blurry to me.
I’m used tot he crispness of 4k and I can’t stand 1440p
So 100% worth it for me.
@@lawyerlawyer1215 I assume for general media you need a big monitor right? Space is a factor often omitted, really depends on the room and how far you'll sit.
1440p 165hrz with my 4080 is a wonderful setup
Overkill for your monitor
@@veilmontTV Not with games nowadays being unoptimized trash.
@markedfodeath that's why you don't play new releases on launch unless you're willing to deal with it. Still overkill for 1440p
@@veilmontTV Nope.
@markedfodeath you playing Alan wake 2 or using ultra settings including raytracing? I'd imagine not. what game are you playing that makes that card not a waste?
I just play competitive FPS games on my 27 inch 1440p monitor and story games on my 4k 65 inch c2 oled
Is your wallet ok 😂
@@crazysaturn0803 can get c2 oled for $1k at costco at one point. its not the much money bro. lol
nice combo
@@crazysaturn0803 my wallet is getting light but I want a 1440p 240hz oled monitor lol
@@chaboinas yes sir
The nice thing with 4K, I find, is that when you upgrade your GPU in some generation, it does open a wide range of more games that your GPU can push at 4K.
It's very demanding to run the latest games at 4K max settings, but you can run them on modest GPUs with medium settings, and older games, 3+ years old, you can typically get away with 4K max settings.
4K is also great for emulation, and such things. It's awesome upscaling old PS1, PS2, Dreamcast, Xbox games, to 4K.
In the end, it will come down to the individual. Do you want fidelity, or response time? Not everyone cares about online shooters. Someone may want to sit back with a coffee, and play Baldur's Gate 3, with the absolute best possible image quality they can get, at 60 fps, or even 30 fps.
Myself, I am on a 4K OLED 120Hz, and I wouldn't give it up for 240Hz on a 1440, but, that's just me. Others would, and that's perfectly understandable.
Which monitor are you using? I'm trying to find a 4K OLED with no higher than 165HZ, competitive gaming is just not for me.
@@Daniel-x2d9k I am using an LG C4 TV actually, and it works quite well as a monitor.
If you don't use game mode, which I don't, the picture is very vibrant and colours are quite thick.
Game mode dullens the image substantially, though, that's not exclusive to this screen or anything, as all oleds do that in game mode.
If you use custom picture set up, it looks very, very nice in HDR, and in SDR as well.
I game on a 1440p 170hz monitor and I love it. I replaced my old Samsung TV with a 43in 4k 144hz monitor in my living room with my Series X and Shield and its nice too. Out of curiosity, I did lug my PC out there to try 4k and It was nice though I'd say it was more because I had proper HDR on that monitor in addition to the bigger size and resolution. I still like the sharpness of my 1440p close range monitor though.
Always 4K for me. You don't have to play at 4K because you've got a 4K monitor. I love and appreciate the clarity of 4K monitor for every other non-gaming task. I can't go back to lower resolution monitor.
and now imagine that apple users dislike 4k because it's blurry. it is. retina is the only perfect monitor with 120hz+ doesnt exist yet.
@@WybremGamingwhat are you on about
@@JustMikeBroThey are saying Apple destroys the whole market with ACTUAL pixel density and visual clarity. They are also destroying the market with their new silicon
@@JustMikeBroThey probably just learned about MacOS's inability to display correctly at 4K and are confusing that with pixel density.
@@WybremGaming"Retina" is a pixel density measure. You can 100% have a 4K monitor within Retina's specs.
I'm glad the main digital foundry guy ("Rich" I think) mentioned pixel density. Because that is the real relevant factor. How pixel dense? and how close are you putting it to your eyeballs?
I play at 4k DLSS quality (1440p) on a 55 inch QLED whenever I can, custom resolutions like 1620p and 1800p too if reconstruction is not possible... Real 4k is rare for newer games, but the details of 1440p is enough and the HDR and level of details that I get out of a big high end tv close to me is outstanding.
I like to use 3840x1600 for the 24:10 ultrawide aspect ratio. 3840x1920 is also good, it's 2:1. I like going letterboxed if I really need that extra bit of performance. I use a 42" OLED though. But I am planning on downsizing to 1440p with a newer 240hz OLED monitor. Honestly, my desk is just a bit too small, and I have to have my second monitor mounted very high and it can be annoying looking up. I could also just buy a better chair with a head rest, but for the price a new chair would cost, I could get at least one of those monitors, maybe 2 depending on the chair.
Does your display has vrr?
@@randyrrs7028 Yes, It's a QN90B.
@@randyrrs7028 Yeah, It's a QN90B.
@@randyrrs7028 All modern midrange+ TVs have VRR.
i have a 3440x1440 monitior at 120 hz. I am completely satisified with this setup. I only plan to go 4k at next generation of cards. I currently have 5800x3d and 6800xt. Runs perfectly most games i wanna play at 120 fps.
I might try nvidia next generation due to their superior RT and upscaling technology.
Don't be a traitor now 😜😂😂
make sure you get an Nvidia card with enough Vram to support 4k
I think TAA makes 1440p not feel as good anymore. Games just sort of look blurry. On my 1440p monitor, whenever I use DLDSR and DLSS to output a 4K image, TAA issues are significantly reduced. The image is much more crisp, and this is only on a 1440p display. Native 4K should look a fair bit better than even that.
Yea your right ans its because TAA is not a good AA. It def helps jaggies, but tye amount of blur that happens is unacceptable. Even if you use taa at 4k, sure naturally it looks better, but its still blurring more than it should, which takes away from the crispness of 4k to some extent.
@@anthonyrizzo9043 Too many PC gamers are obsessed with 'sharpness'. TAA was only 'bad' in the odd examples(especially earlier on), but was mostly hugely worth it, and the hit to clarity when using 4k is negligible given how much clarity there still ultimately is.
@maynardburger i mean if you have a 2000 to 3000 dollar pc and a screen thats 1000 dollars and you run games in 4k, it really should be sharp.
@@anthonyrizzo9043 God forbid gamers don't want a vaseline filter over their entire screen. Having to go up to 4K just to get clarity levels that we had at 1080p is insane. I shouldn't have to spend huge money on a setup so I can get the bare minimum of what I got in games a decade ago. TAA is a blight.
there's a fix for the blurriness. install reshade, find the lumasharpen and clarity injectors to make the image sharper and clearer and done. the other 2 injectors i use are lightroom and curves. lightroom is a tone down version of photoshop. you use curves to darken the shadows and brighten the light, then use lightroom to tweak the colors, vibrance and saturation, fine tune the shadows and light with shadows, midrange, and highlight, and you got an HDR look without HDR. it's really a shame that so many gamers don't know about reshade.
Before getting 4K monitor it's worth to check GPU reviews to see how flagship ones are doing in 4K after gen or two.
People need to keep in mind that ANY demanding game nowadays will offer reconstruction options. You dont need to run at native 4k to get good use from a 4k monitor. That's old thinking.
@@maynardburger People playing games for last few years rather noticed existence of upscalers ;) Or at least group of them tending to tweak graphic settings.
maxed settings and dlss quality mode at 4k! 🤤
This is the way
this si the way indeed.. after going 4k, i cant go back
This guy gets it!!
@@littellgamersame
Even 4090 struggles in Alan Wake 2 in that combination
Went 4K in 2017 for PC gaming and never looked back. Once you exceed 27" then 4K is necessary unless you like looking at the pixels.
You must have the eyes of a hawk, I certainly don’t, I have a 1440p 32 inch screen and can’t see individual pixels 🤓
nah. its very very noticeable the difference between 1440 and 4k. As is the impact on framerate unless you run a 4080 at least.@@youngwt1
@@youngwt1 Assuming you both have 20/20 vision, the thing that would make the difference is how far you're sitting away from your monitor. Quite a lot of people seem to sit farther than 30" from their monitor. With my desk setup, I'm sitting about 16" from my screen due to limitations with my room space, so anything with lower than a ~100 PPI and the pixels are very visible
@@epiccontrolzx2291 This.
With everyone going back and forth about if 1440p or 4k is the better choice... the viewing distance from the screen is the most important part.
1080p looks a bit different when you're looking at it on your phone compared to a flat screen TV lol.
@@justinp9170
This is the truth really.
I use my pc from the same distance you would use a console from, and in the same way really.
I can't be using a small 1440p screen.
I just pipe it through my 4k tv and enjoy all the extra pixels on a 50" screen and actually decent HDR.
i think the question is more along the lines of "is the jump worth the hassle?" because nobody even bothers pondering whether 1080 is better than 720 and 1440 is better than 1080, no one would go backwards in those two cases, but 4k, even 10 years after going kinda mainstream, its not regarded in this way and as you can see its pretty normal to "downgrade" to 1440 or just dont mind having either unless its for a big productivity/multitasking monitor
I was on 4k for 2 years I think and I went back but that was before stuff like DLSS and FSR, now I want to go to 4K again because it's sooo good and you can get 240Hz monitors, when I had it was only 60Hz
Literally the only thing that made me go with 1440p is the non-existence of a high HZ good quality 4K ultrawide (or 5k2k apparently, god i hate marketing). Once a 4K UW can do everything my current 1440p UW can do (that is 160Hz, OLED, HDR1000, etc) i will buy one.
Performance impact of 4K is just too high on PC, especially on the relatively small PC screens (27"). And since 27" is a perfectly acceptable size for PC users, 1440p becomes a perfectly acceptable resolution.
This is one of the unspoken benefits of PC's over consoles. Consoles, due to being connected to much larger screens, need to focus on high resolutions.
@@fcukugimmeausername That was before stuff like DLSS and FSR came out, now you can upscale your games to 4K from 1440p and it almost looks the same as native 4K 😅 plus 32" monitors exist and they are awesome
@@adi6293 I agree... but 1440p upscaled to 4k is still 1440p.
Going from 1440p to 4K was a much bigger upgrade than 1080p to 1440p for me. First time I can't see the pixels or dot pitch, can't go back to lower resolutions now. You have to use upscaling but it actually looks good unlike in 1440p.
Depends on the displays PPI…
Ever wonder why smart phones have the cleanest displays? Higher PPI than the overpriced stuff on the market.
@@christophervanzetta Yes, 25" to 27" 28" so each time I'd go to a noticeably higher PPI. That's the idea.
I put my ps5 from automatic to 1440 p with hdr also Retracying mode it's looks identical to 4k in my opinion I'm playing on 4k 65 inch
@@LosoBanks_332Depends on your viewing distance, as you are sitting further back from a larger screen the difference won't be a lot. Also 2k on a 2k monitor will look better than being displayed on a 4k monitor.
1440 looks great though
Im very happy with 1440p gaming, even though my PC will easily support 4K gaming. I would always favour better looking games over higher resolution.
Same mate.
There is literally no reason you cant have both. That's one of the whole points to PC gaming. Sticking to 1440p is not helping games 'look better'. Developers are not designing visuals around 1440p or anything like that. :/
I have a 5800x3D paired with a 7800XT. My current plan is that i already have a 1440p 144hz monitor so that one i use for FPS Games or Competitive PC Games. When i play a Story Game for example GOW 2018 right now i take a 10m HDMI2.1 cable and put it into my 55 inch 4k60hz TV because honestly i wont get much more than 60 at this resolution in most games. I also have a PS5 but its honestly just for the exclusive Sony Games.
GOW i could push to around 80 FPS in 4k if i want but its older and i dont have the money to upgrade the TV
I have those same specs with 32GB of DDR4 Ram. I started with a 6650xt with a 5600x3d and noticed that the 8GB card was kinda holding me back in gaming
I'm the same but with a 2070 super and 3 meter HDMI to 4k 120 vrr OLED tv, debating about a 4070 ti super especially with rtx video and HDR
@AnomieDomine I wanted to go with an Nvidia build but since I already had an AMD card and I'm not too keen on Ray tracing I went with the 7800XT. That and the fact my TV supports FSR
@@DarkJustin87 how does a TV support FSR?
I’ve been rocking a 26.5” 1440P 360Hz QD-OLED for a few months now and it’s been heaven on earth.
I recently was looking to upgrade from the 27” BenQ 2560x1440p QHD monitor to a 32” 3840x2160p 4K UHD to be used by my RX 6950XT. However, during my search I’ve come across a 34” Samsung Odyssey OLED G8 with a res of 3440x1440p WQHD, and it is by far the best purchase I’ve ever made for a monitor. Not only does WQHD look great but also the 21:9 aspect ratio provides an immersive cinematic experience with a sharp image while the GPU is not being overwhelmed like pushing out 4K. Let alone the OLED Display takes colors to another level. From my own experience, I believe 1440p is the best way to go for monitors. It’s certainly the sweet spot between providing a sharp image without severely comprimising performance. With most GPUs supporting AI upscaling techniques, performance should get even better. Bottom line, no one needs a 4090 to enjoy high quality gaming.
what monitor should i get with 4080s + 7800x3d? I need screen bigger than 32inch
How much bigger we are talking and why? For you, an LG C3 at 42" would be a good fit, if you are really looking at bigger than 32 inches.
Rocking a 32 4K 144hz monitors for my gaming PC and my PS5. 1440p with 60fps + is where its at. Im not planning really on gaming in 4K unless its an older title or a Fighting/RPG where 60fps is really all i need
to be honest the big jump is when you go from 1080p to 1440p, its a very considerable increase in image quality, from 1440p to 4k honestly theres not much of a big jump or at least is less noticeable
1440p output on a 4K monitor is about the same as 1440p native. Anything past last like DLSS/FSR, 80% render scale, or even native is nothing but upside, and on top of that you also have a much better desktop experience. Budget is really the only justification for 1440p IMHO.
Which is silly cuz 4k monitors are super affordable nowadays.
I have both a 28” 4k144hz and a 27” 1440p144hz.
1440p looks ok. But once you look at a high ppi 4k screen 1440p looks blurry.
The only time i play on the 1440p monitor is when i cant get the fps i want on the 4k monitor which isnt often (i have a 4070ti super).
Any idea why Rich would say 4k 27in pixel density is too high? Didn't think there was such a thing and was looking at one for my next monitor
@@yc_030 yeah that one didnt make much sense to me either.
Maybe its a per person type of thing. And some people cant really tell a difference once you go over a certain threshold? Like people with less than avg eye site?
Either way i can definitely appreciate the difference.
@@kapler8550 I actually ended up looking it up and the ppi on a 27in 4k monitor is 163 and my phone is 460ppi so not sure what he was talking about there
1440P is the great new "workhorse" resolution for me. Perfect balance between performance and quality. Cheaper monitors as well. Use DLSS quality mode in many games and it looks great.
I think it also depends on the use case, as in whether you also use the monitor for more than just gaming. I use mine for office use 80% of the time, reading a lot of papers. And there I can definitely tell the difference of 4k vs 1440p on a 27 inch monitor.
yep my 32inch 4K monitor gets mostly used for work these days🤣🤣🤣🤦 couldn't go back to 1440P just for that reason. DLSS modes from performance upto quality look fine at 4K so 🤷. You do need a 7900XTX or 4080 class card though IMHO especially if higher refresh rate is wanted.
I upgraded from a RTX 3080 with lg ultragear 1440p @ 180hz to a Asus 4090 oc with a Asus PG32ucdm Oled 4k @ 240hz.
No regrets.
Im gona play singleplayer titles on a 4070 should i play on 1440p or 4k pleas3 answer
Got to be honest I have been on 1080p for years still on my 144hz LCD monitor though i am tempted to get a high frame rate IPS or some other one with better color and contrast
I had used my brother's old 32'' 1080p tv for gaming monitor for few years lol. Last year bought 4k (tho only 60hz, having ~120hz would be great improvement) and the 1080p vs 2160p is like having bad eyesight vs hawk eyes. The details for everything that you see is just super worth it.
The required resolution depends alot on whether you try to recognize the whole screen or concentrate just on parts of it.
If your game makes you watch the complete screen, I am still fine with 1080p.
If you need to watch around to concentrate on small portions, 4k is the way to go.
With refresh rates there is something similar. If the game does not blurr motion and you have fast movements of small objects, refresh rates up to 120Hz can make sense.
If the motion is blurred and the game is not about small objects moving fast, even 30Hz can be sufficient.
Next question is, if you are playing competetive or just for fun. Competetive you will typically reduce the rendering quality, because all the shades, reflections, effects make it harder to see, what you actually need. If you are playing for fun however, the beauty and the emotions carried are better with more details.
There is never a "one fits all" answer.
The C2 42 OLED 4K is the chefs kiss for me
C4 now has 144hz.
In tons of game(Cyberpunk, TLOU, RDR2, etc), 4K DLSS performance(1080p) will looks "A LOT" better 1440p native because those engines will use lower LODs when the outputting resolution is below 4K. Unreal Engine doesn't seem to suffer from this, as well as all the Nixxes ports.
If you had to use 4k DLSS performance (1080p) to play the game, what makes you think you can play it 1440p native? Seems kinda weird logic here. If we're comparing 4k DLSS Quality(1440p) with native 1440p, maybe that makes more sense. Then again, if you were willing to play with upscaling in the first place, why not go with the higher resolution display? Especially for older games, you can play at much higher fidelity than you would get otherwise.
@@_ch1psetDLSS and other advanced upscaling techniques are not "free", they do have a compute cost, and in my experience, in terms of average FPS, performance levels at 4K DLSS performance mode are largely comparable to 1440p, and slightly more demanding in VRAM limited scenarios
@@leomariano2735 you're right, it's not "free". Everything you said still reinforces what I said. What makes you think you can run 1440p native if you can't even or struggle to run dlss performance to 4k? If they are the same load on gpu and/or vram, you'll struggle with both. Logic still holds up there. Idk what your point is.
Likely because UE has been pushing for upscaling for very long time with TAAU and TSR with UE5. So they decoupled their render resolution from LOD range a while ago.
@@_ch1pset I have 4060 laptop, in my experience 1440p native fps = dlss performance fps in most games. in some games 1440p native gives me more fps.
Made the switch to 1440p years ago when red dead 2 came out and the TAA blur was horrendous, didn’t cost too much performance wise for a very noticeable boost in image quality.
Rdrd2 has issues with DLSS making the image way too sharp. There's a mod on nexusmods to fix it and it makes the game look 1000 times better. It's a must have even dlss balanced on a 4k display looks astonishing when u use it.
keep in mind, sometimes 2k monitors will have better colors and contrast vs 4k, and 4k isnt necessary until 32 inches or above, especially if all you're doing is gaming. I also would choose visual fidelity over any hz over 240hz. You can tell the difference between 60hz-120hz, but you cant really see much difference between 120-240 and even less above that, but sometimes the higher hz monitors also have better response times
Switching to OLED was the best decision I made, more so than any resolution. I went for the LG oled 240hz for pc and ps5, it’s been incredible 😊
Probably the same panel my Asus uses. I already had a high end IPS monitor, but the switch to OLED really was a drastic visual improvement. It's amazing just how much you lose out on when you have a backlight destroying the details on anything with contrast. I'd recommend a switch to OLED over a GPU upgrade for most people if they had to choose which to do first.
I want a panel that can last me 10 years or so, so OLED is a no-go. I bought an expensive plasma TV a long time ago and it really soured me on the experience. No peace of mind, annoying image retention, worries about long-term burn in, high power/heat, etc. Dont want to go through any of that again.
@@maynardburger go read up on the Rtings OLED test. They have a UA-cam channel simulating 1,000s hours on modern OLEDs including monitors.
The tech has moved on, & the panels are a lot better mitigating burn in.
10 years is a hell of a long time though, I can't say I expect any device I own to last 10 years, or be relevant that long. Maybe my laptop, but even then I'd probably choose to upgrade around year 6-7.
I think 5 years for an OLED with good mitigation is actually doable though. Expecting 10 years of top end reliability seems delusional to me.
@@WildfireX So you wanna pay $800+ for a tech that lasts only 5 years? and 10 years is not a hell of a long time. My Samsung FHD TV is 12 years old and still running. That's why greedy corporates LOVE people like you. To sell you something every 5 years 🤣
I’m tryna decide what monitor to go for as I’m planning to build my first PC and I want to get a Mac Studio as well.
I want to game, stream, video editing/YT, watching content and everyday usage. I’ve been using a 1080p AIO “PC” since 2015, it’s a 27 inch curved Samsung and honestly I think it’s okay even to this day… I also have a 5K iMac 2020 and it looks gorgeous.
My PC build I’ve decided on is 13600K w/ 4070 TI Super. I think that can run 4K 240hz and the Mac Studio can also do 4K 240hz with HDMI 2.1…
Should I get a 4K 240hz or a 1440p 240hz instead?
If you have the budget for an OLED. The 27 inch LG UltraGear is a pretty solid option. Alternatively- a brand called AOC has a new OLED out.
1440p is the safe bet. The sweet spot. But I definitely think 4k is a bit demonised at the moment. It requires a bit more compromise, but you can get surprisingly decent performance in 4k with less than 4080 or 4090 level GPUs.
I sim race and play games at 4k /3840/2160 res. Games look beautiful. 55'' OLED monitor.
Any 4k in looks good
all about screen size and how far away you sit. 1440p is perfect up to 27 inch monitors imo, at 32 inches it just looks a little too pixelated at normal desk viewing distance. you get significantly better performance for not much clarity loss.
4K basically scales perfectly up to even 88-in TVs considering you sit so much further away from them.
4k also makes a clear difference on projectors.
I have a 4090, just switched the monitor from 27" 1440 144Hz to a 32" 4k 144Hz, and I am not going back to 2k. It is absolutely worth it.
4k is actually 2k lol
I went with 1440p monitor for my new PC, but for my PS5 I use a 4K TV. 1440p just made sense for PC, like a sweetspot between perf and quality.
Plug your PC to the TV it will look better if your GPU is powerful enought.
If you have a strong GPU, use the PS5 on the monitor and the PC on the TV because the higher PPI will make PS5 look sharper (since it doesn't reach 4K on 98% of games) and you will be able to use your PC's full potential on the TV.
What monitor should I get with a 7800X3D & 7900XTX build
The one you like the most!
With modern game poly counts 1440P is a sweetspot. In future as poly counts get higher, I can see 4K gaining more popularity but not much point going 4K on a sitaution where even smallest polygons are like 10 pixels wide.
This is complete nonsense. I have no idea where you're getting this notion from. lol 4k is always better, simple as that. Polygon count is not going to change any of that. Aliasing issues even in sort of micro-detail or sub-pixel detail is better resolved at 4k and there's already plenty of that. I dont think you seem to have any idea that distance is a thing. You're not viewing every object at arms length distance. Lots of smaller objects in the distance and whatnot are already in the 'subpixel' region.
@@maynardburger It isn't complete sense, it is just that you are not able to comprehend what I'm saying. Would be better if you tried harder tho.
1. I never said 1440p == 4k. I said 1440p is sweetspot. Sure 4K contains minor extra details but not so much as to justify for most people. Not with the current polygon densities. Poly density has direct effect on perceived resolution quality. Look at flat white image, you won't notice if it is 240P or 8K. That is because there are only 2 polygons on a flat white image.
2. Aliasing artifacts that are resolved by rendering at 4K will also be resolved for 1440P. Just use DSR 2.25x AI mode on 1440p.
1440p on a 16" mini LED 240Hz 4080 laptop works wonderfully for me. 1440p on a 16" display looks so sharp. You can't see the individual pixels, so you get the look and feel of 4K, with the performance of 1440p. I love it.
I have two monitors. One is 1440p, and one is 4k. I initially didn't think 4k was worth it even for my 4090 until.... I discovered integer scaling. 1080p is a perfect integer for 4k, which means great clarity on a 4k monitor at awesome performance. And apps like lossless scaling make 4k monitors very versatile.
Like many who have tried 21/9 monitors, I only consider those now. Enhance greatly your gaming experience.
I can tell the difference between native 4K and 1440p if I’m close to the screen. I can’t however tell the difference between upsacling to 4K and native 1440p even if I’m close to the screen, probably the things that are far away in game night look pixelated but it’s always going to look like that no matter what resolution.
dlss quality @4K looks great
I went from playing on a 4k LG oled TV to the new MSI 1440p oled monitor. I currently have a 4070ti super as my GPU and honestly speaking the 4k tv does look better but the 1440p monitor is no slouch the oled monitor really gives you beautiful colors and a really clear image that is close to 4k. But the added advantage of being to run everything on higher settings with way higher fps output im 1440p really sold me. The monitor is 360hz and is smooth as butter.
On a 4K 65" Samsung QN90B, I'm playing Dragon's Dogma 2 at 4K with DLSS3 and FG on an RTX 4080 between 70-130fps, at max visual detail. Yes, that does include some frame drops and stutter because something's up with their engine, but most modern titles I'm able to keep a steady framerate and squeeze towards 120fps on this TV.
If you plan on doing PC gaming on a TV in a living room setup, 4K is the way to go, and it's totally worth getting a good GPU to drive it! The difference in details like small particles and textures is staggering with this kind of setup. With 27" monitors, 4K isn't going to get you much.
Lmao you saying 4K is the way to go while using DLSS3
@@ThePgR777 that's what dlss 3 does... so you can play most of the latest games at 4k with 120fps and majority don't even noticed the difference between native res and performance mode dlss
@@ThePgR777Well yes, that's the whole point. If I had a choice I'd play at native resolution without upscaling, if games were actually optimized these days. Devs use upscaling as a crutch to be lazy, but we make do with what we've got.
27 inch 4k looks a lot better than 27 inch 1440p even with dlss. You have to use dlaa at 1440p to get close. I'm running 4k off of a 4070 at the moment and it has been great. I still have my 1440p monitor as a secondary.
this is my dream monitor tv, I almost bought it but it was snapped right in front of my nose and im still pissed
Meanwhile me going to buy a 1080p monitor because that’s in my budget 😅
Me to man. 4k is far far away 😂
@@Mene0 4k monitors are incredibly affordable nowadays. Have y'all not looked at monitor prices in a long time or something? Heck, we're even starting to see 4k/144hz monitors for like $400-450 now. A perfectly decent 4k/60hz IPS display can be had for like $200.
4k gaming was a lie, glad im still gaming on my old 1080ptv. 4k tv owners punching the air right now
No point in buying a 4K monitor if you can't play at that resolution (even with DLSS).
People keep talking like 4K is the norm but the vast majority of gamers on PC are still playing at 1080p. That's fine.
I have an RTX 4070 and a 1440p monitor, and more often than not, I have to use DLSS for modern games (quality if I'm lucky, lower if there is some Path Tracing involved). I can barely play at ma native resolution already.
4K is great if you have a 4090, but that's pretty much it.
I would have to disagree with Alex. I have a 165hz 1440p and 4k 60hz monitor in my setup and like 90% of the time, I prefer to game on the 4k. I feel like the only use-case for the 1440p, especially since just about every game now relies on upscaling, is games where fast motion is preferred, such as FPS's but even then I prefer the 4k setup for most non-esports games.
It’s all about size. 31.5” 1440P is not great, but a 26.5” 1440P monitor really is.
Screen size and viewing distance are everything. The conversation is meaningless without factoring in both variables in every case.
There aren't gonna be many situations with modern monitors where 4k is 'too much'.
I love 1440p
The perfect balance between picture quality and frame rate (which to me, frame rate takes priority). I went from the OMEN 27i to this LG OLED 1440p ultrawide and it's amazing! Some say that the pixel count is low and it's possible to see individual pixels but I can only see it with my face planted in the screen... which isn't my normal viewing angle. Rocking a 6950XT and it's nice to turn on high settings and still get pretty high frames.
Seems like you would have to pick and choose more often between quality and frame rate with 4k in my opinion but that's just my opinion.
I think we are kind of missing another point that it also matters what kind of games you play. For example, I would say if you competitive first person shooters, higher frame rates to be competitive with the speed of the action matter more than just wanting beautiful textures, so a solid 1440 p with high frames matters more than 4k with lower frames, especially if you turning on Ray tracing everything maxed and only getting 50-60 fps. However, if you playing solo/rpg story line games in non online settings like a dragons dogma, baulders gate, god of war, etc, I think you would want 4k to maximize the experience and immersion with Ray tracing turned on. The action won’t be so fast that you can’t respond or be competitive, so fps doesn’t matter as long as it is at a playable frame rate.
haha funny than you made this video, a few days ago I switched to my 4k monitor from 1440p (both LG IPS 27inch) and even tho 4k is 60hz its much better, very noticeable. and I have 4070 and even dragons dogma 2 gets to easy 60fps so...
4070 is very capable at 4k. I'm still able to play a lot of modern titles over 100 FPS with dlss quality which looks great at 4k. Can run most 5+ year old titles at native with high FPS
Really? You're saying 4k on a 27inch display is so noticiable that it made u choose 4k at 60fps, over 1440p at 100+ fps?
if I'm not playing competitive fps, yes. 4k is not only resolution clarity is looks like everything on the screen is right there, so yeah I was playing horizon and dd2 on both displays and I much prefer 4k 60fps.@@crazy-nephew_5857
correct its very capable and im not even using frame gen, I was watching some guy doing benchmark on 4090 of horizon fw and he was getting only 20-25fps more on 4k.@@mojojojo6292
probably next year we gonna get 4070 class gpu capable running 4k 100+ fps or at least something similar to 4080 super. but for that you have to buy 4k tv on monitor capable of high refresh rate.
1440p DLSSp has same base DLSS resolution as 1080p DLSSq, and looks significantly better. Its a nice sweet spot and what I built mine around.
Depends on your budget - Ive got 4090/i9 build and currently rocking QD OLED 1st generation AW3423DW from Dell. 1440p monitor, amazing display and for the size it has good PPI - the density is good enough for me, but I am considering Asus ROG PG32UCDM, 4K QD OLED later this year when it becomes available in my country and when/if 5090 arrives. The PPI is just not high enough to fully satisfy me, although this particular monitor strikes a great balance. So again, the answer depends on your budget and wants/needs. If you check my case, you might be in the same boat, however lets face it, we are minority and for most of the people 1440p is absolutely great resolution. Screen size x viewing distance x PPI x your HW. I think if you have at least 4080, you can go with the 4K no problem, but if you are like me and want the best possible experience and you are willing to pay for it, then I think its gonna be 5090 enabling 4K w truly high frame rates and if you pack it with OLED you get it all.
My situation is the same. I have a 7800x3d with a 4090 and I play on an Alienware 3423DWF. I'm considering change to a 32" 4k 240, but I'm not sure about ditching the ultrawide for 16x9.
24", 1080p, 144hz free sync.
Cheap, looks good and fast. Also my TV is 1080p and I frequently use it with my PC.
What about aliasing? Is it still noticeable in 1440p? It's what bothers me the most in my current 1080p monitor, specially how lots of games look blurru whenever I move the camera
If you're taking about motion blur - it's the technique game developers use for compensating for stuttering image (due to inefficient refresh rate of screen) when you move camera around quickly. You can disable blur in most games. Having a monitor with higher refresh rate is what you need than. Refresh rate (Hz) and resolution (1080p, 1440p, 2160p) are two different things.
Just bought a lg 32 inch ultra gear 4k monitor from a 27 inch 1440p . Im on a rtx 3080 and ryzen 9 3900x, i cant hit every game at 4k 60 hz but i cant deny clarity of the picture is beautiful luckily most of the games i play arent that demanding , or the demanding ones have dlss . Loving my new monitor.
I personally decided to get a 4k monitor over a 1440p one not because of gaming but because of watching movies and tv shows in 4k. Most of the gaming that i do never hits 4k and is 1080-1440p but that doesn’t mean u should opt out a 4k monitor. Its also nice to future proof yourself.
I have an RTX 4080 super and recently upgraded from 1440p to 4K monitor. I can say it's definitely worth it. With DLSS and Frame gen tech, you can easily hit above 60 fps in 4K max settings.
I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump.
Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med.
People even on forums say the jump isn't that big in games, PC, or console.
Text is where it shows a big difference from 1440 to 4k,
Unoptimised games struggle to run native 4K even with monster cards like a 3080-4090. That problem doesn't always happen with 1440p. I'm thinking of switching back to 1440. I have been using a 4K screen since last year.
1440 I had used for like 6 year's
4k has been frustration with unoptimized/demanding games even using high settings, stutter/dropped fps
I have a 27" 4k monitor, and from my personal experience, there's no difference between the two at native resolutions, playing with upscaling technology however you can definitely see the difference.
You might want to consider getting glasses if you can't see a difference between the 2. granted I don't think it's as noticeable as the jump from 1080p to 1440p it's still pretty big.
Because 4K brings out more detail at 32”…
It's cause 27 is too small to tell
@@via_negativa6183Why is 27" too small to tell? 27" monitor 2 feet from your face on a desk is huge!!
I'm still 1080 and fine with it. Had a 4K TV for a while and was quite underwhelmed by it. I was expecting something amazing but it didn't blow me away at all. Sold the TV and just gone back to a 1080 monitor with my PC. I have no desire to upgrade at all.
You really should at least jump to 1440p it's a nice visual upgrade without the massive performance hit.
Well, then go to the NVidia Control Panel if you have an NVidia GPU and enable DLDSR for 1440p & 1620p, it will make your image look a lot sharper and will remove 80% of your jaggies. It doesn't look as good as native but it is a huge jump from native 1080p on a 1080p screen. Just don't use it on desktop, only inside games or your image will become blurry. Also check that you have Scaling set to off or your image will look zoomed in on some games.
1080 monitor will blow away a 4K tv, because its being blown up in size, you get a 4K monitor, it will be loads better than 1080p and 1440p
@@ChrisDaytraderthings ‘pc master race’ will say 😂😂😂.
I'm actually thinking of doing both. 1440p screen for shooters and a 4k tv for games like God of war, Jedi Survivor, Assassis's creed.... My guess is a 4080 Super could probably handle that load.
massive difference.
I have said before that 1440p or 4K will look absolutely glorious on its own, any of those resolutions will be just amazing on any OLED, BUT until you have both displays next to each other, then of course you will see that 4K resolution is vastly superior. Key thing here is having them next to each other, because honestly your eyes will get used to whatever resolution and monitor size after a couple of hours in.
If you're strictly only gaming then I would settle for a 1440p, however in my case I also watch a lot of content and do a lot o productivity, so the better image quality and crisp text is a must for me, I don't mind using upscaling if needed.
Im still gaming on my old 1080p tv because im a console gamer and most games are either 1440p 30fps and 1080p 60fps, so theres no reason for me to upgrade
I have a 3440x1440 pg35v and I prefer to play my games in 5k; the latency increase is the only asterisk for fps games. I am looking forward to the future 4k oled monitors that have higher frame rates (especially at 1080).
I have the same dilemma, although I am leaning heavily to 4K 144hz+ now for several reasons, even considering I am a midrange buyer:
1. I have had a 1440p 144hz monitor since 2016, and I don't really see the need for 240 hz being primarily a single player gamer (I probably would have stayed on this monitor if not for getting tired of being locked into G-Sync only due to it being a pre G-Sync Compatible era monitor and the aging color quality vs new models).
2. Upscaling being quite good even in Balanced and Performance modes if the target is 4K.
3. If worst comes to pass (like Alan Wake 2) there is the option of integer scaling from 1080p while retaining all the eye candy, even full RT.
4. Better compatibility with the PS5.
5. Nicer looking text is a treat for me as a Software Engineer looking 8h a day at text.
DLSS performance renders at 1080P. Better than integer scaling if game supports it
My next display is gonna be a 4K OLED 360Hz HDR10 Ultra. So, hope the prices on GPUs go down soon, cuz I'm gonna need a monster GPU to carry the load.
Depends on your monitor size tbh, I have a 27" 1440p screen and some games look crap in comparison when in 1080p, and the difference between 1440p and 4K is bigger than 1440p VS 1080p
But if you`re on a 5" smartphone, the difference between 720p and 1080p is hardly noticeable at all.
At some point the pixel density just becomes pointless, and when it comes to text, it can be straight up detrimental.
Screen size, viewing distance and settings make a big difference, so if I` mgaming on a monitor 2 feet from my face, I`m not using the same settings as when I`m gaming on a TV that`s 7 feet away.
It all varies based on settings and circumstance.
Was going to say the same thing, size is such a big factor. I use my 75" 4k TV as my monitor. 1440p is not bad on it, but does 4k look much better formost games? Yeah.
@@dece870717 wrong it doesn't its a medium difference
27" 1440p
32" 4k. That's how i would approach it. 32" 1440p but you do lose ppi
@@dece870717 I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump.
Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med.
People even on forums say the jump isn't that big in games, PC, or console.
Text is where it shows a big difference from 1440 to 4k,
Unoptimised games struggle to run native 4K even with monster cards like a 3080-4090. That problem doesn't always happen with 1440p. I'm thinking of switching back to 1440. I have been using a 4K screen since last year.
1440 I had used for like 6 year's
4k has been frustration with unoptimized/demanding games even using high settings, stutter/dropped fps
I use a 4K OLED LG 65" TV hooked up to my EVGA3090 and its the most beautiful images ive ever seen in gaming.
It's the oled not the 4k resolution.
@@4evahodlingdoge226 you’re not wrong but with DLSS and different resolutions you get better results in 4K. In some cases it’s tough to distinguish 2160vs1440.
I made a custom resolution 2140x1200 for my 55 inch Tv and set sharpening to 20%.
It looks perfect for me and the performance is pretty much the same like gaming on 1080p.
Personally, I prefer max settings at a high refresh rate ultrawide 1440p over 4k/60. 144+ fps aint always easy these days at 4k without fiddling with settings.
Had a 32in 1440p and went 32in 4k yes is definitely a step up. The real life saver is DLSS. 4K and DLSS is a marriage made in heaven
In the market for a new monitor after a new build of 7800X3D paired with a Aorus Master 4080 Super, I was going to go with the Asus PG34WCDM 1440p Ultrawide OLED, but thats nearly £400 more than the Alienware AW3225QF 4k QD-OLED, for me, its a no-brainer given the cost saving.
Don’t forget about the Alienware AW2725DF either. It’s amazing!
@@andrewmorris3479agreed, but I mainly play story driven games rather than multiplayer, so for me the bigger screen makes more sense, even for 27" if multiplayer / competitive gaming is your day-to-day gaming experience, then you'll be wanting frames over anything, so 1440p would probably be where you'd want to be,
To me, visually stunning titles like Cyberpunk, Alan Wake 2, Avatar, and the like are really great at 4K if you have a GPU that can handle them. I have a 4080 so I'm often going back to and forth between using DLSS to get above 60fps or just playing at 30fps natively. Neither situation bothers me because the experience is enjoyable regardless. However, HDR10 should be a factor in choosing a display monitor because it gives the game a pop of color to make them more vibrant.
HDR is still generally too expensive in the monitor space. At least to get any kind of decent HDR experience. Still requires so much specialty technology to do right. If you're in the
@@maynardburger actually it's not as expensive as you think. If you get a VA, you could save some money. In January, I bought a refurbished LG 32" 4K 60hz Monitor (VA) with HDR10 for $184 including tax and a 1 year protection plan. I know that IPS is superior in the monitor space, but games look gorgeous on there. That's why I always advocate for HDR10 as opposed to HDR400 which is the most common for monitors. Also, it would allow you to use current Gen systems without a dropoff in quality. I'm not suggesting to get a 4K monitor because I do believe that 1440p is the sweet spot. I'm just saying that I would highly consider getting a display that has HDR10 regardless of resolution.
I have a LG C2 and a 4090. It's stunning
I feel like 4k 60fps ultra settings for non xx90 tier GPUs is still a generation or two away. When I got my 3080, I was torn between 4k and 1440p and almost went with 4k because in my head I was like "I have one of the most power GPUs of this generation," but I'm actually really glad I went with 1440p. I can crank settings and get over 60 fps in most any modern AAA games. Throw in DLSS and I'm getting close to high refresh experiences with maxed settings. Just don't think that would be possible on a 3080, and maybe not even a 4080. The 4090 cruises through it, of course, but it's harder to justify such an expensive part. Maybe the 5080 will offer near 4090 performance at a price tier below
If my card is hitting 90-100+ @4K (which it does), then I'm not gaming in 1440p just to say I'm getting 240fps. That's crazy for anyone that's not playing competitive CS or something like that. Not only that, but I have streaming boxes and things hooked up to said 4K monitors that benefit from 4K, HDR, Dolby Vision, etc.
For those uninterested in those things, 1440p is a better bargain.
Ive played gta v of the legion go on ultra with native resolution and the sharpening and clarity is crazy, never experienced that quality before and i like it, but i will still go back to 720p
I'm pefectly happy with 1080p at 144Hz. And I prefer high FPS over pixel count.
1440p vs 1080p the advantage used to actually be the wider screen real estate so you could see more in fps games. I’m not really sure if the same applies. I could actually be worse again because of the screen aspect ratio?
pixel density does wonders for how sharp an image looks. For example my 1080p 15 inch laptop looks impressively sharp because it's so pixel dense. a 24inch 1080p monitor looks like crap in comparison because of the density.
For a 4k screen to match the pixel density of a 1440p 27 inch to equal the pixel density you'd need over 40inches at a 16:9 ratio so anything under will gain sharpness. So a 32inch 4k screen will be very, very noticeably more clear and give you a bigger monitor overall.
DLSS performance at 4k is over 100p internal resolution higher than 1440p quality as well. With that said you need like a 4080 minimum to be sure you can run every single existing game well. You can push it a bit with a 4070ti super but you'll be cranking down settings in some of the harshest games like AW2.
I'll also say that if you're looking to use Ray Reconstruction in any game it is unbelievably smeary at 1440p DLSS quality. You need at least 4k output res to have Ray Reconstruction look good atm.
1440p seems to be the sweetspot. However most gamers use 1080p especially for competitive shooters where high fps is critical.
What should I get with a 7900xt? 1440p or 4k? The XT doesn’t have DLSS, sadly… :/
It just matters what you do with it.
"That what she said".
If doing lots of gaming, go with 1080p or 1440p depending on pc specs. Go for 4k if not gaming much or just watching other media like movies.
or you just buy a high end GPU🤷. The tiering of GPUs is really clear now. A 4070TI/7900XT is the most you need for 1440P for a really great experience. However at 32inches and up 4K is a better res for the monitor size which means an 7900XTX/4080 or 4090 is needed.
Of course it makes a difference. It makes a massive difference when the viewing distance is so small. Even 4K is unfortunately too low-res for 27 or 32. Apple got it right with 5K for 27 which is over 200 PPI with an effective resolution of 1440p and 6K for 32. However, we don't really have the standards to transfer that amount of data with higher refresh rates + HDR.
Unfortunately most people seem to be blind these days because they think 1440p isn't a blurry mess. Imagine if we didn't have people like df pushing 1440p all those years ago if we would have had reasonable priced 4k monitors and gpus to run them? I'm sure Nvidia was thanking God for this hive mind seeing as now they could make so much more money.
@@OG-Jakey🤦 you can't cheat physics. 4K is a lot of pixels to render. We've only just got to the point with a $1600 GPU that you can genuinely have native high refresh rate 4K rasterised content.......until you turn on RT or PT🤣🤣🤣 then it's DLSS time.
@@OG-Jakey is it that most people are blind or your eyes are too good? 4k is great for 27 inches, even though most would disagree. However this guy is saying its too low of a res for that size lmao.
Maybe for text clarity 5-6k is necessary but for gaming, you can forget about it. 4090 is starting to struggle in certain games at 4k maxed out with RT/PT.
Also viewing distance matters.
Easy.. As long as ur display is more than 28 inch... 1440p
Anything more than 32 inch 4K
Anything more than 42 inch....8K
I got a 4090 and 1440p monitor. It allows me to get great performance even in AAA games. If I want 4k I just plug it to my 4k 120hz TV. Best of both worlds.
Because of DLSS and FSR doesn't really matter if you pick a 1440p or 4k monitor nowdays.
That's what it was meant for....all this upscaleing is for resolutions above 1080p as long as you have at least 60 frames.
It absolutely does lol, DLSS at 4k produces much better quality than 1440p at basically any level except ultra performance. 1440p makes no sense anymore.
Not all games suport Dlss@@omarcomming722
@@omarcomming722 exactly...that's what it was designed to do. Lol
@@omarcomming722 Agreed I'm using a 4070Ti Super at 4K and using DLSS Quality or Balanced basically looks like native and Performance looks really good too. Higher res you go the bigger benefit you get from upscaling tech and the better it looks. If you have a decent Nvidia GPU and plan to use DLSS a lot I think it makes more sense to go 4K vs 1440p nowadays.
I want to buy a 4k monitor because I want to watch 4k movies on my Pc and I want to play games in 1440p as my graphic card is RTX 3080. So will 1440p gaming on a 4k monitor look noticeably bad in comparison to a native 1440p monitor?
4K is superior to 1440p. PC components just need to catch up in price for gamers to do the jump on mass.
But I doubt the jump from 4K to 8K will be done tho, the human eye doesn't upgrade like the pixel count. And most people struggle to find the difference between 4K VS 8K on a PC monitor
After 4k I want smoothness. I'd much rather have 240hz 4k than 60hz 8k. 2880p is the next jump in resolution for me I think. Not making that jump until after 4k 360hz plus though. I'm gaming on the aw3225qf with a 4080 super and tons of titles are already in the 100+ frames. Older titles and esports titles do actually hit 4k 240hz right now. I want a 5090 on release to really push this monitor. I'm hoping that 4k 360hz or higher monitors are coming out around the time the 6090 is coming out.
Then it just depends on monitor size. If people are sitting up close at a 55 inch 4k monitor, then the jump 8k would be noticable.
@@Geekosification might as well stick your head in the Pc case then
It's a huge step in price from a 1440p to 2160p if you are a native resolution with zero frame gen guy like me. Will rather have high framerate with no distortion of the picture.
A lot depends on the anti-aliasing. Some games with really great anti-aliasing can look great on a 1440p monitor. Others still let you see individual pixels and jaggies at 4k. Running those games at 8k and downscaling to 4k gives a big improvement in image quality, so I'd expect to easily see the difference if those games were run on an 8k monitor. A better solution, though, is for developers to give their games decent anti-aliasing! Running Rocket League at 8k rather than 4k with my 4080 drops my fps from 500 to sub 100, so running in 8k at the same frame rate as 4k today needs a multi-generation increase in GPU power. Better anti-aliasing would be a more sensible solution.
even with a 4090, 1440p is till the GOAT. You can run 1440p, ULTRA and high frames (200+)
All the fames I like to play works fine on 4k with mid tier gpu, last epoch, diablo 4, path of exile
Upscaling tech is fine but I'll never consider it more than a stopgap solution. In the end it'll have to be native 4k, no strings attached.
In Cyberpunk at 4K, you see a lot more details. The problem is that 4K currently requires significantly more performance.
1440p or 4k. I mean, 4090=4k 4080=4k or 1440p.