I’m currently gaming on a 1440p ips monitor and I have the chance to get either the 1440p Msi oled or the msi 32” 4k oled. The 1440 I can get for £630 and the 32” 4k for around £960. I game in the Xbox series x so so Hz is really a factor as I’m maxed at 120hz. Thoughts?
Nonsense….most people that aren’t dumping thousands of dollars on video cards and monitors, should absolutely go with 1440 until 4k is the norm and you are able to find them in 27 in sizes too. I can’t sit at a desk staring at a TV size monitor inches away from my face while gaming.
@@ryanbrowning5586 true, who wants to see their ugly face staring right back at them right? And it was easy for the mirror to change from glossy into matte as I never ever clean it 🤓
I have a 32” 1440p display and I’m actually looking to downsize. It’s too much to handle unless your desk has more depth to where the monitor can sit further away from your eyes.
Personally went with dual monitor setup, 48" 4K/120 LG Oled for single player titles/content consumption + 27" 1440p/360 QD-OLED for kbm and faster paced titles + general workstation stuff. Total cost is only slightly higher than a single 32" 4K 240Hz monitor currently, if you can find a deal on 48" Oled. Best of both worlds
4k just makes the most sense. Fringing is less obvious when doing anything productive, DLSS that can satisfactorily fake higher resolution from less powerful cards and even then, with a modern gpu, you can play a huge library of older games comfortably in 4k. 240hz is plenty fast for the average person, and realistically the only reason to want 360hz is if winning at esports is your job. So a 27" 4k 240hz glossy OLED (with HDMI 2.1) would be an 'end game' monitor for a large majority of people.
@@mastersgamers747 downscaled to 2k its pretty much the same as a 2k display. I use the LG 4k 32in with 480hz 1080p but I usually run 4k even on a 6800xt...
forgot one HUGE diffrence and it drives me away from 4k being my main monitor. it's the size. 27" is the largest I could go with comp games so 32" would be a disadvantage at that point. I would have considered the LG "Dual-mode" if not for it's matte display
Why do you consider size to affect gameplay? I’ve heard some claim it takes up too much vision, but would you not push it back further? Perhaps your desk won’t allow this? Curious to know.
@@thedisplayguyits the same for me If the monitor is too big there is just to much space my eye has to cover and i cant go that much further back from my desk so i would love a 24 inch oled as crazy as it Sounds
@thedisplayguy too much information for the eyes fps shooters being close while still seeing the mini map without big eye twitchs is huge. Plus sitting further back surely won't help with eye tracking. Most of the very best players in the world want 24.5" as is but settle on 26.5 to gain 1440p.
Do you draw a paycheck from competitive gaming on a weekly basis? No? Then just get the 4K monitor. I built my first computer in 2013. 1440p was the standard back then. It's crazy that over a decade later and we're still at 1440p as the standard.
thinking of 1k hz is just insane. No triple a title will run that much of fps. We have years of advancements to achieve. We need something like crt for that clarity
A think the same as you. 4K 240, all the way. The only way i can possible think on 1440p monitor, 1) If your hardware cant do 4k 60 fps in "medium" presets of the games you play. 2) You ONLY play multiplayer games like fornite/apex/cs etc, fps games that have advantage of that 360hz and NEVER touching a AAA single player game. Rest of the cases, if your hardware can do more than 60 fps in "medium quality" in 4k, or you at least play 1 singleplayer AAA game a moth, thats it, you should go to 4k. Not only games are better in 4k, youtube videos in 4k look better, movies, overall operation system. Productivity work is better in 4k too. There are too much things better in 4k that 120hz of difference just cant match. Even p* videos are better in 4k and get no advantage of the 360 hz 🤣🤣 rofl
Don't forget that 32 inch is too big for competitive games. And also, what's the PPI of the 4k 32 inch vs the 1440p of a 27 inch? Edit: 108 PPI vs 137.68 PPI on 4k. 27.4% more PPI
32 inches is not “too big” for competitive games. Move the monitor slightly further back and it will have the exact same appearance in your field of view as a 24 inch. This is very very basic
@@Deifiable First, if you move the monitor so far, you won't notice the 4k difference vs 1440p. Second, there are 1680 players on prosettings's website. Not a single one of them uses anything above 27. 97% use a 24-25 Inch. Explain why?
@@Vandelay666 so your first argument is not even trying to say its worse, you’re just saying it’s not better. That’s a very different argument. I was never trying to say it was better because it was 4k. I was saying it wasn’t worse because it’s larger. Separate claims. Second, because all of the best features comes to the smaller monitors. The 540 and 360hz IPS panels with Dyac2 and ULMB2 are all on 24” monitors. Pros want those features, so of course they buy those. Again, this is Because of this nonsensical idea that 24” is somehow better than 32” pushed further back. Which is inaccurate. And you’ve provided no argument otherwise, except saying “32” isn’t noticeably higher res”, which doesn’t factor in at all.
I sit in front of 32" and play Quake Champions perfectly fine with it 🤷♂ And Quake is very competitive game. I couldnt even imagine going smaller than 30". 27" or even 24" would be very tiny lol. You get used to a bigger monitor pretty fast
@@Deifiable IPS is so slow with pixel response that they need overdrive modes where they then have even more ghosting. I personally would prefer OLED 240hz with 0.1 response time and thus perfect motion clarity and true 240hz without any ghosting over an IPS any day
Quantum Dot (QD) and White OLED (W-OLED) are two distinct display technologies used in modern TVs and monitors, each with its own strengths and weaknesses. Here are the key differences between them: ### Quantum Dot (QD) Technology: 1. **Color Reproduction**: - QD displays use a layer of quantum dots that emit pure colors (red, green, blue) when stimulated by a light source (usually a blue LED). - This results in very vibrant and accurate colors with a wide color gamut. 2. **Brightness**: - QD displays can achieve higher brightness levels compared to W-OLEDs. This makes them well-suited for HDR content. 3. **Efficiency**: - QD technology is generally more energy-efficient than OLED because it uses LEDs as the light source. 4. **Lifespan**: - QD displays tend to have a longer lifespan as they do not suffer from burn-in issues as OLEDs do. ### White OLED (W-OLED) Technology: 1. **Color Reproduction**: - W-OLEDs use white OLEDs combined with color filters to produce red, green, and blue subpixels. - While they offer excellent color reproduction, they may not achieve the same level of color accuracy and vibrancy as QD displays. 2. **Contrast and Black Levels**: - W-OLEDs have perfect black levels because each pixel can be turned off completely, providing infinite contrast ratios. - This results in deeper blacks and a more immersive viewing experience, especially in dark scenes. 3. **Viewing Angles**: - OLED displays generally offer better viewing angles compared to QD displays, maintaining color accuracy and contrast even when viewed from the side. 4. **Response Time**: - OLEDs have faster response times, making them ideal for fast-moving content like sports and gaming. ### Text Clarity: - **Text Clarity**: - W-OLED displays are often considered better for text clarity due to their ability to control individual pixels precisely, reducing issues like color fringing and providing smoother edges on text. ### Summary: - **QD Technology**: Better for vibrant colors, higher brightness, energy efficiency, and longer lifespan. - **W-OLED Technology**: Superior in contrast, black levels, viewing angles, response time, and text clarity. Your choice between QD and W-OLED should depend on your specific needs, such as the type of content you watch, the environment you watch in (bright vs. dark room), and your sensitivity to issues like burn-in and text clarity.
@GustavoRabelo93 Great answer! Thank you. But if you dont mind me asking, did you use ChatGPT for this answer or some sort of AI. Regardless the answer checks out and it's well explained thank you once again.
Exactly but somehow most people ignore that. For example I do not need 240hz watching youtube videos, or creativity stuff. 4k might look nice but even a 4090 (i do own one) it does NOT run most games @ 144 fps. Average is around 70-80 fps. You might get around 100-120 with DLSS & Frame Generation but thats it. It might run E-Sport titles/FPS games well but when it comes to for example rpg/open world games - big No. I rather have consistent 144 - 200 fps in WQHD than having the feeling that the game is lagging.
@Roecky nice info brother! May I ask! Does your 4090 run Fortnite at 240fps 1440p? Or any game for the matter? Also, if you were to lower all textures and image quality how many fps can you get with your rig?
Y'all don't seem to realize that if someone is spending over $1000 on a monitor, odds are they already have a beast of a PC. Just because it is expensive for some people doesn't mean it has no point. It absolutely has a point, there are plenty of people out there who have 4090's, 4080 Supers, 7800XTX's...etc. You not having the ability to properly leverage a product doesn't make it pointless for everyone else. Also it's incredibly short sighted to buy a monitor which you can already max out the framerate on. It gives you the ability to have more headroom when you upgrade your components in your PC. And their are still games you can run at well over 240hz. If a game feels like it's "lagging" at 70-80 FPS then that isn't due to the framerate, it's due to it having poor frame pacing. Which is generally either a result of being CPU limited or having a poorly optimized game. That being said, there are still a huge number of AAA games that will run at well over 120+ FPS on a 4090. Assuming you aren't bottlenecking it with a underpowered CPU. @Roecky also doesn't seem to realize how DLSS actually works, and ignores the fact that basically all of those big AAA games include it. DLSS is literally upscaling from a lower resolution and using AI to enable that upscaling to work with at worst a very minor hit to image quality. When used in quality mode at 4K, the difference between it at native resolution is so incredibly minor that it's a no-brainer to use it the vast majority of the time. In some games it actually provides a higher detail image than native. You don't even really need to use frame gen. So even in games which aren't able to reach those framerates running natively at 4K, using DLSS to upscale from 66.7% of 4K (DLSS Quality Mode) or 50% of 4K (DLSS Balanced mode) - which is also just a 1440P render resolution. Provides a far better overall image than playing natively at 1440P. Playing at 4K also allows you to take advantage of 4K texture packs which improve the graphics in many games as well. There are some games that just straight up weren't designed to be able to hit those kind of framerates regardless of the hardware you have. In these games you generally run into CPU/optimization bottlenecks. Not GPU performance bottlenecks. In these type of games you can't get higher framerates past a certain point regardless of how low you drop the resolution. These are all things you would know if you actually had a 4090 and had ever tried to do this with those types of games. The average framerate with a 4090 is also not 70-80FPS, it's more like 100-140 FPS. The games that run at 70-80FPS are outliers, and most of those games only run like that if you just turn every single setting all the way up to max and make no attempt to optimize your settings for the best framerate/image quality. This would be like trying to run Cyberpunk 2077 with full pathtracing on at 4K native resolution. A lot of these games build in future facing features/graphical settings so that in the future when we have higher performance hardware the game can age better and offer an even better experience. Turning up every single setting to the absolute maximum is just throwing away huge amount of potential performance for relatively marginal improvements to image quality. If you even do the slightest amount of optimization of the settings you will be more than fine and still have a better experience and performance than you would have at 1440p native. TL;DR: - You don't buy a monitor to immediately be able to cap out the framerate on the monitor. That's just a poorly thought out purchasing decision. - Buying a 4K monitor doesn't mean you have to run every single game at native 4K - You don't need to play a game at 1440p native resolution to get higher framerates - Any AAA game that's hardware intensive has DLSS which enables it to run at whatever framerate you would have been getting at 1440p native anyways+. While also looking much better. Any game that doesn't have DLSS will run at 120+ FPS - There are games which are CPU bottlenecked or have engine bottlenecks which prevent higher framerates regardless of resolution - Anyone who actually had a 4090 and a CPU that wasn't bottleneck it like crazy would know that using it with anything other than a 4K high refresh rate monitor is, at best not the move. And at worst a massive waste. - Buying an OLED monitor has so many more benefits than simply the framerate. Even at the same framerate an OLED monitor will feel and respond better than an LCD monitor - There are massive diminishing returns when it comes to framerate, as the difference in individual frame times gets exponentially smaller as framerate increases. You will have 30-40x more impact on latency as a result of using a wireless controller than you would by going from 240 FPS to 360 FPS. -If you have a 4090 and are getting between 70-80 FPS as the average framerate in games, and it feels like it's "lagging" to you. You must be; not optimizing your settings at all, or massively CPU bottlenecked, not actually have a 4090, have an extremely thermally/power limited laptop 4090, or your computer is infected with some kind of CPU or GPU bitcoin mining virus that is sapping your performance.
it really isn't. You think it would be, but not really. Especially because the bezels on the monitor are quite small. It's just about the largest I would go. At a normal distance away from you on a desk the entire monitor is still in view without having to move your head or anything. Even if you are someone who leans in close to your monitor it's not an issue. Anything larger than it would probably become an issue. Like 42in is definitely way too big. When it comes to the overall footprint of the monitor on a desk, it's not too large either. The footprint of the monitor stand itself is slightly larger than I would want it to be, but that issue is easily resolved by just putting it on a monitor arm.
the milisecond differences kinda stack with other latency reductions like polling and the like. alone it makes no dif, but enough small reductions leads to a big reduction. you probably wont notice though if you arent playing at the peak of skill of the game you play. theres an old game ive been playing for over 15 years and one of the top players of the mode i play, and can notice every tiny insignificant latency reduction... in just that game lol edit - also, at 2k res, i stop noticing the dif in resolution. 4k and 2k i cant tell the dif. ill always go with refresh rate. i can notice 2k to 4k on my 65 inch tv though
While QD OLEDS do have a better color saturation, will their color accuracy not be worse than WOLED because of the pink/magenta tint if there is some ambient light which mixes with the colors on the display? I also wonder if color accuracy test tools are trustworthy if the tools placed on the monitor blocking ambient light from mixing with the shown content. The tools can be consistent but it might not be a realistic representation for the user experience who ends up with tested & calibrated pure white being shown as pink for the eyes with ambient light. Im currently thinking about purchasing my first OLED monitor. But playing in a room with plenty of ambient natural light makes me believe that QD OLED would be a bad choice for me. While 4K resolution would require beefy GPU for gaming, it's a much better standardized resolution for watching content than 1440p. It also much better for productivity as it allows window snapping multiple windows into more than 4 and still be able to read text in each. I wonder if I should wait a bit more.
@@GreyDireWolf You are a complete moron if you buy an OLED to play in the daytime, this makes QD-OLED the clear winner, because anyone buying an OLED TV or Monitor should aim to use it in a dim or dark room for the best experience.
I have the Alienware 360hz 1440p oled with the glossy display. My brother has the KTC 240hz oled matte monitor and honestly bro it's fine to get the matte display too. shit still looks phenomenal compared to an LCD panel
i have both, WOLD and QD oled both 2K and honestly lg matte was wayt better in the black levels and great sharpness, i hate why we listen to those people, QD oled is still better from other aspects but my WOLED was just perfect as well, both are great
@@محمدالفقيه-و3عjust picked my Samsung 27 inch up yesterday and the display is beautiful. If anything having a matte finish benefits most who can’t control lighting.
I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump. Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med. People even on forums say the jump isn't that big in games, PC, or console. Text is where it shows a big difference from 1440 to 4k, Unoptimised games struggle to run native 4K even with monster cards like a 3080-4090. That problem doesn't always happen with 1440p. I'm thinking of switching back to 1440. I have been using a 4K screen since last year. 1440 I had used for like 6 year's 4k has been frustration with unoptimized/demanding games even using high settings, stutter/dropped fps
This is a great video for me, as next March I'm buying a new PC, likely based on a 5070ti. According to VideoCardz all the new 50 series cards ship with DisplayPort 2.1, (Nobody is talking about this) yet there aren't that many 4k DP 2.1 capable monitors out there and they're expensive. I'd love to make the jump from 1080p (on my GTX 1070) to 4k whilst I upgrade my PC. Would love to hear your thoughts on the availability of DP2.1 monitors in 2025, thank you.
Nah our GPUs are not that strong. By going 1440p you extend the life of your graphics card by a couple of years. You're already sub 60fps at AAA games in 4K
Ehh I don't have a 4090 so I went with the 1440p 360hZ option as that's actually easier to achieve than 4K 240 and my 4080 can hit 300fps in quite a few titles
You could just upscale to 4K with DLSS. And then get both the higher resolution as well as the high framerate. Also, whether it's easier to achieve the maximum framerate of the monitor isn't the question that matters. The question you should be asking is; what is the difference in achievable framerate between 1440p and 4K, and what is the actual value of that difference in framerate? Then, how does that difference in framerate compare to the value of the difference in image quality? Just because you can achieve a higher framerate doesn't mean that relative difference in achievable framerate has any real value. Especially once you get into higher framerates. There's only like a 1ms difference in actual frame times between 240hz and 360hz. While the difference in image quality between the two is substantially higher. There are very few scenarios where the difference between 140FPS and 200+ FPS is worth having a 40-50% reduction in image quality. And that's before factoring in that with DLSS you can essentially have the best of both worlds when it comes to framerate and image quality. Since that gives you the framerates of the lower resolution while having essentially the image quality of the higher resolution. Especially at higher resolutions like 4K, the difference in image quality between 4K native resolution and DLSS (in balanced or quality mode) upscaled to 4K is at worst, maybe a 2-3% difference in quality. If it's noticeable at all. The vast majority of the time, when playing the a game it's not something you think about at all. The higher the internal render resolution, the better DLSS works, as it has far more visual information to start with. The smaller the difference between the internal render resolution, and the resolution you're upscaling to the better it works as well, as there is less new information DLSS has to fill in. So upscaling to 4K from 1440p leads to better results and better image quality than using DLSS to go from 1080p to 1440p. Even though 1440p to 2160p (4K) is actually a bigger jump in resolution than 1080p to 1440p You also don't really need to max out the framerate on your monitor in every single game. There is rapidly diminishing returns when it comes to actual frame times as framerate increases. You are almost always better off trying to get extremely consistent frame times as opposed to trying to chase the maximum peak framerate. As a inconsistent, highly variable framerate will still feel worse than a slightly lower but rock solid framerate. Any game where you could get 300FPS at 1440p, you could still likely get well over 200FPS at 4K. And unless you are playing competitive FPS games and are at a high enough skill level where you are making money off of it. The difference in frame times is so small that it's basically a non-factor. There are likely many other aspects in your overall system which are introducing far more latency than the slightly more than 1ms difference in frame time between those two framerates. Average human reaction time is about 200ms, and for those who play lots of competitive videogames, their reaction time is more like around 150ms on average even among the best of the best. Your ping, mouse, and keyboard also have a far bigger impact that slight difference in frame time. If you're actually using a controller, especially a wireless one, just that decision is adding around another 30-40ms of latency. Unless you've drilled down on all of those other areas and optimized them already, making your monitor choice based on a 1ms difference in potential latency is focusing on the wrong things. This is all assuming that you are even playing latency sensitive competitive multiplayer games in the first place. If you are just playing single-player or PVE games, then focusing on latency to this degree is completely unnecessary. In that case, the image quality benefits of 4K will have wayyyyyyyyyyy more impact on your overall gaming experience than even the difference between 140 FPS and 300 FPS. Playing a AAA open world game with at 4K and a locked 120FPS will look and play substantially better than playing at 1440P with 200+ FPS. And if you still really wanted to get that 200 FPS that you would get from playing at 1440p, well then you could just use DLSS set to balanced and render the game at a internal resolution of 1440p but also upscaled to a 4K output resolution. Then you would get to play the game at 4K 200+ FPS while also having better image quality than you would have gotten by playing the game at native 1440p. For the vast majority of scenarios, the 4K monitor is the objectively better choice. It offers more versatility, better image quality, and will remain viable for much longer than the 1440p option as well. Since as hardware and games continue to improve, you will be able to scale image quality and performance far higher than you would be able to with the 1440p option. It gives you almost everything you would get with the 1440p option, in addition to many other things that the 1440p monitor simply can't do. There are only really two benefits that the 1440p monitor has which the 4K one doesn't. - It costs less than the 4K monitor, and it's capable of having ~1.5ms faster frame times. A difference in latency that's 100 times smaller than the average reaction time of even the quickest humans in the world. It's a difference in latency that's incredibly minor, and there are a multitude of factors people can alter which will provide far more impactful improvements to their total system latency. Which leaves only one significant advantage for the 1440p option. Price. While it's totally understandable if that's what motivated someones purchasing decision, if it really was based on the difference in framerate you may want to reconsider.
I was thinking about maybe buying the 4K Option and be Anke to Play those Single player AAA games on 4K. What if I wanted to Switch to 1440p for other games tho? I read somewhere that Running 1440p on a 4K Monitor Looks Bad. Is that true and if so why and is it bad enough to drop the 4K option for other games entitrely?
I still don't know what to pick. I have both the Alienware OLED rn on my desk, the 27inch 1440p and 32inch 4k. My head says keep the 1440p because I only have a 7900XT right now (planning to upgrade though), and it gives me plenty headroom if I want to ever install tons of mods etc. Even the 4090 can be swiftly bought to its knees on some titles, and I'm not spending 4090 prices on GPUs, But my heart prefers the 4k monitor, which is also cool for consuming content too. I can pull 100+ fps on high/ultra at Native 4k in slightly older games like DayZ which is one of my go-to games, or Kingdom Come which I'm playing through again before the sequel. But I don't mind heavy FSR or even altering the entire res to play games like Cyberpunk (which should apparently look bad but it's fine to me?). The 360hz over 240 isn't really a factor as I don't play competitive titles these days like CS etc. I've got like a week to decide until my no questions asked return policy is up and I'm still no closer.
@@likris6607 Yup, I kept the 4k monitor and I've got someone coming to collect the 1440p one today for return! The 32 inch 4k one is just a much better upgrade, especially as I already have a 27 inch 1440p monitor (it's just IPS and not OLED so still would be an upgrade, but meh) FSR/Xess isn't bad and the Super Resolution feature is actually really effective, playing through Stalker 2 perfect rn, and the 7900XT is fine native 4k for older games that I play a lot like DayZ/Arma/Paradox Strategy or Total War games (playing Kingdom Come prepping for the sequel in Feb and getting like 100 FPS native 4k basically at ultra), so it's good enough for now until the next Nvidia cards drop (or if the 8800XT is good). Happy with the choice.
Holy shit he has a preference but qualified it with “this won’t always be the best choice for everyone”. A little bit of nuance. I love to see it. If only he would learn to do that with the Matte vs. Glossy sticking point he’s been completely incapable of getting past.
You can use Nvidia Control Panel on 1440p monitors to upscale to 4K. Save hundreds on a 2K monitor and use technology to make it look more expensive while also saving performance instead of wasting it. It also vibrantly changes that look of a native 2K monitor in my experience but again, my experience.
That's not how things work. All you are doing there is super-sampling the image which basically just giving you better anti-aliasing. You may be setting your GPU to output at a higher resolution, but the monitor is still a 2K monitor. It still has the same number of pixels, regardless of what resolution you set your display resolution to. It's not an equivalent experience to 4K output on an actual 4K monitor, where there is a 1 to 1 match between the number of pixels in the image on the screen and the number of pixels in on the monitor itself. With a 4K monitor you can also use DLSS to render at a lower resolution like either 1440p or ~67% percent of 4K and then have it upscaled to 4K. Enabling you to have the performance benefits of a lower render resolution while also getting the image quality benefits of 4K.
I sit about 2.5 three feet from my screen, should I get a 32 or 28 inch 4K monitor. I mainly play world of warcraft and was wondering if it would be too big or might be better to fit more UI inside. Also wondering which one would be ideal don't want to spend that much but could be persuaded if it meant more. Thanks
32 would probably be perfect if you’re sitting up to 3 feet away. I would say in your case it’s just a preference thing, you’d be satisfied with a 28 or 32. I would go 32 if I was you.
@@bibblebabl I realize the video was about oled monitors, but the question was in general; not really worried about getting oled but I could understand why someone without critical thinking might assume that :)
4k doesnt matter in regard performance because majority of modern games have "render scale" or even better DLSS and boom the game renders at 1440p without having to reduce the monitor resolution. So if you have a game which you want 1440p performance at: Just set render scale in the game to 75% and boom you have it. With DLSS even with almost same quality as native 4k.
Seeing QD-Oleds in store was a massive turn off since the blacks truly looks like tn-panels with some ambient light. Rather wait some longer then have such a glaring issue.
Ur stupid if you are buying an OLED to use in a bright room to begin with. OLED shines in a dim or dark environment if you don't play like this, just buy a LCD.
They don't look like that in real world usage. The issue is made 100x worse because of the super bright florescent lights in those kind of stores. It's basically the worst case scenario. When you actually have it at home it's a complete non-issue when the monitor is actually on and you are using it. The only time it's really even noticeable is if I have the monitor off, and have a light on directly in front of it. Unless you use your computer with really bright lights on right behind you on the ceiling, I wouldn't worry about it. Frankly, in the situations where it would be a problem, the glare and reflection of the light due to having a glossy panel would probably be the bigger, more noticeable issue.
These new panels don't have DP 2.1 and some are lacking HDMI 2.1 for that matter as well. And with the rumors of the 5080 not even matching the 4090 for performance it seems silly to even buy a 4k 240hz monitor as a 4090 only averages around 144FPS at 4k at Ultra settings without DLSS and Frame Generation. I guess if money isn't an object and you'll be getting a 5090 it might be worth it but just hope DP 2.1 isn't required. Id personally wait on the tech to get better for the 4k OLEDS.
That is assuming you wont need DP 2.1 to reap those new GPUs benefits. These new panels don't have DP 2.1 and some are lacking HDMI 2.1 for that matter as well. And with the rumors of the 5080 not even matching the 4090 for performance it seems silly to even buy a 4k 240hz monitor as a 4090 only averages around 144FPS at 4k at Ultra settings without DLSS and Frame Generation. I guess if money isn't an object and you'll be getting a 5090 it might be worth it but just hope DP 2.1 isn't required. Id personally wait on the tech to get better for the 4k oleds.
@@BillyMOV It's not an assumption. You won't need DP 2.1 for any Blackwell GPUs. They will likely support DP2.1, but it won't be necessary. That would be a terrible business decision and make no sense for any of the parties involved. Any basic research into the whole DP 2.1 thing would have told you that. You don't think that Nvidia's biggest GPU AIB (ASUS) knows whether DP2.1 would be necessary for Blackwell? And if it was, that they would have put it in their flagship gaming monitors that are launching shortly before Blackwell and are targeted at the same people who are buying those top end Blackwell GPUs? If it was every new monitor would already have it. But they don't because that would be stupid and the DP standard is backwards compatible. Oh and every single new 32in OLED 4k240hz monitor does have HDMI 2.1 (although that only caps out at 4K 120Hz/144Hz anyways). There is a 32in 4K 240hz panel with DP2.1 as well, but it was only put in there to attempt to differentiate it from other monitors using the same panel, and capitalize on those who don't understand why DP2.1 isn't really necessary yet. In the monitor that does have it, It also provides zero noticeable benefit and comes with additional problems due to how new the DP2.1 standard is and the lack of clarity around its various sub-standards. You incorrectly assume you understand the purchasing behavior of the people who are actually buying these products. If you think people only buy monitors like this, or even monitors in general to try and cap out every single game at the maximum refresh rate of the monitor, and expect to do so from the moment they get it. You are very very wrong. Your comment reads like someone who doesn't actually have any of the hardware you are referring to and therefore can only attempt to analyze hardware based on the numbers on the spec sheet, without actually knowing what the real world experience of using them is like, or how people who have them, actually use them. And even then, you haven't even done enough research to actually know the things that you can actually know without owning the hardware. You are completely wrong about DP2.1. You don't understand the first thing about OLED panels or their benefits. Which is wild considering the video you are commenting on. You don't understand DLSS or more specifically, what it's actually looks like when using it IRL at 4K and different quality levels. Or what the experience is like playing different games with or without frame gen at various framerates. You don't even seem to really understand framerate in the first place. The advantages that comes from switching to one of these new 32in 4K 240Hz OLED monitors go far so far beyond simply the increased refresh rate. That's not even the primary reason the vast majority of purchasers are buying them. There's also a huge difference between "money being no object" and being able to afford buying a few thousand dollars of PC hardware every couple years. There was absolutely a point in time where I couldn't afford to either, but was still really into PC hardware and researched things constantly to think up hypothetical builds, what I would get if I could, and follow the evolution of the various industries. But come on, if your gonna make comments like the one you just did, at least be certain you actually have a clue what you are talking about before commenting. You're doing others a disservice by spreading incorrect information.
Now do a qdoled tv at 120hz vs a monitor at 360hz. Obviously the monitor will be better for competition but I'm talking about outside of that. Monitors are just to dim imo
Anyone commenting on the matte screen. You obviously havent seen them all side by side in perosn. Its not even close. The matte coating takes away from the sharpness and details. Enjoy having your 4k oled screen being held back.
You’re right. It’s not even close. I’ve had both, and my glossy QD Oled loses its deep blacks and has crazy reflections the second I turn lights on in my room or open the window. The W-Oled has a grain that is really only noticeable on bright white backgrounds, and it only looks worse than QD-oled when my room is pitch black. So yeah, glossy is better if you game like a cave troll. Which is terrible for your eyes with gigantic amounts of constant strain on your focusing muscles. Enjoy having your 4k oled screen being held back by you slowly losing the ability to see in the first place.
@Deifiable for one glossy screens have been around for how many years now??? They are on a thing called a TELEVISJON. People have been gaming on them for how many years and never had complaints. The matte screen looks like a cheap 4k monitor with the way it blurs the detail of the pixels. 4k is meant to have a 3d look to it while gaming and the matte screen prevents that. I posted pics on reddit just to prove it. Enjoy your blurred screen and grayish blacks. Less detail and less sharpness. There's a reason glossy oleds are sold out everywhere and you can find Samsungs and LGs at anytime of the day.
@jefftruitt1812 lmao sure. I've had 3 and can tell you without a doubt the MSI and Aorus were more detailed and clearner than then samsung. For a 100% fact. I went with the LG G4 and it blows away all of them.
I might be on hard drugs or something but ive heard the difference after 240 hz is minimal, I have been using a 540 hz monitor to see if its any different and boy does 240 hz after switching back look bad it looks like how 60 hz did after I used 240. Now im wondering if a higher resolution than my normal would be better than higher hz to get better image clarity.
240 to 360hz isn't that much of a difference, still a difference but not big. 240 to 540hz is actually a big difference. Ofcourse not comparable to the 60-144 jump but it probably is just as big as the 144-360 jump lol😅
Just returned the 360hz 1440p Alienware AW2725DF, gaming was awesome but everything else looked worse than 1080p to me. Jagged, disorientating text etc. Probs an OLED thing but yeah guess I'll be going for a 4k 240 with 1080p 480hz hybrid.
@@Ratbane I can't speak for all monitors obviously but I have owned a zowie 360hz and 240hz and see 1080p corporate pc displays often that look better to my eyes. The zowies are tn and look really bad on desktop also
1440p i can max setttings 4k i need to use max performance FSR or DLLS to reach desire fps and some low settings to use or med settings..4k is good if u play old games For new games 1440p. That is why pro gamers go for 1440p monitors with a reason and 1080p.... if you want to have fun and to be fast don't go 4k system latency is huge.
@@drunkhusband6257 Not true at all. Doom Eternal looks dramatically better on HDR if its HDR settings calibrated correctly to your monitor. God of War Ragnarök on the PS5 looks insanely better on HDR too. And so will the PC version
Of course everythong in this video depends on your rig. But yes I agree 4k is just way too gorgeous to go with 1440p espe ially if you can get 120 fps from your gpu the frames after that unless you have insane eyes won't matter as much (I can notice the difference between 120 and 240 but not nearly as gross as 60 and 120 fps)
Why cant i just have my damn cake and eat it to? Why do i have to pick pros and cons? Just make a damn monitor that has it all so everyone can be happy
I have a 4090 / 7800x3d build. You will spend more money on your pc and your monitor, it will heat up your room pretty bad causing you to turn down the temp in the ac, the house will be at a crisp 67 Fahrenheit and your room at 74 f. If you live in an area with high electricity cost you will notice it on your power bill if you game 4+ hrs a day. All that for a few extra pixels. Luckily I have solar panels though. If I had to do it all over again, I’d get a 4070ti super 7800x3d and 1440p panel. Or a 7900xt but the game I play likes NVIDIA more.
@@disco4553 The only reason to get the 27 over 32 inch one is if you are a very competitive gamer. If you mainly play SP games, MMORPGS or racing games def go for 32 inch 4k. The experience on those types of games and text clarity is that much better on the 4K oled displays. I mainly play FPS games so I actually traded my 32 inch 4k in for the 27 inch 1440p one.
got same Pc specs and im ordering FO27Q3 360hz as im writtin this, i ll keep my current Dell Alienware aw2723df 27" 1440p 280hz Ips as my secondary monitor
I havent tried 4k yet so the only reason im going for 32 inch 4k 240hz oled. The gpu isnt going to cut it but 5000 series are around the corner and i'll get maybe a 5070 or when the refreshes hit, 5070 super or 5070ti super. Something in 800-1000 eur range max. Should be able to do at least 60fps maxed at 4k. If not, i'm not too fussy about lowering settings to get a playable framerate. Satan knows i'm going to run minimum settings with dlss for most of my games til then. Except esport titles which will be cpu limited even at 4k. Got a very modest 5700x3d and 4060ti. Skipped the 9800x3d upgrade in favor of new monitor. It's way more than i'd usually spend on a single piece of hardware but it's long overdue for a monitor upgrade. 240hz tn i use is 6+ years old now. Got it at the time when cpus and gpus we'rent even capable of hitting that framerate outside of some really easy to run games. Thanks to x3d chips the games easily hit the monitor refresh rate in esport titles. Gpu is the bottleneck now.
Imho, in gaming, DLDSR of 1440p to 4K on these 27" monitors adequately bridges the gap to the picture clarity of native 4K on the 32" monitors. Fight me.
Sort of. It’s never quite going to be 1:1, as you can’t really match the amount of pixels on the screen. There’s also the issue of the distinctly AI processed look DLDSR tends to add, especially when you apply higher sharpening. It also tends to have a higher performance hit than native 4K (varies from totally negligible to pretty significant), tends to be finicky with some games, doesn’t support RTX HDR, and isn’t supported in some of these OLEDs. DLDSR is great, but only if you already have a 1440p monitor. If you’re in the market for a new monitor, and you’re debating between 1440p and rendering at 4K with DLDSR and 4K native, it just makes more sense to go with the 4K one. If you’re going to be rendering games in both at 4K, just get the 4K native one. The only case I can see for 1440p OLEDs is competitive gaming.
@@thedisplayguy I’ve got a 1440p 27” oled and a 4k 27” mini led sitting side by side and when using dldsr on the 1440p display, they look eerily similar when gaming (most games… FFVII and Forza: no. Every other game, yes; I also have 4k 55”, 65” and 77” oleds which I also game on fwiw). Give it a try (gaming, not desktop use). 2.25x dldsr and 100% smoothing.
@@noidsuper I agree with you almost 100%. I use no sharpening (100% smoothing) which completely gets rid of the ai processed look to my eye. And on a 4090 I’ve never found the performance hit significant, even in aaa games (I also have a 4k panel sitting next to my 1440p one). Problem with going 4k native is price and size. There are no 4k 27” oled’s for those of us who prefer that size. Also, I’d argue that the asus xg27aqdm is the best 27” oled out right now and it gets brighter than the 32” oled’s with less abl while being quite a bit cheaper. It supports dldsr but not having rtx hdr with dldsr (for the time being) is a bummer.
My only concerned about 4K is the latency problems like many other UA-camrs have mentioned which means it's not really great for competitive play can I typically like to sit close to the monitor which I feel 4K is better suited for bigger monitor screens better positioned away from people but on the other hand going to 4K would mean less of a CPU bottleneck scenario with my 5800 ex 3D
I don't know why I watch so many monitor reviews that talk about refresh rates so much. I don't even play competitive games anymore and it won't matter in my everyday usage at all...
I would love for somebody to objectively quantify how many more kills or wins they get on a 27" monitor compared to a 32". I bet it's negligible at best. Same thing with 360 Hz compared to 240 Hz.
Thank you to Ruipro for sponsoring this video!
Buy the Ruipro HDMI 2.1 Certified Fiber Optic Cable (6FT): amzn.to/3wvHB7j
Buy the Ruipro HDMI 2.1 Certified Fiber Optic Cable (25FT): amzn.to/432NDGS
Buy Ruipro Cables: ruipro.store/collections/all
Best Monitor Settings Here (Guides & Discord): patreon.com/TheDisplayGuy
My Favorite HDR Displays (For Now). Affiliate links below. I earn commission on purchases.
Best 4K OLED Monitors
ASUS 32" (4K 240Hz OLED) PG32UCDM: amzn.to/3RXG01U
Gigabyte 32" (4K 240Hz OLED) FO32U2: amzn.to/3W42Do0
MSI MPG 32" (4K 240Hz OLED) 321URX: amzn.to/4ek1Oyg
Gigabyte 32" (4K 240Hz OLED) FO32U2P: amzn.to/3LzLufJ
MSI MAG 32" (4K 240Hz OLED) 321UPX: amzn.to/3VGknWE
Alienware 32” (4K 240Hz OLED) AW3225QF: howl.me/clPs75aw3DU
Best 1440p OLED Monitors
Gigabyte 27" (1440p 360Hz OLED) FO27Q3: amzn.to/4awu8u1
MSI MPG 27" (1440p 360Hz OLED) 271QRX: amzn.to/4f5ODkL
Alienware 27" (1440p 360Hz OLED) AW2725DF: howl.me/clV5XkZbvs1
I’m currently gaming on a 1440p ips monitor and I have the chance to get either the 1440p Msi oled or the msi 32” 4k oled. The 1440 I can get for £630 and the 32” 4k for around £960. I game in the Xbox series x so so Hz is really a factor as I’m maxed at 120hz. Thoughts?
I swear this guy would reject his Uber driver if he pulls up in a matte finished car...
nah he rejects tinted windows
I would too
lmao
So would I
Hilarious.
Nonsense….most people that aren’t dumping thousands of dollars on video cards and monitors, should absolutely go with 1440 until 4k is the norm and you are able to find them in 27 in sizes too. I can’t sit at a desk staring at a TV size monitor inches away from my face while gaming.
broke boy talk
@@Velly2g Sycophant boy talk.
@@Velly2gok broke boy
@@Arkangel88Mrpoor kid 😂
@@76a3c3 Wtf r u talking about. I am old enough to be your dad punk.
My bathroom mirror is matte because I don't like seeing my own reflection.
😂😂😂😂😂😂😂😂😂😂
And obviously don't want to see yourself in 4k resolution with details and sharpness
@@ryanbrowning5586 true, who wants to see their ugly face staring right back at them right? And it was easy for the mirror to change from glossy into matte as I never ever clean it 🤓
@@Paul_Rich hahaha
I really wanna upgrade to 4k, but a 32 inch screen is too big imo..
I have the same thought, and also, gpus aren’t ready for 4k high fps, so I’m going with a 3440x1440 oled in the meanwhile.
I have a 4k 32" screen and it's perfect for me. I wouldn't go any bigger.
I have a 32” 1440p display and I’m actually looking to downsize. It’s too much to handle unless your desk has more depth to where the monitor can sit further away from your eyes.
I wish that they made 4k 29 inch monitors
@@Stuke51this is exactly what I’m looking for for my racing sim thanks for the recommendation lol
Personally went with dual monitor setup, 48" 4K/120 LG Oled for single player titles/content consumption + 27" 1440p/360 QD-OLED for kbm and faster paced titles + general workstation stuff. Total cost is only slightly higher than a single 32" 4K 240Hz monitor currently, if you can find a deal on 48" Oled. Best of both worlds
Literally did the same 48" 4k 120hz oled C3 and 27" 1440p 360hz Samsung g60sd.
4k just makes the most sense. Fringing is less obvious when doing anything productive, DLSS that can satisfactorily fake higher resolution from less powerful cards and even then, with a modern gpu, you can play a huge library of older games comfortably in 4k.
240hz is plenty fast for the average person, and realistically the only reason to want 360hz is if winning at esports is your job. So a 27" 4k 240hz glossy OLED (with HDMI 2.1) would be an 'end game' monitor for a large majority of people.
Sadly 27’’ 4k OLEDs don’t exist as far as I’m aware. Forcing many people who don’t want 32’’ for fps gaming reason to go 1440p instead of 4K.
Got mine PG32UCDM, jumping from 1080p 240hz IPS. Don’t go 360hz. Running on 4080, no problem at all in all games.
Have you downscaled to 2k?
How does it look on a 4K monitor bro?
@@mastersgamers747 downscaled to 2k its pretty much the same as a 2k display. I use the LG 4k 32in with 480hz 1080p but I usually run 4k even on a 6800xt...
@@mastersgamers747 no, 4K runs smooth. Looks good
forgot one HUGE diffrence and it drives me away from 4k being my main monitor. it's the size. 27" is the largest I could go with comp games so 32" would be a disadvantage at that point.
I would have considered the LG "Dual-mode" if not for it's matte display
Big factsssss
Why do you consider size to affect gameplay? I’ve heard some claim it takes up too much vision, but would you not push it back further?
Perhaps your desk won’t allow this? Curious to know.
@@thedisplayguyits the same for me
If the monitor is too big there is just to much space my eye has to cover and i cant go that much further back from my desk so i would love a 24 inch oled as crazy as it Sounds
Too much head movement I guess.
@thedisplayguy too much information for the eyes fps shooters being close while still seeing the mini map without big eye twitchs is huge. Plus sitting further back surely won't help with eye tracking. Most of the very best players in the world want 24.5" as is but settle on 26.5 to gain 1440p.
I've been leaning toward a 1440p 360Hz OLED, thumbs up!
Do you draw a paycheck from competitive gaming on a weekly basis? No? Then just get the 4K monitor. I built my first computer in 2013. 1440p was the standard back then. It's crazy that over a decade later and we're still at 1440p as the standard.
thinking of 1k hz is just insane. No triple a title will run that much of fps. We have years of advancements to achieve. We need something like crt for that clarity
Not to mention you can literally downscale the 4k monitor for certain games until you end up upgrading gpu in the future?
A think the same as you. 4K 240, all the way.
The only way i can possible think on 1440p monitor,
1) If your hardware cant do 4k 60 fps in "medium" presets of the games you play.
2) You ONLY play multiplayer games like fornite/apex/cs etc, fps games that have advantage of that 360hz and NEVER touching a AAA single player game.
Rest of the cases, if your hardware can do more than 60 fps in "medium quality" in 4k, or you at least play 1 singleplayer AAA game a moth, thats it, you should go to 4k.
Not only games are better in 4k, youtube videos in 4k look better, movies, overall operation system. Productivity work is better in 4k too. There are too much things better in 4k that 120hz of difference just cant match.
Even p* videos are better in 4k and get no advantage of the 360 hz 🤣🤣 rofl
Never goon, my guy
Don't forget that 32 inch is too big for competitive games. And also, what's the PPI of the 4k 32 inch vs the 1440p of a 27 inch?
Edit: 108 PPI vs 137.68 PPI on 4k. 27.4% more PPI
32 inches is not “too big” for competitive games. Move the monitor slightly further back and it will have the exact same appearance in your field of view as a 24 inch. This is very very basic
@@Deifiable First, if you move the monitor so far, you won't notice the 4k difference vs 1440p. Second, there are 1680 players on prosettings's website. Not a single one of them uses anything above 27. 97% use a 24-25 Inch. Explain why?
@@Vandelay666 so your first argument is not even trying to say its worse, you’re just saying it’s not better. That’s a very different argument. I was never trying to say it was better because it was 4k. I was saying it wasn’t worse because it’s larger. Separate claims.
Second, because all of the best features comes to the smaller monitors. The 540 and 360hz IPS panels with Dyac2 and ULMB2 are all on 24” monitors. Pros want those features, so of course they buy those.
Again, this is Because of this nonsensical idea that 24” is somehow better than 32” pushed further back. Which is inaccurate. And you’ve provided no argument otherwise, except saying “32” isn’t noticeably higher res”, which doesn’t factor in at all.
I sit in front of 32" and play Quake Champions perfectly fine with it 🤷♂ And Quake is very competitive game. I couldnt even imagine going smaller than 30". 27" or even 24" would be very tiny lol.
You get used to a bigger monitor pretty fast
@@Deifiable IPS is so slow with pixel response that they need overdrive modes where they then have even more ghosting. I personally would prefer OLED 240hz with 0.1 response time and thus perfect motion clarity and true 240hz without any ghosting over an IPS any day
Whats the big difference between QD and W OLED? I heard W better text clarity ?
Quantum Dot (QD) and White OLED (W-OLED) are two distinct display technologies used in modern TVs and monitors, each with its own strengths and weaknesses. Here are the key differences between them:
### Quantum Dot (QD) Technology:
1. **Color Reproduction**:
- QD displays use a layer of quantum dots that emit pure colors (red, green, blue) when stimulated by a light source (usually a blue LED).
- This results in very vibrant and accurate colors with a wide color gamut.
2. **Brightness**:
- QD displays can achieve higher brightness levels compared to W-OLEDs. This makes them well-suited for HDR content.
3. **Efficiency**:
- QD technology is generally more energy-efficient than OLED because it uses LEDs as the light source.
4. **Lifespan**:
- QD displays tend to have a longer lifespan as they do not suffer from burn-in issues as OLEDs do.
### White OLED (W-OLED) Technology:
1. **Color Reproduction**:
- W-OLEDs use white OLEDs combined with color filters to produce red, green, and blue subpixels.
- While they offer excellent color reproduction, they may not achieve the same level of color accuracy and vibrancy as QD displays.
2. **Contrast and Black Levels**:
- W-OLEDs have perfect black levels because each pixel can be turned off completely, providing infinite contrast ratios.
- This results in deeper blacks and a more immersive viewing experience, especially in dark scenes.
3. **Viewing Angles**:
- OLED displays generally offer better viewing angles compared to QD displays, maintaining color accuracy and contrast even when viewed from the side.
4. **Response Time**:
- OLEDs have faster response times, making them ideal for fast-moving content like sports and gaming.
### Text Clarity:
- **Text Clarity**:
- W-OLED displays are often considered better for text clarity due to their ability to control individual pixels precisely, reducing issues like color fringing and providing smoother edges on text.
### Summary:
- **QD Technology**: Better for vibrant colors, higher brightness, energy efficiency, and longer lifespan.
- **W-OLED Technology**: Superior in contrast, black levels, viewing angles, response time, and text clarity.
Your choice between QD and W-OLED should depend on your specific needs, such as the type of content you watch, the environment you watch in (bright vs. dark room), and your sensitivity to issues like burn-in and text clarity.
@GustavoRabelo93 Great answer! Thank you. But if you dont mind me asking, did you use ChatGPT for this answer or some sort of AI. Regardless the answer checks out and it's well explained thank you once again.
4k monitors are better but you need to have a beast of a pc to run the 4k max setting at 100fps or higher so I don’t see the point of 4k 240
Exactly but somehow most people ignore that. For example I do not need 240hz watching youtube videos, or creativity stuff. 4k might look nice but even a 4090 (i do own one) it does NOT run most games @ 144 fps. Average is around 70-80 fps. You might get around 100-120 with DLSS & Frame Generation but thats it. It might run E-Sport titles/FPS games well but when it comes to for example rpg/open world games - big No. I rather have consistent 144 - 200 fps in WQHD than having the feeling that the game is lagging.
u can also play older games not necessarily the newest ;
@Roecky nice info brother! May I ask! Does your 4090 run Fortnite at 240fps 1440p? Or any game for the matter? Also, if you were to lower all textures and image quality how many fps can you get with your rig?
Y'all don't seem to realize that if someone is spending over $1000 on a monitor, odds are they already have a beast of a PC. Just because it is expensive for some people doesn't mean it has no point. It absolutely has a point, there are plenty of people out there who have 4090's, 4080 Supers, 7800XTX's...etc. You not having the ability to properly leverage a product doesn't make it pointless for everyone else. Also it's incredibly short sighted to buy a monitor which you can already max out the framerate on.
It gives you the ability to have more headroom when you upgrade your components in your PC. And their are still games you can run at well over 240hz.
If a game feels like it's "lagging" at 70-80 FPS then that isn't due to the framerate, it's due to it having poor frame pacing. Which is generally either a result of being CPU limited or having a poorly optimized game. That being said, there are still a huge number of AAA games that will run at well over 120+ FPS on a 4090. Assuming you aren't bottlenecking it with a underpowered CPU.
@Roecky also doesn't seem to realize how DLSS actually works, and ignores the fact that basically all of those big AAA games include it. DLSS is literally upscaling from a lower resolution and using AI to enable that upscaling to work with at worst a very minor hit to image quality. When used in quality mode at 4K, the difference between it at native resolution is so incredibly minor that it's a no-brainer to use it the vast majority of the time. In some games it actually provides a higher detail image than native. You don't even really need to use frame gen.
So even in games which aren't able to reach those framerates running natively at 4K, using DLSS to upscale from 66.7% of 4K (DLSS Quality Mode) or 50% of 4K (DLSS Balanced mode) - which is also just a 1440P render resolution. Provides a far better overall image than playing natively at 1440P. Playing at 4K also allows you to take advantage of 4K texture packs which improve the graphics in many games as well.
There are some games that just straight up weren't designed to be able to hit those kind of framerates regardless of the hardware you have. In these games you generally run into CPU/optimization bottlenecks. Not GPU performance bottlenecks. In these type of games you can't get higher framerates past a certain point regardless of how low you drop the resolution.
These are all things you would know if you actually had a 4090 and had ever tried to do this with those types of games. The average framerate with a 4090 is also not 70-80FPS, it's more like 100-140 FPS. The games that run at 70-80FPS are outliers, and most of those games only run like that if you just turn every single setting all the way up to max and make no attempt to optimize your settings for the best framerate/image quality. This would be like trying to run Cyberpunk 2077 with full pathtracing on at 4K native resolution. A lot of these games build in future facing features/graphical settings so that in the future when we have higher performance hardware the game can age better and offer an even better experience. Turning up every single setting to the absolute maximum is just throwing away huge amount of potential performance for relatively marginal improvements to image quality. If you even do the slightest amount of optimization of the settings you will be more than fine and still have a better experience and performance than you would have at 1440p native.
TL;DR:
- You don't buy a monitor to immediately be able to cap out the framerate on the monitor. That's just a poorly thought out purchasing decision.
- Buying a 4K monitor doesn't mean you have to run every single game at native 4K
- You don't need to play a game at 1440p native resolution to get higher framerates
- Any AAA game that's hardware intensive has DLSS which enables it to run at whatever framerate you would have been getting at 1440p native anyways+. While also looking much better. Any game that doesn't have DLSS will run at 120+ FPS
- There are games which are CPU bottlenecked or have engine bottlenecks which prevent higher framerates regardless of resolution
- Anyone who actually had a 4090 and a CPU that wasn't bottleneck it like crazy would know that using it with anything other than a 4K high refresh rate monitor is, at best not the move. And at worst a massive waste.
- Buying an OLED monitor has so many more benefits than simply the framerate. Even at the same framerate an OLED monitor will feel and respond better than an LCD monitor
- There are massive diminishing returns when it comes to framerate, as the difference in individual frame times gets exponentially smaller as framerate increases. You will have 30-40x more impact on latency as a result of using a wireless controller than you would by going from 240 FPS to 360 FPS.
-If you have a 4090 and are getting between 70-80 FPS as the average framerate in games, and it feels like it's "lagging" to you. You must be; not optimizing your settings at all, or massively CPU bottlenecked, not actually have a 4090, have an extremely thermally/power limited laptop 4090, or your computer is infected with some kind of CPU or GPU bitcoin mining virus that is sapping your performance.
@@e2rqeynobody is reading that shit, calm down
Wish we could get everything with the 4k at 27inchs. 32 is just to big on a desk imo
ya
it really isn't. You think it would be, but not really. Especially because the bezels on the monitor are quite small. It's just about the largest I would go. At a normal distance away from you on a desk the entire monitor is still in view without having to move your head or anything. Even if you are someone who leans in close to your monitor it's not an issue. Anything larger than it would probably become an issue. Like 42in is definitely way too big. When it comes to the overall footprint of the monitor on a desk, it's not too large either. The footprint of the monitor stand itself is slightly larger than I would want it to be, but that issue is easily resolved by just putting it on a monitor arm.
the milisecond differences kinda stack with other latency reductions like polling and the like. alone it makes no dif, but enough small reductions leads to a big reduction. you probably wont notice though if you arent playing at the peak of skill of the game you play. theres an old game ive been playing for over 15 years and one of the top players of the mode i play, and can notice every tiny insignificant latency reduction... in just that game lol
edit - also, at 2k res, i stop noticing the dif in resolution. 4k and 2k i cant tell the dif. ill always go with refresh rate. i can notice 2k to 4k on my 65 inch tv though
screen size is everything...
While QD OLEDS do have a better color saturation, will their color accuracy not be worse than WOLED because of the pink/magenta tint if there is some ambient light which mixes with the colors on the display?
I also wonder if color accuracy test tools are trustworthy if the tools placed on the monitor blocking ambient light from mixing with the shown content. The tools can be consistent but it might not be a realistic representation for the user experience who ends up with tested & calibrated pure white being shown as pink for the eyes with ambient light.
Im currently thinking about purchasing my first OLED monitor. But playing in a room with plenty of ambient natural light makes me believe that QD OLED would be a bad choice for me. While 4K resolution would require beefy GPU for gaming, it's a much better standardized resolution for watching content than 1440p. It also much better for productivity as it allows window snapping multiple windows into more than 4 and still be able to read text in each. I wonder if I should wait a bit more.
Correct qd oleds will look inaccurate with ambient light, buy a woled if u dont game in a batcave
@@GreyDireWolf You are a complete moron if you buy an OLED to play in the daytime, this makes QD-OLED the clear winner, because anyone buying an OLED TV or Monitor should aim to use it in a dim or dark room for the best experience.
I have the Alienware 360hz 1440p oled with the glossy display. My brother has the KTC 240hz oled matte monitor and honestly bro it's fine to get the matte display too. shit still looks phenomenal compared to an LCD panel
i have both, WOLD and QD oled both 2K and honestly lg matte was wayt better in the black levels and great sharpness, i hate why we listen to those people, QD oled is still better from other aspects but my WOLED was just perfect as well, both are great
@@محمدالفقيه-و3عjust picked my Samsung 27 inch up yesterday and the display is beautiful. If anything having a matte finish benefits most who can’t control lighting.
QD better in black levels in dim rooms which is the only way i play.
I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump.
Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med.
People even on forums say the jump isn't that big in games, PC, or console.
Text is where it shows a big difference from 1440 to 4k,
Unoptimised games struggle to run native 4K even with monster cards like a 3080-4090. That problem doesn't always happen with 1440p. I'm thinking of switching back to 1440. I have been using a 4K screen since last year.
1440 I had used for like 6 year's
4k has been frustration with unoptimized/demanding games even using high settings, stutter/dropped fps
This is a great video for me, as next March I'm buying a new PC, likely based on a 5070ti. According to VideoCardz all the new 50 series cards ship with DisplayPort 2.1, (Nobody is talking about this) yet there aren't that many 4k DP 2.1 capable monitors out there and they're expensive. I'd love to make the jump from 1080p (on my GTX 1070) to 4k whilst I upgrade my PC. Would love to hear your thoughts on the availability of DP2.1 monitors in 2025, thank you.
Nah our GPUs are not that strong. By going 1440p you extend the life of your graphics card by a couple of years. You're already sub 60fps at AAA games in 4K
what do recommand Samsung Odyssey OLED G6 or MSI Gaming MAG 271QPX ?
Seen as top shelf has barely been available online, I'm just gonna wait. What am I waiting? Any ideas? I'm just running a 4090, no console.
Ehh I don't have a 4090 so I went with the 1440p 360hZ option as that's actually easier to achieve than 4K 240 and my 4080 can hit 300fps in quite a few titles
Same, now I just need a new cpu to push my 4080
What monitor did you get?
@@AxellCPT I went 240hz 1440p qd oled with the MSI mag. Saved a lot of cash that way.
Any of you guys can run Fortnite or any titles at 360fps 1440p easily ? If so, what cpu/gpu is necessary?
You could just upscale to 4K with DLSS. And then get both the higher resolution as well as the high framerate. Also, whether it's easier to achieve the maximum framerate of the monitor isn't the question that matters. The question you should be asking is; what is the difference in achievable framerate between 1440p and 4K, and what is the actual value of that difference in framerate? Then, how does that difference in framerate compare to the value of the difference in image quality?
Just because you can achieve a higher framerate doesn't mean that relative difference in achievable framerate has any real value. Especially once you get into higher framerates. There's only like a 1ms difference in actual frame times between 240hz and 360hz. While the difference in image quality between the two is substantially higher. There are very few scenarios where the difference between 140FPS and 200+ FPS is worth having a 40-50% reduction in image quality. And that's before factoring in that with DLSS you can essentially have the best of both worlds when it comes to framerate and image quality. Since that gives you the framerates of the lower resolution while having essentially the image quality of the higher resolution. Especially at higher resolutions like 4K, the difference in image quality between 4K native resolution and DLSS (in balanced or quality mode) upscaled to 4K is at worst, maybe a 2-3% difference in quality. If it's noticeable at all. The vast majority of the time, when playing the a game it's not something you think about at all. The higher the internal render resolution, the better DLSS works, as it has far more visual information to start with. The smaller the difference between the internal render resolution, and the resolution you're upscaling to the better it works as well, as there is less new information DLSS has to fill in. So upscaling to 4K from 1440p leads to better results and better image quality than using DLSS to go from 1080p to 1440p. Even though 1440p to 2160p (4K) is actually a bigger jump in resolution than 1080p to 1440p
You also don't really need to max out the framerate on your monitor in every single game. There is rapidly diminishing returns when it comes to actual frame times as framerate increases. You are almost always better off trying to get extremely consistent frame times as opposed to trying to chase the maximum peak framerate. As a inconsistent, highly variable framerate will still feel worse than a slightly lower but rock solid framerate. Any game where you could get 300FPS at 1440p, you could still likely get well over 200FPS at 4K. And unless you are playing competitive FPS games and are at a high enough skill level where you are making money off of it. The difference in frame times is so small that it's basically a non-factor.
There are likely many other aspects in your overall system which are introducing far more latency than the slightly more than 1ms difference in frame time between those two framerates. Average human reaction time is about 200ms, and for those who play lots of competitive videogames, their reaction time is more like around 150ms on average even among the best of the best. Your ping, mouse, and keyboard also have a far bigger impact that slight difference in frame time.
If you're actually using a controller, especially a wireless one, just that decision is adding around another 30-40ms of latency. Unless you've drilled down on all of those other areas and optimized them already, making your monitor choice based on a 1ms difference in potential latency is focusing on the wrong things.
This is all assuming that you are even playing latency sensitive competitive multiplayer games in the first place. If you are just playing single-player or PVE games, then focusing on latency to this degree is completely unnecessary. In that case, the image quality benefits of 4K will have wayyyyyyyyyyy more impact on your overall gaming experience than even the difference between 140 FPS and 300 FPS. Playing a AAA open world game with at 4K and a locked 120FPS will look and play substantially better than playing at 1440P with 200+ FPS. And if you still really wanted to get that 200 FPS that you would get from playing at 1440p, well then you could just use DLSS set to balanced and render the game at a internal resolution of 1440p but also upscaled to a 4K output resolution. Then you would get to play the game at 4K 200+ FPS while also having better image quality than you would have gotten by playing the game at native 1440p.
For the vast majority of scenarios, the 4K monitor is the objectively better choice. It offers more versatility, better image quality, and will remain viable for much longer than the 1440p option as well. Since as hardware and games continue to improve, you will be able to scale image quality and performance far higher than you would be able to with the 1440p option.
It gives you almost everything you would get with the 1440p option, in addition to many other things that the 1440p monitor simply can't do. There are only really two benefits that the 1440p monitor has which the 4K one doesn't. - It costs less than the 4K monitor, and it's capable of having ~1.5ms faster frame times. A difference in latency that's 100 times smaller than the average reaction time of even the quickest humans in the world. It's a difference in latency that's incredibly minor, and there are a multitude of factors people can alter which will provide far more impactful improvements to their total system latency.
Which leaves only one significant advantage for the 1440p option. Price. While it's totally understandable if that's what motivated someones purchasing decision, if it really was based on the difference in framerate you may want to reconsider.
What would you do if you had a 4080 as a PC gamer?
So I’m using a 240hz mini led va panel and I like it but my pc is running cod 360fps so should a 1440p IPs 360hz or stay where I’m at thank you!
Both is the only way boys....start saving
I was thinking about maybe buying the 4K Option and be Anke to Play those Single player AAA games on 4K. What if I wanted to Switch to 1440p for other games tho? I read somewhere that Running 1440p on a 4K Monitor Looks Bad. Is that true and if so why and is it bad enough to drop the 4K option for other games entitrely?
I put a matte coating on my sunglasses so people won't see their reflection in them.
I still don't know what to pick. I have both the Alienware OLED rn on my desk, the 27inch 1440p and 32inch 4k. My head says keep the 1440p because I only have a 7900XT right now (planning to upgrade though), and it gives me plenty headroom if I want to ever install tons of mods etc. Even the 4090 can be swiftly bought to its knees on some titles, and I'm not spending 4090 prices on GPUs,
But my heart prefers the 4k monitor, which is also cool for consuming content too. I can pull 100+ fps on high/ultra at Native 4k in slightly older games like DayZ which is one of my go-to games, or Kingdom Come which I'm playing through again before the sequel. But I don't mind heavy FSR or even altering the entire res to play games like Cyberpunk (which should apparently look bad but it's fine to me?). The 360hz over 240 isn't really a factor as I don't play competitive titles these days like CS etc.
I've got like a week to decide until my no questions asked return policy is up and I'm still no closer.
I did the same and keep the 4k one. No regrets, once used 4k monitor, it is hard to go back to 1440p
@@likris6607 Yup, I kept the 4k monitor and I've got someone coming to collect the 1440p one today for return! The 32 inch 4k one is just a much better upgrade, especially as I already have a 27 inch 1440p monitor (it's just IPS and not OLED so still would be an upgrade, but meh)
FSR/Xess isn't bad and the Super Resolution feature is actually really effective, playing through Stalker 2 perfect rn, and the 7900XT is fine native 4k for older games that I play a lot like DayZ/Arma/Paradox Strategy or Total War games (playing Kingdom Come prepping for the sequel in Feb and getting like 100 FPS native 4k basically at ultra), so it's good enough for now until the next Nvidia cards drop (or if the 8800XT is good).
Happy with the choice.
Holy shit he has a preference but qualified it with “this won’t always be the best choice for everyone”.
A little bit of nuance. I love to see it. If only he would learn to do that with the Matte vs. Glossy sticking point he’s been completely incapable of getting past.
What should i buy: LG 27GR95QE or Samsung Monitor Gaming Odyssey OLED G6 or Samsung Monitor Gaming Odyssey OLED G8?
G6
You can use Nvidia Control Panel on 1440p monitors to upscale to 4K.
Save hundreds on a 2K monitor and use technology to make it look more expensive while also saving performance instead of wasting it.
It also vibrantly changes that look of a native 2K monitor in my experience but again, my experience.
That's not how things work. All you are doing there is super-sampling the image which basically just giving you better anti-aliasing. You may be setting your GPU to output at a higher resolution, but the monitor is still a 2K monitor. It still has the same number of pixels, regardless of what resolution you set your display resolution to. It's not an equivalent experience to 4K output on an actual 4K monitor, where there is a 1 to 1 match between the number of pixels in the image on the screen and the number of pixels in on the monitor itself.
With a 4K monitor you can also use DLSS to render at a lower resolution like either 1440p or ~67% percent of 4K and then have it upscaled to 4K. Enabling you to have the performance benefits of a lower render resolution while also getting the image quality benefits of 4K.
I sit about 2.5 three feet from my screen, should I get a 32 or 28 inch 4K monitor. I mainly play world of warcraft and was wondering if it would be too big or might be better to fit more UI inside. Also wondering which one would be ideal don't want to spend that much but could be persuaded if it meant more. Thanks
32 would probably be perfect if you’re sitting up to 3 feet away. I would say in your case it’s just a preference thing, you’d be satisfied with a 28 or 32. I would go 32 if I was you.
@@itsDenhy Thanks for the insight :)
Is there any 4k 28” oled on market though? 😮
@@bibblebabl I realize the video was about oled monitors, but the question was in general; not really worried about getting oled but I could understand why someone without critical thinking might assume that :)
why does no one talk about the 1440p mode on a 4k display... windows 10 has a full scaling menu in the display section like cmon now...
Wdym shave delay of within nvidia control panel
4k doesnt matter in regard performance because majority of modern games have "render scale" or even better DLSS and boom the game renders at 1440p without having to reduce the monitor resolution. So if you have a game which you want 1440p performance at: Just set render scale in the game to 75% and boom you have it. With DLSS even with almost same quality as native 4k.
just wanted to ask... have you tested out this monitor? this monitor looks sick! what are your thoughts ? " Samsung 27" Odyssey OLED G6 QHD 360Hz "
@@abyssnightcore just bought it and it’s wonderful!!!!
Ultrawide 1440p is probably my favorite
Agree the immersive is dope
And perfect for 4K UHD movies as ultrawide fits Cinema
Seeing QD-Oleds in store was a massive turn off since the blacks truly looks like tn-panels with some ambient light. Rather wait some longer then have such a glaring issue.
Ur stupid if you are buying an OLED to use in a bright room to begin with. OLED shines in a dim or dark environment if you don't play like this, just buy a LCD.
They don't look like that in real world usage. The issue is made 100x worse because of the super bright florescent lights in those kind of stores. It's basically the worst case scenario.
When you actually have it at home it's a complete non-issue when the monitor is actually on and you are using it. The only time it's really even noticeable is if I have the monitor off, and have a light on directly in front of it.
Unless you use your computer with really bright lights on right behind you on the ceiling, I wouldn't worry about it. Frankly, in the situations where it would be a problem, the glare and reflection of the light due to having a glossy panel would probably be the bigger, more noticeable issue.
These new panels don't have DP 2.1 and some are lacking HDMI 2.1 for that matter as well. And with the rumors of the 5080 not even matching the 4090 for performance it seems silly to even buy a 4k 240hz monitor as a 4090 only averages around 144FPS at 4k at Ultra settings without DLSS and Frame Generation. I guess if money isn't an object and you'll be getting a 5090 it might be worth it but just hope DP 2.1 isn't required. Id personally wait on the tech to get better for the 4k OLEDS.
4K 240hz is absolutely the way to go. Especially since were getting a new gen of GPUs pretty soon.
If you have the money to get a PG32UCDM, do it.
That assumes that you are going to upgrade your GPU the next generation
That is assuming you wont need DP 2.1 to reap those new GPUs benefits. These new panels don't have DP 2.1 and some are lacking HDMI 2.1 for that matter as well. And with the rumors of the 5080 not even matching the 4090 for performance it seems silly to even buy a 4k 240hz monitor as a 4090 only averages around 144FPS at 4k at Ultra settings without DLSS and Frame Generation. I guess if money isn't an object and you'll be getting a 5090 it might be worth it but just hope DP 2.1 isn't required. Id personally wait on the tech to get better for the 4k oleds.
@@BillyMOV It's not an assumption. You won't need DP 2.1 for any Blackwell GPUs. They will likely support DP2.1, but it won't be necessary. That would be a terrible business decision and make no sense for any of the parties involved. Any basic research into the whole DP 2.1 thing would have told you that. You don't think that Nvidia's biggest GPU AIB (ASUS) knows whether DP2.1 would be necessary for Blackwell? And if it was, that they would have put it in their flagship gaming monitors that are launching shortly before Blackwell and are targeted at the same people who are buying those top end Blackwell GPUs? If it was every new monitor would already have it. But they don't because that would be stupid and the DP standard is backwards compatible. Oh and every single new 32in OLED 4k240hz monitor does have HDMI 2.1 (although that only caps out at 4K 120Hz/144Hz anyways). There is a 32in 4K 240hz panel with DP2.1 as well, but it was only put in there to attempt to differentiate it from other monitors using the same panel, and capitalize on those who don't understand why DP2.1 isn't really necessary yet. In the monitor that does have it, It also provides zero noticeable benefit and comes with additional problems due to how new the DP2.1 standard is and the lack of clarity around its various sub-standards.
You incorrectly assume you understand the purchasing behavior of the people who are actually buying these products. If you think people only buy monitors like this, or even monitors in general to try and cap out every single game at the maximum refresh rate of the monitor, and expect to do so from the moment they get it. You are very very wrong.
Your comment reads like someone who doesn't actually have any of the hardware you are referring to and therefore can only attempt to analyze hardware based on the numbers on the spec sheet, without actually knowing what the real world experience of using them is like, or how people who have them, actually use them. And even then, you haven't even done enough research to actually know the things that you can actually know without owning the hardware.
You are completely wrong about DP2.1.
You don't understand the first thing about OLED panels or their benefits. Which is wild considering the video you are commenting on.
You don't understand DLSS or more specifically, what it's actually looks like when using it IRL at 4K and different quality levels. Or what the experience is like playing different games with or without frame gen at various framerates. You don't even seem to really understand framerate in the first place.
The advantages that comes from switching to one of these new 32in 4K 240Hz OLED monitors go far so far beyond simply the increased refresh rate. That's not even the primary reason the vast majority of purchasers are buying them.
There's also a huge difference between "money being no object" and being able to afford buying a few thousand dollars of PC hardware every couple years. There was absolutely a point in time where I couldn't afford to either, but was still really into PC hardware and researched things constantly to think up hypothetical builds, what I would get if I could, and follow the evolution of the various industries.
But come on, if your gonna make comments like the one you just did, at least be certain you actually have a clue what you are talking about before commenting. You're doing others a disservice by spreading incorrect information.
When are next gen gpu coming? And what specs are we pushing?
@@BillyMOV GIGABYTE AORUS FO32U2P with dp2.1 left the chat....
Now do a qdoled tv at 120hz vs a monitor at 360hz. Obviously the monitor will be better for competition but I'm talking about outside of that. Monitors are just to dim imo
Anyone commenting on the matte screen. You obviously havent seen them all side by side in perosn. Its not even close. The matte coating takes away from the sharpness and details. Enjoy having your 4k oled screen being held back.
You’re right. It’s not even close. I’ve had both, and my glossy QD Oled loses its deep blacks and has crazy reflections the second I turn lights on in my room or open the window. The W-Oled has a grain that is really only noticeable on bright white backgrounds, and it only looks worse than QD-oled when my room is pitch black.
So yeah, glossy is better if you game like a cave troll. Which is terrible for your eyes with gigantic amounts of constant strain on your focusing muscles.
Enjoy having your 4k oled screen being held back by you slowly losing the ability to see in the first place.
@Deifiable for one glossy screens have been around for how many years now??? They are on a thing called a TELEVISJON. People have been gaming on them for how many years and never had complaints. The matte screen looks like a cheap 4k monitor with the way it blurs the detail of the pixels. 4k is meant to have a 3d look to it while gaming and the matte screen prevents that. I posted pics on reddit just to prove it. Enjoy your blurred screen and grayish blacks. Less detail and less sharpness. There's a reason glossy oleds are sold out everywhere and you can find Samsungs and LGs at anytime of the day.
@@ryanbrowning5586they came out almost 2 years ago ofc you can find them
You must not know how to adjust settings bc my matte Samsung blows the glossy oled I just had out of the water.
@jefftruitt1812 lmao sure. I've had 3 and can tell you without a doubt the MSI and Aorus were more detailed and clearner than then samsung. For a 100% fact. I went with the LG G4 and it blows away all of them.
I might be on hard drugs or something but ive heard the difference after 240 hz is minimal, I have been using a 540 hz monitor to see if its any different and boy does 240 hz after switching back look bad it looks like how 60 hz did after I used 240. Now im wondering if a higher resolution than my normal would be better than higher hz to get better image clarity.
240 to 360hz isn't that much of a difference, still a difference but not big. 240 to 540hz is actually a big difference. Ofcourse not comparable to the 60-144 jump but it probably is just as big as the 144-360 jump lol😅
Just returned the 360hz 1440p Alienware AW2725DF, gaming was awesome but everything else looked worse than 1080p to me. Jagged, disorientating text etc. Probs an OLED thing but yeah guess I'll be going for a 4k 240 with 1080p 480hz hybrid.
Could have been windows scaling. If it's not set right, it makes everything except video and gaming look horrible.
That's weird. The 27 inch 1440p has higher pixel density than a 24 inch 1080p monitor.
@@Ratbane I can't speak for all monitors obviously but I have owned a zowie 360hz and 240hz and see 1080p corporate pc displays often that look better to my eyes. The zowies are tn and look really bad on desktop also
@@Ratbane It's not weird at all, It's supposed to be that way. The more u increase the size and res. the more ppi u have.
That has to be the weirdest take ever, Maybe you didn't have the right settings on, or u had it on full hd res on a 2k monitor
You need ridiculous gpu power to run 240 Hz at 4k so 360Hz is the better option for me because buying a 5090 isn’t on my list right now …
Since now. The Asus xg27aqdmg is the Best. Glossy. 27inch and no. Crap Fan like LG or dell
1440p i can max setttings 4k i need to use max performance FSR or DLLS to reach desire fps and some low settings to use or med settings..4k is good if u play old games For new games 1440p. That is why pro gamers go for 1440p monitors with a reason and 1080p.... if you want to have fun and to be fast don't go 4k system latency is huge.
I'm going to keep crying for 1080P OLED at 24 inches and 480Hz
never going to happen. oled is a premium product. 1080p isn't a premium resolution.
@@veilmontTV
I wouldn't say never. They had $700 - $800 1080p high refresh LCD monitors.
Oled on a 1080p is a waste of resources 💀
ive seen 1440p monitors now at 24 inches I doubt anyone would make a 1080p version especially oled at this point
@@504Trey charging $700 for a 24" 1080P screen is a waste of what resources? 💀
FUACK! JUST GET BOTH 😵💫🥴
Hi What is the game of 0:26 ?
I think it’s the finals
1440p 480... Still waiting
Bro you need to update your PG32UCDM HDR SETTINGS on your patreon
HDR is junk, just leave it off....
@@drunkhusband6257 Not true at all. Doom Eternal looks dramatically better on HDR if its HDR settings calibrated correctly to your monitor.
God of War Ragnarök on the PS5 looks insanely better on HDR too. And so will the PC version
@@De-M-oN Ok continue to ruin a good image with junk hdr thinking it's better because you don't know how to adjust your sdr settings right.
Nah 4k will give too much input lag in comp games so 1440p 360hz all the way
Cant really tell the clarity difference in my opinion if anything minimal difference oled is a huge hop don standard 2k 4k ips monitors
Of course everythong in this video depends on your rig. But yes I agree 4k is just way too gorgeous to go with 1440p espe ially if you can get 120 fps from your gpu the frames after that unless you have insane eyes won't matter as much (I can notice the difference between 120 and 240 but not nearly as gross as 60 and 120 fps)
60hz feels absolutely miserable
Why cant i just have my damn cake and eat it to? Why do i have to pick pros and cons? Just make a damn monitor that has it all so everyone can be happy
lol .. i'm with you.
4k really is that mush better and in most games 4k is easier to run that 360fps. even 240hz is hard to do.
3:35 Don't think it works quite like that
Thank you for talking about reso. I’m waiting for 5120 by 2160.
i have the aw3223qf but an ultrawide with 4k pixel density is what i'm waiting on.
@@veilmontTV isn’t there already a bunch of ultra wide oled 4ks on the market? I’ve seen a ton of them but nothing with higher resolutions.
@@GregKealey There is literally only 1 4k ultrawide on the market at all. The super ultrawide Samsung Odyssey 57", that's it.
Hi! So i started playing geforce now, it looks great but how will it do with an QD Oled 4k 240hz monitor since its a streaming a 4080 card?
Your doing my head in 4k 240hz lg am better as it's cleaner looking it's a big jump from 120
Brasil eu comprei uma 360hz estou querendo o 4k só que minha gpu é uma 4070super alguém pode me ajudar por favor?
I have a 4090 / 7800x3d build. You will spend more money on your pc and your monitor, it will heat up your room pretty bad causing you to turn down the temp in the ac, the house will be at a crisp 67 Fahrenheit and your room at 74 f. If you live in an area with high electricity cost you will notice it on your power bill if you game 4+ hrs a day. All that for a few extra pixels.
Luckily I have solar panels though.
If I had to do it all over again, I’d get a 4070ti super 7800x3d and 1440p panel. Or a 7900xt but the game I play likes NVIDIA more.
I have the exact same cpu/gpu. I'm really torn between the 27 or the 32. I can't make up my mind.
Same here,got the fo27q3. 4K isn’t even optimized well on games these days it’s so useless
@@disco4553 The only reason to get the 27 over 32 inch one is if you are a very competitive gamer. If you mainly play SP games, MMORPGS or racing games def go for 32 inch 4k.
The experience on those types of games and text clarity is that much better on the 4K oled displays.
I mainly play FPS games so I actually traded my 32 inch 4k in for the 27 inch 1440p one.
got same Pc specs and im ordering FO27Q3 360hz as im writtin this, i ll keep my current Dell Alienware aw2723df 27" 1440p 280hz Ips as my secondary monitor
I havent tried 4k yet so the only reason im going for 32 inch 4k 240hz oled. The gpu isnt going to cut it but 5000 series are around the corner and i'll get maybe a 5070 or when the refreshes hit, 5070 super or 5070ti super. Something in 800-1000 eur range max. Should be able to do at least 60fps maxed at 4k. If not, i'm not too fussy about lowering settings to get a playable framerate. Satan knows i'm going to run minimum settings with dlss for most of my games til then. Except esport titles which will be cpu limited even at 4k. Got a very modest 5700x3d and 4060ti. Skipped the 9800x3d upgrade in favor of new monitor. It's way more than i'd usually spend on a single piece of hardware but it's long overdue for a monitor upgrade. 240hz tn i use is 6+ years old now. Got it at the time when cpus and gpus we'rent even capable of hitting that framerate outside of some really easy to run games. Thanks to x3d chips the games easily hit the monitor refresh rate in esport titles. Gpu is the bottleneck now.
Bro eyes on left and on the right haha
@3:50 what is is GPU bounded or CPU 🤤.
Saying if you're GPU bounded. And telling seconds later I'm CPU bounded
Imho, in gaming, DLDSR of 1440p to 4K on these 27" monitors adequately bridges the gap to the picture clarity of native 4K on the 32" monitors. Fight me.
It may reduce aliasing, but it looks nothing like native 4K.
If you do this often I recommend your next purchase to be 4K.
Sort of. It’s never quite going to be 1:1, as you can’t really match the amount of pixels on the screen. There’s also the issue of the distinctly AI processed look DLDSR tends to add, especially when you apply higher sharpening. It also tends to have a higher performance hit than native 4K (varies from totally negligible to pretty significant), tends to be finicky with some games, doesn’t support RTX HDR, and isn’t supported in some of these OLEDs.
DLDSR is great, but only if you already have a 1440p monitor. If you’re in the market for a new monitor, and you’re debating between 1440p and rendering at 4K with DLDSR and 4K native, it just makes more sense to go with the 4K one. If you’re going to be rendering games in both at 4K, just get the 4K native one. The only case I can see for 1440p OLEDs is competitive gaming.
@@thedisplayguy I’ve got a 1440p 27” oled and a 4k 27” mini led sitting side by side and when using dldsr on the 1440p display, they look eerily similar when gaming (most games… FFVII and Forza: no. Every other game, yes; I also have 4k 55”, 65” and 77” oleds which I also game on fwiw).
Give it a try (gaming, not desktop use). 2.25x dldsr and 100% smoothing.
24 monitors are just small for me. I'll always choose 32+
@@noidsuper I agree with you almost 100%. I use no sharpening (100% smoothing) which completely gets rid of the ai processed look to my eye. And on a 4090 I’ve never found the performance hit significant, even in aaa games (I also have a 4k panel sitting next to my 1440p one).
Problem with going 4k native is price and size. There are no 4k 27” oled’s for those of us who prefer that size. Also, I’d argue that the asus xg27aqdm is the best 27” oled out right now and it gets brighter than the 32” oled’s with less abl while being quite a bit cheaper. It supports dldsr but not having rtx hdr with dldsr (for the time being) is a bummer.
My only concerned about 4K is the latency problems like many other UA-camrs have mentioned which means it's not really great for competitive play can I typically like to sit close to the monitor which I feel 4K is better suited for bigger monitor screens better positioned away from people but on the other hand going to 4K would mean less of a CPU bottleneck scenario with my 5800 ex 3D
what latency problems?
@@Ziyoblader...?
I don't know why I watch so many monitor reviews that talk about refresh rates so much. I don't even play competitive games anymore and it won't matter in my everyday usage at all...
None of these just get the 2k 480hz 😂
I'm waiting 4 a list of 4k oled for video editing
pg32ucdm
I would love for somebody to objectively quantify how many more kills or wins they get on a 27" monitor compared to a 32". I bet it's negligible at best. Same thing with 360 Hz compared to 240 Hz.
The OLED burn-in issue is not satisfactorily solved. Until it is, I'm not going there.
Not a problem with my 6090.
I get like 300+ fps in the finals with an Intel processor at 1440p
This guy is losing his credibility so fast lol
why is so?
All to expensive.. Looking at MSI mag 274upf 4k 144hz for about 540 USd😋