In a desktop viewing distance, you do not need the eye searing brightness levels of an OLED TV because you are not sitting several metres away and only sitting inches away. Bigger number is not always better. If you are going to spend 3-4 hours a day gaming, trust me you do not want 1000 nits highlights bombarding your retina all day.
even the low 40% highlights brightness of 380nits on current 4k oled is to bright. i had my rtx hdr set up wrong in warcraft and a body of water was glowing so hard i squinted.... in a dark room that is.
It always bothered me how Smartphones and TVs are able to achieve such a high peak brightness, but PC Displays can't achieve the same, Hopefully this tech gets better by time
WDYM? PC displays can achieve high brightness... You just need to buy one with high brightness, there's plenty QLED options with high brightness around.
@@marcosvictor4935 And those come with all the downsides of LCD displays (poor black levels, or blooming in the case of Mini-LEDs). The point he's making is that phones have super bright AMOLED displays, which have the advantages of OLED without the poor brightness of OLED monitors.
It's for sure the reason that the 32" 4k oleds didn't end up at least $1500. Like bro, give me a brighter oled monitor! I will pay $1500 for it. Put a fan in it I don't care just give me the option at least!
@@crowntotheundergroud they don’t even need fans. They just need better heat sinks and the manufacturers are determined to hold technology back so they can milk you for when they’re ready to put out 1000 nit plus screens.
The OLED community has extreme buyer's insecurity. You can't criticize any aspect of their expensive TV/monitor without them having a tantrum. It's laughable that people defend ABL with arguments like it protects your eyes, or that a full screen of 250 nits is so eye-searing that they have to turn the brightness down. I say this as someone who's owned two OLEDs. On balance they're my favorite panel type but they're a long way off from perfect.
@burchmore5000132 You can criticize and we are allowed to disagree respectfully since this is a discussion. It's not a buyers insecurity but more like personal preference as people like me don't like bright screens. I had to return my OLED even due to how hard it was cleaning the glossy screen and how easily they would scratch. So yeah they are imperfect but brightness is now what I was missing
@@technonecroplays dude, we're talking about HDR brightness specifically. the video even has HDR on the title. there's a reason why the HDR in monitors is a joke. cause monitors can't get bright enough in HDR. and guess what? HDR is a selling point for OLED.
ABL? What are you on 2023? ABL is old news. You do realize that since it got negative reviews back in 2023, that new 2024 models and moving forward has disabled ABL. I'm an OLED fan and before judging you need to have the latest monitors to make a critique or a proper comparison.
I bought the 4K 240Hz Alienware QD-OLED for 1K USD and the brightness SUCKS compared to my cheap AOC VA Panel. OLED gaming monitors are unfinished tech and its a complete joke that any company think they're in a suitable state to put on the market and most content creators that release videos on OLED monitors are just hyping them up to farm money for affiliate links.
@@seannightwal1765 You have absolutely no idea what you are talking about. Every single OLED monitor and TV has severe ABL. It is a fundamental limitation of OLED technology. Mini-LEDs have ABL too, but to a much lesser extent. Even $30k reference monitors have some degree of ABL. Unless you have one of those $100k Micro-LED TVs, your panel has ABL too. You have absolutely no idea what you are talking about.
Honestly, for gaming I wouldn’t pick anything other than OLED, even if HDR performance isn’t amazing. I have yet to see a good motion performance display with good enough FALD that blooming isn’t a major issue. Sure, large TVs can have insane zone counts at 4K resolution and pretty much eliminate blooming on most scenarios except the absolute toughest. But at a display size I would actually use (27”) at a resolution that is reasonable to run (1440p), I can’t see any realistic monitor option other than OLED.
the M4 iPad Pro and Dell XPS has the best OLED tech up to date. Tandem OLED is innovating and is less prone to burned-ins and high brightness just like mini-LEDs due to it has 2 panels stacked together. Future Macbooks and Pro Display XDRs would be getting it to maybe 2026-2027.
i have no idea what monitor brightness should be, double the brightness of current oled seems like a good idea to me. just make sure monitors get that brightness wheel back, that was sweet back in the day. i also remember 60fps was fine, then 144fps, and now 240fps. my point is that when i see a better thing i know i want it, but i can not imagine it right now. blissfully ignorant you might say. but i might think that power draw could be the limit factor, there might be physics limits to how bright a pixel can get, i know there are more efficient oled coming because without that we are stuck at 1000nits on 2%
@@cheeemzy6651 No it doesnt. OLEDs have their clear advantages. I use them for years. But I purchased a 85 inch Sony Bravia 9 Mini-LED and my Samsung S95C QD-OLED gets totally blown away in HDR. Micro-Contrast is of course superior on the OLED but if you really want that HDR "pop", no OLED can deliver that. My Bravia 9 reaches almost 3500 nits on a 10% windows and 900 nits full field. My S95C barely gets above 250 nits full screen and 1350 nits 10% (Which dims in a matter of few seconds down to 600-700 nits...) and it clearly shows. Infinite contrast cant make up for all the deficits OLED has. ABL and SABL are ridiculous in games and certain scenes. You dont know it because the OLED shills and paid reviewers seldomly mention that.
I bought the 32 inch Alienware curved and OLED and returned that sucker so quick. Only thing it had going for it was the response times. HDR was a joke.
450 nits on a 10% is kind of trash. This weak ass brightness is for sure the reason they couldn't charge over $1500 for these monitors. But give us more options and charge that $1500!
How bright before these screens actually do start to hurt our eyes like our parents warned us about? I mean, we really don't want to actually damage our retinas here. I don't need my screen to look like the sun or a welding torch or one of those ridiculous super bright flood lights or a laser dot or something.
@@aubrynobicop6924 Even half that could be a problem if the image was dark and suddenly flared up to full brightness. After all, our eyes adjust to let in more light indoors. I don't want scenes of a pitch dark cave that fully switch to a light from a flashlight to make me actually wince and hold a hand in front of my face to block my screen's light after all.
@Dark_Jaguar I just don't believe you have ever walked outside if a monitor in a dark room hurts your eyes even before adjusting. I have woken up in pitch dark opened my phone and not ever had a reaction of needing to shield my eyes from the brightness, and my phone gets pretty bright. Peak brightness of 2500nits is what my phone gets too.
this is why I'm keeping my Mini LED even though I have one of the new OLEDs...Mini LED HDR brightness is so good its like crack. It blows OLED out of the water in anything HDR
Idk if I agree, I usually play in a dark gamer cave so I rarely need more than a few hundred nits. HDR1000 already looks punchy enough imo, I just wish they'd fix the bright scenes dimming on QD OLEDs, that's the only major problem that bothers me. But I guess having an even brighter HDR mode would be useful for those who like to game in brighter rooms or whatever, dunno.
I prefer the monitors in this regard. The difference the TVs are exhibiting between full screen and 10 percent window should not be so extreme. That kind of ABL has its own weird look. If the full screen ability is only 280 nits or so, the peaks should not be too much brighter. 400 to 500 is about right. The problem is that when peak brightness of highlights gets brighter, it makes it harder for bright scenes to look bright enough. So you can't do a daylight scene where most of the screen is limited to 270 nits when highlights are getting up well over 1000 nits. That just doesn't work perceptually, and I see it every time I look at OLED TVs, or any TV with a lot of ABL. If a highlight can get bright, the whole screen needs to get reasonably close to that peak brightnes for bright daylight scenes to not look eerily dim. I think 1000 nits full screen is enough, and highlights don't need to be any brighter than that either. Zero ABL is ideal. There are real visiual advantages to compressing realistic brightnesses into easier viewing dynamics, so I see no need to go to 4000 or 10000 nits, not that it would be bad. It's just a waste of energy. Much of real world lighting is excessively bright and contrasty and hard on the eyes. Compression of brightness and contrast is an effective artistic effect, and scenes can look gorgeous with 1000 nits of brightness. A lot of people wear dark shades on sunny days, and they don't get the benefit of a taylored tone curve to make the scene look it's best to the human eye.
Trueblack 400 is way enough for me on a 32' 4k monitor. I usually play at dark but even daytime is enough! I really can't understand why people need more thousands of nits...
Take a look at the Asus ROG Strix OLED XG27AQDMG for example. 1440p 240Hz WOLED with MLA 8xx Nits 10% and around 12xx Nits 2% when set up for highest brightness, and that for around 600-700 bucks. That one for me is the perfect of both worlds, perfect blacks again but highlights can still make me squint my eyes like my Mini LED.
I recently purchased a mini-LED to upgrade from a regular LCD IPS with entry level 400 HDR, which was fine for what it was and offered, it was good monitor all things considered, and I could not be happier now with that change, I have now the Xiaomi Mini LED Gaming Monitor G Pro 27i, and it is not a perfect monitor by any means but the value and specs in general are just insane. I do plan to upgrade later but in a few years to an Ultrawide OLED monitor, and while I wait for that it will still give more time for the OLED tech to continue to be further polished/improved and for the prices to continue to go down.
No data on the peak 1000 nits mode? I've been wondering at which window sizes modern OLED monitors would get past 400nits... Not going past 400 nits in HDR400 mode is just... I mean, it's in the name, what'd you expect? Do you plan on making a separate video showing that?
cant they just make a monitor which is 2x or 3x bigger and add a fan inside, air cooled or water cooled for that matter, to cool the heat away. anyways people i need help. i want a new futuresafe screen , hopefully a sceen with hdmi 2.1 and dp 2.1. but i have rtx 4080 gpu, with 2,1 hdmi port. and a 1.4a displayport. which should i choose? hdmi 2.1 is more bandwith. but everyone wants the displayport.
only real advantage of micro led is high brightness and no background worry about burn in for some. disadvantages - slow response time (need time to dim and brighten backlight grid), low refresh rate, contrast still loses out to OLED, FALD is annoying in desktop mode, so need to switch depending on what You do every time, heat build up. for the future - increasing FALD zones increases price exponentially.
3:08 bro not even the HX 3110 can do 4000 nits full field 😭😭 The ideal display sounds cool though i think 1000 nits full field and 4000 nits 10% is the ideal display, for now ig. Still though, a TV that can do 1000 nits full field and has the contrast of OLED is gonna be perfect for nearly everything. And you also have to understand that although the outside world can be 30000 nits, everything out there is bright as well. HDR stuff is meant to be watched in the dark, so 30000 nits in a dark room is gonna be like a flashbang yk? It's not an ideal comparison. I still do think we should have the ABILITY to choose brighter screens if we want to and then TURN IT DOWN in a setting, that way we get both super high brightness for people who want to destroy their eyes and high brightness for people who dont
I have to disagree with your sentiment about OLED monitors having terrible HDR. I have the ASUS PG32UCDM, which you have reviewed, and it gets plenty bright for 95% of users out there. For reference, I also have a Bravia 9 75", which most definitely gets brighter. But... I don't keep the brightness cranked on the Bravia 9 either. There's no point, IMO. The current brightness of the UCDM is great. I'd be ok with maybe 20% more brightness, but after that, I'd end up turning it down.
I upgraded to the UCDM from a gigabyte 43 inch 1k nits VA panel and the brightness and contrast blew me away. I always used to crank the white point in games to 100% but now I don’t need to get remotely close before the brightness becomes painful. Anyone who would say the hdr on the pg32ucdm is awful has strange taste.
I actually prefer the vesa 450 nits mode that most qd oled pc monitors come with. It kind of bad for HDR tbh compared to a OLED TV. But it’s very consistent and uniformly bright without ABL that you are more likely to notice since you are sitting so close to an OLED monitor.
I think now its the time to stop crying about brighter oled monitors…. They bright enough already. When u sit front of the monitor literally whit screen in your face u cannot have 1000+nits…. If i play on my tv from a couch thats a different story. When i play at night i need to lower the brightness on my oled monitor otherwise my eye will burn out and we cry here for 1000+ nits just ridiculous
Alienware AW3225QF is capable of 1100 nits peak brightness in HDR. My TrueBlack400 Color profile hit 460 nits. I can't complain at all to be honest. Everything looks great in HDR or SDR. I don't experience issues viewing during the day and I'm facing two windows to put things into perspective, but OLED really shines in darker environements.
Turning off ASBL on my Samsung G8 in the service menu fixed all of the dim HDR performance problems I was having. I know that's not a ideal solution for everyone, but I did work for me, just be aware and not leave static images up on your screen, but I think that's OLED ownership 101 stuff.
meanwhile on phone land, 3000 nits fullscreen at 5x pixel density, fully glossy with almost no bad reflections btw, why aren't asus & co. selling watercooled OLEDs already?
This is why I held off on buying an OLED monitor. They're doing fine at 2% and 100%, but the arguably most common real-world use case at 10% is lagging far behind TVs. My 2022 OLED TV with heatsink would make a 2024 monitor look disappointing. Currently using a cheap 1440p 240Hz VA mini-LED while waiting for OLED monitor panels to improve their 10% brightness. Owners of OLED monitors who care about motion perf above all and play dark games (in a dark room if QD-OLED) still have a good case for their purchase, though. Just ain't me; I share my room and prefer it softly lit anyway, and I don't play FPSs.
@@Fetchdafish I just don't like that companies focused their marketing on HDR because there is much more important features to improve! WHO D F need 1000 nits brightness? A BLIND PERSON?
The pg32ucdm has the best hdr I’ve ever seen. Even better than the VA panel it replaced for a similar price. Honestly more than 1k nits in your face really does seem insane to me. I have to turn my brightness down already in games. Also why the insistence on hdr 400 true black? That seems like gimping the brightness artificially all for color contrast accuracy many people won’t notice.
3:03 That chart isn't really accurate. Because of how our eyes work, there's a bigger difference between 100 and 1,000 nits, than between 1,000 and 2,000 nits. So that chart should actually be squished at the top, making QD-OLED look a lot better. It's still bad of course, but it's not that bad.
Sooooo... I got a MSI 32UPX, I just watched this on - because YOU told me to get it. But now, it sucks and I should replace it with a C4 42"!?? HELP :D That aside, and it is a serious question done in a goofy way. I got a 77" S90C last year - was my first OLED... and it pretty much ruined me, since this year I went S90D 65" for my other place, as well as QD OLED Monitor. I hate to admit that I just love it, even though that "Magenta Tint" Debate gave me a FOMO related headache for 2 weeks (Its not as bad as people make it seem, thats my take), I digz me those QD OLED Panels, since they hit different. The Samsung TVs get stupid bright - like sure, my eyes could adjust to 10k Nits specular highlights, but when I look at those TVs, I am not going "Ah dang, that should be brighter, since my eyes arent hurt enough already.". Coming to the Monitor, it doesn´t get the "same" brightness, but I am waaaaaaaaaaay closer than I would ever be to my TV, so the HDR still is subjectively very bright. Since OLEDS come at the cost of having to somewhat control light, in order to get the best performance from the Panel (WOLEDs brighten up too in light) I wonder how bright they really NEED to be - since they have to compete against what!? 20-100Lux of light, with 100 ambient being too bright to really enjoy a movie or game properly. So yeah, do you think that viewing distance and light controled environments kinda make up for the lack in light intensity? Also...should I keep that MSI or go LG TV? Also Also.... when those nasty 4.000 Nits OLEDs are ALLEGEDLY supposed to drop, rumor wise? We talking 2025/2026ish, or further down the line? Also Also Also, keep up doing the lords work and never get rid of that lovely borat mercury moustache, its pinnacle swaggery. Cheers
I have OLED TVs for Watching movies and such and the picture is noticeably better than my QD-OLED and W-OLED monitors. I'm ok with that for now just like I'm ok with having a matte screen. Intense image brightness is not good long-term when you sit 3 feet or less away form your display for hours at a time. I learned the hard way after months of eye strain and occasional headaches using a QD-OLED glossy monitor since march. Everyone's mileage may vary but I'm having a way better experience with the 32gs95ue people like to take a dump on lol. No reflections and no raised blacks, no eye fatigue and no headaches. Yesterday I played for hours with the sun shining through the window and it was so relaxing instead of turning my office into a dungeon. WOLED wins quality of life for me while QD-OLED wins the overall image quality in the perfect conditions but I will say side by side the difference was marginal.
that is why I am holding to odyssey neo g9 57 HDR brightness and with displayport 2.1 on upcoming nvidia cards I can finally run the monitor at 7680x2160 @ 240hz
Mini LED for the win, If I just tweak around the Nvidia settings with the color and saturation and brightness versus contrast, I can get to 90% of an oled. With none of the side effects and able to watch it during the day.
display guy criticize even about the smallest thing. Does that actually help anyone make a purchasing decision? I say no. It just confuses people even more. 24 years ago there weren't these UA-cam misanthropists who badmouthed everything, but people were still happy with their purchases.
its crazy to me that people think they need more brightness than what the current flagship OLED's offer. its literally eye searingly bright unless you're trying to use it outdoors. I see every day people complaining about brightness while simultaneously complaining about headache and eye strain because they're already too bright
We absolutely need more fullscreen brightness. A high APL image on an OLED looks pathetic next to a mini-LED showing the same. Not even remotely "eye searing".
@@burchmore5000132 its true, mini led and standard led can get WAY brighter, but there's a certain point where it just doesn't matter, and we're already there.
This max brightness as a measure of better hdr is total marketing c**p I have a miniled with 1400 max brightness, it’s good for me because I use it also for work and I fear oled burn in, but oled is still a lot better for image quality, viewing angles , blacks, micro contrast, and overall smoothness of the experience.
I have a miniled with 1400 max brightness and use it with hdr off. Simply looks better. HDR is becoming the new 3D. Overly hyped and almost never implemented properly.
Don't hate me. After using a Samsung QLED for a few years and upgrading to the LG C1, I think the LG C1 is just about perfect. ABL dropping full screen white to 130 nits is a tiny bit excessive but 200 nits full screen is probably as much as I would ever really want, and I know for a fact my 500 Nits full screen Samsung QLED is too much brightness so I'd take 300 Nits full screen at the absolute maximum.
With it being so close to your face you do not need that level of brightness. It damn near blinding. I have a PG27AQN that I have at 60% brightness because of how bright it is. I plan on upgrading to the PG27AQDP which is 450. I game in a dip lit room and 450 is perfectly fine. The HDR on the AQDP is around 1300 peak brightness. I think a good standard brightness on OLED monitors would be 600-700 max.
We just need miniled displays with 40k LD zones to beat oleds… I suspect nobody can have the processing power to pull this one off in a consumer product
I have the same monitor and I agree. I upgraded from a mini led neo g8 m, it was brighter but necessarily bright. The hdr looks better on the aw3225qf by far. The detail and the blacks are insane.
@GeeDeeDee tell me you've fallen too far down the right wing pipeline without telling me you've fallen to far down the right wing pipeline. Not everything you dont like is "woke".
@@OrlandoCaba dude just needs a bullshit reason to make a video. Contrast matters far more than full peak brightness. He's comparing them at their absolute worst in a manner that doesn't mean shit
OLED fanboys really coping hard with the "I don't need eye-searing brightness, I use my monitor in a dark room!!!11!!!". Just say you have no idea how HDR works and go. HDR is mastered using 1000 or 4000 nit displays, and therefore require a 1000+ nit display to show what the actual scene looks like as mastered by the creator. There are ZERO oled monitors or TVs that do more than 300-400 nits at fullscreen or even a 50% window. The new OLED iPad is the largest-brightest OLED display, and its still only 13". OLED monitors and TVs are still probably 5-10 years away from catching up with Mini LED for HDR. Currently, only Mini LED monitors and TVs have real HDR capability.
As someone who owns both a Bravia 9 Mini-LED (the most advanced on the market, btw), and an ASUS PG32UCDM, mastering means very little unless you're viewing the content in FILMMAKER mode or equivalent; modern TVs are going to tone map based on what the TV and user prefers. OLEDs have the advantage of being able to push point highlights far higher than mini-LED, while mini-LED can push full scene brightness. A mastering monitor can do both thanks to dual-layer LCD tech, but current consumer TVs can't. So either you pick full screen brightness, or point highlights; you're not going to get both.
There's no one "coping" except you. OLED displays can achieve significantly higher peak brightness (up to 1,000-1,500 nits) in smaller highlight areas (e.g., a 10% window), which is crucial for HDR performance.. So MiniLED certainly doea not apply to what rhe creators intended either.
What is with this bizarre hostility and anger? This is panel preference not politics. You’re also just wrong, HDR is more than just brightness, the whole point is contrast which OLED does better than anything. My current OLED monitor has the same 1k nits peak brightness as my last monitor, a VA panel. But on the OLED I actually have to turn the brightness down in games which I never had to do before. The contrast and clarity is so much better on the new monitor at the same resolution and almost double the frame rate, and the contrast makes it literally the best HDR I’ve ever seen, not much worse than my OLED tv in the same room. How many people need to say “My personal experience is different” before you realize not everyone who disagrees with you is coping?
@@NexGenTek they are not definitely better. they suck for HDR and only last 2-3 years before getting burn in. Mini LED is overall better for most use cases.
@ Actually HDR on a QD Oled looks better because mini led monitors have halo and light bleed and terrible uniformity. Slow response times It’s not even close
So glad I didn't pull the trigger,Though blacks turning purple is a deal beaker for me I still considered going for the alienware ultrawide because of hdr, instead of asus xg27aqdmg
I had a 2000 nit Samsung odyssey g9 mini led now I have Alienware qd-oled the Alienware looks way better in both hdr and sdr the specular highlights look way better and brighter on the oled even though on paper the stats say otherwise
Thank you to Ruipro for sponsoring this video!
Buy the Ruipro HDMI 2.1 Certified Fiber Optic Cable (6FT): amzn.to/3wvHB7j
Buy the Ruipro HDMI 2.1 Certified Fiber Optic Cable (25FT): amzn.to/432NDGS
Buy Ruipro Cables: ruipro.store/displayguy
Best Monitor Settings Here (Guides & Discord): patreon.com/TheDisplayGuy
In a desktop viewing distance, you do not need the eye searing brightness levels of an OLED TV because you are not sitting several metres away and only sitting inches away. Bigger number is not always better. If you are going to spend 3-4 hours a day gaming, trust me you do not want 1000 nits highlights bombarding your retina all day.
even the low 40% highlights brightness of 380nits on current 4k oled is to bright. i had my rtx hdr set up wrong in warcraft and a body of water was glowing so hard i squinted.... in a dark room that is.
My thoughts exactly.
We want the OPTION to not have the EOTF of our videos destroyed. Is that insane to ask?
cope you’re missing color brightness and contrast, 1200+ nit OLED is incredible and life changing
true I have lg c4 42 oled it's bright that I set it always on dark mode level 2.
It always bothered me how Smartphones and TVs are able to achieve such a high peak brightness, but PC Displays can't achieve the same, Hopefully this tech gets better by time
WDYM? PC displays can achieve high brightness... You just need to buy one with high brightness, there's plenty QLED options with high brightness around.
@@marcosvictor4935 And those come with all the downsides of LCD displays (poor black levels, or blooming in the case of Mini-LEDs). The point he's making is that phones have super bright AMOLED displays, which have the advantages of OLED without the poor brightness of OLED monitors.
It's for sure the reason that the 32" 4k oleds didn't end up at least $1500. Like bro, give me a brighter oled monitor! I will pay $1500 for it. Put a fan in it I don't care just give me the option at least!
Tell me one @@marcosvictor4935
@@crowntotheundergroud they don’t even need fans. They just need better heat sinks and the manufacturers are determined to hold technology back so they can milk you for when they’re ready to put out 1000 nit plus screens.
The OLED community has extreme buyer's insecurity. You can't criticize any aspect of their expensive TV/monitor without them having a tantrum. It's laughable that people defend ABL with arguments like it protects your eyes, or that a full screen of 250 nits is so eye-searing that they have to turn the brightness down. I say this as someone who's owned two OLEDs. On balance they're my favorite panel type but they're a long way off from perfect.
@burchmore5000132 You can criticize and we are allowed to disagree respectfully since this is a discussion. It's not a buyers insecurity but more like personal preference as people like me don't like bright screens. I had to return my OLED even due to how hard it was cleaning the glossy screen and how easily they would scratch. So yeah they are imperfect but brightness is now what I was missing
@@technonecroplays dude, we're talking about HDR brightness specifically. the video even has HDR on the title. there's a reason why the HDR in monitors is a joke. cause monitors can't get bright enough in HDR. and guess what? HDR is a selling point for OLED.
ABL? What are you on 2023? ABL is old news. You do realize that since it got negative reviews back in 2023, that new 2024 models and moving forward has disabled ABL.
I'm an OLED fan and before judging you need to have the latest monitors to make a critique or a proper comparison.
I bought the 4K 240Hz Alienware QD-OLED for 1K USD and the brightness SUCKS compared to my cheap AOC VA Panel.
OLED gaming monitors are unfinished tech and its a complete joke that any company think they're in a suitable state to put on the market and most content creators that release videos on OLED monitors are just hyping them up to farm money for affiliate links.
@@seannightwal1765 You have absolutely no idea what you are talking about. Every single OLED monitor and TV has severe ABL. It is a fundamental limitation of OLED technology. Mini-LEDs have ABL too, but to a much lesser extent. Even $30k reference monitors have some degree of ABL. Unless you have one of those $100k Micro-LED TVs, your panel has ABL too. You have absolutely no idea what you are talking about.
Honestly, for gaming I wouldn’t pick anything other than OLED, even if HDR performance isn’t amazing.
I have yet to see a good motion performance display with good enough FALD that blooming isn’t a major issue.
Sure, large TVs can have insane zone counts at 4K resolution and pretty much eliminate blooming on most scenarios except the absolute toughest.
But at a display size I would actually use (27”) at a resolution that is reasonable to run (1440p), I can’t see any realistic monitor option other than OLED.
OLED is king still.
@@thedisplayguy was ^^
Hdr performance on oled IS amazing, and quality of black and micro contrast and viewing angles are a lot more important than peak brightness
Amen brother! 27” is the biggest size you can go for a normal desk. 4K and 1440p are both ideal on 27”
Also mini led doesn't force you to live in a cave if you have direct sunlight hitting the screen otherwise
So you’re ideal oled is an iPad Pro oled
the M4 iPad Pro and Dell XPS has the best OLED tech up to date.
Tandem OLED is innovating and is less prone to burned-ins and high brightness just like mini-LEDs due to it has 2 panels stacked together. Future Macbooks and Pro Display XDRs would be getting it to maybe 2026-2027.
i have no idea what monitor brightness should be, double the brightness of current oled seems like a good idea to me. just make sure monitors get that brightness wheel back, that was sweet back in the day.
i also remember 60fps was fine, then 144fps, and now 240fps. my point is that when i see a better thing i know i want it, but i can not imagine it right now. blissfully ignorant you might say.
but i might think that power draw could be the limit factor, there might be physics limits to how bright a pixel can get, i know there are more efficient oled coming because without that we are stuck at 1000nits on 2%
what are your thoght guys on Samsung odyssey neo g8 4k 240 mini led pls in the near future i will buy new monitor what do you suggest
I have one and it's pretty solid
Yeah I kind of regret purchasing my OLED monitor whenever I open anything with HDR.
calibrate it properly and it look amazing
@@cheeemzy6651 No it doesnt. OLEDs have their clear advantages. I use them for years. But I purchased a 85 inch Sony Bravia 9 Mini-LED and my Samsung S95C QD-OLED gets totally blown away in HDR. Micro-Contrast is of course superior on the OLED but if you really want that HDR "pop", no OLED can deliver that.
My Bravia 9 reaches almost 3500 nits on a 10% windows and 900 nits full field. My S95C barely gets above 250 nits full screen and 1350 nits 10% (Which dims in a matter of few seconds down to 600-700 nits...) and it clearly shows. Infinite contrast cant make up for all the deficits OLED has. ABL and SABL are ridiculous in games and certain scenes. You dont know it because the OLED shills and paid reviewers seldomly mention that.
You use hdr, you don't deserve a good display
@@drunkhusband6257 LOL...HDR IS the single biggest advance in TV tech in many years.
@@Cenkolino You must be blind and dumb, hdr is horrendous
I bought the 32 inch Alienware curved and OLED and returned that sucker so quick. Only thing it had going for it was the response times. HDR was a joke.
OLED monitors are terrible at peak brightness right now. That's why I went with an S95C for gaming and content consumption
s95c? which model isthat? Please share with name and model🙏🏻
@@yakimchukэто телевизор Самсунг
yeah i just got a S90D for under a grand, best display i have
if you had a brighter display you'd get headaches. i already get headaches after long gaming sessions on my aw3225qf
450 nits on a 10% is kind of trash. This weak ass brightness is for sure the reason they couldn't charge over $1500 for these monitors. But give us more options and charge that $1500!
Honestly, I understand how the benchmarks were done, but my AORUS FO32U2P has KILLER hdr and all that brightness burns my eyes.
How bright before these screens actually do start to hurt our eyes like our parents warned us about? I mean, we really don't want to actually damage our retinas here. I don't need my screen to look like the sun or a welding torch or one of those ridiculous super bright flood lights or a laser dot or something.
Until it's brighter than the outside world, which is isn't.
@@aubrynobicop6924 Even half that could be a problem if the image was dark and suddenly flared up to full brightness. After all, our eyes adjust to let in more light indoors. I don't want scenes of a pitch dark cave that fully switch to a light from a flashlight to make me actually wince and hold a hand in front of my face to block my screen's light after all.
@Dark_Jaguar I just don't believe you have ever walked outside if a monitor in a dark room hurts your eyes even before adjusting. I have woken up in pitch dark opened my phone and not ever had a reaction of needing to shield my eyes from the brightness, and my phone gets pretty bright.
Peak brightness of 2500nits is what my phone gets too.
@@aubrynobicop6924 I'm talking about the trend now and where it could lead, not something I've encountered, so cut it out with the casual insults.
@@Dark_Jaguar no one is forcing you to buy one
This is why I chose miniLED 4K monitor.. awesome HDR plus no risk of burn in.. don't want to spend all that money just to babysit a monitor at the end
this is why I'm keeping my Mini LED even though I have one of the new OLEDs...Mini LED HDR brightness is so good its like crack. It blows OLED out of the water in anything HDR
@@theenduranceyeah I have both also and the mini led brightness just takes the cake for me. Blacks also look totally black and satisfying
I dont get HDR. I turn it off.. I can't see. My samsung qdled is turned down below 15% brightness for normal use.
Idk if I agree, I usually play in a dark gamer cave so I rarely need more than a few hundred nits. HDR1000 already looks punchy enough imo, I just wish they'd fix the bright scenes dimming on QD OLEDs, that's the only major problem that bothers me. But I guess having an even brighter HDR mode would be useful for those who like to game in brighter rooms or whatever, dunno.
I prefer the monitors in this regard. The difference the TVs are exhibiting between full screen and 10 percent window should not be so extreme. That kind of ABL has its own weird look. If the full screen ability is only 280 nits or so, the peaks should not be too much brighter. 400 to 500 is about right. The problem is that when peak brightness of highlights gets brighter, it makes it harder for bright scenes to look bright enough. So you can't do a daylight scene where most of the screen is limited to 270 nits when highlights are getting up well over 1000 nits. That just doesn't work perceptually, and I see it every time I look at OLED TVs, or any TV with a lot of ABL. If a highlight can get bright, the whole screen needs to get reasonably close to that peak brightnes for bright daylight scenes to not look eerily dim. I think 1000 nits full screen is enough, and highlights don't need to be any brighter than that either. Zero ABL is ideal. There are real visiual advantages to compressing realistic brightnesses into easier viewing dynamics, so I see no need to go to 4000 or 10000 nits, not that it would be bad. It's just a waste of energy. Much of real world lighting is excessively bright and contrasty and hard on the eyes. Compression of brightness and contrast is an effective artistic effect, and scenes can look gorgeous with 1000 nits of brightness. A lot of people wear dark shades on sunny days, and they don't get the benefit of a taylored tone curve to make the scene look it's best to the human eye.
If you think 1500nits is 3x of 500nits in terms of brightness, then bro need to do more research on how the luminance scale works.
2000 1% and 600/800 100% would be the perfection in a monitor
I think Oled tech may only be able to get so bright. Just like mini-led can only have so many zones.
Trueblack 400 is way enough for me on a 32' 4k monitor. I usually play at dark but even daytime is enough! I really can't understand why people need more thousands of nits...
agreed, my experience has been that brighter would be more "stunning" but also not comfortable for actual use.
Take a look at the Asus ROG Strix OLED XG27AQDMG for example.
1440p 240Hz WOLED with MLA 8xx Nits 10% and around 12xx Nits 2% when set up for highest brightness, and that for around 600-700 bucks.
That one for me is the perfect of both worlds, perfect blacks again but highlights can still make me squint my eyes like my Mini LED.
got the same Monitor and have the same Opinion on it
43qn90d or 43qn90c ??
I recently purchased a mini-LED to upgrade from a regular LCD IPS with entry level 400 HDR, which was fine for what it was and offered, it was good monitor all things considered, and I could not be happier now with that change, I have now the Xiaomi Mini LED Gaming Monitor G Pro 27i, and it is not a perfect monitor by any means but the value and specs in general are just insane. I do plan to upgrade later but in a few years to an Ultrawide OLED monitor, and while I wait for that it will still give more time for the OLED tech to continue to be further polished/improved and for the prices to continue to go down.
No data on the peak 1000 nits mode? I've been wondering at which window sizes modern OLED monitors would get past 400nits... Not going past 400 nits in HDR400 mode is just... I mean, it's in the name, what'd you expect? Do you plan on making a separate video showing that?
cant they just make a monitor which is 2x or 3x bigger and add a fan inside, air cooled or water cooled for that matter, to cool the heat away. anyways people i need help. i want a new futuresafe screen , hopefully a sceen with hdmi 2.1 and dp 2.1. but i have rtx 4080 gpu, with 2,1 hdmi port. and a 1.4a displayport. which should i choose? hdmi 2.1 is more bandwith. but everyone wants the displayport.
only real advantage of micro led is high brightness and no background worry about burn in for some.
disadvantages - slow response time (need time to dim and brighten backlight grid), low refresh rate, contrast still loses out to OLED, FALD is annoying in desktop mode, so need to switch depending on what You do every time, heat build up. for the future - increasing FALD zones increases price exponentially.
3:08
bro not even the HX 3110 can do 4000 nits full field 😭😭 The ideal display sounds cool though i think 1000 nits full field and 4000 nits 10% is the ideal display, for now ig. Still though, a TV that can do 1000 nits full field and has the contrast of OLED is gonna be perfect for nearly everything.
And you also have to understand that although the outside world can be 30000 nits, everything out there is bright as well. HDR stuff is meant to be watched in the dark, so 30000 nits in a dark room is gonna be like a flashbang yk? It's not an ideal comparison.
I still do think we should have the ABILITY to choose brighter screens if we want to and then TURN IT DOWN in a setting, that way we get both super high brightness for people who want to destroy their eyes and high brightness for people who dont
I have to disagree with your sentiment about OLED monitors having terrible HDR. I have the ASUS PG32UCDM, which you have reviewed, and it gets plenty bright for 95% of users out there. For reference, I also have a Bravia 9 75", which most definitely gets brighter. But... I don't keep the brightness cranked on the Bravia 9 either. There's no point, IMO. The current brightness of the UCDM is great. I'd be ok with maybe 20% more brightness, but after that, I'd end up turning it down.
I upgraded to the UCDM from a gigabyte 43 inch 1k nits VA panel and the brightness and contrast blew me away. I always used to crank the white point in games to 100% but now I don’t need to get remotely close before the brightness becomes painful. Anyone who would say the hdr on the pg32ucdm is awful has strange taste.
HDR looks like shit anyways, disable it
@@drunkhusband6257 You picked the weirdest thing to troll people over.
imagine casually opening a web browser @4000+ nits in light mode without bracing yourself.
I actually prefer the vesa 450 nits mode that most qd oled pc monitors come with. It kind of bad for HDR tbh compared to a OLED TV.
But it’s very consistent and uniformly bright without ABL that you are more likely to notice since you are sitting so close to an OLED monitor.
I think now its the time to stop crying about brighter oled monitors…. They bright enough already. When u sit front of the monitor literally whit screen in your face u cannot have 1000+nits…. If i play on my tv from a couch thats a different story. When i play at night i need to lower the brightness on my oled monitor otherwise my eye will burn out and we cry here for 1000+ nits just ridiculous
Alienware AW3225QF is capable of 1100 nits peak brightness in HDR. My TrueBlack400 Color profile hit 460 nits. I can't complain at all to be honest. Everything looks great in HDR or SDR. I don't experience issues viewing during the day and I'm facing two windows to put things into perspective, but OLED really shines in darker environements.
Turning off ASBL on my Samsung G8 in the service menu fixed all of the dim HDR performance problems I was having. I know that's not a ideal solution for everyone, but I did work for me, just be aware and not leave static images up on your screen, but I think that's OLED ownership 101 stuff.
This is why i don't go with all the over-hyped HDR this and that.
I’d honestly say hdr is underhyped. If you haven’t seen good hdr you really don’t know what you’re missing. For me I can’t ever go back.
@Fetchdafish that is the problem so many fake and bad hdr implementation.
@@Fetchdafish HDR looks like ass. I hate it. Worst tech to come out in years
@@drunkhusband6257if hdr is the worst then things must be pretty good in the tech industry
@@Fetchdafish There is literally no advantage to HDR over properly adjusted SDR, absolutely nothing.
meanwhile on phone land, 3000 nits fullscreen at 5x pixel density, fully glossy with almost no bad reflections
btw, why aren't asus & co. selling watercooled OLEDs already?
This is why I held off on buying an OLED monitor. They're doing fine at 2% and 100%, but the arguably most common real-world use case at 10% is lagging far behind TVs. My 2022 OLED TV with heatsink would make a 2024 monitor look disappointing. Currently using a cheap 1440p 240Hz VA mini-LED while waiting for OLED monitor panels to improve their 10% brightness.
Owners of OLED monitors who care about motion perf above all and play dark games (in a dark room if QD-OLED) still have a good case for their purchase, though. Just ain't me; I share my room and prefer it softly lit anyway, and I don't play FPSs.
I got the G4 on Blck Friday month for $1999 forn65 inch TV its a day and night difference between the CX and the g4. I love it!
Oled is still king for gaming. For movies, I think its still a competition.
HDR = TRASH!
lol you’ve never seen good hdr
@@Fetchdafish I just don't like that companies focused their marketing on HDR because there is much more important features to improve! WHO D F need 1000 nits brightness? A BLIND PERSON?
@@Fetchdafish No such thing as good hdr it looks like shit
@@drunkhusband6257 someone has a poopy monitor/tv :)
@@kajzar I have 3 top end displays. HDR is shit, and always will be.
Sometimes I feel like the only person who doesn't care about HDR. I literally have my TN monitor set to 25% and thats already bright enough to my eyes
The pg32ucdm has the best hdr I’ve ever seen. Even better than the VA panel it replaced for a similar price. Honestly more than 1k nits in your face really does seem insane to me. I have to turn my brightness down already in games. Also why the insistence on hdr 400 true black? That seems like gimping the brightness artificially all for color contrast accuracy many people won’t notice.
3:03 That chart isn't really accurate. Because of how our eyes work, there's a bigger difference between 100 and 1,000 nits, than between 1,000 and 2,000 nits. So that chart should actually be squished at the top, making QD-OLED look a lot better. It's still bad of course, but it's not that bad.
Sooooo...
I got a MSI 32UPX, I just watched this on - because YOU told me to get it. But now, it sucks and I should replace it with a C4 42"!?? HELP :D
That aside, and it is a serious question done in a goofy way.
I got a 77" S90C last year - was my first OLED... and it pretty much ruined me, since this year I went S90D 65" for my other place, as well as QD OLED Monitor.
I hate to admit that I just love it, even though that "Magenta Tint" Debate gave me a FOMO related headache for 2 weeks (Its not as bad as people make it seem, thats my take), I digz me those QD OLED Panels, since they hit different.
The Samsung TVs get stupid bright - like sure, my eyes could adjust to 10k Nits specular highlights, but when I look at those TVs, I am not going "Ah dang, that should be brighter, since my eyes arent hurt enough already.".
Coming to the Monitor, it doesn´t get the "same" brightness, but I am waaaaaaaaaaay closer than I would ever be to my TV, so the HDR still is subjectively very bright.
Since OLEDS come at the cost of having to somewhat control light, in order to get the best performance from the Panel (WOLEDs brighten up too in light) I wonder how bright they really NEED to be - since they have to compete against what!? 20-100Lux of light, with 100 ambient being too bright to really enjoy a movie or game properly.
So yeah, do you think that viewing distance and light controled environments kinda make up for the lack in light intensity?
Also...should I keep that MSI or go LG TV?
Also Also.... when those nasty 4.000 Nits OLEDs are ALLEGEDLY supposed to drop, rumor wise? We talking 2025/2026ish, or further down the line?
Also Also Also, keep up doing the lords work and never get rid of that lovely borat mercury moustache, its pinnacle swaggery.
Cheers
I have OLED TVs for Watching movies and such and the picture is noticeably better than my QD-OLED and W-OLED monitors. I'm ok with that for now just like I'm ok with having a matte screen. Intense image brightness is not good long-term when you sit 3 feet or less away form your display for hours at a time. I learned the hard way after months of eye strain and occasional headaches using a QD-OLED glossy monitor since march. Everyone's mileage may vary but I'm having a way better experience with the 32gs95ue people like to take a dump on lol. No reflections and no raised blacks, no eye fatigue and no headaches. Yesterday I played for hours with the sun shining through the window and it was so relaxing instead of turning my office into a dungeon. WOLED wins quality of life for me while QD-OLED wins the overall image quality in the perfect conditions but I will say side by side the difference was marginal.
that is why I am holding to odyssey neo g9 57 HDR brightness and with displayport 2.1 on upcoming nvidia cards I can finally run the monitor at 7680x2160 @ 240hz
Mini LED for the win, If I just tweak around the Nvidia settings with the color and saturation and brightness versus contrast, I can get to 90% of an oled. With none of the side effects and able to watch it during the day.
Thats just coping..mini led is weak af
Yeah mini led all day... Usable screen no matter the light levels in the room.
The hdr on the samsung oled 27 is really good. I tried several hdr oled monitors so far this one has been the best.
Truly Hanzo and Moira game play.
display guy criticize even about the smallest thing. Does that actually help anyone make a purchasing decision? I say no. It just confuses people even more. 24 years ago there weren't these UA-cam misanthropists who badmouthed everything, but people were still happy with their purchases.
its crazy to me that people think they need more brightness than what the current flagship OLED's offer. its literally eye searingly bright unless you're trying to use it outdoors. I see every day people complaining about brightness while simultaneously complaining about headache and eye strain because they're already too bright
We absolutely need more fullscreen brightness. A high APL image on an OLED looks pathetic next to a mini-LED showing the same. Not even remotely "eye searing".
@@burchmore5000132 its true, mini led and standard led can get WAY brighter, but there's a certain point where it just doesn't matter, and we're already there.
OLED HDR really isn’t bright at all
OLEDs are not bright compared to most LEDs and Mini LEDs lol
@@ThreshyNeonz bright enough though
No, I don't want that much brightness arms length from my face in a dark room
I just need a 27 inch 4k qd oled monitor with the same hdr brightness as my lgc3 42 inch.
This max brightness as a measure of better hdr is total marketing c**p
I have a miniled with 1400 max brightness, it’s good for me because I use it also for work and I fear oled burn in, but oled is still a lot better for image quality, viewing angles , blacks, micro contrast, and overall smoothness of the experience.
It depends on the games also. Outlast Trials and some other games. the HDR is amazing!
Finally someone says it. AOC Q27G3XMN superiority
OMG thank you! OLED monitors are too dim! I’m hoping for brighter panels for 2025
I have a miniled with 1400 max brightness and use it with hdr off.
Simply looks better.
HDR is becoming the new 3D.
Overly hyped and almost never implemented properly.
Did you seriously just equate nits to "performance"? That is the most moronic take I have heard on the topic of displays, period.
who tf is using this level of brightness 💀
Don't believe a word the Display Guy says, he's lying like a dog. He's only interested in clickbait.
Numbers n numbers. Mini led having better contrast than oled ? Wtf..
Don't hate me.
After using a Samsung QLED for a few years and upgrading to the LG C1, I think the LG C1 is just about perfect. ABL dropping full screen white to 130 nits is a tiny bit excessive but 200 nits full screen is probably as much as I would ever really want, and I know for a fact my 500 Nits full screen Samsung QLED is too much brightness so I'd take 300 Nits full screen at the absolute maximum.
With it being so close to your face you do not need that level of brightness. It damn near blinding. I have a PG27AQN that I have at 60% brightness because of how bright it is. I plan on upgrading to the PG27AQDP which is 450. I game in a dip lit room and 450 is perfectly fine. The HDR on the AQDP is around 1300 peak brightness. I think a good standard brightness on OLED monitors would be 600-700 max.
Yeah they are too dark thats why play on s95b✌️
The Asus XG27ACDNG QD-OLED has excellent HDR.
Nah my msi 321up strains my eyes in bright scenes. It's good enough
LOL ! i played with 1000 nits and it s by far too much bright for your eyes ! Display sometimes is badplay guy xD
still better than miniled haloing lol
a max of 250 full screen brightness is enough to burn my eyes to hell and back
We just need miniled displays with 40k LD zones to beat oleds… I suspect nobody can have the processing power to pull this one off in a consumer product
Only microled can..mini leds are nowhere near oleds
i honestly find my aw3225qf to get bright enough that some games i have to turn the brightness down. who's craving more brightness?
I have the same monitor and I agree. I upgraded from a mini led neo g8 m, it was brighter but necessarily bright. The hdr looks better on the aw3225qf by far. The detail and the blacks are insane.
Woke people and dropshipping gurus who use their computers on their standing desk and the sun right beside them.
@GeeDeeDee tell me you've fallen too far down the right wing pipeline without telling me you've fallen to far down the right wing pipeline. Not everything you dont like is "woke".
@@OrlandoCaba dude just needs a bullshit reason to make a video. Contrast matters far more than full peak brightness. He's comparing them at their absolute worst in a manner that doesn't mean shit
@@veilmontTV I'm not even american. Don't drag me into the cesspool you belong in.
OLED fanboys really coping hard with the "I don't need eye-searing brightness, I use my monitor in a dark room!!!11!!!". Just say you have no idea how HDR works and go. HDR is mastered using 1000 or 4000 nit displays, and therefore require a 1000+ nit display to show what the actual scene looks like as mastered by the creator. There are ZERO oled monitors or TVs that do more than 300-400 nits at fullscreen or even a 50% window. The new OLED iPad is the largest-brightest OLED display, and its still only 13". OLED monitors and TVs are still probably 5-10 years away from catching up with Mini LED for HDR. Currently, only Mini LED monitors and TVs have real HDR capability.
As someone who owns both a Bravia 9 Mini-LED (the most advanced on the market, btw), and an ASUS PG32UCDM, mastering means very little unless you're viewing the content in FILMMAKER mode or equivalent; modern TVs are going to tone map based on what the TV and user prefers. OLEDs have the advantage of being able to push point highlights far higher than mini-LED, while mini-LED can push full scene brightness. A mastering monitor can do both thanks to dual-layer LCD tech, but current consumer TVs can't. So either you pick full screen brightness, or point highlights; you're not going to get both.
Point is that there is a lot more to image quality than peak brightness, and except brightness miniled fails vs oled on all features
There's no one "coping" except you. OLED displays can achieve significantly higher peak brightness (up to 1,000-1,500 nits) in smaller highlight areas (e.g., a 10% window), which is crucial for HDR performance.. So MiniLED certainly doea not apply to what rhe creators intended either.
What is with this bizarre hostility and anger? This is panel preference not politics. You’re also just wrong, HDR is more than just brightness, the whole point is contrast which OLED does better than anything. My current OLED monitor has the same 1k nits peak brightness as my last monitor, a VA panel. But on the OLED I actually have to turn the brightness down in games which I never had to do before. The contrast and clarity is so much better on the new monitor at the same resolution and almost double the frame rate, and the contrast makes it literally the best HDR I’ve ever seen, not much worse than my OLED tv in the same room. How many people need to say “My personal experience is different” before you realize not everyone who disagrees with you is coping?
Glad I didn't go for Oled. I am still rocking my Asus mini-led. I just don't like the halo effect in dark areas and the IPS ghosting.
Oleds are better
@@NexGenTek they are not definitely better. they suck for HDR and only last 2-3 years before getting burn in. Mini LED is overall better for most use cases.
@ Actually HDR on a QD Oled looks better because mini led monitors have halo and light bleed and terrible uniformity. Slow response times It’s not even close
yeah, i agree. i have a 32/240 new gen and the hdr is super weak, doesn't even get close to my lg cx tv from 2020
My PG32UCDM seems noticeably brighter than my 17k hour LG CX 55".
@@madpistol it's not even close. do an hdr calibration on both and see what peak nits you get and you'll see.
So glad I didn't pull the trigger,Though blacks turning purple is a deal beaker for me I still considered going for the alienware ultrawide because of hdr, instead of asus xg27aqdmg
watching this on my new Lenovo Legion Y34wz-30 miniLED monitor!
I had a 2000 nit Samsung odyssey g9 mini led now I have Alienware qd-oled the Alienware looks way better in both hdr and sdr the specular highlights look way better and brighter on the oled even though on paper the stats say otherwise