Love these updates. I really hope they encourage companies to support and improve their products. Some of these updates really change the value proposition and market competitiveness of the products.
Firmware of monitors should be updateable via usb port baked into monitor and selecting update from usb in OSD. There is no excuse for making it possible only via certain gpu and windows.
but why would you buy a monitor thats 200$ more expensive with the g-sync module without having a nvidia gpu ... this monitor is built for nvidia users so yeah it kinda is is excusable
Agreed, I can't even get the update to work off of my gaming laptop because I have to use a USB-C to DP custom cable that isn't recognized by the firmware updater since my laptop doesn't have DisplayPort built in.
yep ! That's what I said a week earlier on this channel. We should be able to plug some USB thumb drive and use OSD to launch a firmware update. Having to depend on certain OS and Hardware compatibility is not acceptable.
I'm happy to see some (although, not enough) monitors seeing continued support this year; something generally unheard of. And large updates like this today set a performance standard for future monitors to meet. I'm glad to finally see coverage of this tech!
2:24 Owner of this monitor here! :D I just recently bought it in April 2024, and I noticed that my unit came already with ULMB 2. I think this is due to the fact that this monitor has been in the market for more than one year, so now it comes with the firmware updated. Therefore, don't worry: there's a pretty high chance that if you buy this monitor today or in the future, it will come updated, making unnecessary the use of an Nvidia GPU for AMD or Intel users out there. 👀😉👌
Love the high standards of testing! I hope yall review the hp omen 27qs and msi G274QPX. They’re newer, budget 1440p 240hz monitors that I’d really like to see the response times and ghosting values for :)
The next step for backlight strobing seems like it will be to implement a scanning backlight, to sync the the strobe pulse up with the scanning output of the GPU across all portions of the frame. Presumably this would be possible with modern FALD backlight monitors (especially high zone count monitors).
It's crazy that Nvidia and others invest so much on the best possible motion clarity, and then random games come along with completely busted forced TAA and undo so much of that progress
@@existentialselkath1264youre a bit misinformed on the topic. Most games that come with forced taa do so because the assets and textures used dont work well with aliasing. Some games look great with aliased corners, some just dont. Thats why the devs force it sometimes. And i hate taa too btw
@@luxemier I don't know how you can think I'm misinformed when I never said anything to the contrary, I already know all this. Game engines use TAA to smooth over sub resolution effects, other optimisations, and dense sub pixel geometry, but that doesn't mean it can't be turned off anyway. Take unreal engine 5, turning off TAA is as simple as a single command, it looks shimmery and low res, but supersampling clears it right up (if you have the power, a lower res monitor, or just a future pc) I wouldn't want to play most modern games without some form of TAA, though I'll take DLAA over whatever most modern games are using. But it's not long before hardware is more than powerful enough to supersample for perfect image quality and motion clarity, but many games will never have that option.
I find motion smoothness tops out at 120-144 fps for me. Motion clarity on the other hand goes way up with higher fps, especially with oled. Ulmb can get you even better motion clarity without the need for high refresh rates.
@@Aggnog I know it's expensive but the fact that they're actually doing something that's objectively an improvement even though it looks worse on a superficial spec sheet - is a very nice step in the right direction for a market that is dominated by poorly optimized screens that deliver worse image quality at the expensive of useless marketing numbers.
Nvidia Marketing this time is spot on when it comes to advertising 1000Hz like requirement. This monitor's motion clarity is actually 1440Hz like or 0.7ms of persistence. Based on the Blur Busters Law (1ms of persistence is equal to 1 pixel of motion blur when moving at 1000 pixels per second). This monitor can look better in motion than a 1000Hz sample and hold display.
The way this works is by flashing the entire screen to show the frame at the right time and then turning off the back light. The frame visibility time is reduced to 25% at default pulse width, achieving 4x motion clarity improvement. 360Hzx4 1440Hz (1/1440Hzx1000Hz=0.7ms)
Yeah but motion CLARITY does not matter. What matters is motion lag, if you wave your hand infront of your face it will look blurry. Blur is not a problem, input lag is.
I posted this already under a other video: a new firmware (MCM104) for the PG27AQN came out recently. Could you test it? I would love your opinion and measurement on that. As far as I can tell, especially the color performance has improved by a lot. New silders "color vibrance" and "ULMB 2 Pulse offset" are available. The menu has been cleaned up. I am not quite sure how to set ULMB 2 Pulse offset.
i gave up asking for more IPS 1080p240Hz reviews from Tim a long time ago (XG249CM, XG2431 and S2522HG), it's obvious if a 1080p240 monitor costs less then $350 is not getting reviewed no matter how good the specs or price to performance could be..., just look at the last time a 1080p240Hz IPS display was reviewed by him :/
Whoa, Tim got a new guy to help out on the channel! So good to see the channel taking off enough to afford new hires. He looks like a Steve to me, so I'll call him Camera Steve. ;)
I think you can take nvidia at their word with the 1000 hz clarity comment or at least test for it. Blurbusters already puts crt’s, ulmb 1 @ 10% in the 1000 hz ‘holy grail’ class of clarity.
This comment section is full of AMD fan girl, get a grip gals, nVidia lock you out, ain't the end of the world. Like you all have that 1500$ monitor....
16:08 Some guy on Blur Busters Forums claims he managed to use ULMB 2 on amd gpu by deleting range limit numbers in CRU or disabling "include if slot available" bellow it. Would be interesting to see this re-tested, if not on this PG, at least on upcoming PG248QP 540Hz.
Thank you, I can use it now with my 7900xtx now all I got to do is set a custom resolution of 1280x960 but I can't seem to do it the osd says 2560x1440, even tho I have GPU scaling off in AMD software, I need to find timings for 360hz 1280x960 resolution
Tim: Often contrast is also affected when changing response times of a panel. So, did you perform a new contrast measurement after the firmware update? 😉
Congratulations, you're now big enough companies are optimizing their product tuning to beat your benchmarks specifically. Fortunately your benchmarks are good enough at representing the real world performance that that's actually a good thing.
You say that but it's better to play FPS games on my old 360hz 1080p monitor than it is on my new 240hz OLED. The OLED appears to stutter, even compared to my 165hz 1440p IPS panel. There is no accounting for how an analog observer (humans) witness the motion and how it's perceived. It's only measured by other digital media which is totally irrelevant, the only valid measurement digital media can make is how quick it is to respond to input. Anything else is entirely made up, you don't see still images as a analog being.
@@konnj you are talking about judder. That's normal and you'll get uses to it. Oled needs to be 1000hz to get rid of judder. Or specifically 1 frame having 1ms of screen time. I prefer oled even with judder as you'll simply get a cleaner picture nearly 100% of the time. Plus that actually transfers to when you are playing the best looking games at lower fps. The monitors in the video are nice but how often are you actually going to achieve 1440p 360hz outside of csgo or r6s?
@@jwhi419 I can easily get to the 300 in-game FPS cap on a game like Apex Legends with a 13900k+4090 on my 1080p 240hz monitor (XL2546K). Also I think a 4090 will be easier to control with an actual G-Sync Module. That's why I'm considering upgrading to this monitor. You can definitely stay above 360 FPS in those games you mentioned and Overwatch. Plus it's probably a good future-proof monitor. Big product leaps in IPS monitors seems to have crawled to a halt lately, 540Hz monitors are nowhere close to being readily available. And most companies are pivoting their efforts to new OLED products that still have a lot of kinks to work out. It's going to be a long time until a new IPS monitor comes out that's better than the PG27AQN and actually worth the cost to upgrade.
This is the feature I want the monitor industry to focus on (and open version to all GPU brands tho) would be amazing if it had 60fps mode for fighting games.
ULMB not working on AMD is just pure DRM purgatory territory. The OSD setting to disable VRR you speak of already exists in much much cheaper ASUS monitors such as the VG259QM. Its not like they don't know how to have that functionality, its done deliberately. I've made my choice on an XG2431 and I just don't really want anything to do with Nvidia sponsored hardware in general as this is just their way of ensuring you will always use their products for a fear of compatibility.
That is the biggest downside with Nvidia. They always do it for THEIR products. While AMD releases open source or "cross" compatibility. Ofcourse, it opens AMD's income if they create something open source. But they are more or less more open to availability than nvidia. I for one, has an nvidia gpu and intel cpu. Bought a monitor on sale a while back which only has freesync.. which is no problem with whatever games i've been playing. I'm not saying that either company is better than the other. I'm just saying that both nvidia and amd does great things. It's just infuriating that amd keeps piggybacking on nvidia.
ULMB1 worked the same way, my original PG279Q has no adaptive sync toggle and requires you to disable GSync in the driver to be able to enable the ULMB option in the OSD. What Tim didn't cover is that there is also the option of setting per game preference between GSync and ULMB in the driver under game settings, and it'll swap modes when you launch the game which is useful. I dunno if it's a deliberate move to lock out AMD, it's just a carryover of how it worked 7 years ago when ULMB first launched. AMD can also fix this on their end by actually disabling adaptive sync when you change the setting, they're the only one with this issue. The Intel driver and the windows basic driver work the same way as the Nvidia one.
@@DanielFrost79 If AMD had the same or equal market share as NVIDIA they would've done the same thing. You are forgetting that these are corporations first and the only thing they care about is money (I wish we had a Silverhand)
Idk, maybe just stop buying AMD products then. Sure they're passable products but it's clear they're always going to lose the fight to Intel and Nvidia in terms of products and marketing. You pay the price premium for a reason. I thought I'd be smart once and built an AMD system. Worst mistake I ever made with constant issues. That's why I don't use AMD products anymore. The small amount of money you save just isn't worth the hassle. Their products constantly underperform anyways.
Is there a video like this for the Acer Predator XB273U F? It's also 1440p and 360 Hz and has also received a firmware update for ULMB2. I've seen someone say it uses the same panel as the ASUS PG27AQN, but I don't know if that's true. I would really love quality information on that monitor model. 😶 Thanks for the excellent content on this supposedly comparable model. :)
That's interesting, with Optimum Tech review, I thought this is going to be a no compromise, obvious next step for premium monitor that focus on motion clarity. Guess it isn't as clear cut, I am exciting to see what kind of improvement we can see on motion clarity of an LCD in the future
We do compare IPS to TN here, which is a key difference when it comes to motion clarity. Looking forward to some "premium" (E-sports focused) TN monitors featuring ULMB2 to even the playing field & have a more direct comparison to DYAC+.
I remember Mark from blurbusters talking about multiple backlight zones being a possible solution to the crosstalk issues, especially at lower refresh rates.
I'd like to see at least one overdrive setting with roughly the same response times as the old firmware. I'm not sure about upgrading as I don't want my screen to slow down. On the other hand I would like to try ulmb2.
Nvm. I did the update. Now I understand. I can exactly feel what you did describe in numbers, thats a huge improvement to befor. Cyberpunk 2077 and Hogwarts Legacy are capped to 75 FPS and with the esports overdrive it just feels so amazing! It feels like much more FPS, super snappy, no artefacts just super smooth and super fun. I upgraded from a benq 2546K and it feels like a different world. thank you for your great review and also for your other videos where you explained what all these values, like the cumulativ deviation, means. ULMB 2 feels great in overwatch 2, but I am not sure what to prefer because gsync feels more snappy. Overall this is a great product, I am so happy that I choosed the 27aqn over the 27aqdm.
I think you will have the best gain on the 27aqn. 1440p, perfect colors, true G-Sync and ULMB, which as far as I understand is superior to dyac+. Overall, the 2566K is inferior to the 27aqn.@@erich-r4y
However, the 27aqdm is not a perfect all-rounder. Due to the pixel layout, it is not possible to work and read texts very well with the OLED. As for burn-in, Asus does not provide any coverage, which would be too risky for me.@Doctor0Who
@@he1go2lik3it I got the monitor last week, I m also not confident to update cause it will slow down my response time, is the response time degrade significant? Motion and picture quality will be better? In trade of resnpose time? I play competitive ganes.
Just for the funzies it would be interesting to see just what Nvidia based GPU the upgrade will work with. For instance would it work with a GTX card? Wondering as it would be an even "better" feature if it didn't. Now I know that the chance that someone investing in this monitor would be stuck with anything but a card from the 30 or 40 RTX series, or the 60 or 70 series of RX cards is so low it's ridiculous, but I got an inquiring mind... Also does it have to support NVSync? in other words would it work with a 1080, but fail with a 970? At least I think the 1080 supported NVSync, or have i got that mixed up?
what if you disable vrr in something like cru, will the monitor allow you to enable ulmb? If it works it may also allow you to update the firmware but it is risky
ULMB is an Nvidia feature, it makes sense firmware update doesn’t work when GPU is AMD, also you have to ‘disable’ GSync in Nvidia control panel on PC to enable ULMB2 on monitor.
You needed a Nvidia card for the update. Did nvidia or amd matter for the testing? Like if I buy a updated screen and then use a amd graphics card, will it still run as good as tested?
Sounds like nvidia at least did a very good job with ULMB2 compared to other techniques since I would imagine the bottom of the screen is way less important than the middle and top in most games and definitely shooters. So they made the right choices, but yes hopefully they can perfect it someday. Sounds like a great way to get clear motion now that they fixed the brightness problems too. I might end up getting this Asus 360 hz monitor and use my OLED tv for single player games. At same time, I might just go with a 240hz OLED as I bet those monitors are so good.
Hi, are there any 32" 4k monitors with auto brightness? I am fiding it hard to believe that my 10 year old 24" NEC beats latest and greatest monitors in features and connectivitity.
i just got the asus pg27aqn and have the umlb2 update and in the nvidia control panel shoould i have vsync on or off and what should the "monitor tecgnology" setting be set to? i have it on umlb and vsync off
HELP ! i want to buy this monitor but had Q 1.- when you use ULMB 2 can Still use overdrive modes ? I mean use the 1.9 ms gtg or automatic with ULMB 2 the monitor use the normal 3 ms gtg 2.- In 2024 Stille worth it ? Sorry inglish isn't my first lenguaje 😅
im trying to update firmware but i get error during flash saying it cant switch to 60hz, and it tells me to lower it to 60hz but the only option i have is 360 both in nvidia control panel and in windows adapter settings. please help!
It was a little confusing for me, does the game need to be at 360fps or more for ulmb 2 to work? If I run the game at 200fps, for example warzone 2, ulmb 2 won't help at all?
what's the reasoning for this level of emphasis on low refresh-rate testing - the information it might or might not reveal can't be extracted to anything meaning both because of "the eye test" (caveat *60fps*) - as you even said "what this in actual visual... is hard to describe" but also because of the inorganic use-case-scenario - made up example with similar test for another device targeted comp gaming: "4k mouse performed poorly at lift-off-detection when I set the polling rate to 100" - I guess what I'm not understanding is the significance / the why is this relevant?
@@garrusvakarian8709 I understand that, but the importance of that is based on an ever-decreasing-amount of people. who goes out and purchases 360hz monitors? people who play FPS games... 99% of competitive FPS games can easily hit 400+ fps - if you already sacrifice getting a 4k monitor for a 2k 360hz (that cost the same or more than really great 4k 144 monitors) a fair assumption to make, is that single player games are secondary so why are we emphasising a small amount of people and or a secondary use-case scenario? to this degree? wouldn't the logical thing be to test the performance of this monitor at a LOWER resolution since the users of this who can't hit 360+ fps in *some* titles - but still want the competitive benefits would then LOWER the resolution NOT the refresh-rate? 240hz Benq dyac users are probably disproportionately on a non-native resolution (cs go players etc) so where's the Odd / low resolution testing - wouldn't that at the very least be equally as common of a scenario for someone thinking about purchasing this monitor- (as a 2k 60hz scenario is) **Reviewing something based on it's intended & probable use-case (outside of your core viewership perhaps) as well as reviewing something based on your audience's most likely use-case (or at least those who comment / respond to polls) are not mutually exclusive is what I'm realising has become my point, I guess - anyway thanks for the video :)
@@garrusvakarian8709 right the minority of people got that answered by this video - great - that's fine like I said, the emphasis of that VERY specific scenario is what I'm questioning, the monitor is 2k 360hz, it's an extremely specific combination, a "jack of all traits" test being the "main focus" of the review doesn't seem to fit with the products selling points / main competitor - which is evident by the very monitor everyone is comparing this to - the benq 360 dyac - a PURE competitive gaming monitor, so why not ALSO include a purely competitive based comparison, where the resolution is dropped, instead of the refresh-rate, this wouldn't REPLACE the jack of all trades test, but be alongside it! in your own words "this is not complicated!" YOU CAN DO BOTH!
@@garrusvakarian8709 that's just disingenuous to the point above - the very thing that is updated is ULMB, which again, is a hyper-specific technology, most associated with competitive gaming at these refresh rates - again hence why it is being compared to the benq dyac. The "updated firmware" puts this monitor into a great spot to compete in the competitive gaming monitor market, so your argument as to why a competitive testing is not done, is obviously incredibly silly, since that is a result of the firmware update, and you accurately point out the firmware update is the point of the video :).
you yourself claim "not everyone can get high fps all the time at 2k resolution" I say: okay! lets test what happens if you drop the resolution down, so you maintain the competitive advantage of 360hz - will the new firmware update still be as impressive at low(er) resolution with ULMB2 enabled, with the absence of 2k resolution?? or will the benq dyac be noticeable better at 1080p or even at 4:3 resolutions where the top / middle / bottom zones of the monitor really is tested differently, as the "area of focus" is greatly distorted - we know that top, middle and bottom of the monitor perform differently in terms of backlight strobing accuracy - so could be one area to do some further testing. then you respond with "NOPE" why not just say "yeah maybe that could be an interesting thing to also test"?
Why dont they just develop a monitor with simultaneously updating pixels instead of vertically? What's so tough about that, power consumption concerns? Manufacturing costs? Surely it isnt impossible to do
Could you check out the "HP x34", it is the ultra wide 1440p monitor with ips display, 165hz and freesync. Seems to be a great value product for 360-400€ after taxes... Think it will be the same in dollar
What might be worth trying is to modify the EDID to fool the AMD Driver into thibking it doesn't support AdaptiveSync... Kludgy as hell fix, but it might work?
I've never seen a higher than 144hz LDC with strobing enabled in person, but mine (1440p @ 144hz) gives me huge headaches when using the strobing mode. Maybe it's easier on the eyes at 360hz..
I'd be really curious to try it with an Intel GPU. Not that Intel's current lineup could drive a monitor of this caliber, but since you said that ULMB 2 works on AMD cards with the Windows generic driver I wonder if it's a driver issue on AMD's end or if Asus is actively disabling it for AMD on purpose.
i found quite some things that work with nvidia and doesnt work with AMD suddenly also works on Intel. getting back at the LG c9 oled VRR not working on rdna2 gpu - Intel arc VRR works perfectly fine. same goes for productive Performance in say affinity photo. nvidia scores about 10-20x higher than AMD, and so does Intel. having Intel in the gpu space helps to clear up alot of those strange cases.
Something like Overwatch would not be hard to drive for a Intel GPU. If you are seeking that kind of improvement chances are you are playing competeteive low graphics anyway.
If only the Asus monitor stands (over the last several years) didn't have that awful looking copper portion... are they trying to compete with Noctua for worst brown?
Love these updates. I really hope they encourage companies to support and improve their products. Some of these updates really change the value proposition and market competitiveness of the products.
Nvidia sadly is not known to encourage user friedly backwards compatibility with tech. Especially when its licensed software.
Firmware of monitors should be updateable via usb port baked into monitor and selecting update from usb in OSD. There is no excuse for making it possible only via certain gpu and windows.
exactly, that's how it's done mostly anyway. this is so strange
Nvidia helped them with it, so I'm not surprised it's locked to Nvidia.
but why would you buy a monitor thats 200$ more expensive with the g-sync module without having a nvidia gpu ... this monitor is built for nvidia users so yeah it kinda is is excusable
Agreed, I can't even get the update to work off of my gaming laptop because I have to use a USB-C to DP custom cable that isn't recognized by the firmware updater since my laptop doesn't have DisplayPort built in.
yep ! That's what I said a week earlier on this channel. We should be able to plug some USB thumb drive and use OSD to launch a firmware update. Having to depend on certain OS and Hardware compatibility is not acceptable.
This channel is so underrated. 🤯
I'm happy to see some (although, not enough) monitors seeing continued support this year; something generally unheard of. And large updates like this today set a performance standard for future monitors to meet. I'm glad to finally see coverage of this tech!
I've been waiting for this one :) I'm looking forward to using ULMB 2 as a CRT enthusiast
2:24 Owner of this monitor here! :D
I just recently bought it in April 2024, and I noticed that my unit came already with ULMB 2. I think this is due to the fact that this monitor has been in the market for more than one year, so now it comes with the firmware updated. Therefore, don't worry: there's a pretty high chance that if you buy this monitor today or in the future, it will come updated, making unnecessary the use of an Nvidia GPU for AMD or Intel users out there. 👀😉👌
How’s your experience with it so far? Getting mine this week :)
Love the high standards of testing! I hope yall review the hp omen 27qs and msi G274QPX. They’re newer, budget 1440p 240hz monitors that I’d really like to see the response times and ghosting values for :)
The next step for backlight strobing seems like it will be to implement a scanning backlight, to sync the the strobe pulse up with the scanning output of the GPU across all portions of the frame. Presumably this would be possible with modern FALD backlight monitors (especially high zone count monitors).
That was a cool ad from Steve. I'm not in the market for a webcam, but the demo made it interesting and I didn't skip it.
It's crazy that Nvidia and others invest so much on the best possible motion clarity, and then random games come along with completely busted forced TAA and undo so much of that progress
2042 🤮
@@yoked391halo infinite 🤮
@@existentialselkath1264youre a bit misinformed on the topic. Most games that come with forced taa do so because the assets and textures used dont work well with aliasing. Some games look great with aliased corners, some just dont. Thats why the devs force it sometimes. And i hate taa too btw
@@luxemier I don't know how you can think I'm misinformed when I never said anything to the contrary, I already know all this.
Game engines use TAA to smooth over sub resolution effects, other optimisations, and dense sub pixel geometry, but that doesn't mean it can't be turned off anyway.
Take unreal engine 5, turning off TAA is as simple as a single command, it looks shimmery and low res, but supersampling clears it right up (if you have the power, a lower res monitor, or just a future pc)
I wouldn't want to play most modern games without some form of TAA, though I'll take DLAA over whatever most modern games are using.
But it's not long before hardware is more than powerful enough to supersample for perfect image quality and motion clarity, but many games will never have that option.
Well NVIDIA has a solution for that too with DLSS. I'd almost think it was a conspiracy if I was less familiar with the tools.
The best monitor reviewer Posted again lets gooo
This channel is extremely underrated. Thank you guys for such amazing content.
Its completely changed the monitor. I can't believe how good it is. Its absolutely unreal.
Do you own it, if yes, can you tell me is it bright enough with ulmb 2 enabled?
I have this monitor, playing at 360Hz is so buttery smooth its unreal. Monitor of the year so far
Hope it lasts more than 2 years though!
Had 2 ROG Swifts die and never again.
@@ShaneMcGrath. Had the 1st ROG switft to come out and its still working fine (tho as 2ndairy monitor now)
Yeah this firmware update took care of any complaints I had, its truly god tier.
I find motion smoothness tops out at 120-144 fps for me. Motion clarity on the other hand goes way up with higher fps, especially with oled. Ulmb can get you even better motion clarity without the need for high refresh rates.
@@ShaneMcGrath. What in the world are you doing to your monitors Shane? I've had a ROG for 6 years no issue and another for 4, also no issue.
Thank you for the unbiased in-depth review! Bless you :)
Good to know ASUS monitor division at least still have some competent people left.
It helps that its priced $1200+.
@@Aggnog $1700+ here in Aus :(. It's about as expensive as some oleds....
@@Aggnog I know it's expensive but the fact that they're actually doing something that's objectively an improvement even though it looks worse on a superficial spec sheet - is a very nice step in the right direction for a market that is dominated by poorly optimized screens that deliver worse image quality at the expensive of useless marketing numbers.
Nvidia Marketing this time is spot on when it comes to advertising 1000Hz like requirement. This monitor's motion clarity is actually 1440Hz like or 0.7ms of persistence. Based on the Blur Busters Law (1ms of persistence is equal to 1 pixel of motion blur when moving at 1000 pixels per second). This monitor can look better in motion than a 1000Hz sample and hold display.
And that's just at the default pulse width, it can actually go lower than that in this monitor.
The way this works is by flashing the entire screen to show the frame at the right time and then turning off the back light. The frame visibility time is reduced to 25% at default pulse width, achieving 4x motion clarity improvement. 360Hzx4 1440Hz (1/1440Hzx1000Hz=0.7ms)
Yeah but motion CLARITY does not matter.
What matters is motion lag, if you wave your hand infront of your face it will look blurry.
Blur is not a problem, input lag is.
Great video! Also love the even hoody strings
Amazing indepth review, don't really see content like this anymore!
One of these decades, we'll get back to CRT quality. Getting there.
same 1 comment guy
I posted this already under a other video: a new firmware (MCM104) for the PG27AQN came out recently. Could you test it? I would love your opinion and measurement on that. As far as I can tell, especially the color performance has improved by a lot. New silders "color vibrance" and "ULMB 2 Pulse offset" are available. The menu has been cleaned up. I am not quite sure how to set ULMB 2 Pulse offset.
I cant use GPU Scaling after this Update, only Display Scaling works
Just ordered my AQN! Hype!
Can we get a review of the XG 2431?
i gave up asking for more IPS 1080p240Hz reviews from Tim a long time ago (XG249CM, XG2431 and S2522HG), it's obvious if a 1080p240 monitor costs less then $350 is not getting reviewed no matter how good the specs or price to performance could be..., just look at the last time a 1080p240Hz IPS display was reviewed by him :/
1440p is the Sweetspot
@@TheWayOfZorro thats why i have the LG oled. And the aw3423DW. 1440p 240hz oled >
@@RathOX if only 1440p240Hz+ OLED came in 23.8" or 24.5" monitor sizes, i fucking hate 27"+
@@shanksisnoteventhatstrongbruh yeah its a shame its not on the oled roadmap either sadly.
Whoa, Tim got a new guy to help out on the channel! So good to see the channel taking off enough to afford new hires. He looks like a Steve to me, so I'll call him Camera Steve. ;)
We need this tech on a high refresh OLED monitor
I think you can take nvidia at their word with the 1000 hz clarity comment or at least test for it. Blurbusters already puts crt’s, ulmb 1 @ 10% in the 1000 hz ‘holy grail’ class of clarity.
2000th like was mine 😁
This comment section is full of AMD fan girl, get a grip gals, nVidia lock you out, ain't the end of the world. Like you all have that 1500$ monitor....
9:02 I don't understand the wording here, could you please clarify if this is the result of testing in the normal or extreme mode?
16:08 Some guy on Blur Busters Forums claims he managed to use ULMB 2 on amd gpu by deleting range limit numbers in CRU or disabling "include if slot available" bellow it. Would be interesting to see this re-tested, if not on this PG, at least on upcoming PG248QP 540Hz.
Thank you, I can use it now with my 7900xtx now all I got to do is set a custom resolution of 1280x960 but I can't seem to do it the osd says 2560x1440, even tho I have GPU scaling off in AMD software, I need to find timings for 360hz 1280x960 resolution
Great video and channel.
Am I the only one extremely impressed by that webcam quality
Love your reviews. Hope to see a review of INNOCN 27M2V soon
That webcam is amazing.
Thank you for this!
That was a brilliant review
Tim: Often contrast is also affected when changing response times of a panel.
So, did you perform a new contrast measurement after the firmware update? 😉
Congratulations, you're now big enough companies are optimizing their product tuning to beat your benchmarks specifically. Fortunately your benchmarks are good enough at representing the real world performance that that's actually a good thing.
You say that but it's better to play FPS games on my old 360hz 1080p monitor than it is on my new 240hz OLED.
The OLED appears to stutter, even compared to my 165hz 1440p IPS panel.
There is no accounting for how an analog observer (humans) witness the motion and how it's perceived.
It's only measured by other digital media which is totally irrelevant, the only valid measurement digital media can make is how quick it is to respond to input.
Anything else is entirely made up, you don't see still images as a analog being.
@@konnj you are talking about judder. That's normal and you'll get uses to it. Oled needs to be 1000hz to get rid of judder. Or specifically 1 frame having 1ms of screen time.
I prefer oled even with judder as you'll simply get a cleaner picture nearly 100% of the time. Plus that actually transfers to when you are playing the best looking games at lower fps. The monitors in the video are nice but how often are you actually going to achieve 1440p 360hz outside of csgo or r6s?
@@jwhi419 I can easily get to the 300 in-game FPS cap on a game like Apex Legends with a 13900k+4090 on my 1080p 240hz monitor (XL2546K). Also I think a 4090 will be easier to control with an actual G-Sync Module. That's why I'm considering upgrading to this monitor. You can definitely stay above 360 FPS in those games you mentioned and Overwatch. Plus it's probably a good future-proof monitor. Big product leaps in IPS monitors seems to have crawled to a halt lately, 540Hz monitors are nowhere close to being readily available. And most companies are pivoting their efforts to new OLED products that still have a lot of kinks to work out. It's going to be a long time until a new IPS monitor comes out that's better than the PG27AQN and actually worth the cost to upgrade.
Patiently waiting for a 1440p/4k 240hz OLED w/backlight strobing
Unticking the VRR boxes in CRU might fix the issue of ULMB 2 not working on Radeon GPU'S, would be worth a try.
Next up should be BFI in their PG27AQDM!
This is the feature I want the monitor industry to focus on (and open version to all GPU brands tho) would be amazing if it had 60fps mode for fighting games.
So , AQN or AQDM ?
ULMB not working on AMD is just pure DRM purgatory territory. The OSD setting to disable VRR you speak of already exists in much much cheaper ASUS monitors such as the VG259QM. Its not like they don't know how to have that functionality, its done deliberately.
I've made my choice on an XG2431 and I just don't really want anything to do with Nvidia sponsored hardware in general as this is just their way of ensuring you will always use their products for a fear of compatibility.
That is the biggest downside with Nvidia. They always do it for THEIR products.
While AMD releases open source or "cross" compatibility.
Ofcourse, it opens AMD's income if they create something open source. But they are more or less more open to availability than nvidia.
I for one, has an nvidia gpu and intel cpu.
Bought a monitor on sale a while back which only has freesync.. which is no problem with whatever games i've been playing.
I'm not saying that either company is better than the other. I'm just saying that both nvidia and amd does great things. It's just infuriating that amd keeps piggybacking on nvidia.
ULMB1 worked the same way, my original PG279Q has no adaptive sync toggle and requires you to disable GSync in the driver to be able to enable the ULMB option in the OSD. What Tim didn't cover is that there is also the option of setting per game preference between GSync and ULMB in the driver under game settings, and it'll swap modes when you launch the game which is useful. I dunno if it's a deliberate move to lock out AMD, it's just a carryover of how it worked 7 years ago when ULMB first launched.
AMD can also fix this on their end by actually disabling adaptive sync when you change the setting, they're the only one with this issue. The Intel driver and the windows basic driver work the same way as the Nvidia one.
@@DanielFrost79 If AMD had the same or equal market share as NVIDIA they would've done the same thing.
You are forgetting that these are corporations first and the only thing they care about is money (I wish we had a Silverhand)
Idk, maybe just stop buying AMD products then. Sure they're passable products but it's clear they're always going to lose the fight to Intel and Nvidia in terms of products and marketing. You pay the price premium for a reason. I thought I'd be smart once and built an AMD system. Worst mistake I ever made with constant issues. That's why I don't use AMD products anymore. The small amount of money you save just isn't worth the hassle. Their products constantly underperform anyways.
Is there a video like this for the Acer Predator XB273U F? It's also 1440p and 360 Hz and has also received a firmware update for ULMB2. I've seen someone say it uses the same panel as the ASUS PG27AQN, but I don't know if that's true.
I would really love quality information on that monitor model. 😶
Thanks for the excellent content on this supposedly comparable model. :)
That's interesting, with Optimum Tech review, I thought this is going to be a no compromise, obvious next step for premium monitor that focus on motion clarity. Guess it isn't as clear cut, I am exciting to see what kind of improvement we can see on motion clarity of an LCD in the future
We do compare IPS to TN here, which is a key difference when it comes to motion clarity.
Looking forward to some "premium" (E-sports focused) TN monitors featuring ULMB2 to even the playing field & have a more direct comparison to DYAC+.
Hopefully we see ULMB2 in some more affordable monitors.
if dyac+ and ulmb2 are comparable then i'd consider dyac superior since it doesn't require vendor lock in.
Now we can see the difference with optimums newest video, seems like ulmb2 is still superior.
I hope that other "older" monitors get this upgrade. Would love to use it on my AW2723DF
No, the refresh rate has to be above 250hz
@@Aviotoso the new 300hz asus monitor can get this ulmb 2 functionality?
The AW2723DF goes up to 280
How come the Blur Busters approved monitors aren't on your lists? Have you not yet reviewed the ViewSonic XG270 and XG2431?
ua-cam.com/video/9o0rAvZtM7w/v-deo.html
hopefully ulmb2 can work on amd gpus in the future when I upgrade my setup to 1440p
if this is not possible, I'll probably go with the benq XL2566K
I believe I’m getting tearing or stutters I can’t really tell but with gsybc on it feels more fluent
Can you do a review of the omen 27qs monitor when it comes out?
I wonder if individual horizontal backlight zones have been ever tried out to reduce the backlight async issue.
I remember Mark from blurbusters talking about multiple backlight zones being a possible solution to the crosstalk issues, especially at lower refresh rates.
I'd like to see at least one overdrive setting with roughly the same response times as the old firmware. I'm not sure about upgrading as I don't want my screen to slow down. On the other hand I would like to try ulmb2.
Nvm. I did the update. Now I understand. I can exactly feel what you did describe in numbers, thats a huge improvement to befor. Cyberpunk 2077 and Hogwarts Legacy are capped to 75 FPS and with the esports overdrive it just feels so amazing! It feels like much more FPS, super snappy, no artefacts just super smooth and super fun. I upgraded from a benq 2546K and it feels like a different world. thank you for your great review and also for your other videos where you explained what all these values, like the cumulativ deviation, means. ULMB 2 feels great in overwatch 2, but I am not sure what to prefer because gsync feels more snappy. Overall this is a great product, I am so happy that I choosed the 27aqn over the 27aqdm.
@@he1go2lik3ithave you tried the xl2566k? Im contemplating between the aqn and the benq.
I think you will have the best gain on the 27aqn. 1440p, perfect colors, true G-Sync and ULMB, which as far as I understand is superior to dyac+. Overall, the 2566K is inferior to the 27aqn.@@erich-r4y
However, the 27aqdm is not a perfect all-rounder. Due to the pixel layout, it is not possible to work and read texts very well with the OLED. As for burn-in, Asus does not provide any coverage, which would be too risky for me.@Doctor0Who
@@he1go2lik3it I got the monitor last week, I m also not confident to update cause it will slow down my response time, is the response time degrade significant? Motion and picture quality will be better? In trade of resnpose time? I play competitive ganes.
ARE YOU GONNA DO THE CORSAIR MONITOR
Have you considered having a persistent chart of all your monitor reviews, with results etc?
Do you plan on making an update about the 2023 edition 27GP850P?
I would have really liked to see what input latency results this mojitor has with ULMB2 enabled!?
Do you have the Monitor ?
Just for the funzies it would be interesting to see just what Nvidia based GPU the upgrade will work with. For instance would it work with a GTX card? Wondering as it would be an even "better" feature if it didn't. Now I know that the chance that someone investing in this monitor would be stuck with anything but a card from the 30 or 40 RTX series, or the 60 or 70 series of RX cards is so low it's ridiculous, but I got an inquiring mind...
Also does it have to support NVSync? in other words would it work with a 1080, but fail with a 970? At least I think the 1080 supported NVSync, or have i got that mixed up?
Updated the firmware today with my reference GTX 970 and Windows 10, but I can't find ways to enable ULMB2 on my new full AMD setup 😢
what if you disable vrr in something like cru, will the monitor allow you to enable ulmb?
If it works it may also allow you to update the firmware but it is risky
Thanks for the update! Have you tried disabling VRR in windows settings on the AMD system?
Hey, do you guys plan on reviewing the new Samsung G9 OLED 49?
ULMB is an Nvidia feature, it makes sense firmware update doesn’t work when GPU is AMD, also you have to ‘disable’ GSync in Nvidia control panel on PC to enable ULMB2 on monitor.
Pulse width 100 looks clearer to me than pulse width 10.
You needed a Nvidia card for the update. Did nvidia or amd matter for the testing? Like if I buy a updated screen and then use a amd graphics card, will it still run as good as tested?
Do you need to choose ulmb in nvidia control panel(monitor technology) or fixed refresh rate?
What is better for competitive gaming?
Sounds like nvidia at least did a very good job with ULMB2 compared to other techniques since I would imagine the bottom of the screen is way less important than the middle and top in most games and definitely shooters. So they made the right choices, but yes hopefully they can perfect it someday. Sounds like a great way to get clear motion now that they fixed the brightness problems too. I might end up getting this Asus 360 hz monitor and use my OLED tv for single player games. At same time, I might just go with a 240hz OLED as I bet those monitors are so good.
I don't understand. What does the marketed 1ms response time stand for if it's 5-10ms?
Also, why not go for OLED when it's cheaper? (I am asking for competitive FPS purposes)
Do we need to lock frames for best ULMB experience?
Please review the MSI G274QPX
since the Gigabyte monitors are still out of stock
Hi, are there any 32" 4k monitors with auto brightness? I am fiding it hard to believe that my 10 year old 24" NEC beats latest and greatest monitors in features and connectivitity.
What are the best settings to use for ulmb 2.0, for competitive games and what should I use in singleplayer games?
I would rather be able to deal with overshoot and have the sub 2ms response time... Now I don't even have that option with the update
i just got the asus pg27aqn and have the umlb2 update and in the nvidia control panel shoould i have vsync on or off and what should the "monitor tecgnology" setting be set to? i have it on umlb and vsync off
HELP !
i want to buy this monitor but had Q
1.- when you use ULMB 2 can Still use overdrive modes ? I mean use the 1.9 ms gtg or automatic with ULMB 2 the monitor use the normal 3 ms gtg
2.- In 2024 Stille worth it ?
Sorry inglish isn't my first lenguaje 😅
Only thing i want is a 32" 4k OLED 144 hz+ when is that coming out.
What is the smoothness feeling with Gsync disabled and ULMB 2 ? Do you use V sync ON? I hate tearing with V Sync off.
So would you recommend ULMB 2 be on for fast paced fps gaming?
Since there is no guarantee you get burn in warranty on the Samsung odyssey g8 oled in Australia would you buy the Alienware aw3423dwf instead?
If u want burn in then yes
im trying to update firmware but i get error during flash saying it cant switch to 60hz, and it tells me to lower it to 60hz but the only option i have is 360 both in nvidia control panel and in windows adapter settings. please help!
It was a little confusing for me, does the game need to be at 360fps or more for ulmb 2 to work? If I run the game at 200fps, for example warzone 2, ulmb 2 won't help at all?
Please review the BenQ EX270QM 240Hz I think is good option for those with AMD Cards.
Will ulmb2 be coming to the newer 240hz oled monitors?
what's the reasoning for this level of emphasis on low refresh-rate testing - the information it might or might not reveal can't be extracted to anything meaning both because of "the eye test" (caveat *60fps*) - as you even said "what this in actual visual... is hard to describe" but also because of the inorganic use-case-scenario - made up example with similar test for another device targeted comp gaming: "4k mouse performed poorly at lift-off-detection when I set the polling rate to 100" - I guess what I'm not understanding is the significance / the why is this relevant?
why would someone lower refresh rate and not resolution to "accommodate" for low FPS-ingame (on a 2k monitor) targeted competitive gaming
@@garrusvakarian8709 I understand that, but the importance of that is based on an ever-decreasing-amount of people. who goes out and purchases 360hz monitors? people who play FPS games... 99% of competitive FPS games can easily hit 400+ fps - if you already sacrifice getting a 4k monitor for a 2k 360hz (that cost the same or more than really great 4k 144 monitors) a fair assumption to make, is that single player games are secondary so why are we emphasising a small amount of people and or a secondary use-case scenario? to this degree?
wouldn't the logical thing be to test the performance of this monitor at a LOWER resolution since the users of this who can't hit 360+ fps in *some* titles - but still want the competitive benefits would then LOWER the resolution NOT the refresh-rate?
240hz Benq dyac users are probably disproportionately on a non-native resolution (cs go players etc) so where's the Odd / low resolution testing - wouldn't that at the very least be equally as common of a scenario for someone thinking about purchasing this monitor- (as a 2k 60hz scenario is)
**Reviewing something based on it's intended & probable use-case (outside of your core viewership perhaps) as well as reviewing something based on your audience's most likely use-case (or at least those who comment / respond to polls) are not mutually exclusive is what I'm realising has become my point, I guess - anyway thanks for the video :)
@@garrusvakarian8709 right the minority of people got that answered by this video - great - that's fine like I said, the emphasis of that VERY specific scenario is what I'm questioning, the monitor is 2k 360hz, it's an extremely specific combination, a "jack of all traits" test being the "main focus" of the review doesn't seem to fit with the products selling points / main competitor -
which is evident by the very monitor everyone is comparing this to - the benq 360 dyac - a PURE competitive gaming monitor, so why not ALSO include a purely competitive based comparison, where the resolution is dropped, instead of the refresh-rate, this wouldn't REPLACE the jack of all trades test, but be alongside it!
in your own words "this is not complicated!" YOU CAN DO BOTH!
@@garrusvakarian8709 that's just disingenuous to the point above - the very thing that is updated is ULMB, which again, is a hyper-specific technology, most associated with competitive gaming at these refresh rates - again hence why it is being compared to the benq dyac. The "updated firmware" puts this monitor into a great spot to compete in the competitive gaming monitor market, so your argument as to why a competitive testing is not done, is obviously incredibly silly, since that is a result of the firmware update, and you accurately point out the firmware update is the point of the video :).
you yourself claim "not everyone can get high fps all the time at 2k resolution"
I say: okay! lets test what happens if you drop the resolution down, so you maintain the competitive advantage of 360hz - will the new firmware update still be as impressive at low(er) resolution with ULMB2 enabled, with the absence of 2k resolution?? or will the benq dyac be noticeable better at 1080p or even at 4:3 resolutions where the top / middle / bottom zones of the monitor really is tested differently, as the "area of focus" is greatly distorted - we know that top, middle and bottom of the monitor perform differently in terms of backlight strobing accuracy - so could be one area to do some further testing.
then you respond with "NOPE"
why not just say "yeah maybe that could be an interesting thing to also test"?
Why dont they just develop a monitor with simultaneously updating pixels instead of vertically? What's so tough about that, power consumption concerns? Manufacturing costs? Surely it isnt impossible to do
Is it possible to disable adaptive sync on AMD cards through registry editor?
also curious about this. Seems like there has to be a way to force it off entirely.
Could you check out the "HP x34", it is the ultra wide 1440p monitor with ips display, 165hz and freesync. Seems to be a great value product for 360-400€ after taxes... Think it will be the same in dollar
Not interested
I want this on OLED TVs.
can I turn my Asus TUF Gaming VG279QM in to a ULMB monitor is there going to be any software update for it in future ?
Hi there,
Would it be possible to check out the firmware update 1.3.1 on the cooler master gp27u?
What might be worth trying is to modify the EDID to fool the AMD Driver into thibking it doesn't support AdaptiveSync... Kludgy as hell fix, but it might work?
Now we only need BFI in the 240Hz LG OLED panels...
I've never seen a higher than 144hz LDC with strobing enabled in person, but mine (1440p @ 144hz) gives me huge headaches when using the strobing mode. Maybe it's easier on the eyes at 360hz..
So you only need a NVIDIA GPU to update the monitor and after that I can use ULMB without an NVIDIA?
wow, Nvidia specifically told you they are using the testing methodology that your team has designed??
I'd be really curious to try it with an Intel GPU. Not that Intel's current lineup could drive a monitor of this caliber, but since you said that ULMB 2 works on AMD cards with the Windows generic driver I wonder if it's a driver issue on AMD's end or if Asus is actively disabling it for AMD on purpose.
i found quite some things that work with nvidia and doesnt work with AMD suddenly also works on Intel.
getting back at the LG c9 oled VRR not working on rdna2 gpu - Intel arc VRR works perfectly fine.
same goes for productive Performance in say affinity photo. nvidia scores about 10-20x higher than AMD, and so does Intel.
having Intel in the gpu space helps to clear up alot of those strange cases.
Something like Overwatch would not be hard to drive for a Intel GPU. If you are seeking that kind of improvement chances are you are playing competeteive low graphics anyway.
Just me or does the Asus look brighter vs the BenQ?
If only the Asus monitor stands (over the last several years) didn't have that awful looking copper portion... are they trying to compete with Noctua for worst brown?
How would a 3080 and 11700k preform with this monitor? And how does this monitor preform if I set it to 240 hz instead?