I should note here the longer DisplayPort cables I showed in this video I couldn't even test with this monitor as they were normal DP cables - so they might not work, and probably won't work because they're not officially DP80 certified by VESA. Which is another layer of confusion because cables can be DP2.1 certified but not certified for the full bandwidth requirements as DP2.1 has multiple bandwidth specs. I bought them to see whether they'd work but ultimately because the W7800 doesn't support UHBR20 over full sized DP I was unable to check their advertised claims EDIT: As several commenters have pointed out, the longest certified DP80 cables are 1.2m long, which further restricts cable compatibility and usability. DP1.4 monitors can easily use 5m cables but if you want to go with DP2.1 UHBR20 you'll be stuck with these short cables by the sounds of things, at least for now
The longest officially certified UHBR20 DP2.1 cable is 1.5m lol... Worse than the HDMI cable they shipped the LG27GR95QE with, which caused the monitor to turn black randomly until you changed it
For my 57" Samsung, I got UGreen DP2.1 8K@240Hz and seems to work. As you were saying for the cost of the monitor and GPU they could give a longer cable. (Stingy) My computer is on the opposite side of the monitor.
In regards to the 1m DP2.1 cable, at the moment, the longest CERTIFIED DP2.1 cable is only 1.2m long (Club3D). AFAIK, there isn't a longer DP2.1 certified cable yet that can be guaranteed to work. Perhaps in the future, there will be longer cables when they improve the standards for DP2.1. You can find the list of certified DP2.1 cables on the VESA Displayport site.
@@kissu_io Unfortunately for a lot of people with their display setups not directly next to their computer, it is not enough if you also account for the bends you have to make on the cable to get around objects like parts of the table or display mount for a cleaner cable management.
I would be fine with the occasional HDMI 3 or DP 3 or USB 17. It'd be so much easier to just increment the number sensibly. I'm looking forward to PCI 6x16+3.5 or something. :D
Just make sure it says DP80 or UHBR20 in the product description and you should be fine. I know what you mean though, they made this far more complicated than it needed to be.
It's very noticeable different colored text. If you use discord you will notice it night and day. I gave my $2000 Predator X27 that I had since right after its release to a family member that doesn't use discord often and mainly does gaming on it. because I knew DSC wouldn't bother them as much although I showed them how to lower the refresh rate so DSC wasn't an issue.
Finally it's being talked about, and at such detail! Thank you for being so clear and concise with all the information we could possibly need. Good Video!
I have a recommended category "Best secondary Monitor" Since secondary monitors aren't really used for gaming and will have static images most of the time, shakes things up for different suggestions. Where picture comes more into play over just speed.
The AW3225QF has been reported on the NVidia forums to have VRR screen flicker issues. It also has an active fan inside and curved (which isn't great for viewing angles). For those 3 reasons I avoided it and went with the FO32U2P. Zero issues so far, so i'm pleased I chose it over the AW one. Plus its a bit more futureproof in that it supports proper DP2.1 UHBR20. No doubt the Nvidia 50XX series cards and AMD's new offerings will also sport at least one DP 2.1 UHBR20 connector.
@@0perativeX I get the AW3225QF because of its slight curve. The FO32U2P is not officially certified by Nvidia to be "G-Sync Compatible" whereas the AW3225QF is. This leads me to the hope that Alienware can provide a firmware update to improve VRR like msi already did. Also, it is not supported yet by the driver. The FAN does not bother me at all, if it helps with OLEDs to keep them cooler the better. Also, it is 330€ cheaper and as OLED is yet not that bright I hope that 2025/2026 panels will be brighter so I will upgrade non the less.
@@lilpain1997 the Aorus FO32U2P I picked up yesterday doesn't have any VRR flicker so far. Haven't had any with my 42" LG C2 OLED either. For the FO32U2P i'm using the Mini DP 2.1 cable that came in the box. With the C2 i'm using HDMI 2.1. Both with VRR/G-sync compatible mode enabled. I wonder if the flicker that others have reported is occurring in conjunction with nvidia DSR use?
@@interceptor001 It may not be officially certified but i can tell you right now through personal use that G-sync compatible mode is working perfectly with this monitor and I've had zero VRR flicker so far.
I have this exact model and WHOA!!! Perfect picture clarity, HDR is AMAZING and bright, Colors POP vibrantly, 2.1 UHBR20 Future proof is a Chef's Kiss, KVM yes please, and a many of other great features and tricks to make you happy that you're alive at this moment I promise you, but don't take my word for it, see for yourself. Also make sure you download a icc profile for the perfection that I'm speaking of.
I wish latency was tested between uncompressed 80gbps and 40gbps+dsc. I know that's been done time and time again with other signals like 1440p 240hz since that can be done without dsc at 8bit but not 10bit. Still would've been nice to see a graph though.
From exp and what others have said. It has issues alt tabbing out of full screen games and black screen flickers etc. I use to not have that issue until upgrading to 4k 240hz. My 1440p 240hz had no such issues. @vincentrowold1104
@@Floturcocantsee idk if i trust it really has zero performance and input lag impact, unless someone actually benchmarks this its still safer to use fullscreen exclusive. besides its a win 11 feature, isnt it? most people are on 10.
@Floturcocantsee i actually dislike fullscreen. I like not having to hit my windows key or alt tabbing every time i want to move my cursor over to my other two monitors
Got this monitor yesterday and I have it hooked up to a RTX 4080 Super's DP1.4 connector, the other end plugged into the monitor's DP2.1 mini port. It's been running absolutely fine at 4K 240hz (using DSC). The picture quality is great and there are several HDR colour options in the menu so you can tailor everything to your liking. It'll be interesting to test it out with a RTX 5080 when those come out (hopefully with a UHBR20 DP2.1 connector) but for now, everything working is great and i'm pleased with my purchase.
Am I right in thinking you can turn pixel shift/screen move OFF in your settings? I’m interested in this monitor solely because it seems to be the only qd-oled that allows you to do this. Is that correct?!
2 hours ago?! Impeccable timing, Tim! I ordered the Aorus without 2.1 last night and was wondering if it's a big deal and if I should cancel the order, I guess not! Brilliant video as always, cheers!
@@Dubulcle Well the super long mode switches (alt+tab/fullscreen) with DSC is painfully annoying and I'm more than glad when GPUs will come with 2.1 so that the DSC can go to hell. It takes several seconds sometimes and this is annoying when you want to play a new game, and have to hope the few first seconds are just company logos
Just the video I was hoping for! With that said, I sort of feel that the DSC-part was a bit underwhelming. For instance, DSC is a lossy compression since it removes data. This will lead to compression artifacts. There is also the error-rate for the reconstruction of the removed data. On top of that, DSC does not use previous frame to improve compression quality like other algorithms. This means that from frame-to-frame, there will always be artifacts. When DSC are at its worst, is at very sudden changes from frame-to-frame. An explosion in a video-game or a gun muzzle flash, for example. Latency is a none issue. It will proximate to the same latency as running with 4:2:2 subsampling (has to reconstruct to RGB before the monitor can use it). With all that said, one could say that viewing a image in png vs jpg is similar to native vs DSC. The differences are there (for every frame), but does those differences matter? I think you should be able to showcase these difference between a DP 1.4 graphic card and a DP 2.1 UHBR 20 graphic card with slowmotioning them side by side during an explosion in a video game. EDIT: reference with sample: forums.blurbusters.com/viewtopic.php?t=12235
Thanks for explaining the differences and what we need to make things work! I recently bought some nice DisplayPort cables that can handle DP80, so now I just need a video card and monitor that also do. I am hoping the new Nvidia 5000 cards have this working. Thumbs up.
DP2.1 really is one of these things where you go: Yeah cool it's there I'll take it, but it'll not sell a device purely based on that. My monitor has been working pretty much flawlessly with DSC, but I also have an AMD GPU.
Thanks for confirming my thoughts that DSC really doesn't degrade visual quality in any perceptible sense. I like that movie compression example. Tbh I cannot tell any difference between a full Bluray and an encode at half the size, even with pausing and staring on my 83 in TV. Feeling more reassured with my new AW3225QF now.
@worldwanderingfisherman2166 It's very much like modern lossy audio compression. It throws out data that can either be reconstructed from the other data (usually at reduced accuracy) or an imperfect human organ isn't sensitive to (much like chroma subsampling which focuses on the most important aspect of an image in relation to the human eye: luminance).
I was concerned at my Alienware AW3423DWF stepping down to 8-bit signal to get the full 165Hz, great to see that question answered and it's basically no difference except in math.
That's 8 bit + FRC, not DSC. If it had DSC it would run 10 bit all the way to 165 hz. The AW3423DW/F doesn't support DSC. Most (if any) people probably can't tell the difference between true 10 bit and 8 bit + FRC but we are talking about 2 different things although from my understanding they're both used to save bandwidth.
So this is a little like when I bought my ultrawide in 2020 - I opted against going for HDMI 2.1, thinking it was pointless at the time. Ultimately, I regretted it, but it took 4 years for me to regret it. By that time, new monitors like these awesome 32" 4k OLEDs were launching, and it didn't matter.
I used. To have random black screen problems, in the end, it is all because the cable is not up to spec, change to the validated one the problems gone.
As a matter of fact, every 120Hz capable TV should also have it... Just because of the situation of HDMI2.1 on Linux... And even if there would not be that situation, it should still be there!
Great video , to be honest is too soon for DP 2.1 , but is nice that some company are put that in, with those monitors price range the more feature u have the better
The Displayport 2.1 standard now supports link training, so it's theoretically possible to detect a degraded cable and revert with the source. I think it just isn't implemented in current PHYs.
Following up on this, based on the DisplayPort RX Subsystem LogiCORE IP Product Guide (PG233) from Xilinx which implements a DP 2.1 RX for FPGAs, the GPU/TX is responsible for determining the bandwidth after the link training period. The RX side can suggest a bandwidth in advance of link training but the GPU is the ultimate arbiter of what bandwidth is actually targeted for the communication. Thus, I think that the problem here is actually on AMD's side in not using the training results to determine bandwidth rather than on the monitor.
This is perfect timing for me. I wanted to get the Aorus FO32U2P because it seemed like the ideal OLED monitor for me, especially its DP 2.1 UHBR 20 support. I then found out that WOLED monitors are better with lots of ambient lighting, especially if they have a slighly more matte display like the recent LG 32" 4k 240Hz monitor, so I might be willing to sacrifice DP 2.1 for a better experience with an open window. I don't think I'd ever use DSR on a 4k monitor, so it likely doesn't matter to me.
Thanks Tim! This is EXACTLY why I have yet to buy this monitor. Waiting for your review. Great explanation on DP2.1 UHBR20. You just confirmed what I thought about future-proofing. In Singapore, the prices of all the latest QD-OLED are quite similar so if the FO32U2P has similar performance as the MSI, then I would buy the Gigabyte. I keep my monitors for a long time and I go thru 2-3 GPUs during the life of my monitors across 4-5 generations of GPU. Meaning I change monitor every 5-6 years. Anyway, you wouldn't notice ANY difference in YT vid to compare DP2.1 & DP1.4 because the footage are compressed and max out at 4K60. Text scrolling wouldn't show as well but UFO test will be interesting. I guess only hi-res RAW pictures & low compression HDR10-12 videos will show some difference. From what I read, DSC works by converting RGB signals to a colour space that represent what our eyes can detect, then from there decrease the bit depth of the chroma channels from 10 bits to 8 or 6 bits. The monitor will then interpolate adjacent pixels' chroma channels so colour gradients will not exhibit any banding issues due to drop in bit depth, hence dynamic range. Pictures that may exhibit issues with DSC may be high contrast photos of leaves or grass against bright background, and brightly lit fur/feather. But I must admit that at such high DPI, most people will not be able to spot it. This is probably more obvious with 50"+ displays. Will be interesting to see if high refresh rates will affect grey-to-grey response time. Looking forward to full review! Edit: Pairing this with the 7900XTX so I hope a good 2M DP2.1 cable will work. At worst, maybe do what Linus did and buy an active optical DP2.1 cable but those are REALLY expensive!
The fact they haven’t even invented a VESA certified DP80 cable longer than 1.5m boggles my mind. I’m sure we’ll have 2m out by the time the 5090 drops though. I trust NVIDIA will support DP 2.1 too
Finally, a proper 10 bit monitor. I notice that Blackmagic Design make DeckLink monitor cards up to 12 bit, with SDI/HDMI 2.0a, REC 2020, 4:4:4 colour space, up to 8K. Not sure if they will work with this monitor though, or if you could play games with them. Great for editing though.
It may not be mandatory, but I am not dropping over $1,000 only to be forced to use an old standard connection when there is no legitimate excuse for a top of the line monitor to not have DisplayPort 2.1 (UHBR20) in 2024. DP 1.4 forces you to use DSC at 4K 240Hz, whereas you can get that natively for 2.1
I spent quite some hours trying to understand all the limits on Wikipedia and was worried that I would need to have a 2.1. But no GPU no supporting it was quite crazy to me. Turns out I can run my future 240Hz 4K OLED with DSC. That's definitely not a deal breaker then. Especially if you support the fact that the compression is not too awful. Thanks sir for the deep dive.
Also, with DSC as it is, whenever you use a monitor that requires DSC to get all your shiny features (res, refresh, bit depth, etc..) you're effectively limiting the amount of monitors you can connect to your GPU because DSC will use more than one display head to drive that screen. I am typing this on the above mentioned monitor in this discussion and I bought it in anticipation of full spec DP 2.1 coming out with the next generation of GPUs (nVidia 5000 - series, AMD - 9000 series) so that I could purchase 2 more of these monitors and connect them to the same GPU w/o to install a secondary GPU into my system and halving my PCIE lanes (z790 here).
I have found that between my 4k 120hz display and my 4k 240hz display with DSC, I can tell a difference visually when using DLSS. So only when using not native resolution it doesn't scale as well with DSC enabled. That is apart in R6 siege when looking at saran wrapped crates for example. Outside of that can't see a difference. This is on the MSI MPG 321URX
I ordered the Aorus FO32U2P. And I didn't really do that for DP2.1, though that is one of the few reasons I bought it. The main reason I bought it, is that it was one of the few QD-OLEDs I could actually get since everything else is out of stock. Also where I live the price difference between the DP2.1 and the cheaper 1.4 version wasn't big enough to warrant going for the cheaper one, as they already cost a lot of money out of the gate.
This is so good! Have you by chance made a video in this format for HDMI 2.1? I remember some that discussed it a lot on HU, but one in this format would absolutely demistify that for beginners, too. Thank you!
Badly waiting for the review of the Aorus. Here in Germany the Asus is 1800€ and the Aorus 1450€, so if there will be no big diffrences in features I will still go for the Aorus just becasue Asus is insane in the pricing deparment.
Another limit of using DSC is multimonitor support. You're limited to 2 monitors in this configuration. From Nvidia: When a display is connected to the GPU and is set to DSC mode, the GPU may use two internal heads to drive the display when the pixel rate needed to drive the display mode exceeds the GPU’s single head limit. This may affect your display topology when using multiple monitors. For example if two displays with support for DSC are connected to a single GeForce GPU, all 4 internal heads will be utilized and you will not be able to use a third monitor with the GPU at the same time.
I actually prefer the monitor not working when the cable isn't supprting DP2.1 spec, it's better than being oblivious to the shitty quality of my cable.
you do realize a small text in the OSD is very easy to implement? telling you the cable quality is subpar while also letting you use your monitor while a new one comes. Don't give excuses to companies
Also don't forget that it's easy to make an OSD toggle to activate and deactivate DSC, Asus does it, and I think one of the other manufacturers too, it would allow you to run DSC on lower quality cables, and without it on good quality ones (in the case of the Asus it locks you to 120hz without it since it doens't support dp2.1)
What an absurd take. Why would you want something to not work when can most probably work in a different mode, especially in the default configuration. All it needs to do is keep falling back until it finds a mode that work, then have the OSD pop a box showing the current connection figuration once it gets a functioning signal.
@@houssamalucad753 I agree with you, but unfortunately my trust in companies making these monitors is low, your solution would be the perfect one, but if it isn't implemented I would rather the monitor not to work instead of working in a half assed way without telling me.
ugh nice idea, except they come with the monitors own cables so its redundant, also it s very hard to tell any speed difference, maybe we will once the 5090's come
In regards of DSR/DLDSR and DSC from TFT article: “NVIDIA DSR, NVIDIA DLDSR and NVIDIA Image Scaling are supported when DSC mode is enabled, if the pixel rate needed to drive the display mode does not exceed the GPU’s single head limit. If GPU uses two or more internal heads to drive the display, NVIDIA DSR, NVIDIA DLDSR and NVIDIA Image Scaling are not supported”
It would be good to see direct input lag comparisons with DSC on and off, and inspection on whether DSC affects the image in any way when there's a lot of very fast movement (like OW2 combat).
There's one thing you didn't mention, which I'm quite sad about: Black screens when going in and out of Fullscreen games. In my opinion, it is the biggest down-side of DSC. On some monitors it's only about a second, but I've tested ones where the black screens last more than 5 seconds. When trying to Alt+Tab out of a game quickly to check something, or answer a voice call, etc, it can really be a big annoyance. Secondly, I've also read reports on the Blurbusters forum of people experiencing a "heavy mouse effect" when using DSC. This is probably negligible for most people, but for people who are very into competitive gaming, it is something to consider.
What you’re saying is true but I fail to see the point you’re trying to make. In the video he DOES talk about NVidia specific issues, namely missed features etc. He dedicates a section of the video to talk about issues arising from the use of DSC, and in my opinion forgets the most annoying one. NVidia has the obviously bigger market share of GPU’s, and displays that require the use of DSC are most often used for gaming, meaning NVidia specific issues, are issues for the majority of people using DSC.
Kept this monitor over the MSI MPG 321URX QD-OLED HDR is great the MSI version in dark areas was flickering but not this monitor and I didn't want to rely on firmware updates months later in extremely happy with the AORUS FO32U2P and a plus side my MB and GPU are both gigabyte so can control everything on GCC
Shame rest of new OLED monitors didn't come with DP2.1 would've been nice. Though expected to see on future wave of monitors. Just today there was a 4K 1000Hz demo monitor Blur Busters teased. I'd expect 1080p 1000Hz to hit as a next jump along with DFR mode for 4K monitors. Bandwidth is there.
I just bought this monitor mostly because of the the DP output for daisy chaining working with it's KVM switch. Allows me use the same setup for my dual monitor gaming PC with and my work laptop with out sacrificing high refresh rate if i was to use an external KVM switch that are usually limited to 60hz.
lastly I agree, I won't buy "DSC native" monitor again after owning the Predator X27. I disliked the color compression and the jagged edges it introduced. While they say "Visually lossless" I say "Run text and show me visually lossless." especially on discord, try someone with a red name with DSC you'll see it immediately the text "bleeds." Edit: The only place DSC may be acceptable is 8k since the PPI is so high your eyes will have difficultly seeing DSC.
Integer Scaling is one of the features that you can't use with DSC enabled, which is such a shame because I was really looking forward to playing older titles that don't scale well at 4K, with 4K being the perfect resolution to integer scale 1080p titles. I got the FO32U2P to future proof myself, but yeah, if Nvidia could actually just upgrade their latest drivers to enable Integer Scaling, I would appreciate that a WHOLE lot more than waiting for potential DP2.1 80Gbps GPU's to hit the market.
Hey Firstly love your videos helped me a lot finalize my monitor. I went for AW3225QF. Coming to my question: I want to use my monitor with dp port to keep my hdmi ports open. For future console setup. Right now I have laptop with full thunderbolt 4 functionality. Which cable/adapter/dock should I got for to get the full 4k@240hz from my monitor. I have a 4080 175w laptop
Great video as always. I consider myself an informed buyer but videos like this are eye opening. I was unaware that DP 2.1 had different standards for bandwidth, good to know that UHBR20 is the one to look for. I personally have only had one issue with DSC which was on my newly purchased LG 32GS95UE. I was using DP 1.4 with the monitor's onboard speakers but when launching Fallout 76, the game would not output to the speakers. If I cranked the resolution down to 2560x1440 at 120hz, the speakers would work fine which I assume has something to do with DSC as the game is launching. I ended up switching over to HDMI 2.1 and locked the game at 139FPS due to a physics issue; that combination seems to have fixed the DSC issue as the speakers work fine now. I am wondering if 4k at 139FPS is just under the bandwith requirement for DSC to activate. In a perfect world, I'd prefer to use DP 2.1/UHBR20 but HDMI 2.1 seems to fill the gap well enough.
8:25 you need to put an edit into the video, one of those text boxes showing the correction here. I could not understand why Gigabyte would ever include such a tiny cable and then later you release a whole video explaining that those specific high-end DP cables only come in 1.2 meter length. I'm happy you released a video but please edit this one as well to include the correction.
well, I’m pretty sure RTX 5000 series will have display port 2.1 and because gigabyte is also making GPU’s they already know this and that’s why they are making display port 2.1 monitor with UHBR20 . I might be wrong, but I would bet for this. 🙂 Thanks for sharing this I’m curious if it is better than the PG32UCDM or mostly the same . I already got the PG32UCDM and love it . I don’t know if it’s like this, but maybe with uncompressed signal if you alt tab from a game there will be no flicker or black screen for a moment ? anyways, the main difference will be display port 2.1 on Aorus side and Dolby Vision Which should be implemented later this year on Asus side . Which is the better choice? What do you think? Thank You!
Iv seen a few posts about HDMI problems, people just assume it's the GPU or Display (which ever is the newest of the two). No one thinks it's the cable, we are not used to cables being the problem.
DisplayPort daisy-chaining will require twice the bandwidth. Daisy-chaining the DisplayPort signal is useful if using the first monitor as a KVM and you want to connect a laptop to multiple monitors using a single cable. Of course, still waiting on those laptops with DisplayPort 2.1. FYI, might want to test the KVM and daisy-chain capability in the full monitor review.
I should note here the longer DisplayPort cables I showed in this video I couldn't even test with this monitor as they were normal DP cables - so they might not work, and probably won't work because they're not officially DP80 certified by VESA. Which is another layer of confusion because cables can be DP2.1 certified but not certified for the full bandwidth requirements as DP2.1 has multiple bandwidth specs. I bought them to see whether they'd work but ultimately because the W7800 doesn't support UHBR20 over full sized DP I was unable to check their advertised claims
EDIT: As several commenters have pointed out, the longest certified DP80 cables are 1.2m long, which further restricts cable compatibility and usability. DP1.4 monitors can easily use 5m cables but if you want to go with DP2.1 UHBR20 you'll be stuck with these short cables by the sounds of things, at least for now
Was going to say, Gigabyte's PR told me that longer DP80 cables aren't yet available!
The longest officially certified UHBR20 DP2.1 cable is 1.5m lol...
Worse than the HDMI cable they shipped the LG27GR95QE with, which caused the monitor to turn black randomly until you changed it
@@PhixisoFunny that you happened to mention this. I do notice this with my 27GR95QE-B sometimes but very rarely.
Any update to report regarding your long-term Burn-In test?
For my 57" Samsung, I got UGreen DP2.1 8K@240Hz and seems to work. As you were saying for the cost of the monitor and GPU they could give a longer cable. (Stingy) My computer is on the opposite side of the monitor.
The monitor sitting on the very edge of your desk had me clenching lol
It's actually slightly off the desk
@@monitorsunboxed love how you just make it worse haha
Same here ! My balls where itching
Don't worry. Only Linus breaks things.
@@tekelupharsin4426 didn't MKBHD break Fisker?
(No, he didn't, they did, it's just a joke before anyone kicks off)
That's a lot of letters and numbers in the script, the fact you can roll them off without a slip up is amazing.
Teleprompter?
@@Ni5ei Doubt I could do it with a teleprompter!
@@cptwhiteyou’re not the only one 😂
He has been doing this a while. Im sure its practice
mfg
In regards to the 1m DP2.1 cable, at the moment, the longest CERTIFIED DP2.1 cable is only 1.2m long (Club3D). AFAIK, there isn't a longer DP2.1 certified cable yet that can be guaranteed to work. Perhaps in the future, there will be longer cables when they improve the standards for DP2.1. You can find the list of certified DP2.1 cables on the VESA Displayport site.
Actually I think it is even worse - longest fully certified DP80 (not DP40) cable is 1.2m
Definitely enough in terms of length IMO.
@@mnkeyclwrl Ah yes, you are correct. Fixed my number.
@@kissu_io Unfortunately for a lot of people with their display setups not directly next to their computer, it is not enough if you also account for the bends you have to make on the cable to get around objects like parts of the table or display mount for a cleaner cable management.
@@kissu_io For most people? Yes.
But some people like to put their PC in other rooms.
Appreciate the clarity on the situation.
Thank you for addressing this. During the time I was looking at the new 32" OLED displays, I saw so many comments about DP 2.1 vs 1.4.
Love the research coverage into dsc
This video came at the best time ever, I was researching some monitors with DP 2.1 and this video popped up. HUGE thanks!
For fuck's sake
All they had to do was not name them all DP 2.1
It'll just lead to more confusion like with the new HDMI naming...
Somehow less than useless companies in charge of naming standards. I guess it helps screw the consumer so... that's good?
I would be fine with the occasional HDMI 3 or DP 3 or USB 17. It'd be so much easier to just increment the number sensibly. I'm looking forward to PCI 6x16+3.5 or something. :D
Add USB to that list too, it's like they're all in on it together to con customers.
Just make sure it says DP80 or UHBR20 in the product description and you should be fine. I know what you mean though, they made this far more complicated than it needed to be.
They know exactly what they are doing. Just look at USB standards. They want to confuse consumers so we just buy thinking we got the right stuff.
I took a shot for every acronynm in this video.. im now dead
Yay! I've been waiting for DSC vs non-DSC video coverage for months. Thanks for explaining, Tim. I'm glad to hear that it's a negligible difference.
Thanks tim even though test was flawed lol.
+Test is Flawed, different monitors used.
@@Keltzzzzthey use the same display made by SAMSUNG
It's very noticeable different colored text. If you use discord you will notice it night and day. I gave my $2000 Predator X27 that I had since right after its release to a family member that doesn't use discord often and mainly does gaming on it. because I knew DSC wouldn't bother them as much although I showed them how to lower the refresh rate so DSC wasn't an issue.
Fantastic video, Tim. Exactly the kind of info I was looking for.
Thank you for explaining this. The question we all had but no one answered
Finally it's being talked about, and at such detail!
Thank you for being so clear and concise with all the information we could possibly need.
Good Video!
Excellent video and thanks for the information.
Learned quite a bit.
I have a recommended category "Best secondary Monitor"
Since secondary monitors aren't really used for gaming and will have static images most of the time, shakes things up for different suggestions.
Where picture comes more into play over just speed.
i actually really appreciate this video. thank you
I love that you did research reading threw papers! Thx for doing so. I was so worried to get the AW3225QF because of DSC but it is so much cheaper.
The AW3225QF has been reported on the NVidia forums to have VRR screen flicker issues. It also has an active fan inside and curved (which isn't great for viewing angles). For those 3 reasons I avoided it and went with the FO32U2P. Zero issues so far, so i'm pleased I chose it over the AW one. Plus its a bit more futureproof in that it supports proper DP2.1 UHBR20. No doubt the Nvidia 50XX series cards and AMD's new offerings will also sport at least one DP 2.1 UHBR20 connector.
@@0perativeX I get the AW3225QF because of its slight curve. The FO32U2P is not officially certified by Nvidia to be "G-Sync Compatible" whereas the AW3225QF is. This leads me to the hope that Alienware can provide a firmware update to improve VRR like msi already did. Also, it is not supported yet by the driver. The FAN does not bother me at all, if it helps with OLEDs to keep them cooler the better. Also, it is 330€ cheaper and as OLED is yet not that bright I hope that 2025/2026 panels will be brighter so I will upgrade non the less.
@@0perativeXAll OLEDs will have vrr flicker it seems. Literally not seen one without it.
@@lilpain1997 the Aorus FO32U2P I picked up yesterday doesn't have any VRR flicker so far. Haven't had any with my 42" LG C2 OLED either. For the FO32U2P i'm using the Mini DP 2.1 cable that came in the box. With the C2 i'm using HDMI 2.1. Both with VRR/G-sync compatible mode enabled. I wonder if the flicker that others have reported is occurring in conjunction with nvidia DSR use?
@@interceptor001 It may not be officially certified but i can tell you right now through personal use that G-sync compatible mode is working perfectly with this monitor and I've had zero VRR flicker so far.
This was an extremely helpful explanation of the situation with these monitor connections! Thank you so much!
I have this exact model and WHOA!!! Perfect picture clarity, HDR is AMAZING and bright, Colors POP vibrantly, 2.1 UHBR20 Future proof is a Chef's Kiss, KVM yes please, and a many of other great features and tricks to make you happy that you're alive at this moment I promise you, but don't take my word for it, see for yourself. Also make sure you download a icc profile for the perfection that I'm speaking of.
Thank you for educating us
Thank you Tim. This was the monitor I am waiting for, so can't wait to see the review.
thank You, this is the best info on topic i have found so far
I wish latency was tested between uncompressed 80gbps and 40gbps+dsc. I know that's been done time and time again with other signals like 1440p 240hz since that can be done without dsc at 8bit but not 10bit. Still would've been nice to see a graph though.
does latency improves with dsc off on those tests ?
Latency testing will be done in the actual monitor review, but given the DP1.4+DSC monitors have
@@monitorsunboxed What if pumping 80gbps bandwidth is worse.
Some people swear that dsc creates some kind of delay different feeling with a mouse cursor
From exp and what others have said. It has issues alt tabbing out of full screen games and black screen flickers etc. I use to not have that issue until upgrading to 4k 240hz. My 1440p 240hz had no such issues. @vincentrowold1104
Another two reasons people dont want to use DSC is faster alt-tabbing, and potentially lower latency. Testing that would be interesting.
The alt tabbing is fixed by upgrading games to using flip model dxgi surfaces with mpos exclusive fullscreen should be avoided entirely
@@Floturcocantsee idk if i trust it really has zero performance and input lag impact, unless someone actually benchmarks this its still safer to use fullscreen exclusive. besides its a win 11 feature, isnt it? most people are on 10.
This should have been the main thing to test here actually. How could it be avoided in the first place?
@@Floturcocantseei just run my games in borderless
@Floturcocantsee i actually dislike fullscreen. I like not having to hit my windows key or alt tabbing every time i want to move my cursor over to my other two monitors
Thanks! You answered my questions like backwards compatibility, differences in DP2.1 standards and image quality.
Got this monitor yesterday and I have it hooked up to a RTX 4080 Super's DP1.4 connector, the other end plugged into the monitor's DP2.1 mini port. It's been running absolutely fine at 4K 240hz (using DSC). The picture quality is great and there are several HDR colour options in the menu so you can tailor everything to your liking. It'll be interesting to test it out with a RTX 5080 when those come out (hopefully with a UHBR20 DP2.1 connector) but for now, everything working is great and i'm pleased with my purchase.
There are no HDR color options
@@markshenefelt7611 There are several HDR presets. HDR standard, HDR Game, HDR HLG, HDR Vivid, HDR Cinema
Am I right in thinking you can turn pixel shift/screen move OFF in your settings?
I’m interested in this monitor solely because it seems to be the only qd-oled that allows you to do this. Is that correct?!
@@thepadster123 You can indeed. The option is in the OLED Care settings menu. You can toggle pixel shift on and off in there.
@0perativeX
Thank you! 🙏
Very well explained. Thank you!
2 hours ago?! Impeccable timing, Tim! I ordered the Aorus without 2.1 last night and was wondering if it's a big deal and if I should cancel the order, I guess not! Brilliant video as always, cheers!
They really knitpicked those words for the answer to "what is meant by visually lossless"
wish we had a system power consumption comparison between DSC and non DSC, idle clocks etc.
thank you, i was about to put an order for this just because of the port
Its even without considering the port, the greatest OLED 4k 32" so far.
Why? It's literally worthless
@@De-M-oNObjectively untrue.
@@Dubulcle very true for me. What is in your mind the best one?
@@Dubulcle Well the super long mode switches (alt+tab/fullscreen) with DSC is painfully annoying and I'm more than glad when GPUs will come with 2.1 so that the DSC can go to hell.
It takes several seconds sometimes and this is annoying when you want to play a new game, and have to hope the few first seconds are just company logos
great information! thanks for your work.
+$200 for that connection. That's how much it cost over the FO32U2 that only has DP1.4
And they only provide that short cable...
The FO32U2P only costs £50 more than the vanilla FO32U2 over here in the UK.
@@0perativeXyeah, because you’re already been raped for the FO32U2 price.
The dp 2.1 standard only allows 1 meter passive cables. 2.1a added support for 2 meter cables.
to be fair this model also has display chaining, although I don't know where that could be useful?
@@kwadwo1000 What about the MSI MAG 321UPX ?
Just the video I was hoping for! With that said, I sort of feel that the DSC-part was a bit underwhelming. For instance, DSC is a lossy compression since it removes data. This will lead to compression artifacts. There is also the error-rate for the reconstruction of the removed data. On top of that, DSC does not use previous frame to improve compression quality like other algorithms. This means that from frame-to-frame, there will always be artifacts. When DSC are at its worst, is at very sudden changes from frame-to-frame. An explosion in a video-game or a gun muzzle flash, for example. Latency is a none issue. It will proximate to the same latency as running with 4:2:2 subsampling (has to reconstruct to RGB before the monitor can use it).
With all that said, one could say that viewing a image in png vs jpg is similar to native vs DSC. The differences are there (for every frame), but does those differences matter? I think you should be able to showcase these difference between a DP 1.4 graphic card and a DP 2.1 UHBR 20 graphic card with slowmotioning them side by side during an explosion in a video game.
EDIT: reference with sample: forums.blurbusters.com/viewtopic.php?t=12235
I love the jingle at the end ❤
Thanks for explaining the differences and what we need to make things work! I recently bought some nice DisplayPort cables that can handle DP80, so now I just need a video card and monitor that also do. I am hoping the new Nvidia 5000 cards have this working. Thumbs up.
Anyone else skeptical about DSC being visually identical to uncompressed, but now believe it since Tim said so? Thanks Tim!
Great video. Thank you!
DP2.1 really is one of these things where you go: Yeah cool it's there I'll take it, but it'll not sell a device purely based on that. My monitor has been working pretty much flawlessly with DSC, but I also have an AMD GPU.
Thanks for confirming my thoughts that DSC really doesn't degrade visual quality in any perceptible sense. I like that movie compression example. Tbh I cannot tell any difference between a full Bluray and an encode at half the size, even with pausing and staring on my 83 in TV. Feeling more reassured with my new AW3225QF now.
He said researches saw degradation in image sometimes, so is not visually lossless.
@worldwanderingfisherman2166 It's very much like modern lossy audio compression. It throws out data that can either be reconstructed from the other data (usually at reduced accuracy) or an imperfect human organ isn't sensitive to (much like chroma subsampling which focuses on the most important aspect of an image in relation to the human eye: luminance).
Encodes aren't encoded on the fly while you watch them, but DSC is applied on the fly. It adds another level of potential issues.
Nice video. Still, I would want 4k@240Hz without DSC. So I wait a little bit until it's widely available.
"Cables Unboxed" when? lol
The cable provided with the monitor in the box is likely for daisy chaining the display to another monitor rather than for plugging it into your PC
I don't think so. The mini DP out is only via a full size DP port and the mini is only in. Both the mini in and the cable are full UHBR20.
I was concerned at my Alienware AW3423DWF stepping down to 8-bit signal to get the full 165Hz, great to see that question answered and it's basically no difference except in math.
That's 8 bit + FRC, not DSC. If it had DSC it would run 10 bit all the way to 165 hz. The AW3423DW/F doesn't support DSC.
Most (if any) people probably can't tell the difference between true 10 bit and 8 bit + FRC but we are talking about 2 different things although from my understanding they're both used to save bandwidth.
DSC also usually disables MultiPlaneOverlay (MPO) on the "two GPU heads used" connections.
So this is a little like when I bought my ultrawide in 2020 - I opted against going for HDMI 2.1, thinking it was pointless at the time. Ultimately, I regretted it, but it took 4 years for me to regret it. By that time, new monitors like these awesome 32" 4k OLEDs were launching, and it didn't matter.
I used. To have random black screen problems, in the end, it is all because the cable is not up to spec, change to the validated one the problems gone.
As a matter of fact, every 120Hz capable TV should also have it...
Just because of the situation of HDMI2.1 on Linux... And even if there would not be that situation, it should still be there!
Great video , to be honest is too soon for DP 2.1 , but is nice that some company are put that in, with those monitors price range the more feature u have the better
Fantastic video, as always
The Displayport 2.1 standard now supports link training, so it's theoretically possible to detect a degraded cable and revert with the source. I think it just isn't implemented in current PHYs.
Following up on this, based on the DisplayPort RX Subsystem LogiCORE IP Product Guide (PG233) from Xilinx which implements a DP 2.1 RX for FPGAs, the GPU/TX is responsible for determining the bandwidth after the link training period. The RX side can suggest a bandwidth in advance of link training but the GPU is the ultimate arbiter of what bandwidth is actually targeted for the communication. Thus, I think that the problem here is actually on AMD's side in not using the training results to determine bandwidth rather than on the monitor.
This is perfect timing for me. I wanted to get the Aorus FO32U2P because it seemed like the ideal OLED monitor for me, especially its DP 2.1 UHBR 20 support. I then found out that WOLED monitors are better with lots of ambient lighting, especially if they have a slighly more matte display like the recent LG 32" 4k 240Hz monitor, so I might be willing to sacrifice DP 2.1 for a better experience with an open window. I don't think I'd ever use DSR on a 4k monitor, so it likely doesn't matter to me.
super informative video, thanks!
Thanks Tim! This is EXACTLY why I have yet to buy this monitor. Waiting for your review.
Great explanation on DP2.1 UHBR20. You just confirmed what I thought about future-proofing. In Singapore, the prices of all the latest QD-OLED are quite similar so if the FO32U2P has similar performance as the MSI, then I would buy the Gigabyte. I keep my monitors for a long time and I go thru 2-3 GPUs during the life of my monitors across 4-5 generations of GPU. Meaning I change monitor every 5-6 years.
Anyway, you wouldn't notice ANY difference in YT vid to compare DP2.1 & DP1.4 because the footage are compressed and max out at 4K60. Text scrolling wouldn't show as well but UFO test will be interesting.
I guess only hi-res RAW pictures & low compression HDR10-12 videos will show some difference. From what I read, DSC works by converting RGB signals to a colour space that represent what our eyes can detect, then from there decrease the bit depth of the chroma channels from 10 bits to 8 or 6 bits. The monitor will then interpolate adjacent pixels' chroma channels so colour gradients will not exhibit any banding issues due to drop in bit depth, hence dynamic range.
Pictures that may exhibit issues with DSC may be high contrast photos of leaves or grass against bright background, and brightly lit fur/feather. But I must admit that at such high DPI, most people will not be able to spot it. This is probably more obvious with 50"+ displays. Will be interesting to see if high refresh rates will affect grey-to-grey response time.
Looking forward to full review!
Edit: Pairing this with the 7900XTX so I hope a good 2M DP2.1 cable will work. At worst, maybe do what Linus did and buy an active optical DP2.1 cable but those are REALLY expensive!
the 7900 xtx is uhbr 13.5, not uhbr 20.
@@MrJojo6713 Correct. It's mentioned in one of the videos in this series that MU used the AMD Radeon Pro card to test.
Finally getting dp 2.1... now just need to hope the next gen TVs start using them... or at least it hurries the next hdmi cables.
TVs will never use DP as the standard. You can stop dreaming now.
Why? HDMI 2.1 at 48 Gb is sufficient for the refresh rates the current TVs operate at, no compression required.
@@boshi9 Cause I want them to have less excuses to go higher, been waiting for a 4k 240hz tv for a while now.
TV does not go for DP… they want the copy protection of HDMI…
@@deama15, 1-2 more generations of LG TVs and we’ll be there. The technology is there, it just comes down to development.
The fact they haven’t even invented a VESA certified DP80 cable longer than 1.5m boggles my mind. I’m sure we’ll have 2m out by the time the 5090 drops though. I trust NVIDIA will support DP 2.1 too
Finally, a proper 10 bit monitor. I notice that Blackmagic Design make DeckLink monitor cards up to 12 bit, with SDI/HDMI 2.0a, REC 2020, 4:4:4 colour space, up to 8K. Not sure if they will work with this monitor though, or if you could play games with them. Great for editing though.
It may not be mandatory, but I am not dropping over $1,000 only to be forced to use an old standard connection when there is no legitimate excuse for a top of the line monitor to not have DisplayPort 2.1 (UHBR20) in 2024.
DP 1.4 forces you to use DSC at 4K 240Hz, whereas you can get that natively for 2.1
I spent quite some hours trying to understand all the limits on Wikipedia and was worried that I would need to have a 2.1. But no GPU no supporting it was quite crazy to me.
Turns out I can run my future 240Hz 4K OLED with DSC. That's definitely not a deal breaker then. Especially if you support the fact that the compression is not too awful.
Thanks sir for the deep dive.
Untill you get issues like black screen flicker because of the DSC and G-Sync.
Also, with DSC as it is, whenever you use a monitor that requires DSC to get all your shiny features (res, refresh, bit depth, etc..) you're effectively limiting the amount of monitors you can connect to your GPU because DSC will use more than one display head to drive that screen. I am typing this on the above mentioned monitor in this discussion and I bought it in anticipation of full spec DP 2.1 coming out with the next generation of GPUs (nVidia 5000 - series, AMD - 9000 series) so that I could purchase 2 more of these monitors and connect them to the same GPU w/o to install a secondary GPU into my system and halving my PCIE lanes (z790 here).
I have found that between my 4k 120hz display and my 4k 240hz display with DSC, I can tell a difference visually when using DLSS. So only when using not native resolution it doesn't scale as well with DSC enabled.
That is apart in R6 siege when looking at saran wrapped crates for example. Outside of that can't see a difference.
This is on the MSI MPG 321URX
I have the FO32U2 (non P) version and it's the brightest of the new QDOLEDs. I love it.
I ordered the Aorus FO32U2P. And I didn't really do that for DP2.1, though that is one of the few reasons I bought it.
The main reason I bought it, is that it was one of the few QD-OLEDs I could actually get since everything else is out of stock.
Also where I live the price difference between the DP2.1 and the cheaper 1.4 version wasn't big enough to warrant going for the cheaper one, as they already cost a lot of money out of the gate.
This is so good! Have you by chance made a video in this format for HDMI 2.1? I remember some that discussed it a lot on HU, but one in this format would absolutely demistify that for beginners, too. Thank you!
Badly waiting for the review of the Aorus. Here in Germany the Asus is 1800€ and the Aorus 1450€, so if there will be no big diffrences in features I will still go for the Aorus just becasue Asus is insane in the pricing deparment.
Another limit of using DSC is multimonitor support. You're limited to 2 monitors in this configuration.
From Nvidia:
When a display is connected to the GPU and is set to DSC mode, the GPU may use two internal heads to drive the display when the pixel rate needed to drive the display mode exceeds the GPU’s single head limit. This may affect your display topology when using multiple monitors. For example if two displays with support for DSC are connected to a single GeForce GPU, all 4 internal heads will be utilized and you will not be able to use a third monitor with the GPU at the same time.
I actually prefer the monitor not working when the cable isn't supprting DP2.1 spec, it's better than being oblivious to the shitty quality of my cable.
you do realize a small text in the OSD is very easy to implement? telling you the cable quality is subpar while also letting you use your monitor while a new one comes. Don't give excuses to companies
Also don't forget that it's easy to make an OSD toggle to activate and deactivate DSC, Asus does it, and I think one of the other manufacturers too, it would allow you to run DSC on lower quality cables, and without it on good quality ones (in the case of the Asus it locks you to 120hz without it since it doens't support dp2.1)
What an absurd take. Why would you want something to not work when can most probably work in a different mode, especially in the default configuration. All it needs to do is keep falling back until it finds a mode that work, then have the OSD pop a box showing the current connection figuration once it gets a functioning signal.
@@houssamalucad753 I agree with you, but unfortunately my trust in companies making these monitors is low, your solution would be the perfect one, but if it isn't implemented I would rather the monitor not to work instead of working in a half assed way without telling me.
@@Your_Paramour Change the monitor to 1.4 mode and it will work. If I recall right he even literally said it in the video that you can do that.
For me, a VR user, DP2.1 is a must, so I hope GPU makers will use it with the highest possible bandwidth
So if you use HDMI for the higher data rate, the compression will be less?
@monitorsunboxed , which monitor you prefer the Asus PG32UCDM or the Fo32u2P?
ugh nice idea, except they come with the monitors own cables so its redundant, also it s very hard to tell any speed difference, maybe we will once the 5090's come
great episode!!! finally dsc can be put to rest. any latency differences?
In regards of DSR/DLDSR and DSC from TFT article: “NVIDIA DSR, NVIDIA DLDSR and NVIDIA Image Scaling are supported when DSC mode is enabled, if the pixel rate needed to drive the display mode does not exceed the GPU’s single head limit. If GPU uses two or more internal heads to drive the display, NVIDIA DSR, NVIDIA DLDSR and NVIDIA Image Scaling are not supported”
It would be good to see direct input lag comparisons with DSC on and off, and inspection on whether DSC affects the image in any way when there's a lot of very fast movement (like OW2 combat).
Brilliant video thanks
At least you can quickly tell when the cable you are using is not giving you its full potential.
Could DSC compression/decompression be adding a bit of latency?
There's one thing you didn't mention, which I'm quite sad about: Black screens when going in and out of Fullscreen games. In my opinion, it is the biggest down-side of DSC. On some monitors it's only about a second, but I've tested ones where the black screens last more than 5 seconds. When trying to Alt+Tab out of a game quickly to check something, or answer a voice call, etc, it can really be a big annoyance.
Secondly, I've also read reports on the Blurbusters forum of people experiencing a "heavy mouse effect" when using DSC. This is probably negligible for most people, but for people who are very into competitive gaming, it is something to consider.
This is true but its specific to Nvidia, theyve known for over a year and dont care sadly
What you’re saying is true but I fail to see the point you’re trying to make. In the video he DOES talk about NVidia specific issues, namely missed features etc. He dedicates a section of the video to talk about issues arising from the use of DSC, and in my opinion forgets the most annoying one. NVidia has the obviously bigger market share of GPU’s, and displays that require the use of DSC are most often used for gaming, meaning NVidia specific issues, are issues for the majority of people using DSC.
Kept this monitor over the MSI MPG 321URX QD-OLED HDR is great the MSI version in dark areas was flickering but not this monitor and I didn't want to rely on firmware updates months later in extremely happy with the AORUS FO32U2P and a plus side my MB and GPU are both gigabyte so can control everything on GCC
great work tim very well explained
Shame rest of new OLED monitors didn't come with DP2.1 would've been nice. Though expected to see on future wave of monitors. Just today there was a 4K 1000Hz demo monitor Blur Busters teased. I'd expect 1080p 1000Hz to hit as a next jump along with DFR mode for 4K monitors. Bandwidth is there.
I just bought this monitor mostly because of the the DP output for daisy chaining working with it's KVM switch. Allows me use the same setup for my dual monitor gaming PC with and my work laptop with out sacrificing high refresh rate if i was to use an external KVM switch that are usually limited to 60hz.
Your most anticipated video of the year for me
Will you guys be doing a 321UPX review touching on the subject of how non-upgradeble firmware affects a monitor of this class?
lastly I agree, I won't buy "DSC native" monitor again after owning the Predator X27. I disliked the color compression and the jagged edges it introduced. While they say "Visually lossless" I say "Run text and show me visually lossless." especially on discord, try someone with a red name with DSC you'll see it immediately the text "bleeds."
Edit: The only place DSC may be acceptable is 8k since the PPI is so high your eyes will have difficultly seeing DSC.
great vid thanks 👍👍
Integer Scaling is one of the features that you can't use with DSC enabled, which is such a shame because I was really looking forward to playing older titles that don't scale well at 4K, with 4K being the perfect resolution to integer scale 1080p titles. I got the FO32U2P to future proof myself, but yeah, if Nvidia could actually just upgrade their latest drivers to enable Integer Scaling, I would appreciate that a WHOLE lot more than waiting for potential DP2.1 80Gbps GPU's to hit the market.
He said researches saw degradation in image sometimes, so is not visually lossless.
I just orded one and this video made me worried. Isn't this the best oled because it has 2.1? I had the alienware qf, am I gonna notice a difference?
Hey Firstly love your videos helped me a lot finalize my monitor. I went for AW3225QF.
Coming to my question:
I want to use my monitor with dp port to keep my hdmi ports open. For future console setup. Right now I have laptop with full thunderbolt 4 functionality. Which cable/adapter/dock should I got for to get the full 4k@240hz from my monitor. I have a 4080 175w laptop
Great video as always. I consider myself an informed buyer but videos like this are eye opening. I was unaware that DP 2.1 had different standards for bandwidth, good to know that UHBR20 is the one to look for.
I personally have only had one issue with DSC which was on my newly purchased LG 32GS95UE. I was using DP 1.4 with the monitor's onboard speakers but when launching Fallout 76, the game would not output to the speakers. If I cranked the resolution down to 2560x1440 at 120hz, the speakers would work fine which I assume has something to do with DSC as the game is launching. I ended up switching over to HDMI 2.1 and locked the game at 139FPS due to a physics issue; that combination seems to have fixed the DSC issue as the speakers work fine now. I am wondering if 4k at 139FPS is just under the bandwith requirement for DSC to activate. In a perfect world, I'd prefer to use DP 2.1/UHBR20 but HDMI 2.1 seems to fill the gap well enough.
thx for the knowledge drop.
I love your videos they are great, very nice work 👍👍👍👍
8:25 you need to put an edit into the video, one of those text boxes showing the correction here. I could not understand why Gigabyte would ever include such a tiny cable and then later you release a whole video explaining that those specific high-end DP cables only come in 1.2 meter length. I'm happy you released a video but please edit this one as well to include the correction.
well, I’m pretty sure RTX 5000 series will have display port 2.1 and because gigabyte is also making GPU’s they already know this and that’s why they are making display port 2.1 monitor with UHBR20 . I might be wrong, but I would bet for this. 🙂 Thanks for sharing this I’m curious if it is better than the PG32UCDM or mostly the same . I already got the PG32UCDM and love it . I don’t know if it’s like this, but maybe with uncompressed signal if you alt tab from a game there will be no flicker or black screen for a moment ? anyways, the main difference will be display port 2.1 on Aorus side and Dolby Vision Which should be implemented later this year on Asus side . Which is the better choice? What do you think? Thank You!
Iv seen a few posts about HDMI problems, people just assume it's the GPU or Display (which ever is the newest of the two). No one thinks it's the cable, we are not used to cables being the problem.
And I use the same DP 1.4 cable I used for my 1440p monitor there's no issues at all
This is important info 6:48
DisplayPort daisy-chaining will require twice the bandwidth. Daisy-chaining the DisplayPort signal is useful if using the first monitor as a KVM and you want to connect a laptop to multiple monitors using a single cable. Of course, still waiting on those laptops with DisplayPort 2.1. FYI, might want to test the KVM and daisy-chain capability in the full monitor review.