It's sad how hard they dropped the ball on standardizing the actual cable spec, allowing sellers to get away with using the DP 2.1 branding without supporting the full bandwidth of the spec.
@livedreamsg yeah I don’t see any reason why that should be allowed personally. If it’s only old speeds, they should have to call it DP 1.4. Fine if they add other new bits to that, but DP 2.1 is ALL about the new speeds in the minds of the consumer
I'm not surprised; for example, in the past, we had "HD ready" display, and then "Full HD". Now, I guess, we would have "DP 2.1 ready", and then "Full DP 2.1" or something...
HD ready vs full hd: These mean two very different things, and I'm not talking about resolution. HD Ready was an actual set of requirements by the standards body, and minimum resolution was just one of them. It was to protect consumers from being duped by manufacturers. Full HD, 1080p, etc, were random labels that manufacturers stuck on their TVs. They had no value other than honour. A Full HD tv could very well not satisfy all the HD Ready requirements. It was manufacturers massively muddying the waters. That's also why back then you found beamers advertising Full HD while having a much lower resolution. Their small print would say they accept full hd input. it had no legal meaning. You could sell waterbottles with a "full hd" label.
One thing not mentioned here that Wendell (level 1 techs) has mentioned.. the cables. Apparently even the displayport connector is a limiting factor, and AMD has made some kind of Mini DisplayPort connector for its Pro level graphics cards which support 2.1. It apparently handles the signal much better. He did a video on it.
Why hasn't the industry adopted an optical connection for displays? 2 meters of OM5 optical cable that supports 100Gbps costs about 18$, which is cheaper than a hypothetical 2 meter DP2.1 80 cable. And optical cables can go hundreds of feet it length. The transcievers use about 4 watts of power, so that is not the reason. Probably the cost of transcievers then?
optical cables are less durable for the same cost, so to mass produce optical cables would be significantly more costly, require massive changes to current manufacturing processes and the benefits for 99% of users would be negligible
We should, but digital rights management is king, and eventually, no one will own DVDs, blurays, and your computer will just be a kiosk, it's what windows and Intel are working up to. Also helps marxists stay in control by spying on what you do. You should check out the creepiness of Intel's driver assist manager.
It's worth mentioning that losing Nvidia features (DSR, DLDSR, etc.) has nothing to do with DSC. It's entirely to do with hitting the pixel rate that causes the GPU to use more than one display head (of a total 4 on the card) for that output. This is why sometimes the features are available with DSC and sometimes not. It probably also causes the extended black flickering issue. Monitors Unboxed covered this in their DP 2.1 video.
That’s true, and their info was based on our original article on the topic that was shown in this video 🙂 realise I should have linked to it in description too, now added 👍
I made a comment on the review video for the PG27AQDP thinking I understood but of course I didn't. Thank you for explaining this and I apologize for my ignorant comment even if you may not remember. I will do better next time rather than make assumptions about things I've read somewhere on the internet
I think we also have toke account t the price of the monitor. I dont expect, nor need DP 2.1 80Gbps on a 200€ monitor. I doubt thing most other people do either. But buying a monitor that has top end specs and that costs over 1000€ should also come with the newest standard and appropriate cable included in the box (looking at you Sony). When a monitor already costs 1000+ the costs of the scaler and cable becomes much less of an issue. People buying such high end monitors are not going to skip on it because it's 50-200 more expensive. They may however skip it if it has deficiencies in the specs compared to it's competitors.
Same stupid shit! Combine this with cables that are labeled with a specific stat, but in reality it does not support it (cheap Chinees shit). So much wasted time figuring out what is broken (new cable not working properly).
It's same white collar criminal bean counters and marketing scammers now doing these decisions as those behind killing 350 people in Boeing MCAS crashes.
Dsc is not always that great, for some reason on my odissey oled g6 when I enable 10bit at both 240hz and 360hz I get a green tint, constant and there is nothing I can do about it. So if I want to use HDR I have to set it at 120hz which is a bit disappointing.
It’s hard to know. Passive copper cables are going to be difficult for sure at long lengths and high bandwidths. Maybe a different solution will be found though
HDMI already has optical cable for those who need longer cable length for same high speed. Cable costs more but its an option. DP 2.1 have to support same thing its only option.
They do make converters, they come in various types, but it looks like most are thumbstick style dongles (a bit larger though) that plug into a DP port and have FO connectors on the other side.
@@kaseyboles30 could you list an example , like why would someone need it if the monitor is DP1.4 even if you convert it to fiber optics your still hardware bottlenecked by the monitor main board. only solution for the time been is getting a fiber optic cable that support UHBR 20 for lengths of more than 1m
@@MrFaleh1129 DP/HDMI to fiber converters are mostly used for distance today, not bandwidth. For example Linus of Linus tech tips has a server rack filled with various things including a pc for each family member and uses something like this to have the monitor/ keyboard/ mouse/ etc. wherever it's needed. He then has all the heat generation in one water cooled rack (and the heat is dumped via a heat exchanger into his pool).
@@kaseyboles30 no I believe he has a normal hdmi for video to his monitor But the usb is the icron hub (costs like $1000) to get his usb and peripherals to his gaming room Those hdmi to fiber are not 2.1 from what I looked up cause I needed this use case and did a lot of research and there were simply not any that support 2.1 If you want the 2.1 standard you need a fiber optic cable not a converter
@@MrFaleh1129 He's almost certainly using some sort of extender for hdmi if he's on 2.0 or 2.1. 10-15 feet isn't very far. Though you can get upwards of 50' on hdmi 1.4, but I don't see Linus doing 1.4 if he can put in better.
I really wanted a 6K panel with some sort of local dimming (miniLED or OLED) and high refresh rate (120hz or more). Having work with the Pro Display XDR for a few months makes me wish the PC market would have something like that, even if the price is outrageous.
Out of all OLED panels you have tested which one does have best white balance like whites appear white rather than yellow? Im stuck between PG32UCDM or 32GS95UE for me screen coating doesn’t matter only the white point. Any help would be appreciated 👍
Well that kind of depends on what white point you’re used to. Both have pretty good calibration to a 6500K white point but whether that looks white or a bit yellow to you depends on what you’re used to. If you’ve been using a cooler screen it might still look yellow. Both have good adjustment options though at least
@@tftcentral I do prefer cooler tone displays previously i was using LG27GN950 but still if it’s accurately calibrated to 6500K I’m ok with it. So among these two ASUS PG32UCDM & LG 32GS95UE which one comes with perfect calibration out of box?? I don’t have any color calibration tool with me.
@ yeah i have already done watching it but still i couldn’t able to decide because it wasn’t direct comparison also since you have experienced both the panels i would like to hear from you it would be really helpful for me to make decision!
I've seen alt+tabbing out of a fullscreen game without DSC causing several second lag too. My understanding is that lag is from switching control back to the OS for driving the display & caused overhead as resolution, refresh rate, etc. gets initialized and communicated. Are you saying that DSC adds more lag or the switch only lagged with it on in your testing? Thanks for summarizing this mess. Would have been nicer if more of the charts had 1.4 and the several 2.1 speeds marked on them but seems easy enough to follow with jumping around.
@@veilmontTV better for alt tabbing but often has impacts to latency + framerate and the game's ability to use and control some advanced graphics capabilities. You can run into different results when trying to capture the game such as for streaming with obs-studio. Sometimes there are bugs impacting the preferred choice. How things are impacted can also be impacted by the game and its currently used graphics library, what GPU is in use, and hardware that is connected to it such as for multiple monitors. Newer games, OS graphics standards, and GPUs are making things better but its not still automatically equal.
So the answer is: - VESA should release a real DP standard that stops everything being optional; - people should stop buying Nvidia cards; - and it is going to be freaking forever for USB4v2/TB5 120 Gbps display connections are running (maybe outside of Apple).
Is there any news about Tandem OLED? Because Tandem OLED has 1000 nits Fullscreen brightness which is very good for BFI and it has way higher lifespan than normal OLED I personally think that Tandem OLED is the ultimate display technology
A new oled monitor or tv should last at least 5 years unless you're a super heavy user. People have ten year old oleds with 10k+ hours and no burn in. Just don't abuse it. If you don't want to baby your display at all get an lcd.
@@tftcentral I appreciate what you guys do, I really mean it. But not everything warrants a video and everything is pushed to youtube these days when it could be just as easily portrayed in a bullet point list. I miss being able to read stuff.
Does it even matter buying DP 2.1 40 or 80 Gb cable when monitor have DP 1.4 and latest nvidia gpu too? It does not make any difference right? Or? I am really confused..
@@tftcentralthe cable actually matters regardless of gpu monitor is what matters but you won’t get the full hrz of the monitor utilized stop the false information
Well it goes without saying that it needs to be a decent / certified DP 1.4 cable yes. I meant you don’t need a newer DP 2.1 cable if you’re only using a DP 1.4 connection
That’s the safest bet to cross refer a cable with their site but DP 1.4 has been around ages, you can also go off well known brands or reviews on Amazon etc I’d say
16:57 ive got a very long DP 2.1 Cable that will probably support the UHBR20 standard , but its an active fiber optic cable i guess thats the only solution right now if you want the DP2.1 standard.
I bought the MSI 27 Oled 360hz. Newest FW is installed. I can switch off DSC. But cant activate 360hz, as soon as I toggle off DSC, there is only 240Hz available. But if you calculate you are supposed to power 10bit WQHD 360hz without DSC. The New Asus Oled 480hz can also Support 360hz without DSC. Same connection. HDMI 2.1 Why isnt it possible on the MSI model ? I am sad...
@@aleksandarkrstic7182 I assume the monitor is a 271QRX? Do you use HDR? Since that increases bandwidth requirements. I also think DLDSR can cause limitations if you use that. Also ensure it's not the cable, but if it came with the monitor, it shouldn't be. If all else fails, I'd just contact MSI support and ask about it.
I've juist bought MSI 27QPX for my rog based on 5900x and 7900 XTX. My monitor seems out of sync and I can barely work / game on it for more than 5 minutes - it's blurred that messeses my eyes. Based on my LG C1 experience which i returned, and having S90C and Zenbook OLED which is great - how come my new monitor feels this bad - like binoculars with bad diopter setting?
@@tftcentral The most likely thing is that they will announce it at CES 2026 and it will come out around the end of that year, I am between the 1440p 480hz monitor or continue waiting for the 4k 360hz one.
Don't think that's even an official designation. Are these cables listed on the VESA website as certified DP 2.1 as well? I expect there's many cables out there which are listed as supporting DP 2.1 when they don't really, or at least haven't been formally certified as DP80 and so on
The abuse happening is automatic. No question. the system has been actively designed to be confusing and easy to obfuscate and use in misleading marketing. Sadly not surprised. these codes and standards to afford deception seem to be the desired standard in all modern updates.
VESA's practices need to come under the spotlight. A lot of industries could not get away with these non-standards standards. If it doesn't meet the spec of UBHR10 at a minimum they should not be able to call it dp2.1 full stop. That you can get VESA certification is a joke. VESA needs to held to account. This is just as shameful as the USB "standards"
Sony just released a 480hz OLED. You guys have it and are going to review it? Those cheap bozos didint include an hdmi 2.1 cable but a cheap DP. The fps pro plus mode stutters and lags when facing opponents. At first i thought it was connection lag.
Recent manipulations will not break my determination. I'll wait and get DisplayPort 2.1a (with full UHBR20) supported display and also graphics card(s) in future. It can be in 2025... or later. I have no rush.... Companies are more than welcome to postpone the money collection which I am ready to pay for right product(s). I do not have to convince them to produce right products. They have to convince us with right products with right price. I am not gonna waste my money for DisplayPort 1.4 or any HDMI connection... I only invest "future proof" specs. Not "already dead" features...
Dear friend @@ShalowRecord That is correct. But as you know, that model has it's own issues... There will be way better and way cheaper models in future, if not soon. I'll be waiting and monitoring the market.
I resonate with you, but the situation from a manufacturer & seller standpoint is more complicated. They operate in a capitalistic environment and have to sell stuff & services to exist.
I see you have an article about the two new tcl mini led monitors, are there any plans of reviewing them? They look very interesting for those of us who don't want to buy oled monitors
TL:DW; 1) not all DP 2.1 connections are the same UHBR10 = 40 Gbps, 13.5 = 54 Gbps, 20 = 80 Gbps need UHBR20 for max speed 2) UHBR is optional. DP 2.1 = support DSC, and one of these: UHBR, AdaptiveSync secondary packets, or Link-Training Tunable PHY Repeaters 3) input source devices not DP 2.1 ready yet graphics cards, AMD + Nvidia AMD consumer R7000 up to UHBR13.5 AMD workstation W7000 up to UHBR20 Nvidia nothing in G4000 or A4000 series 4) DP 2.1 scalers aren't common high prices 5) DP 2.1 cables are very limited high prices DP40 = 40 Gbps, DP54 = 54 Gbps, DP80 = 80 Gbps 1m or 1,5m max for 80 Gbps is quite short
Disagree on the comment you made about UHBR10 and up. Because DP allows you connecting a display by one lane, two lanes or four lanes, all at the same bitrate. Therefore UHBR40, 54 and 80 does not make any sense.
Well I understand why they are called what they are, the point was more that for an average consumer it’s confusing naming schemes. Especially when you’ve then got different names for cables
I just want a 42"-48" 4k 240hz OLED TV with insane brightness, DP2.1/HDMI2.1 UHBR20, and BFI/ELMB for around $1000-$1400. I dont think I'm asking for too much. Lol.
Excellent video... Nvidia will release its next versions of GeForce RTX 50xx PC graphics cards in Q1 2025. They will have 3 DisplayPort 2.1 ports. So, this video should be updated with a few seconds about that.
You are absolutely correct. Regardless, just about anywhere you check as of today you get something like: "Yes, the Nvidia GeForce RTX 5090 is widely expected to support DisplayPort 2.1, with many rumors pointing toward the inclusion of the UHBR20 (Ultra High Bit Rate 20) variant. UHBR20 allows a data rate of 20 Gbps per lane, enabling up to 80 Gbps total across four lanes, supporting very high refresh rates and resolutions, such as 8K at higher frame rates or 4K with HDR at extremely high refresh rates. This would be a substantial upgrade from DisplayPort 1.4a in terms of display capabilities, especially for high-end monitors." I'm thinking it would be helpful to work a few seconds of that into your video so there is a more up to date spin on the situation. If 2025 is going to be the year where DisplayPort 2.1 and 4k video starts to take off - maybe some of your listeners would want to think about that. Putting out an updated version is going to get you hits with minimal effort - nothing wrong with that.
@@tftcentral according to the company that manufactures it lol. Why do you trust VESA so much? Why do you trust their marketing so much? In the same way companies like Sony are being deceitful in their marketing for the Inzone M10S why don't you think VESA could be too?
VESA don’t manufacturer DSC. But this is based on my experience using countless displays. You’ll find other major reviewers all say the same thing too, there’s no visual or discernible difference visible.
If it is NVIDIA… i expect that 5090 will have full 2.1 80 5080 will have 2.1 40 5070 will have the slovest 2.1 aka 20 And 5060 will have that fake 2.1 aka dp 1.4 speed… That would be very NVIDIA like release! 😂😂😂
Thanks for the video! I understood everything and all i can say that its such bullshit, that we have to pay 1000eur for the monitor, and they cant even make it without DSC. Bunch of aholes, I wont buy a monitor, I guess. Will stick with my 1080p@240Hz TN panel and buy a VR headset instead, that cost half the price of the monitor ffs.
Lol, that would make it worse... Actually, that might work. :)) But then, how would the industry be able to create confusion in the market when it needs to? xD
Great video. All monitors that are advertised as 2.1 and not uhbr20 are misleading on purpose, plain and simple. They can say it's 2.1 per the standard but we all know what they are trying to pull.
Personal thank you for uncovering myths of DP2.1 idk why VESA has to make everything so complicated, as in a modern world everything has to be simplified af.
These kind of concerns really make me want to hold off on getting a monitor. When I buy a monitor, I kind of look at them as a long-term investment. And I'm certain to feel like 2025, 2026 might be the sweet spot for some of this new technology.
i think mandatory dsc+1 any other is actually reasonable. i don't really care much if i get advertised high resolution + high refresh, no problems whatsoever. its not the same bullshit as usb 3.0 renamed twice
I think the issue with it though is that every bit of DP 2.1 marketing and promotion is about the new speeds it can support. That’s what the connection means in the minds of the consumer, and to then say that actually that feature is optional is open to all kinds of confusion and misinformation. “displayport 2.1” doesn’t actually mean new, faster speeds at all, at least it doesn’t have to
It will be a future display spec for sure. There are clear benefits in image clarity, detail etc from the increased pixel density and resolution. And also benefits with motion clarity, system latency, gaming experience etc from a higher refresh rate. So yes, you’d be able to see it 😀
i don't understand why hdmi and displaysport still need to exist, now that there's a USB4 standard that supports up to 80gbps which is the same as the fastest displayport version USB should just become the one truly UNIVERSAL standard, anything that requires higher speed should just use OCulink(aka external PCIe)
USB4 isn't a display signal transfer protocol. It only transfers data via a universal port/connector. You also need a protocol like HDMI or DisplayPort to actually produce and display the data. And DP is actually what is internally used even if going via USB-C.
@@AngelicStreak yes so why do we still need to keep the physical displayport connector around? why do we still have 2 seperate display adapters(dp/hdmi) for monitors and tvs even though the difference is becoming more and more arbitrary
@@Intelligenz_Bestie I didn't say we need connectors. I talked about protocols. And I did that in response to your "i don't understand why hdmi and displaysport still need to exist." And yes, we can go via USB-C port, but will still need an underlying display protocol. As to why we need HDMI/DP ports, we don't.
DP 2.1 is simply around 5-10 years ahead of its time at best. We're already at the point where we see diminishing returns on both resolution and frame rate for desktop size monitors, especially on current gen hardware. Beyond that, going forward, the metrics that will be pushed in terms of hardware capability is NOT resolution at higher refresh rates. It's graphic fidelity and complexity. So realistically when it comes to strictly looking at viable resolution x refresh rate numbers, we will see it stagnate almost entirely for years to come. Going from 1080p 60hz a decade ago, to 4k 240hz today on DP 1.4, was a big leap. But beyond this point the returns are far less noticeable, yet the cost increases exponentially - so it's simply unattractive to pursue higher gains when we're already bordering on perfection. It's why AMD shot themselves by trying to force DP 2.1 on their current gen graphics cards. No one can even make use of it because pushing any visually intensive task (that would benefit from these data rates), is simply way too limited by the graphics hardware itself before the display output even becomes a concern. It's like shoving a bigger engine into your car, expecting your commute through traffic to be faster. Ultimately you're limited by entirely different reasons and your existing engine is hardly working as it is anyways, so upgrading wouldn't net you any benefit whatsoever. DP 2.1 is just a bit of a waste for 99.99% of consumers. And it taking the route of USB data naming and branching doesn't help either. Literally if someone were to ask me how DP could ever lose their position to HDMI, I'd say "waste effort on things nobody will realistically use, and make the segmentation obfuscated so people don't understand what they're buying". And it's literally what they have done. Absolute self shot. Sometimes I really can't fathom how professional people are behind these decisions... Amateurs.
We need 8K displays in 16:9 and 16:10 formats, and we need them to have the electronics to drive integer-scaled all the mainstream lower resolutions at correspondigly higher refresh rates. (4K, 1440p/1600p, 1080p/1200p, 720p/800p). A 40inch 16:10 8K monitor capable of all that, with the best reliable panel technology available would be a marvelous thing. One day... And DSC is a compromise, a workable one perhaps, but DP 2.1 and higher would allow some of these resolution&Hz combinations to be displayed without compromise.
@@edfort5704 That's a whole lot of yapping to still say nothing. DP 2.1 is useless for the next 5-10 years because 99.99% of people will never have any use for it. You're not gonna run anything at 4k +400hz in the first place so why support it. Waste of effort and money. Better to invest into optical display cables then rather than waste it on useless old wire tech.
@@Real_MisterSir Higher refresh rates benefit anything and everything that involves moving images, including simple browsing or scrolling through documents. We are very far from reaching any perceptibility threshold in this area and the only diminishing returns that exist with higher refresh ratea is when the jump is too small (e.g. from 240Hz to 360Hz). Many games may be hard to run at very high fps, but higher Hz will benefit even fps-capped games through lower latency. It's just that it's not worth for the avg. consumers to make incremental hardware upgrades, but large ones (2x, 4x etc.) I too hope to see the potential of optical cables more explored by the industry.
Whats the need tho? Nobody is pushing 1.4 anyways, it seems entirely useless. The actual games worth playing on a high end graphics card will never get anywhere close to pushing past 4k 240hz in the first place. Cyberpunk maxed out on a 4090 barely runs at 40fps without DLSS at 4k, and that's been out for years now. Any game worth getting a 5090 for will not pass even 144hz at 4k so why even bother forcing DP 2.1 that no one will use other than for arbitrary bragging rights and trying to run CS2 at +500 fps.. Its so useless. Diminishing returns that 99% of people will never care about.
I think to solve the cable length problem, fibre really is the only way. That’s going to be insanely expensive though.. there is no way I could use a 1 meter cable in my setup. I’m using 2x 5 meter club3d DP 1.4 cables and they work great.
I would not say fiber is expensive. Economy of scale. As it becomes more common the prices will drop as it did with internet cables going from copper to fiber. 10m fiber cable currently costs 65€. Assuming 5m cable would be be half that at 32,5€ it's on the high side but reasonable. However current DP. 2.1 cables are only UHBR 13,5 from what i see. Labeled DP54 (54Gbps). I see no reason why a 3-5m fiber cable supporting full 80Gbps could not be sold for less than 20€ in the future.
DSR is NVIDIA's downscaling feature, while DLSS is the upscaling feature. I believe AMD's VSR downscaling feature is also incompatible with DSC based on my limited testing. NVIDIA's next generation of GPUs will support DP 2.1a based on recent rumours. Hopefully this will give monitor manufacturers the impetus to support it. Longer UHBR20 cables would be nice, too. EDIT: Ah right yeah you mention this in the video :/ My bad for commenting before watching the whole thing.
I look at DSR as upscaling, but I see what you mean. From a monitor point of view it allows upscaling, allowing a higher res input that the res of the panel. But from a gfx card point of view it’s downscaling the higher res to the panels lower res.
Can't find the source now kind of a niche topic but I'm pretty sure AMD's VSR is compatible with DSC unlike DSR. Building onto your comment though it sounds dumb but at 1080p using DSR to upscale 1.5x than downscale to DLSS 3 Quality/DLAA looks amazing in the games I've run it in.
@@tftcentral Well, it is a bit strange to call it "upscaling" when it does the complete opposite of the widely understood "upscaling" that features such as DLSS or FSR do
Hi please go try it yourself DSC causing input lag you can easily feel this vs native . The easiest way to prove this to yourself is test it using gyro at 8k polling rate. With gyro you can easily feel input lag unlike mouse which you can compensate for lag by increasing force with your hand . But with gyro you can easily feel this. Use hdmi 2.1 native of oled 240 vs hdmi 2.0 with DSc . So night and day . Also visually is very easy to see in terms of clarity in competitive games just snipe and look around fast it’s night and day . Native is so much better
@@imadecoy. literally so annoying people say this and never test it . Literally can tell 100/100 times if DSc is on or not within 5 seconds 🤦♂️I don’t care what people think the math number is it’s irrelevant it’s night and day in terms of feel when All other end to end latency are tuned and low
Vesa must be colluding or they all need to get fired. This is *literal* false marketing. They're using a technicality, but for false marketing it just has to be misleading - which it is, and seems to be designed this way on purpose.
Because people think that compresion = bad, there was a time where flac was called an inferior format because it compresed the audio but people grew up from that, we just need time and reviews
DSC makes editing EDID difficult, which is important if the manufacturer sets the wrong NITs in the EDID. Also, DSC means no DLDSR (unless there’s a weird workaround). At 1440p at 27 inch, the screen looks pixelated compared to 4K. DLDSR would fix that. Also alt tabbing or going from one full screen app to another will result in a 5 to 7 second black screen.
It's sad how hard they dropped the ball on standardizing the actual cable spec, allowing sellers to get away with using the DP 2.1 branding without supporting the full bandwidth of the spec.
Agreed. Same boat as HDMI 2.1 on that front
Yup
The worst part is that they can use the DP2.1 label and only offer the same speeds as DP1.4
@livedreamsg yeah I don’t see any reason why that should be allowed personally. If it’s only old speeds, they should have to call it DP 1.4. Fine if they add other new bits to that, but DP 2.1 is ALL about the new speeds in the minds of the consumer
Bean counters and marketing scammers shouldn't be allowed inside fire arm range of making standards.
I'm not surprised; for example, in the past, we had "HD ready" display, and then "Full HD".
Now, I guess, we would have "DP 2.1 ready", and then "Full DP 2.1" or something...
HD ready vs full hd: These mean two very different things, and I'm not talking about resolution.
HD Ready was an actual set of requirements by the standards body, and minimum resolution was just one of them. It was to protect consumers from being duped by manufacturers.
Full HD, 1080p, etc, were random labels that manufacturers stuck on their TVs. They had no value other than honour. A Full HD tv could very well not satisfy all the HD Ready requirements.
It was manufacturers massively muddying the waters.
That's also why back then you found beamers advertising Full HD while having a much lower resolution. Their small print would say they accept full hd input. it had no legal meaning. You could sell waterbottles with a "full hd" label.
What a trainwreck of a specification this is... thanks for shining a light on it!
One thing not mentioned here that Wendell (level 1 techs) has mentioned.. the cables. Apparently even the displayport connector is a limiting factor, and AMD has made some kind of Mini DisplayPort connector for its Pro level graphics cards which support 2.1. It apparently handles the signal much better. He did a video on it.
He talked at length about the cables currently available being too short to make the standard worth using at all.
Wow, they certainly do know how to mess and mud things up! Thanks for the heads up!
Thank you for such a clearly explained no nonsense video covering this confusing topic, very helpful.
Thank you for the feedback, I’m glad you enjoyed it 🙂
HDMI forum: We f*ed the HDMI 2.1 specs at launch.
VESA: Hold my beer!
And now you know what "learning from their mistakes" really means.
Why hasn't the industry adopted an optical connection for displays?
2 meters of OM5 optical cable that supports 100Gbps costs about 18$, which is cheaper than a hypothetical 2 meter DP2.1 80 cable.
And optical cables can go hundreds of feet it length.
The transcievers use about 4 watts of power, so that is not the reason.
Probably the cost of transcievers then?
optical cables are less durable for the same cost, so to mass produce optical cables would be significantly more costly, require massive changes to current manufacturing processes and the benefits for 99% of users would be negligible
We should, but digital rights management is king, and eventually, no one will own DVDs, blurays, and your computer will just be a kiosk, it's what windows and Intel are working up to. Also helps marxists stay in control by spying on what you do. You should check out the creepiness of Intel's driver assist manager.
@@delofordwhat about oculink?
Because it's not just the cable that would need to change
@@kougamecs3876 64Gb/s max
Great informative video with no BS.
This was incredibly insightful! I will have to refer to this at a later date, as some of the finer details are often lost in translation.
It's worth mentioning that losing Nvidia features (DSR, DLDSR, etc.) has nothing to do with DSC. It's entirely to do with hitting the pixel rate that causes the GPU to use more than one display head (of a total 4 on the card) for that output. This is why sometimes the features are available with DSC and sometimes not. It probably also causes the extended black flickering issue.
Monitors Unboxed covered this in their DP 2.1 video.
That’s true, and their info was based on our original article on the topic that was shown in this video 🙂 realise I should have linked to it in description too, now added 👍
I made a comment on the review video for the PG27AQDP thinking I understood but of course I didn't. Thank you for explaining this and I apologize for my ignorant comment even if you may not remember. I will do better next time rather than make assumptions about things I've read somewhere on the internet
No worries at all. I remember the conversation but I’m glad you found this additional video 🙂
PS5 labels 8K in their boxes-what do you expect? They'll do anything to increase sales, even tell the "half truth."
460p upscaled to 8K… enjoy!
😂😂😂
you can be sure even ps5 pro they will toute some games as 8k it will be 8k upscaled not native
I think we also have toke account t the price of the monitor. I dont expect, nor need DP 2.1 80Gbps on a 200€ monitor. I doubt thing most other people do either.
But buying a monitor that has top end specs and that costs over 1000€ should also come with the newest standard and appropriate cable included in the box (looking at you Sony).
When a monitor already costs 1000+ the costs of the scaler and cable becomes much less of an issue.
People buying such high end monitors are not going to skip on it because it's 50-200 more expensive.
They may however skip it if it has deficiencies in the specs compared to it's competitors.
What an absolute mess. How can anyone with common sense have created this standard? It’s like USB… what a mess that is too!
Same stupid shit! Combine this with cables that are labeled with a specific stat, but in reality it does not support it (cheap Chinees shit). So much wasted time figuring out what is broken (new cable not working properly).
usb hdmi and now display port 🤮
It's same white collar criminal bean counters and marketing scammers now doing these decisions as those behind killing 350 people in Boeing MCAS crashes.
thats what bureaucracy gets ya
can you review the fo27q2
Really enjoy your content It's very informative Subscribed.
Many thanks :)
Dsc is not always that great, for some reason on my odissey oled g6 when I enable 10bit at both 240hz and 360hz I get a green tint, constant and there is nothing I can do about it. So if I want to use HDR I have to set it at 120hz which is a bit disappointing.
Have you tried different cables?
Im still using my 10ft 1.4 DP cable 😊
Do you think manufacturers will be able to eventually make longer 2.1 cables in the future, or is it unlikely?
It’s hard to know. Passive copper cables are going to be difficult for sure at long lengths and high bandwidths. Maybe a different solution will be found though
@@tftcentralmaybe it's time for fiber cables? It seems that is inevitable just a question of when, right?
@@Silverhks It's such a clear solution there are already media-converters that plug into a DP or HDMI port and have a fiber-optic output.
HDMI already has optical cable for those who need longer cable length for same high speed. Cable costs more but its an option. DP 2.1 have to support same thing its only option.
So it looks like the 5090 will have 2.1 display port. Will the GIGABYTE AORUS FO32U2P the one to buy
Yes
With full bandwidth
You newer know, when it is nvidia…
Maybe full 80 with 5090
40 with 5080
20 with 5070
And fake 1.4 version in 5060…
That would make sense!
😂😂😂
Brilliant Explanations! Exactly what i was looking for.
“Visually lossless” is a slimy term. It’s no wonder people don’t trust it. It’s like calling mp3 “aurally lossless”.
When this copper will be changed by light? No requirement for length and MUCH higher bandwidth ( UP TO 800GB/S )
They do make converters, they come in various types, but it looks like most are thumbstick style dongles (a bit larger though) that plug into a DP port and have FO connectors on the other side.
@@kaseyboles30 could you list an example , like why would someone need it
if the monitor is DP1.4 even if you convert it to fiber optics your still hardware bottlenecked by the monitor main board.
only solution for the time been is getting a fiber optic cable that support UHBR 20 for lengths of more than 1m
@@MrFaleh1129 DP/HDMI to fiber converters are mostly used for distance today, not bandwidth. For example Linus of Linus tech tips has a server rack filled with various things including a pc for each family member and uses something like this to have the monitor/ keyboard/ mouse/ etc. wherever it's needed. He then has all the heat generation in one water cooled rack (and the heat is dumped via a heat exchanger into his pool).
@@kaseyboles30 no
I believe he has a normal hdmi for video to his monitor But the usb is the icron hub (costs like $1000) to get his usb and peripherals to his gaming room
Those hdmi to fiber are not 2.1 from what I looked up cause I needed this use case and did a lot of research and there were simply not any that support 2.1
If you want the 2.1 standard you need a fiber optic cable not a converter
@@MrFaleh1129 He's almost certainly using some sort of extender for hdmi if he's on 2.0 or 2.1. 10-15 feet isn't very far. Though you can get upwards of 50' on hdmi 1.4, but I don't see Linus doing 1.4 if he can put in better.
Can you review the Asus XG27ACDNG 360hz please!
Any more info about active DP2.1 80 compatible cables? Price? How long they are?
Also the daisy chaining feature is going to be grest
I really wanted a 6K panel with some sort of local dimming (miniLED or OLED) and high refresh rate (120hz or more). Having work with the Pro Display XDR for a few months makes me wish the PC market would have something like that, even if the price is outrageous.
any news about 5k2k OLEDs?
Out of all OLED panels you have tested which one does have best white balance like whites appear white rather than yellow? Im stuck between PG32UCDM or 32GS95UE for me screen coating doesn’t matter only the white point. Any help would be appreciated 👍
Well that kind of depends on what white point you’re used to. Both have pretty good calibration to a 6500K white point but whether that looks white or a bit yellow to you depends on what you’re used to. If you’ve been using a cooler screen it might still look yellow. Both have good adjustment options though at least
@@tftcentral I do prefer cooler tone displays previously i was using LG27GN950 but still if it’s accurately calibrated to 6500K I’m ok with it. So among these two ASUS PG32UCDM & LG 32GS95UE which one comes with perfect calibration out of box?? I don’t have any color calibration tool with me.
@Arno641 have a check on our reviews of both on the main site as it’s tested there in a variety of modes you may want to use 😀
@ yeah i have already done watching it but still i couldn’t able to decide because it wasn’t direct comparison also since you have experienced both the panels i would like to hear from you it would be really helpful for me to make decision!
@Arno641 I mean the written reviews on the website, they contain the measurements you’d need. I can’t provide any more info than in there really
very well explained, good video, thanks!
DSC has more latency than without? Or the same
The same
I've seen alt+tabbing out of a fullscreen game without DSC causing several second lag too. My understanding is that lag is from switching control back to the OS for driving the display & caused overhead as resolution, refresh rate, etc. gets initialized and communicated. Are you saying that DSC adds more lag or the switch only lagged with it on in your testing?
Thanks for summarizing this mess. Would have been nicer if more of the charts had 1.4 and the several 2.1 speeds marked on them but seems easy enough to follow with jumping around.
Just play your games in borderless
@@veilmontTV better for alt tabbing but often has impacts to latency + framerate and the game's ability to use and control some advanced graphics capabilities. You can run into different results when trying to capture the game such as for streaming with obs-studio. Sometimes there are bugs impacting the preferred choice. How things are impacted can also be impacted by the game and its currently used graphics library, what GPU is in use, and hardware that is connected to it such as for multiple monitors. Newer games, OS graphics standards, and GPUs are making things better but its not still automatically equal.
So the answer is:
- VESA should release a real DP standard that stops everything being optional;
- people should stop buying Nvidia cards;
- and it is going to be freaking forever for USB4v2/TB5 120 Gbps display connections are running (maybe outside of Apple).
Thanks for the informative video
Is there any news about Tandem OLED? Because Tandem OLED has 1000 nits Fullscreen brightness which is very good for BFI and it has way higher lifespan than normal OLED I personally think that Tandem OLED is the ultimate display technology
A new oled monitor or tv should last at least 5 years unless you're a super heavy user. People have ten year old oleds with 10k+ hours and no burn in. Just don't abuse it. If you don't want to baby your display at all get an lcd.
is there a tldw?
Not really. You’ll find it’s no nonsense, to the point, so it’s all worth watching to learn about these problems 😀
@@tftcentral I appreciate what you guys do, I really mean it. But not everything warrants a video and everything is pushed to youtube these days when it could be just as easily portrayed in a bullet point list. I miss being able to read stuff.
Does it even matter buying DP 2.1 40 or 80 Gb cable when monitor have DP 1.4 and latest nvidia gpu too? It does not make any difference right? Or? I am really confused..
If you’re graphics card and/or monitor only has DP 1.4 then just get a regular DP 1.4 cable which can be longer and will be cheaper too
@@tftcentralthe cable actually matters regardless of gpu monitor is what matters but you won’t get the full hrz of the monitor utilized stop the false information
Well it goes without saying that it needs to be a decent / certified DP 1.4 cable yes. I meant you don’t need a newer DP 2.1 cable if you’re only using a DP 1.4 connection
@@tftcentral so every vesa certified on their to page is safe to use and dont need to worry about pin problem
That’s the safest bet to cross refer a cable with their site but DP 1.4 has been around ages, you can also go off well known brands or reviews on Amazon etc I’d say
16:57 ive got a very long DP 2.1 Cable that will probably support the UHBR20 standard , but its an active fiber optic cable
i guess thats the only solution right now if you want the DP2.1 standard.
Yeah fibre cables will prob be ok but any kind of passive copper cable will struggle
I bought the MSI 27 Oled 360hz. Newest FW is installed. I can switch off DSC. But cant activate 360hz, as soon as I toggle off DSC, there is only 240Hz available. But if you calculate you are supposed to power 10bit WQHD 360hz without DSC. The New Asus Oled 480hz can also Support 360hz without DSC. Same connection. HDMI 2.1
Why isnt it possible on the MSI model ? I am sad...
Mine is the oled I can power it fine 360hz must have a crappy cable
It is possible which one do you have
A cable has two connections, so, what does your video card support?
@@Mortac rtx 4090
@@aleksandarkrstic7182 I assume the monitor is a 271QRX?
Do you use HDR? Since that increases bandwidth requirements.
I also think DLDSR can cause limitations if you use that.
Also ensure it's not the cable, but if it came with the monitor, it shouldn't be.
If all else fails, I'd just contact MSI support and ask about it.
Great video thanks!
Alt tab is so annoying when you have to use DSC. I ended up going from 2k 360hz to 2k 240hz because of DSC
Just run games in borderless...
Nvidia does need to fix their shit
If they're running 360hz they care about latency and IIRC fullscreens still provide reduced input lag. @@veilmontTV
I've juist bought MSI 27QPX for my rog based on 5900x and 7900 XTX. My monitor seems out of sync and I can barely work / game on it for more than 5 minutes - it's blurred that messeses my eyes. Based on my LG C1 experience which i returned, and having S90C and Zenbook OLED which is great - how come my new monitor feels this bad - like binoculars with bad diopter setting?
Any news about 4k, 360hz, 32 inch, full gorilla glass coating monitors?
No none yet. Only company using gorilla glass is Dough, and they’re currently working on their 32” 4K 240Hz dual mode model
@@tftcentral The most likely thing is that they will announce it at CES 2026 and it will come out around the end of that year, I am between the 1440p 480hz monitor or continue waiting for the 4k 360hz one.
Most importantly 12-bit color depth and type-c connection
All of my Certified DP cable are all DP 2.1A. Does this A have any real meaning?
Don't think that's even an official designation. Are these cables listed on the VESA website as certified DP 2.1 as well? I expect there's many cables out there which are listed as supporting DP 2.1 when they don't really, or at least haven't been formally certified as DP80 and so on
The abuse happening is automatic. No question. the system has been actively designed to be confusing and easy to obfuscate and use in misleading marketing. Sadly not surprised. these codes and standards to afford deception seem to be the desired standard in all modern updates.
yea they probably got paid out to allow the obfuscation and misleading possibilities.
Ofcourse. They will take like 2-3 releases to get it right . Very scummy
VESA's practices need to come under the spotlight. A lot of industries could not get away with these non-standards standards. If it doesn't meet the spec of UBHR10 at a minimum they should not be able to call it dp2.1 full stop. That you can get VESA certification is a joke. VESA needs to held to account. This is just as shameful as the USB "standards"
Good info, I like my pc far away. 5m cables are not gonna support this then.
Not for now no, and perhaps never 👎
Thanks ❤❤❤
Sony just released a 480hz OLED. You guys have it and are going to review it? Those cheap bozos didint include an hdmi 2.1 cable but a cheap DP. The fps pro plus mode stutters and lags when facing opponents. At first i thought it was connection lag.
They also have a weak measly 1 year warranty on it. Thats pathetic.
No plans to review their model at the moment I’m afraid
@@tftcentral :(
Display Port introduces new standard with confusing names
USB: First time?
So Should i buy a case that fits on my desk to future proof for DP2.1 ? 😅
Haha yes absolutely 👍
Just realized I ordered a DP40 cable for my FO32U2P... Yeah this naming scheme is seriously bad
Looks like the clusterfook that happened with USB 3.0/3.1/4.0
Recent manipulations will not break my determination.
I'll wait and get DisplayPort 2.1a (with full UHBR20) supported display and also graphics card(s) in future.
It can be in 2025... or later. I have no rush.... Companies are more than welcome to postpone the money collection which I am ready to pay for right product(s).
I do not have to convince them to produce right products. They have to convince us with right products with right price.
I am not gonna waste my money for DisplayPort 1.4 or any HDMI connection...
I only invest "future proof" specs. Not "already dead" features...
Gigabyte already has a 4k oled 2.1 full bandwidth
Dear friend @@ShalowRecord
That is correct. But as you know, that model has it's own issues... There will be way better and way cheaper models in future, if not soon.
I'll be waiting and monitoring the market.
I resonate with you, but the situation from a manufacturer & seller standpoint is more complicated. They operate in a capitalistic environment and have to sell stuff & services to exist.
I see you have an article about the two new tcl mini led monitors, are there any plans of reviewing them? They look very interesting for those of us who don't want to buy oled monitors
No plans I’m afraid
@@tftcentral alright, thank you for answering
why leave Intel Arc cards out they have uhbr13.5
No specific reason, just didn’t mention them. Sorry Intel 😀
TL:DW;
1) not all DP 2.1 connections are the same
UHBR10 = 40 Gbps, 13.5 = 54 Gbps, 20 = 80 Gbps
need UHBR20 for max speed
2) UHBR is optional. DP 2.1 = support DSC, and one of these: UHBR, AdaptiveSync secondary packets, or Link-Training Tunable PHY Repeaters
3) input source devices not DP 2.1 ready yet
graphics cards, AMD + Nvidia
AMD consumer R7000 up to UHBR13.5
AMD workstation W7000 up to UHBR20
Nvidia nothing in G4000 or A4000 series
4) DP 2.1 scalers aren't common
high prices
5) DP 2.1 cables are very limited
high prices
DP40 = 40 Gbps, DP54 = 54 Gbps, DP80 = 80 Gbps
1m or 1,5m max for 80 Gbps is quite short
Disagree on the comment you made about UHBR10 and up. Because DP allows you connecting a display by one lane, two lanes or four lanes, all at the same bitrate. Therefore UHBR40, 54 and 80 does not make any sense.
Well I understand why they are called what they are, the point was more that for an average consumer it’s confusing naming schemes. Especially when you’ve then got different names for cables
I just want a 42"-48" 4k 240hz OLED TV with insane brightness, DP2.1/HDMI2.1 UHBR20, and BFI/ELMB for around $1000-$1400. I dont think I'm asking for too much. Lol.
You and everyone else 😂stand in line.
It will take 2 - 5 year to get at that price cause these dp2.1 is a new tech
It’s already available but only for 32in
Not gonna hapen… but who cares… TV will be HDMI device to the end…
Excellent video...
Nvidia will release its next versions of GeForce RTX 50xx PC graphics cards in Q1 2025. They will have 3 DisplayPort 2.1 ports. So, this video should be updated with a few seconds about that.
That’s not officially announced yet though is it?
You are absolutely correct.
Regardless, just about anywhere you check as of today you get something like:
"Yes, the Nvidia GeForce RTX 5090 is widely expected to support DisplayPort 2.1, with many rumors pointing toward the inclusion of the UHBR20 (Ultra High Bit Rate 20) variant. UHBR20 allows a data rate of 20 Gbps per lane, enabling up to 80 Gbps total across four lanes, supporting very high refresh rates and resolutions, such as 8K at higher frame rates or 4K with HDR at extremely high refresh rates. This would be a substantial upgrade from DisplayPort 1.4a in terms of display capabilities, especially for high-end monitors."
I'm thinking it would be helpful to work a few seconds of that into your video so there is a more up to date spin on the situation.
If 2025 is going to be the year where DisplayPort 2.1 and 4k video starts to take off - maybe some of your listeners would want to think about that.
Putting out an updated version is going to get you hits with minimal effort - nothing wrong with that.
DSC is not lossless, it's lossy
Visually lossless I said. Which it is
@@tftcentral according to the company that manufactures it lol.
Why do you trust VESA so much? Why do you trust their marketing so much? In the same way companies like Sony are being deceitful in their marketing for the Inzone M10S why don't you think VESA could be too?
VESA don’t manufacturer DSC. But this is based on my experience using countless displays. You’ll find other major reviewers all say the same thing too, there’s no visual or discernible difference visible.
Its visually losless i agree with em based off my experience @heatnup
Will the 50 series have dp2.1 80 gbps?
No one knows yet. We hope so as it will help move things forward when NVIDIA adopt it
Most likely yes.
Rumors are saying yes
If it is NVIDIA… i expect that
5090 will have full 2.1 80
5080 will have 2.1 40
5070 will have the slovest 2.1 aka 20
And 5060 will have that fake 2.1 aka dp 1.4 speed…
That would be very NVIDIA like release!
😂😂😂
Thanks for the video! I understood everything and all i can say that its such bullshit, that we have to pay 1000eur for the monitor, and they cant even make it without DSC. Bunch of aholes, I wont buy a monitor, I guess. Will stick with my 1080p@240Hz TN panel and buy a VR headset instead, that cost half the price of the monitor ffs.
They should have just made it UHBR 10 = DP 2.0, UHBR 13.5 = DP 2.1, UHBR 20 = DP 2.2
Lol, that would make it worse... Actually, that might work. :)) But then, how would the industry be able to create confusion in the market when it needs to? xD
What is the game at 1:45?
Not sure I’m afraid. It was a demo at Gamescom last year in 2023
Looks like some kind of MOBA.
wow
Well that sucks.
Another way to screw the consumer. We need a law to stop this anticonsumer behaviour and splitting up of standards.
Between VESA and the forum that handles USB. Making things easy for consumers has been totally borked.
4k 480Hz Oled.
Great video.
All monitors that are advertised as 2.1 and not uhbr20 are misleading on purpose, plain and simple. They can say it's 2.1 per the standard but we all know what they are trying to pull.
I’d be very wary of any manufacturer who doesn’t explicitly refer to the UHBR speed for sure. Red flag!🚩
Not true
Personal thank you for uncovering myths of DP2.1 idk why VESA has to make everything so complicated, as in a modern world everything has to be simplified af.
These kind of concerns really make me want to hold off on getting a monitor. When I buy a monitor, I kind of look at them as a long-term investment. And I'm certain to feel like 2025, 2026 might be the sweet spot for some of this new technology.
So, it's a mess, just as I expected.
Will somebody please wake me up in 5 years to check the state of DisplayPort 2.1...
i think mandatory dsc+1 any other is actually reasonable. i don't really care much if i get advertised high resolution + high refresh, no problems whatsoever. its not the same bullshit as usb 3.0 renamed twice
I think the issue with it though is that every bit of DP 2.1 marketing and promotion is about the new speeds it can support. That’s what the connection means in the minds of the consumer, and to then say that actually that feature is optional is open to all kinds of confusion and misinformation. “displayport 2.1” doesn’t actually mean new, faster speeds at all, at least it doesn’t have to
4K 360hz??? What media would ever use that? Are my eyes even capable of *seeing* it?
It will be a future display spec for sure. There are clear benefits in image clarity, detail etc from the increased pixel density and resolution. And also benefits with motion clarity, system latency, gaming experience etc from a higher refresh rate. So yes, you’d be able to see it 😀
i don't understand why hdmi and displaysport still need to exist, now that there's a USB4 standard that supports up to 80gbps which is the same as the fastest displayport version USB should just become the one truly UNIVERSAL standard, anything that requires higher speed should just use OCulink(aka external PCIe)
USB-C operates using displayport Alt mode though. Same challenges exist
USB4 isn't a display signal transfer protocol. It only transfers data via a universal port/connector. You also need a protocol like HDMI or DisplayPort to actually produce and display the data. And DP is actually what is internally used even if going via USB-C.
@@AngelicStreak yes so why do we still need to keep the physical displayport connector around? why do we still have 2 seperate display adapters(dp/hdmi) for monitors and tvs even though the difference is becoming more and more arbitrary
@@Intelligenz_Bestie I didn't say we need connectors. I talked about protocols. And I did that in response to your "i don't understand why hdmi and displaysport still need to exist." And yes, we can go via USB-C port, but will still need an underlying display protocol. As to why we need HDMI/DP ports, we don't.
@@tftcentralwe don’t want that there’s more compression and latency no
DP 2.1 is simply around 5-10 years ahead of its time at best. We're already at the point where we see diminishing returns on both resolution and frame rate for desktop size monitors, especially on current gen hardware. Beyond that, going forward, the metrics that will be pushed in terms of hardware capability is NOT resolution at higher refresh rates. It's graphic fidelity and complexity. So realistically when it comes to strictly looking at viable resolution x refresh rate numbers, we will see it stagnate almost entirely for years to come. Going from 1080p 60hz a decade ago, to 4k 240hz today on DP 1.4, was a big leap. But beyond this point the returns are far less noticeable, yet the cost increases exponentially - so it's simply unattractive to pursue higher gains when we're already bordering on perfection.
It's why AMD shot themselves by trying to force DP 2.1 on their current gen graphics cards. No one can even make use of it because pushing any visually intensive task (that would benefit from these data rates), is simply way too limited by the graphics hardware itself before the display output even becomes a concern. It's like shoving a bigger engine into your car, expecting your commute through traffic to be faster. Ultimately you're limited by entirely different reasons and your existing engine is hardly working as it is anyways, so upgrading wouldn't net you any benefit whatsoever.
DP 2.1 is just a bit of a waste for 99.99% of consumers. And it taking the route of USB data naming and branching doesn't help either.
Literally if someone were to ask me how DP could ever lose their position to HDMI, I'd say "waste effort on things nobody will realistically use, and make the segmentation obfuscated so people don't understand what they're buying". And it's literally what they have done. Absolute self shot. Sometimes I really can't fathom how professional people are behind these decisions... Amateurs.
We need 8K displays in 16:9 and 16:10 formats, and we need them to have the electronics to drive integer-scaled all the mainstream lower resolutions at correspondigly higher refresh rates. (4K, 1440p/1600p, 1080p/1200p, 720p/800p).
A 40inch 16:10 8K monitor capable of all that, with the best reliable panel technology available would be a marvelous thing. One day...
And DSC is a compromise, a workable one perhaps, but DP 2.1 and higher would allow some of these resolution&Hz combinations to be displayed without compromise.
@@edfort5704 That's a whole lot of yapping to still say nothing. DP 2.1 is useless for the next 5-10 years because 99.99% of people will never have any use for it. You're not gonna run anything at 4k +400hz in the first place so why support it. Waste of effort and money. Better to invest into optical display cables then rather than waste it on useless old wire tech.
@@Real_MisterSir Higher refresh rates benefit anything and everything that involves moving images, including simple browsing or scrolling through documents.
We are very far from reaching any perceptibility threshold in this area and the only diminishing returns that exist with higher refresh ratea is when the jump is too small (e.g. from 240Hz to 360Hz).
Many games may be hard to run at very high fps, but higher Hz will benefit even fps-capped games through lower latency. It's just that it's not worth for the avg. consumers to make incremental hardware upgrades, but large ones (2x, 4x etc.)
I too hope to see the potential of optical cables more explored by the industry.
@@edfort5704 bro unironically thinks going past 240hz on 4k means anything I can't 💀
with the rtx 5090 which will have dp 2.1, having a qd-oled 4k240hz dp 2.1 screen will make sense in my opinion
We can guess in theory first product will be 8k 120hz display or tv if its supporting 4k240hz in 2-3 gpu gene ?
hope next generation nvidia cards overhauls their PD connection (1.4 included) might fix their current bugs. At least theres a chance.
Whats the need tho? Nobody is pushing 1.4 anyways, it seems entirely useless. The actual games worth playing on a high end graphics card will never get anywhere close to pushing past 4k 240hz in the first place. Cyberpunk maxed out on a 4090 barely runs at 40fps without DLSS at 4k, and that's been out for years now. Any game worth getting a 5090 for will not pass even 144hz at 4k so why even bother forcing DP 2.1 that no one will use other than for arbitrary bragging rights and trying to run CS2 at +500 fps.. Its so useless. Diminishing returns that 99% of people will never care about.
@@Real_MisterSir The hope is getting a rtx 50 series would remove the 1.4 DSC bugs (black screens) for a 1440p 360hz OLED.
I think to solve the cable length problem, fibre really is the only way. That’s going to be insanely expensive though.. there is no way I could use a 1 meter cable in my setup. I’m using 2x 5 meter club3d DP 1.4 cables and they work great.
I would not say fiber is expensive. Economy of scale. As it becomes more common the prices will drop as it did with internet cables going from copper to fiber.
10m fiber cable currently costs 65€. Assuming 5m cable would be be half that at 32,5€ it's on the high side but reasonable.
However current DP. 2.1 cables are only UHBR 13,5 from what i see. Labeled DP54 (54Gbps). I see no reason why a 3-5m fiber cable supporting full 80Gbps could not be sold for less than 20€ in the future.
@@Raivo_K it’s not so much the cable but more the electronics at each end that would cost a heap.
Nightmare for consumers!
DSR is NVIDIA's downscaling feature, while DLSS is the upscaling feature. I believe AMD's VSR downscaling feature is also incompatible with DSC based on my limited testing. NVIDIA's next generation of GPUs will support DP 2.1a based on recent rumours. Hopefully this will give monitor manufacturers the impetus to support it. Longer UHBR20 cables would be nice, too. EDIT: Ah right yeah you mention this in the video :/ My bad for commenting before watching the whole thing.
I look at DSR as upscaling, but I see what you mean. From a monitor point of view it allows upscaling, allowing a higher res input that the res of the panel. But from a gfx card point of view it’s downscaling the higher res to the panels lower res.
Can't find the source now kind of a niche topic but I'm pretty sure AMD's VSR is compatible with DSC unlike DSR. Building onto your comment though it sounds dumb but at 1080p using DSR to upscale 1.5x than downscale to DLSS 3 Quality/DLAA looks amazing in the games I've run it in.
@@tftcentral That's a fair standpoint. The end result is SSAA, so perhaps 'upsampling' is a better term.
Yeah agreed. 👍
@@tftcentral Well, it is a bit strange to call it "upscaling" when it does the complete opposite of the widely understood "upscaling" that features such as DLSS or FSR do
Hi please go try it yourself DSC causing input lag you can easily feel this vs native . The easiest way to prove this to yourself is test it using gyro at 8k polling rate. With gyro you can easily feel input lag unlike mouse which you can compensate for lag by increasing force with your hand . But with gyro you can easily feel this. Use hdmi 2.1 native of oled 240 vs hdmi 2.0 with DSc . So night and day . Also visually is very easy to see in terms of clarity in competitive games just snipe and look around fast it’s night and day . Native is so much better
DSC is done in hardware. It adds something like 1/100,000th of a second latency.
@@imadecoy. literally so annoying people say this and never test it . Literally can tell 100/100 times if DSc is on or not within 5 seconds 🤦♂️I don’t care what people think the math number is it’s irrelevant it’s night and day in terms of feel when All other end to end latency are tuned and low
@@TheOneGhost12 It's placebo man. This kind of thing is easily tested with a high speed camera or Nvidia LDAT.
@@imadecoy. 🤦♂️literally not at all I can prove this I can immediately tell 100/100 times Iv even proved it to my brother it’s so easy to tell
@@TheOneGhost12 Yeah sounds like BS to me. You'd be the only one in the world who can tell.
Vesa must be colluding or they all need to get fired. This is *literal* false marketing. They're using a technicality, but for false marketing it just has to be misleading - which it is, and seems to be designed this way on purpose.
Gigabyte 32inc Oled has DP 2.1 UHBR20
Basically, it's HDMI 2.0 all again.
LOL so the only thing that's standardized is the name "DisplayPort 2.1".. It's just like HDMI
yeah a bit of a mess unfortunately
I’ll never understand why it bothers a High end PC gamer to just run DSC.
Because people think that compresion = bad, there was a time where flac was called an inferior format because it compresed the audio but people grew up from that, we just need time and reviews
DSC makes editing EDID difficult, which is important if the manufacturer sets the wrong NITs in the EDID. Also, DSC means no DLDSR (unless there’s a weird workaround). At 1440p at 27 inch, the screen looks pixelated compared to 4K. DLDSR would fix that. Also alt tabbing or going from one full screen app to another will result in a 5 to 7 second black screen.
I use DSC all the time
Why even go for DP40 when thy have HDMI2.1 I ask.
I love your content, but you speak soo slooowly. Could you speed it up a bit?
Really?!! I don’t think I speak slowly at all, maybe you need to alter the playback speed in UA-cam?
@@tftcentral No need, you're speaking at a nice pace. If someone would like to have it faster, change the playback speed indeed...
I reckon they must have had me set on 0.5x or something 😀
What are those strange triple driver speakers by your monitor
They're an old Creative 2.1 set, still going strong :)