Nvidia's RTX HDR Has A MAJOR Problem! - HUGE Performance Loss 10 Games Tested
Вставка
- Опубліковано 3 лип 2024
- Join us as we delve into the latest developments from Nvidia, who have rolled out a long-awaited overhaul of their GPU control panel. Exciting features like RTX HDR promise to revolutionize visual fidelity for PC gamers, but there's a catch. In this in-depth analysis, we uncover the truth behind Nvidia's RTX HDR, revealing shocking performance impacts that could leave gamers reeling. From benchmarking multiple games to uncovering hidden pitfalls, we leave no stone unturned in our quest to determine whether the allure of RTX HDR is worth the potential sacrifice in FPS. Tune in for a comprehensive examination that sheds light on the practical implications of embracing Nvidia's latest technology. Don't miss out on crucial insights that could shape your gaming experience!
• Nvidia Finally Fixed A...
• RTX HDR is absolutely ...
SOURCE:
www.nvidia.com/en-us/software...
Buy and RTX 4080 Super HERE!
amzn.to/49av5Zg
BUY A 5700X3D HERE:
amzn.to/3w1nAoI
BUY A 12600KF HERE:
amzn.to/3HRMlGI
Buy Intel 14th gen CPUs here: amzn.to/40Vt4g1
Buy the ROG Apex Z790 Encore here: amzn.to/49jqeon
Buy the AMD Ryzen 7 7800X3D here: amzn.to/3SRSycr
Check out my 13900K Hyperthreading 40 Game Benchmark Vid
• Is Hyper-Threading Use...
Watch my 13700K E Core ON vs OFF 40 Game Benchmark video
• I Did Not Expect These...
Watch my RTX 4090 vs 3090 42 Game Benchmark video
• RTX 4090 vs RTX 3090 M...
Watch my RTX 4090 at 2GHz video
• If Nvidia Didn't Incre...
Buy an RX 7800XT Here:
amzn.to/3EuQYoh
Support the channel by donating
**** www.paypal.com/donate/?hosted... ****
Want to build a Gaming PC? Use my Amazon Affiliate link below!
United States - amzn.to/2JKq8uW
Canada - amzn.to/2GYSkbM
United Kingdom - amzn.to/2qupqda
Follow me on Twitter
/ dannyzreviews
#Nvidia #GPUs #AI #HDRGaming - Наука та технологія
RTX HDR is meant for older games which don't natively support HDR. Performance loss is minimal there
7:00 They still need to fix performance hit not going away after switching the setting back to off tho.
At the end of the day, a fair amount of HDR implementation is from lackluster to dogshit, with bad gamma, black level raise and no way to change peak brightness. Sometimes RTX HDR is the better option, if you're happy to cop the performance hit. RTX HDR is not just meant for old games.
but I mean, how many people knew that ? it's not obvious at all.
Then use windows 11 hdr.. no performance loss and can be used in older games.
@@channelthepigslove you probably haven't calibrated your screen or have a lackluster HDR panel
It's still beta. These should be reported to Nvidia.
They are aware, you think they used pokemon nintendo DS emulators to test the RTX HDR technology? lol
You are right. The Nvidia app is still beta, you are not forced to download it, RTX HDR is off by default.
@@maximusasauluk7359 there is a reason for beta release, smart guy.
@@space4ace582 did I say you are forced? Beta release means test by users so that more bugs will be reported back. I just said that. Somehow every stupid reply finds my comments.
wow, 20% performance loss is like enable ray tracing kind of FPS drop. then stick a perma performance drop even if you turn RTX HDR off, I'm just going to stay away from the nvidia app until they get their 💩together, not even worth trying at this point imo. Digital Vibrance can always be adjusted on the old nvidia control panel, so just do that if you want more color pop
yea same as much as i love hdr I'm just gonna wait.
I am using it on older games that have no HDR and its amazing.
Good, because that's literally it's purpose, says so in the control panel itself, it's meant to be used for games that don't natively support HDR
Which games are you using it on and what makes it amazing
@@rodiculous9464 Dragons Dogma Dark Arisen and Akrham Knight.
@@rodiculous9464 Dragon's Dogma Dark Arisen, Arkham Knight and Darksouls 3.
@@wolfstorm5394 Incorrect. It's used for games that support it as well and even the tool tip says to "disable ingame HDR." You will get better HDR with this - it's not even close. This actually makes computer monitors look good. I wish it could be extended to the desktop as well since Microsoft's HDR and Auto HDR system is garbage tier.
One important thing that makes the HDR implementation very important for nvidias is the fact that I can fix and tune it to be more accurate. Can't tell you how many times the native in game HDR is broken or the autohdr is clearly over correcting or under correcting.
Nvidias at least lets me tweak it to be exact.
Is it fixed now ?
screw the fps loss with HDR, im experiencing about 11% average fps loss and 13% lower 1% fps after installing the Nvidia app on a 5800x3D, 32GB Bdie DDR4 14-14-14-35, Rtx 4090 system. EDIT: I clean uninstalled it with Revo uninstaller and got back to my old ferrari performance on all of my games
The 5800x3d definitely bottlenecks a freaking 4090.
Bro listed out those ram timings like they actually matter lmao
Since downloading the app I have been having some lower performance too and that’s without the RTX feature on. Have the same CPU as you
@@ZackSNetwork I also looked at the setup and thought what a weird specs this guy has.
To all the people above let me rephrase then, Has anyone else witnessed a 11% performance loss after installin the Nvidia app? Not compared to geforce experience but to a cclean system without neither of them.
How did you get RTX HDR working in the last of us? It says game unsupported for me. I would love to get it working in this game 😢
the last of us supports HDR natively?...why tf would you do that to yourself, this RTX HDR thing is for old unsupported games, not modern ones with...full HDR support already lmao
Well, because it's a dogshit native implementation, 10,000nits peak, no HDR Calibration app support and the in-game options only change overall brightness. It doesn't look great. Exactly what this app is designed to improve. 😂
Thats too much of a loss in performance. Hopefully they optimize it in future drivers that bring to the cost less than 5% then it would be easier choice to enable
I'm not experiencing any where near this much performance loss. Also, this creator is making it seem like 5-10 FPS is a HUGE loss. It isn't. You won't even notice it.
Great video! I appreciate the detail on the testing. I briefly tested RTX HDR and RTX Vibrance and noticed a performance drop as well. However, I did not do the extensive amount of testing that you have. Great job! Thanks for the shoutout!
Thanks!! Likewise great job on your vid as well
Hopefully they can optimise this a bunch as this is something thats honestly so much better than AUTO HDR. Was using it in a heavily modded skyrim on an AW3423DW and it made a night and day difference. Tested out Phasmaphobia as well and that looked great and didn't notice any drop on either those. Also most of the games you tested have native HDR so not sure why those would be tested specifically. Alan wake 2, Baldurs gate 3, Hogwarts, Spiderman remastered, cyberpunk, pretty sure MW has it as well, starfield has it but its pure ass and needs a mod called luma.
Yeah not enabling it on my 3090ti, maybe later on if they optimize it a little better
Don't forget, you can change RTX HDR Quality to Low with I think Nvidia TrueHDR mod from nexus and then in Nvidia Profile Inspector it should show up (does for me now). Then it should only cost like 4% perf loss, if that. Maybe none. Please try this and test in another video!
The reason RTX HDR has a perf hit is due to the debanding Nvidia uses. By default, it is set to Very High. But if you change it to Low, you get no debanding algorithm lowering your fps.
I had to uninstall it, it was causing my games to crash, not sure why. Did not try rtx HDR, I don't know what use case this has when you are using windows.
Wow. I installed this before trying out the Phantom Liberty DLC on cyberpunk and thought my frames were a bit low. After uninstalling I jumped from 100-110 to a stable 140+, This is with a 3090. Far too big of a hit.
Wow 3090! Your are a lucky person,
so please enjoy that :)
@@harounjouti I have a 4080 Super, am I lucky?
I think the biggest problem for me with RTX HDR is the entirety of its settings.
One problem I have is, it can be enabled but it can't be configured through the overlay sometimes. Despite still being enabled and functional.
Next up it seems to like to either turn itself off, or turn itself back in on subsequent game launches. This is especially problematic for me in Nioh 2 which I use native HDR. While it shouldn't work I've gotten HDR on HDR with Nvidia before.
Global off seems to be ignored and sometimes will turn itself back on. Per game settings are often more a suggestion 😂 for RTX HDR to ignore.
Performance I can deal with, while it's not great ATM I am fine with that in beta for better visuals. But the settings issues are obtuse, and annoying.
For record I haven't DDU yet, so maybe some problems can be fixed that way. But it's just completely arbitrary. Doing it per game with the hack is probably better for both performance and settings nonsense.
Lastly borderless window functionality works fine, but it flickers. Almost like slow BFI. What is also another way I noticed HDR on HDR since only RTX HDR flickers.
Quality wise it's mostly better, but sometimes it fails, especially on full screen whites. While I hate getting flash banged, I expect to get flash banged with HDR. This isn't always the case, in most games I do die, but when it doesn't kill my eyes I feel sad. When not being blasted by 1100 nits of pure white joy.
i have tested it on some old games and there is a performance hit for sure, but since i lock them to 120FPS any way (don't really see the point to run single player games at 240...) its not a big deal...
image quality wise its amazing on both old and new games.
i actually used it to dial in the native setting for HDR in some games with bad HDR.
I like how the thumbnail shows Elden Ring, a game that has fps issues completely unrelated to hdr
I believe Nvidia is using AI to calibrate the HDR brightness, so the gpu's tensor cores are getting used, which will add latency to frame times. Unless they stop using the tensor cores, there will always be latency. It's not something they can fix with a driver update. The best hope is that maybe they can optimize it a bit. I'm not really sure if AI is the best way to go about converting SDR to HDR. Windows' AutoHDR isn't all that bad. RTX HDR might be better, but probably not drastically better. Seems like Nvidia has got into a mindset of using AI to solve all problems, but this illustrates why that's not always a good approach.
People just have the wrong idea of this feature, it's not supposed to replace Windows HDR or necessarily be better, if you read the little info tha pops up in the control panel it literally says this is to be used with games that don't have native HDR support, it's kinda like using RTX remix to make an old game look nicer, it simply uses the tensor cores to produce an HDR like image quality in games that don't have HDR native support, using this feature on top of a game that already has native HDR support isn't going to do you any good and might just cause more problems, speaking from experience... you're basically stacking HDR on top of HDR and still taking the performance hit for zero benefits
@@wolfstorm5394Windows' Auto HDR feature is not the same thing as Windows HDR being enabled. It is literally meant to do the same thing as RTX HDR, giving games a more HDR look to them. You can't run Auto HDR and RTX HDR at the same time. RTX HDR tells you to disable Auto HDR.
@@dremy746 So what exactly is your point
@@wolfstorm5394 I do feel like there could be a potential benefit of using rtx hdr on top of native hdr. Well, not really now as our TV's don't get bright enough.
In the 2 games I managed to get it to work it looks amazing and in my eyes at least the perrformance did not tank that much, I currently have it on dying light 2 and black ops 3. The image differrence between auto hdr and rtx hdr is night and day in my panel, I am able to calibrate the brightness and saturation levels while in game, something that can't be done with auto hdr.
dying light 2 doesn't have hdr support?
who in their right mind would use rtx hdr in a game that has native hdr support?
It may provide better HDR coverage than the actual native implementation. Starfield is a good example of this. Native implementation is trash.
The default RTX HDR Quality is Very High. Medium/High/Very High enables a deband filter that remove quality and costs performance. You use NvTrueHDR with Very Low and dithering to fix.
If it is using some of their AI to make HDR decisions, then it will always have a performance hit. With that in mind, I think the feature is likely targeted at older titles where you may be CPU limited anyway.
One thing that I would like Nvidia to do, is allow users to simply input their display specs related to luminance range, or even use display calibration data if a user has a colorimeter, and perfectly match the HDR to the capabilities of the display.
Appreciate the test but disagree your opinion on it.
Its called RTX HDR, so the gpu are using the cores to produce this so its understandable for some fps loss, but i hope they can patch or at least improve it to only a few fps. however, i do feel in many games its worth using even now in beta form because boy does it look amazing.
I don't think they're going to be able to really help the FPS loss much. I need to test this myself thoroughly.
Everybody thinks that new technology is supposed to run smooth and perfect when it's first made.
Right now they just have the hardware that they have to work with.
Most of this new technology is going to shine more in the later iterations of RTX gpus.
I imagine the 5000 series will be able to do this quite well.
People don't like the strategy Nvidia uses but it works because they get everybody to test this stuff out "early" so they can make improvements along the way.
All this stuff (RT, DLSS, Ai HDR, color enhancement,etc) needs to do heavy computational work. Performance will be hit sometimes.
This tech isn't for everyone. I'm ok with it. I use the enhancements when I can.
I'm still glad NVIDIA innovates.
Nvidia pushed game developers to not allow reshade. Then turns around and released its own GPU based reshade. Why do you people keep letting Nvidia rob you?
The problem with reshade is people freak out when you mess with post fx and think it's "cheats" bc you're playing on a monitor that let's you crank gamma and not some old ass crt with no settings.
I like HDR in some movies but I think it looks fake in games, especially on a high quality monitor. I haven't experimented with RTX HDR as yet, probably won't bother with it until it's launched and hopefully optimised.
it does look amazing though, deep rock with rtx hdr on looks stunning on my c2
The problem is not only with RTX HD but when you install Nvidia App in general 100% percent. With no settings changed or anything enabled with Nvidia App the game performace drops by 10% in average for example in The Witcher 3 Next Gen. I thought I was doing something wrong to see my fps counter go lower than usual but after uninstalling the Nvidia App everything went back to normal. Totally weird behaviour. It has to be reported.
same
Not surprised by this, I was wondering what kind of performance loss this would inflict on games at this point since the RTX Video Enhancement/HDR had a 20 to 30% GPU usage when I tested it at release.
That being said RTX Video Enhancement/HDR did make SD content look better, but again, when I noticed a 30% usage and the GPU fans spinning just for watching SD streaming content, I just turned that off, not worth it in my opinion, having the GPU working in such a way just by streaming content.
So, it looks the same would translate to RTX HDR for games, but the performance hit while gaming will be definitely noticed and not worth it at this point, but I do hope they can polish this accordingly during the beta (hopefully RTX Video Enhancement/HDR too) I would definitely use these options as long as they are properly optimized and calibrated but without affecting the performance this way.
Looks great on Elden Ring and I never dip from 60 fps at 4k, so I don't really see the problem. The performance cost is worth not having to use Windows HDR and it looks much better imo.
Hot take : if they made it faster I would prefer the dense and simple old ui from nv.. Oh hey, plasma tv is a nice dude. Cool to see him here
Several developers have developed reshade plugins to inject real HDR since over a year or two already and there's much less of a performance hit, with more customization. Retroarch also injects HDR into every old games perfectly with again no visible performance hit and more customization.
So RTX HDR isn't worth using until hopefully Nvidia fixes it, because yeah I admit it's slightly easy to use, and works on vulkan
I also had to DDU driver and go back
I love plasma gaming TVs channel. I discovered him when he had about 400 subs, everything he produces is gold
I switched on hdr on Windows 11 and all the colours changed to purple and pink so I switched it off.
Love all the features nvidia is putting out, they just really need to stay on top of bugs and deliver frequent fixes
Nvidia next year: "50 series exclusive HDR cores"
Nvidia The generation after that: "60 series second generation AI HDR"
Lol. It is obvious it would do that. It's making HDR effects happen in real time.
NVIDIA tech is awesome. I want them to keep it coming.
I'm a PC enthusiast so I want not just fps but great graphics even more.
Exactly. If it’s a better hdr then I’m fine with a little performance hit. Hopefully they can improve it. We still have tech like frame gen etc.
@@Ray-dl5mpSame Nvidia is insane.
Lol, its just Lol
@@Ray-dl5mp It's not "better" people just seem to ignore the details, this feature is meant to be used on games without native HDR support, it says so in the control panel itself, it's using tensor cores to do all the processing, so if you're using this thing with a game that already has native HDR support you're just asking for problems
@@wolfstorm5394 I totally agree. But people don’t trust games implementation of hdr these days and not every game has good controls to adjust the full scope of the hdr image so they might end up trusting nvidia to do it better. But to be fair to accuracy and truth, I’ve seen nothing yet showing that Nvidia is even doing a good job with rtx hdr performance aside. It would be nice to see breakdowns and to see are they really maxing out hdr in a great way or not.
The HDR is done on the fly by AI calculations. It’s not a feature within the game engine when you use NV’s HDR. That’s why it’s relatively expensive. Windows’ AutoHDR is also AI calculated but I guess it’s not as precise as Nvidia’s, so there’s almost no performance tax if you stick with AutoHDR. Not really sure how much Nvidia on optimize the tax
No worries. My cheap Philips 4K TV doesn't even support HDR. 😅
If a feature is redusing the perfrmance and increase the cost, energy, etc, then is not a feature but a downgrade or worst a scam. A feateure need obligatory to not do anithing negative and bring only the goods or in the best cas eescenario give you more performance and less energy, power usage.
By that logic you would be playing on lowest graphics possible at all times no matter what.
this app replace GEFORCE EXPERIENCE not control panel. just same shit and different interface this time.
Nvidia knows how to make 200fps game run 50 fps. And than on top of that they want you to add dlss to drop quality just in case..
I'm fine with my IPS 1440P panel. HDR isn't worth all the hassle and expense. I've seen HDR and while I can tell a difference, it isn't that much to justify the cost or headache. I say this and I love ray tracing.
eye candy will never be free. not sure what you guys were expecting. notice how all performance improvement workarounds like SR and FRAMEGEN degrades visual quality in one way or another?
HDR is quite literally free eye candy if implemented properly
1. Go into Nvidia control panel.
2. Change refresh from 144 to 120
3. Turn hdr shit on
4. Play at 120 and it's still fine unless your a right fussy one
And I typed this before the benchmarks lols
Could you maybe do a simple quick write up guide on how you used DDU, just in case. Thanks
This isn't a MAJOR problem and nor it a HUGE performance loss. You won't even notice it during actual gameplay. You sure you're playing with a 4090? Did you undervolt it or something? Because I get much higher framerates at 4k than you do with RTX HDR on and off. For example inyour Cyberpunk bench at 4k your highest is 79 fps? I avg 90- 120fps (capped at 120) and have never dipped below 60fps like you do.
Honestly this looks better than adding ray tracing. It’s a much less bigger hit as well.
Keyword here is BETA. If this were an official release it would be one thing.
I lose 30 fps in games like Deus ex Mankind and Assassin's Creed Mirage.
Just so you guys know, the RTX HDR is to be used with games that don't already have HDR support, it literally says so in the control panel itself, so if you've got a game that already natively support HDR there's no point using this, the whole reason why there's a performance loss is because it's having to use the tensor cores on the GPU to do all the processing that forces HDR into non HDR games.
Thanks for pointing out the absolute obvious
@@BornToKill780 The guy in the video along with a bunch of people in the comments here didn't seem like they read the description of the feature, they're thinking this is somehow going to replace or complete with Windows HDR and they're using it on games which already have HDR support
Another reason for nVidia to sell you cards
Um... you do realize that "Ray-tracing" has a similar effect of frame rates... This is what Nvidia does now a days... release features no one asked for that TANK FRAMERATES... and then release other features to fix the problems they themselves created...
I don't think HDR is worth investing in until there is an opensource standard. I guarantee you people are buying all these expensive HDR TVs and monitors and in 5 years there will be some opensource HDR that is better than everything else right now and all that money will be wasted.
Everything high end you buy today will be outdated in five years regardless.
@@OmahaGTPHaving no standard makes things way, way worse. Just like Betamax and VHS, just like Bluray and HDDVD. Standards do upgrade, but they take longer to do so and with a set standard you waste way less money. You could buy an HDR TV now only to find in 2-3 years that the standard is finally set and it isn't what you invested in. It took awhile before 4K became the standard, way longer then it took 3D TVs to die out. The HDR issue is even worse with computer monitors.
RTX: ON
FPS: OFF
It's still beta
Ofc sherlock it will kill performance its using GPU cores to calculate AI algorithm.
If a monitor already has HDR, is this irrelevant?
If you have an HDR monitor this makes it relevant. It means you can actually make the most of HDR content instead of just watching SDR on an HDR monitor.
If your monitor already has HDR this feature doesn't benefit you unless you're playing games that don't support HDR, this feature is to allow HDR support on non HDR games
@@Callum.Thomson Do you even have any idea what you're talking about
@@wolfstorm5394 I think you're mistaken. You can have a HDR display, but if the content you are watching is SDK then it just looks like SDR.
With this Nvidia feature, it turns your SDR content (older games) into HDR content using your GPU.
The guy asks the question thinking a HDR display only displays HDR, however that is wrong as he still requires the content he is watching to be HDR, hence getting more value out of his HDR monitor.
@Callum.Thomson ok so your saying. That even though I can toggle HDR on my monitor, if the source isn't HDR enabled, then I am not getting the full HDR experience. So adding in HDR AI from Nvidia will make it so the source appears to have HDR? Thus allowing me to get a better picture.
Just AMD guys looking for any cons they can find out of envy. Remember "fake frames!" with frame gen then all of a sudden AMD gets it's own frame gen and suddenly it's amazing tech. And remember "your 4090 will burn your house down" and it turned out it was only very tiny amount and 100% user error lol.
Don't be an nvidia fanboy then while moking others of the same. The 12VHPWR connector has a poor design which was excerbated by user error, that is a form of manufacturing defect as well, that's why its been redesigned mid generation...
@@ij6708 Don't be a dumbass and learn how to plug cables in?
Monitor monitor monitor
Barely any performance loss for me.
Just AMD guys looking for any cons they can find out of envy. Remember "fake frames!" with frame gen then all of a sudden AMD gets it's own frame gen and suddenly it's amazing tech. And remember "your 4090 will burn your house down" and it turned out it was only very tiny amount and 100% user error lol.
As a 4090 user this doesn't phase me as performance is never an issue.
mb
These features will be optimized on the 50 series and used to justify a higher price.
Personally i will be buying a 600-700 4090.
I dont do Betas, just not a Beta type of guy
btw not much people know but normal native HDR also impact performance, but ofc not by much
I use it in apex legends 4k 144 hz nvidia hdr turned on and it so much beter then the native hdr that i don t care about a potential performance drop.
Nvidia is probably using some kind off algorithm for their HDR so you will always have a performance penalty.
Offcourse i have a rtx 4090 that used to push 60 % in apex and is now doing almost 80 %. Still i don t care because the image quality is so much beter. In a game like apex seeing you re enemie first is everything and every e sports gamer is going to want this even if it costs 15 to 20 fps , you just get a beter card or lower you re settings ( very smart NVIDIA an other selling point). The other hdr solutions are a pain in the but for most people so i and probably others will use the nvidia app !!
fake hdr is as bad as fake orgasm
i dont have a HDR display but man, rtx Vibrance is lookin too good, obv HDR will look better but performance hit is huge
You don’t need 7200 mhz ram for gaming that’s wasted your money
Let me guess you saw Jayz video 🤣
Bigger number better scrub
Only a n00ob like Jayz thinks that 😂😂😂😂
I am an Nvidia owner but their drivers have been shit lately and makes me want to go back to AMD.
Remember when we played our games at 30fps. We had fun. I downclock my 4090 because I'm not a try hard that need validation through earning numbers in a video game.
with that reasoning why even buy a 4090. might as well buy a cheaper gpu...
@@BIG_HAMZ Games are fine for me at 45fps minimum. But I certainly don't need 144fps+ like most people believe they need today. The 4090 provides a solid, yet low power for 4K Max settings at lower FPS. And has 24GB of Vram + a beefy RTX pipeline when it's needed for really demanding RT games which will come out more and more soon. Lesser GPU's don't provide this.
Anything RTX is going to use GPU resources lol the fact that people are shocked it uses resources and drops fps is hilarious. Ofc you will take a performance hit. The only people this effects are the frame rate snobs. The guys who need 50 million fps cause it makes them good at games 😂