@@knightsljx Except it clearly does not? Most other reviewers run the power test on a synthetic load separate from the rest of the benchmarking suite, so you only see the maximum power draw and not the actual efficiency of the card.
@@knightsljx They mean on the same chart as the FPS, everyone else does it in a separate chart. It's nice to have it in one spot. Also no it doesn't, in general it's drawing less power than the 3090 Ti, confirmed by all outlets.
The power stuff is really interesting, but I think the assumption is a bit wrong. It's like everyone is kind of assuming that this card is the 4000 series workhorse. This is the Ferrari, the 4080 is the card that will make sense to worry about efficiency. This isn't really for most people, this is to max the charts and wow people. I can totally see why they squeezed that performance.
this is official now. derBauer is now my new best tech reviewer ever. such attention to detail, and especially - to power consumption, which a lot of others just ignore and compare bare fps. thank you for hard work.
Your overclocking experience really shines through in the way you thoroughly evaluate products and look at their performance from an engineering perspective. Thanks so much for your work! I've been really getting into power scaling graphs and using that to tweak my personal system.
@@der8auer-en Great video! Could you help clairify please? I've just bought an Gigabyte 4090 Gaming OC card. So is your recommandation to simply install MSI Afterburner and set a power limit to 70% and nothing more? 🙂🙏
I think this is because before companies did not pushed chips to their absolute limits out of the box before and reviewers are relatively slow to catch up with changes. For example, it was same back in Kepler vs GCN era when almost all reviewers measured donut power consumption even if it is not representative of gaming consumption which made AMD cards much worse because they were running at full speed, burning VRM, while nvidia downclocked to stay close to gaming consumption.
@@CrazySerb that's more of an issue of manufacturing tolerances. they were so sloppy before that they couldn't reliably manufacture the higher clocks so they sold them at a clock the hardware could definitely work at
This is why I watch multiple reviews. GN has excellent charts, and LTT has some technical hot takes (PCIe 5 and DP 2.0), but you're the first one that tested the entire curve, and when you consider you can drop the card's power target by 33% and only lose 5% of the performance, the RTX 4090 becomes manageable. It's going to reduce transients and case temperatures, and noise. Overall I think the card doesn't make sense for gaming still, but this puts it in a much better light.
@@justinjohnson476 next generation of cards is gonna smoke the performance / price ratio of the 4090 , it's like buying a titan that will surely be a waste of money but yes , seems like currently it is a good choice for said use.
@@justinjohnson476 Not without DP 2.0 it don't. Get wrecked. JK. But seriously, the only way to get that on this card is with chroma subsampling, not because the card can't do high refresh without it, but because it can't output it.
Thank you for the power consumption tests with lower targets. Most others are overlooking this test. It is a huge selling point for me when I'm trying to keep my draw lower and thus less heat being produced.
Yah I completely agree.. I really don't want a GPU exceeding ~300-320W power draw. It just seems like such a waste of power and a ton more heat for such a small gain. The same with the 30 series. Nvidia really needs to just stop pushing these things to ever higher power demands and keep things reasonable.
@@Shadowdane it’s a race to the bottom, both NV and AMD push their GPUs to the absolute limit just so they can score a tad better than their competitors in reviews. Nowadays you can extract less than 10% performance from extreme OC. 6900XT you gain 5% by pushing TDP from 300 to 400W. I still fondly remember my GTX970 which had 30% OC head room
@@Shadowdane You do know that the 4060 will be coming, right? And AMD will have a competing card as well. It is okay for products that aren't exclusively tailored to you to exist. There are plenty of people that do not care about power consumption or heat. All they care about is the TOTL performance, and that's who this is for.
Like many here, I watched a bunch of 4090 videos today. Yours is the most interesting and thorough. I am grateful for the amount of effort you put into your videos - especially putting them out in English as well as German. Thank you Roman.
WireView looks exciting too! Awesome that that solves cable management with a 180 adapter and provides a power readout. Looking forward to seeing that released
I appreciate this review, awesome work! Most reviewers are focused on performance, but you did something unique with the lower power target tests, which was exactly what I wanted to know about this GPU. I've been wanting to upgrade, but the 4090's 450 Watt power draw and possibly high temps was a bit concerning to me. Seeing that it ran cool and quiet at stock, and then you can low power target by 30% and barely lose any performance to achieve even cooler temps, this definitely seems like a great card.
@Silver Joystix I'm not taking about just being the fastest card, but also the improvement and features it brings over the previous gen flagship. No card has excited me more in 15 years of following the GPU scene. That's why I said best *new* card, as in compared to the others when they were new too, not just best card now.
Excellent work. Tested like a real engineer, this makes me happy as i work in R&D. You and GN are my go-to reviewers. The WireView module is a great solution to a problem that shouldn't exist and the extra function of power monitoring is a good idea.
I was planning to undervolt it anyway when I get mine, but it's great to get confirmation that good efficiency exists in this architecture. Great video.
There should be an 'Efficiency Mode' switch in the Nvidia settings to have a simple way to adjust the power curve in native software, for the not so technical users. _With so many gamers out there (I welcome all the new peeps), but many of them aren't tech enthusiast, they just want their system to work._ Could also be an advertising point to bring up it's ability to go full beast, or still be powerful but also efficient.
The power target graphs and the associated performance per watt tests were just what I was looking for. Information like that is very hard to find (if it exists at all) outside of personal testing.
Imagine taking this card back in time during the 9800 GT era (2007) when they were struggling on 1280x720 resolution to do 60fps on very simplistic graphics. And here we are on 3840x2160 120FPS + with Ray tracing... its pretty mind blowing
But back then AAA games consted 5-20 million dollars to develop and were innovative, while now they cost hundreds of millions and even billions(gta 6 so far has costed rockstar 2 billion) and come out broken and rehashed and it's all due to graphics. Graphics/visuals take 95% of the budget of AAA games and are the reason they are approaching and surpassing billion dollar budgets.
The power curve is very interesting. I wonder if there is another reason. Like RDNA3 and Nvidia wanting to be chart topping. Such a waste of electricity for those few extra FPS
This is SO MUCH better and complete than any other tests from ANY other UA-camrs. Your review is top tier. Having these power draw infos and testing every power limits to understand how the card behave, asking yourself about why have nvidia done it like that and more importantly how everyone should configure it because energy is NOT cheap.. You're really great, thank you.
@@jasonmajere2165 there is nothing "made specifically" apple was 1st who got it , about 2 years before production you buy volume of wafers from TMSC on specific specs , and use it for what ever you want . apple is the biggest customer , they are first in the line to get the new stuff .
I found something very similar on the 2070 Super Strix I just bought - I ran some OC Scanner overclocks at various power liimits and discovered that at 65% power, the card lost 1.5% of its stock performance. Insane - from 255 W to 165 W, with no perceptible impact on frames🤣 (Roman, you might want to try an OC Scanner overclock at those lower limits... you might close the gap even further!)
Would it be better using 3 or 4 cables connected to the 4090 to test. 3 cables max 450 watt you can set power limit from or 4 cables max 600 watt you can set a power limit from. Would one be more stable than another as you could achieve the same power limit either way just want to test it in the most stable way possible.
I appreciate the detail in examining power draw. I feel like we have shifted from trying to squeeze more performance from silicon by overclocking and increasing power draw, to power limiting or undervolting now that CPUs and GPUs are becoming ever more powerful and power hungry. With the launch of the Ryzen 7000 series, it was interesting to note how dropping to eco mode or tuning an undervolt can significantly reduce power draw and temps while maintaining 90 - 95% performance, and the same appears to be true here.
I have a question, if undervolting GPU can significantly reduce power draw and temps while maintaining 90% of the performance why aren't the cards set such way in factory?
@@alexqiu1758 Mostly, I believe it's for reviews. Nvidia and AMD know that people look at benchmarks to make a buying decision. They crank power to the max to have the highest bar in the charts, and take the "lead" so to speak. In my opinion, these monstrously powerful cards would look more attractive running the more power efficient modes at stock, and allowing users to squeeze the 5% more performance with higher power draw if they want, especially under current power climates around the world.
@@alexqiu1758 there are no regulations on power consumption and it makes more financial sense to just max out on performance, as most people don't even understand what is a watt.
In my opinion, that's manufacturer set their "reasonable" curve that way because of stability. They want to have a safe range that limit any potential instability, therefore less headache dealing with worldwide customers. Most buyers are simple and will be pissed if their card came running at the "exactly right" power/performance spec then crashed due to infinite component variables and energy state (i.e. dirty residential power vs clean lab power)
I think that the overhead power target does make sense in the top-of-the-line product, indeed. It's better to have it and let the users dial it according to their needs. Your recommendations can be implemented in cheaper/slimmer SKUs.
I also agree... BUT I have always meant that for a "halo" product like this there should be allowed user adjustment of the vCore on the GPU... its such a shame its only up to the silicon lottery on how far you can get on the power limit watt.. at the end of the day the GPU will be voltage limited - if there was just room for 0.1-0.2v adjustment.. :)
This is easily the best 4090 video out right now! The power consumption information is really helpful for all of us looking to undervolt (which we should all be doing). 4000 series looks to be similar to 3000 series, in that undervolting grants a significant amount of efficiency.
I remember someone saying that the 3000 series was really intended for TSMC not Samsung fab. To get the performance from them, they are run the chips hard. Now with TSMC the efficiency is back.
Thanks very much for your elaborate testing! Amazing findings! I will definitely buy the 4090 FE. I will mainly run MSFS 2020 in DLSS 3. Hopefully it will perform as well in VR as it does in 2D.
Keep in mind that DLSS 3.0 frame insertion (the big new feature of DLSS 3.0 and huge talking point from Nvidia) dramatically increases your latency. For example: in the video review of the 4090 by Optimum Tech, they showed a chart testing MSFS in 4k with different features enabled. With no DLSS or frame insertion they had 107 fps and 27.8ms latency. Turning on DLSS 3.0 increased their fps to 145 and lowered their latency to 19.5ms. After adding the frame insertion feature their fps further shot up to the insane 212 fps, but their latency almost doubled, going from 19.5ms to 37.4ms. I personally never tried VR, but that sounds like it could be a pretty big deal (causing nausea and stuff like that?). You can always just turn off that feature and lose some fps, but lower your latency a lot.
@@petar1401 Thanks for your info Petar! I am not aware that DLSS 3.0 can be enabled without enabling the frame insertion feature. Do you have a reference where this is explained?
I think Dlss has some artifacts with reflections, not just in the water seen in the earlier benchmark but the bottles above the bar in cyberpunk benchmark have bright blown out reflections on them.
It seems very similar to what AMD did with their new CPU power consumption targets. Buildzoid was amazed by how much performance you got limiting the chip to 150w.
Very nice video man that 70% power plan is such a great find man is that program you used just dial in to 70% thats it? Or do you need to change other settings as well?
You won't see the same performance leap with the 80 and below cards. The core count difference between 4090 and 4080 (16 GB) is 59% (9728/16384). So 70 and 60 class cards will not have this same performance leap as 90 series.
@@Arunpaarthi If "4070 tier" and/or "4070 price" GPU can match 3090 performance, specs don't really matter. Regarding this launch there were so many rumors almost every week, from 800W to 3x performance, but we got a pretty good gen. leap without excessive power usage.
@16:18 THANK YOU! This is an actual demonstration of a device's efficiency. I've been complaining people aren't doing something similar with the 7950 for weeks. Everyone wants to fixate on 95C instead for no reason without actually showing the extra temp is negatively affecting energy efficiency.
The performance curve compared to the power draw is really really important here. So many watts are really overkill, I would prefer something much more efficient while maintaining the performance. Realistically most people will buy last gen cards, so if you have done a similar video using the 3060ti or 3070, that would be awesome information.
22:30 this can be from a bad bearing, It would be hard to hear on a low static pressure fan, but this can also be from air intermittently being blocked along the path of the blade, air raid sirens use this to generate sound, the RPM sets the frequency, but the volume of air that pulses out of the siren as the blades pass the blockages determines the volume.
I don't know how I feel with so many of these benchmarks being provided by Nvidia. It's obviously skews the data in their favour out of the entire pool of games which users will use the 4090 on.
I was very skeptical but I'm actually impressed by the power efficiency (ironic that its a 400W card) and DLSS3. I'm still gonna vote with my wallet for RDNA3 if it is at least 95% as good as the RTX4000, more power efficient and priced reasonably.
power efficiency is something I already tried to explain in forums and youtube as a 3090 owner, most indie games the card is almost idling I have games that run at 800Mhz and like 35w because it's no work at all for those powerful gpus and to run 120fps limited by vsync it uses like 170-220w when an oc 1080ti could barely do 110 for 330w, it's only when you start maxing RT that those card start sweating (people forget it but those are separate cores which are now getting hot too)
The numbers are crazy - the design decisions even more so. Let's hope Nvidia are running scared with good reason, i.e. desperate to hold on to the FPS crown, because they know they're going to lose out to RDNA3 in everyway that matters to those of us with more sense than money!
I undervolt my 3070 to about 75% power draw but it still maintains 97% of its stock configs benchmark scores. If I feel like being really stingy with power draw I can tune it to 45% power draw and still get 75% of the stock curve's performance. Ampere was wildly power inefficient out of the box as well, it's good to know 40 series appears to be tuning friendly as well even if it's not the standard undervolting method. Excellent video, I'm new the channel but I've been enjoying your takes on zen 4 as well.
Alright you definitely have earned a subscriber here. Between your testing and reviews of the AMD 7000 series and diving into the quirks and issues with its IHS, and your thorough and consumer-focused review of the 4090 and it's power target you've shown great ability to appreciate performance while also being able to critically examine these companies' questionable decisions. Keep up the good work and thank you.
+1 for PUBG benchmarks including eSports settings. So rare for reviewers to include this type of detail. Awesome work! Would love to see 1080p benchmarks at these settings to see how the 1% low would pair up with the new Benq XL 2566K 360hz monitor for the the ultimate experience.
To me this says the reason EVGA left is that nvidia literally spec- and price-floored AIBs out of selling a normal 2 or 3 slot cooler card for around 1200 usd instead. They want their best chip on Kingpin-class products only, which pretty much every AIB model is.
I was also thinking abt the power consumption bill.. u got it precisely.. many major tech reviewer couldn’t get that factor in perf/value studies .. good review
7:56 I think the most accurate way of checking that would be to put each video in a different track in the video editor, one on top of the other, and set the top one’s fusion mode to difference.
@10:13 the leafs on the tree are pretty clear difference. Especially compared to Performance they seem to be smeared out quite a lot. However, I'm glad that we can make these drastic performance increases today without having to resort to more and more power consumption.
Tyvm for the vid. Recently got a TUF 4090 cuz I wanted to wait until the power cable issue got resolved. Reducing the power limits is so much easier to achieve stability and keep performance than UV. Really appreciate the info. I'm probably going to target 75-80% as the max and 70% at the lowest depending on my game tests.
I've got a 3080 Ti Gaming X Trio I've tuned with v/f curve in afterburner - 16% less power draw, 10% more FPS after getting the clock to stick at 1800-1830 MHz, so with 4090 it'd be even more beneficial to work with V/f curve looking at your findings, once it's fixed.
in your a plageues tale, the dlss, you can notice some odd water sparkles as waves hit the rock, its almost like a white pixel large box, naturally this game isn't even out with proper drivers, but that part stood out for me.
Dual bios boot for the fe4090 would have been good. Normal uses the settings you found to be very efficient for a small fps loss and a boost mode where it gives you the current power and the current fps.Great video. Thanks. I sometimes don't watch der8auer videos for months and then when something interesting is released he is always my favourite to watch.
At 20:36 the fan sounds like not having enough air and creating vacuum bubbles, so it sounds like a racing car's engine or an helicopter blade hitting the air. It's the cooler itself, it seems.
I spotted differences, but not when pausing. In motion there is clearly flickering in the Cyberpunk demo at 14:32 for DLSS Quality, but it is hard to spot much with UA-cam compression.
Great review! However the imput lag for DLSS 3 would be iteresting as the algorythm is buffering 2 images and then puting some in between. So even if you have 1000 fps instead of 18 fps original, the game play would be terrible.
7950x performance is already mind blowing and now the 4090... I wish they stopped making such improvements so we could not care about upgrading >< the 7950x also has the same "problem" with 38k cinebench r23 for 200w but left unlimited you get 38.7k for 250w which makes no sense (+1.8% score for +25% power usage)
I had the exact fan issue with my 3080ti straight out of the box. Due to the shortage at the time my only option was to RMA my brand new card 2hours after buying it. To Nvidias credit the process was simple enough and I had a new card in under a week.
This jump in performance is huge, it is early but I cannot wait to see next generation of gpus! TSMC 3nm plus a chiplet design probably, I hope we see like a 20 per cent improvement over this gen but a heavy cut on power consumption.
I've fixed my own RTX 2070's strange noise that also came from a bearing gone bad. I first removed the sticker, then drilled into the back of the fan hub and refilled it with mineral oil (aka. vaseline oil, paraffinum liquidum, etc.), then put the sticker back on (had to use some glue to get it to stick right though). Works perfectly fine now, and if it should ever happen again, I'll just repeat the process. Tell your friends. It's kinda like the baking your graphics card to reflow the solder trick, but for the one moving part.
Der8auer: i tried doing the same thing with 3070Ti that i have and i concluded that the highest efficiency per watt i had on 55% power draw. But i did one more thing I tried after that use automatic overclocking in msi afterburner and limit TDP at the same time and efficiency improved even more. I was testing my card against Msi kombustor one benchmark suite and found out with combination of both I had 97.9% of performance at 75% TDP. The card is now silent, room is less hot and yet i can't notice any diffrence.
Great Review! Really appreciate the power usage test that you have done. I am now very curious to see what the 4080 is going to be like, especially power vs performance.
Hey Roman, that Wire View would make a perfect plug adapter for those cards. Just without the electronics inside, male plug on one side female on the other U shape plug adapter so the stiff 4 way pigtail adapter doesn't have to be extremely bent to clear side panels on a case that can fit the card itself but without enough clearance for the cable. Also good for kinda hiding the ugly cable adapter. Just a product suggestion to consider.
Power consumption and Power target test is a very valid test case. Opens up a lot of options on how to use your gpu. Thanks for being so comprehensive, Debauer:)
I just followed your example of cutting power to 70% and other percentages in 5% more each increment. On my RTX 3080 12GB, every drop in power drops clocks speeds. ... Dropping power to 70% drops clocks from about 1950-2000MHz down to 1600MHz. Tested in The Witcher 3 Upgrade Edition, at max settings.
GN Steve spotted the odd fan noise thing too. We shall see how this plays out. Although tempted, I'm going to hold a while longer before jumping on the 4090
I never understood why other reviewers didn't include the power draw numbers alongside the FPS, this is incredibly valuable.
many did, you need to watch more reviews. 4090 draws about the same power as 3090Ti
@@knightsljx Except it clearly does not?
Most other reviewers run the power test on a synthetic load separate from the rest of the benchmarking suite, so you only see the maximum power draw and not the actual efficiency of the card.
Probably because no one ever cared about power consumption until bills went up 3 x
@@knightsljx They mean on the same chart as the FPS, everyone else does it in a separate chart. It's nice to have it in one spot.
Also no it doesn't, in general it's drawing less power than the 3090 Ti, confirmed by all outlets.
The power stuff is really interesting, but I think the assumption is a bit wrong. It's like everyone is kind of assuming that this card is the 4000 series workhorse. This is the Ferrari, the 4080 is the card that will make sense to worry about efficiency. This isn't really for most people, this is to max the charts and wow people. I can totally see why they squeezed that performance.
this is official now. derBauer is now my new best tech reviewer ever. such attention to detail, and especially - to power consumption, which a lot of others just ignore and compare bare fps. thank you for hard work.
Him and Gamers Nexus are the best
True! specially for SFF builds, 300W vs 450W GPU is so massive that you may need to change to a more expensive PSU, heck even from SFX to ATX PSU.
Thank you very much
@@Mart-E12 I'm pretty sure it's German Nexus
Agreed!
Your overclocking experience really shines through in the way you thoroughly evaluate products and look at their performance from an engineering perspective. Thanks so much for your work! I've been really getting into power scaling graphs and using that to tweak my personal system.
Thank you! Happy to hear that
@@der8auer-en Great video! Could you help clairify please? I've just bought an Gigabyte 4090 Gaming OC card. So is your recommandation to simply install MSI Afterburner and set a power limit to 70% and nothing more? 🙂🙏
Omg, finally a reviewer who REALLY pays attention to power efficiency. Subscribed :)
thanks :)
Cause Germany is in an energy crisis lmao
@@funtourhawk only because the government shut down the plants
I think this is because before companies did not pushed chips to their absolute limits out of the box before and reviewers are relatively slow to catch up with changes.
For example, it was same back in Kepler vs GCN era when almost all reviewers measured donut power consumption even if it is not representative of gaming consumption which made AMD cards much worse because they were running at full speed, burning VRM, while nvidia downclocked to stay close to gaming consumption.
@@CrazySerb that's more of an issue of manufacturing tolerances. they were so sloppy before that they couldn't reliably manufacture the higher clocks so they sold them at a clock the hardware could definitely work at
This is why I watch multiple reviews. GN has excellent charts, and LTT has some technical hot takes (PCIe 5 and DP 2.0), but you're the first one that tested the entire curve, and when you consider you can drop the card's power target by 33% and only lose 5% of the performance, the RTX 4090 becomes manageable. It's going to reduce transients and case temperatures, and noise. Overall I think the card doesn't make sense for gaming still, but this puts it in a much better light.
makes plenty of sense for people who have high refresh rate 4k monitors.
@@justinjohnson476 next generation of cards is gonna smoke the performance / price ratio of the 4090 , it's like buying a titan that will surely be a waste of money but yes , seems like currently it is a good choice for said use.
@@justinjohnson476 Not without DP 2.0 it don't. Get wrecked. JK. But seriously, the only way to get that on this card is with chroma subsampling, not because the card can't do high refresh without it, but because it can't output it.
GN also did power consumption. No?
High resolution VR headsets and DCS. It is a naughty rabbit hole.
Thank you for the power consumption tests with lower targets. Most others are overlooking this test. It is a huge selling point for me when I'm trying to keep my draw lower and thus less heat being produced.
Yah I completely agree.. I really don't want a GPU exceeding ~300-320W power draw. It just seems like such a waste of power and a ton more heat for such a small gain. The same with the 30 series. Nvidia really needs to just stop pushing these things to ever higher power demands and keep things reasonable.
@@Shadowdane it’s a race to the bottom, both NV and AMD push their GPUs to the absolute limit just so they can score a tad better than their competitors in reviews. Nowadays you can extract less than 10% performance from extreme OC. 6900XT you gain 5% by pushing TDP from 300 to 400W. I still fondly remember my GTX970 which had 30% OC head room
Also lets me know if i need a new dedicated line from the substation.
You can afford a $1600 GPU, but can't keep your room cool?
@@Shadowdane You do know that the 4060 will be coming, right? And AMD will have a competing card as well. It is okay for products that aren't exclusively tailored to you to exist. There are plenty of people that do not care about power consumption or heat. All they care about is the TOTL performance, and that's who this is for.
Like many here, I watched a bunch of 4090 videos today. Yours is the most interesting and thorough. I am grateful for the amount of effort you put into your videos - especially putting them out in English as well as German. Thank you Roman.
Thanks for going into the power efficiency stuff in such detail!
WireView looks exciting too! Awesome that that solves cable management with a 180 adapter and provides a power readout. Looking forward to seeing that released
Was going to say the same thing. This card will not fit in a lot of cases because of the width + power cable
I appreciate this review, awesome work! Most reviewers are focused on performance, but you did something unique with the lower power target tests, which was exactly what I wanted to know about this GPU.
I've been wanting to upgrade, but the 4090's 450 Watt power draw and possibly high temps was a bit concerning to me. Seeing that it ran cool and quiet at stock, and then you can low power target by 30% and barely lose any performance to achieve even cooler temps, this definitely seems like a great card.
It's the best new card of the last 15 years easily.
@Silver Joystix Definitely not true. The 3090 Ti was pretty laughable when it comes to performance and power draw compared to a 3090.
@Silver Joystix I'm not taking about just being the fastest card, but also the improvement and features it brings over the previous gen flagship. No card has excited me more in 15 years of following the GPU scene.
That's why I said best *new* card, as in compared to the others when they were new too, not just best card now.
@Silver Joystix Braindead response
Price still sucks major ass. I'll stay with my 2070 longer.
Excellent work. Tested like a real engineer, this makes me happy as i work in R&D.
You and GN are my go-to reviewers. The WireView module is a great solution to a problem that shouldn't exist and the extra function of power monitoring is a good idea.
I was planning to undervolt it anyway when I get mine, but it's great to get confirmation that good efficiency exists in this architecture. Great video.
There should be an 'Efficiency Mode' switch in the Nvidia settings to have a simple way to adjust the power curve in native software, for the not so technical users.
_With so many gamers out there (I welcome all the new peeps), but many of them aren't tech enthusiast, they just want their system to work._
Could also be an advertising point to bring up it's ability to go full beast, or still be powerful but also efficient.
Wow, that power graph is insane. Thank you for the details!
The power target graphs and the associated performance per watt tests were just what I was looking for. Information like that is very hard to find (if it exists at all) outside of personal testing.
Imagine taking this card back in time during the 9800 GT era (2007) when they were struggling on 1280x720 resolution to do 60fps on very simplistic graphics. And here we are on 3840x2160 120FPS + with Ray tracing... its pretty mind blowing
But back then AAA games consted 5-20 million dollars to develop and were innovative, while now they cost hundreds of millions and even billions(gta 6 so far has costed rockstar 2 billion) and come out broken and rehashed and it's all due to graphics.
Graphics/visuals take 95% of the budget of AAA games and are the reason they are approaching and surpassing billion dollar budgets.
@@rattlehead999 i don't understand your point. Mine was about showing how far we have come technologically.
Yeah but then, you would be running at 1080p 90fps because of old cpu bottlenecks lol
@@kingbigcheese4335 okay ill take an entire pc back in time :)
@@rattlehead999 Explain how a game costs $2 billion.
The power curve is very interesting. I wonder if there is another reason. Like RDNA3 and Nvidia wanting to be chart topping.
Such a waste of electricity for those few extra FPS
I truly appreciate your consideration of Power Efficiency and Consumption, given the high electricity bills we have nowadays. Subbed!
This is SO MUCH better and complete than any other tests from ANY other UA-camrs. Your review is top tier. Having these power draw infos and testing every power limits to understand how the card behave, asking yourself about why have nvidia done it like that and more importantly how everyone should configure it because energy is NOT cheap.. You're really great, thank you.
Which is why you watch lots of different reviews to get a complete picture.
it was expected.
8nm vs. 4nm production
not the power consumption is the very burner here, it's the possibility to produce it i guess
soo good.
Appreciate you showing the new 5nm node efficiency. Agree it would have been better to reduce power and heat. Good honest review 👍
it is 4nm , not 5nm
@@LordLab basically a 5nm ++
@@rishirajsaikia1323 it is official 4NM by TSMC
It’s marketing speak, 4nm is a ‘node’ that was made specifically for nvidia gpus. And you fell for it.
@@jasonmajere2165 there is nothing "made specifically" apple was 1st who got it , about 2 years before production you buy volume of wafers from TMSC on specific specs , and use it for what ever you want . apple is the biggest customer , they are first in the line to get the new stuff .
I found something very similar on the 2070 Super Strix I just bought - I ran some OC Scanner overclocks at various power liimits and discovered that at 65% power, the card lost 1.5% of its stock performance.
Insane - from 255 W to 165 W, with no perceptible impact on frames🤣
(Roman, you might want to try an OC Scanner overclock at those lower limits... you might close the gap even further!)
Interesting. .. 'I ran some OC Scanner overclocks at various power liimits ' ..:
Please give a more detailed explanation of 'how'.
@@stormking1973 1. set the power limit in Afterburner and apply,
2. run OC scanner, and apply its settings.
That's all.
@@JMUDoc Thanks
Would it be better using 3 or 4 cables connected to the 4090 to test. 3 cables max 450 watt you can set power limit from or 4 cables max 600 watt you can set a power limit from. Would one be more stable than another as you could achieve the same power limit either way just want to test it in the most stable way possible.
You should make a step to step tutorial for people to save 130 watts with an undervolt!
"Legendary video card, with legendary bad timing." Love it. Glad to see you're closing in on 100k subs on this channel too! You deserve it. 👍
Where do I go when embargoes lift? Straight to der8aur for the facts.
thanks :)
The cyberpunk rasterization FPS was insane, Going from 25 to 41 in one Generation, incredible!
Shity optimization and effort is shit no matter the gen
@@n00buo I smell an idiot in here... Are you complaining without any basis whatsoever???
@@n00buo you sound like a cry baby amd fanboy bitch
@@n00buo it just shows the brute force of the 4090, don't care about the actual max fps lol
@@almightydeity I don't know if we watched the same video, but 40%+ at 4K is very very good, if we ignore price and size
Love the power consumption testing and would love this to be done on all the new cards!
This guy is a few levels above the other tech guys for sure.
It is freaking expensive in Europe! 2.8k?? Holy!!!
I appreciate the detail in examining power draw. I feel like we have shifted from trying to squeeze more performance from silicon by overclocking and increasing power draw, to power limiting or undervolting now that CPUs and GPUs are becoming ever more powerful and power hungry. With the launch of the Ryzen 7000 series, it was interesting to note how dropping to eco mode or tuning an undervolt can significantly reduce power draw and temps while maintaining 90 - 95% performance, and the same appears to be true here.
I have a question, if undervolting GPU can significantly reduce power draw and temps while maintaining 90% of the performance why aren't the cards set such way in factory?
@@alexqiu1758 ...because it's like a choice? some people definitely are fine w/ running everything at max power/temps and whatnot
@@alexqiu1758 Mostly, I believe it's for reviews. Nvidia and AMD know that people look at benchmarks to make a buying decision. They crank power to the max to have the highest bar in the charts, and take the "lead" so to speak. In my opinion, these monstrously powerful cards would look more attractive running the more power efficient modes at stock, and allowing users to squeeze the 5% more performance with higher power draw if they want, especially under current power climates around the world.
@@alexqiu1758 there are no regulations on power consumption and it makes more financial sense to just max out on performance, as most people don't even understand what is a watt.
In my opinion, that's manufacturer set their "reasonable" curve that way because of stability. They want to have a safe range that limit any potential instability, therefore less headache dealing with worldwide customers. Most buyers are simple and will be pissed if their card came running at the "exactly right" power/performance spec then crashed due to infinite component variables and energy state (i.e. dirty residential power vs clean lab power)
Roman, wtf, the wireview is fantastic. good job mate.
With that gadget, you have just resolved the cable management that everyone was so worried about:)
4:06 I was confused at first but now I understand, it is a 180 degrees cable adapter with a 90 degrees display.
I think that the overhead power target does make sense in the top-of-the-line product, indeed. It's better to have it and let the users dial it according to their needs. Your recommendations can be implemented in cheaper/slimmer SKUs.
Completely agree.
I also agree... BUT I have always meant that for a "halo" product like this there should be allowed user adjustment of the vCore on the GPU... its such a shame its only up to the silicon lottery on how far you can get on the power limit watt.. at the end of the day the GPU will be voltage limited - if there was just room for 0.1-0.2v adjustment.. :)
I got a 4090 with the intention of running it at 60%-70% power output and really look forward to the low fan speeds
This is easily the best 4090 video out right now! The power consumption information is really helpful for all of us looking to undervolt (which we should all be doing). 4000 series looks to be similar to 3000 series, in that undervolting grants a significant amount of efficiency.
I remember someone saying that the 3000 series was really intended for TSMC not Samsung fab. To get the performance from them, they are run the chips hard.
Now with TSMC the efficiency is back.
👏 Best review I've seen, the power consumption information is extremely valuable.
11:44 check your math: 80 fps is +344%, not 444.
13:37 139 fps is +239%, not 339
Thanks very much for your elaborate testing! Amazing findings! I will definitely buy the 4090 FE. I will mainly run MSFS 2020 in DLSS 3. Hopefully it will perform as well in VR as it does in 2D.
Keep in mind that DLSS 3.0 frame insertion (the big new feature of DLSS 3.0 and huge talking point from Nvidia) dramatically increases your latency.
For example: in the video review of the 4090 by Optimum Tech, they showed a chart testing MSFS in 4k with different features enabled. With no DLSS or frame insertion they had 107 fps and 27.8ms latency. Turning on DLSS 3.0 increased their fps to 145 and lowered their latency to 19.5ms. After adding the frame insertion feature their fps further shot up to the insane 212 fps, but their latency almost doubled, going from 19.5ms to 37.4ms.
I personally never tried VR, but that sounds like it could be a pretty big deal (causing nausea and stuff like that?). You can always just turn off that feature and lose some fps, but lower your latency a lot.
@@petar1401 Thanks for your info Petar! I am not aware that DLSS 3.0 can be enabled without enabling the frame insertion feature. Do you have a reference where this is explained?
Great real talk review. It is nice to hear intelligent points and opinions on the subject and not just numbers.
dude you are amazing! the details you provided were very informative and helpful! thank you so much ♥ you and jay2cents are my fav tech reviewers
I think Dlss has some artifacts with reflections, not just in the water seen in the earlier benchmark but the bottles above the bar in cyberpunk benchmark have bright blown out reflections on them.
It seems very similar to what AMD did with their new CPU power consumption targets. Buildzoid was amazed by how much performance you got limiting the chip to 150w.
Very nice video man that 70% power plan is such a great find man is that program you used just dial in to 70% thats it? Or do you need to change other settings as well?
Wireview looks great
Showing the power draw and watt stuff.. Extra testing you did.. Extremely valuable! I'm highly interested into this.
In my life, from Voodoo 1 onward, we never saw such high generational leap in performance. I just hope mid tier Ada will provide similar improvements.
You won't see the same performance leap with the 80 and below cards. The core count difference between 4090 and 4080 (16 GB) is 59% (9728/16384). So 70 and 60 class cards will not have this same performance leap as 90 series.
@@Arunpaarthi If "4070 tier" and/or "4070 price" GPU can match 3090 performance, specs don't really matter. Regarding this launch there were so many rumors almost every week, from 800W to 3x performance, but we got a pretty good gen. leap without excessive power usage.
@@alpha007orgIf it can match that performance at 60% of power consumption Can't ask for more.
@16:18 THANK YOU! This is an actual demonstration of a device's efficiency. I've been complaining people aren't doing something similar with the 7950 for weeks. Everyone wants to fixate on 95C instead for no reason without actually showing the extra temp is negatively affecting energy efficiency.
The performance curve compared to the power draw is really really important here. So many watts are really overkill, I would prefer something much more efficient while maintaining the performance.
Realistically most people will buy last gen cards, so if you have done a similar video using the 3060ti or 3070, that would be awesome information.
22:30 this can be from a bad bearing, It would be hard to hear on a low static pressure fan, but this can also be from air intermittently being blocked along the path of the blade, air raid sirens use this to generate sound, the RPM sets the frequency, but the volume of air that pulses out of the siren as the blades pass the blockages determines the volume.
I don't know how I feel with so many of these benchmarks being provided by Nvidia. It's obviously skews the data in their favour out of the entire pool of games which users will use the 4090 on.
Really glad someone came up with perfomance per watt research. Gread video, keep up the good job!
I was very skeptical but I'm actually impressed by the power efficiency (ironic that its a 400W card) and DLSS3.
I'm still gonna vote with my wallet for RDNA3 if it is at least 95% as good as the RTX4000, more power efficient and priced reasonably.
Surely if those factors matter, considering the (real) 4080 makes sense too?
power efficiency is something I already tried to explain in forums and youtube as a 3090 owner, most indie games the card is almost idling I have games that run at 800Mhz and like 35w because it's no work at all for those powerful gpus and to run 120fps limited by vsync it uses like 170-220w when an oc 1080ti could barely do 110 for 330w, it's only when you start maxing RT that those card start sweating (people forget it but those are separate cores which are now getting hot too)
@@fredEVOIX And the tensor cores used for DLSS are separate too, causing a very similar effect.
Your FPS per watt chart is the BEST, especially combined with your comment that it's better than trying to undervolt. Super, super helpful. Thank you!
The numbers are crazy - the design decisions even more so. Let's hope Nvidia are running scared with good reason, i.e. desperate to hold on to the FPS crown, because they know they're going to lose out to RDNA3 in everyway that matters to those of us with more sense than money!
I undervolt my 3070 to about 75% power draw but it still maintains 97% of its stock configs benchmark scores. If I feel like being really stingy with power draw I can tune it to 45% power draw and still get 75% of the stock curve's performance.
Ampere was wildly power inefficient out of the box as well, it's good to know 40 series appears to be tuning friendly as well even if it's not the standard undervolting method.
Excellent video, I'm new the channel but I've been enjoying your takes on zen 4 as well.
Alright you definitely have earned a subscriber here. Between your testing and reviews of the AMD 7000 series and diving into the quirks and issues with its IHS, and your thorough and consumer-focused review of the 4090 and it's power target you've shown great ability to appreciate performance while also being able to critically examine these companies' questionable decisions. Keep up the good work and thank you.
Thank you for the power to performance scaling! It is a very useful piece of content among the hundreds of reviews online.
+1 for PUBG benchmarks including eSports settings. So rare for reviewers to include this type of detail. Awesome work! Would love to see 1080p benchmarks at these settings to see how the 1% low would pair up with the new Benq XL 2566K 360hz monitor for the the ultimate experience.
To me this says the reason EVGA left is that nvidia literally spec- and price-floored AIBs out of selling a normal 2 or 3 slot cooler card for around 1200 usd instead. They want their best chip on Kingpin-class products only, which pretty much every AIB model is.
Much love. Seriously impressive how much key info is packed into this video
The cooler design being way over the top seems to fall in line with their supposed late u-turn on 600W bios.
I was also thinking abt the power consumption bill.. u got it precisely.. many major tech reviewer couldn’t get that factor in perf/value studies .. good review
7:56 I think the most accurate way of checking that would be to put each video in a different track in the video editor, one on top of the other, and set the top one’s fusion mode to difference.
The 18+ sign during the Cyberpunk comparison shows a very clear difference and what appears to be the DLSS sharpening and lighting influences
Wow great idea Roman, taking those USB power testers and adopting them to GPU power sensor.
@10:13 the leafs on the tree are pretty clear difference. Especially compared to Performance they seem to be smeared out quite a lot. However, I'm glad that we can make these drastic performance increases today without having to resort to more and more power consumption.
Awesome testing with the power curve and efficiency charts
The Tldw up front makes me want to watch more of the video - thank you
Great video, I can really see all the work that went into the video. I learned a lot so thank you!
Tyvm for the vid. Recently got a TUF 4090 cuz I wanted to wait until the power cable issue got resolved. Reducing the power limits is so much easier to achieve stability and keep performance than UV. Really appreciate the info. I'm probably going to target 75-80% as the max and 70% at the lowest depending on my game tests.
I've got a 3080 Ti Gaming X Trio I've tuned with v/f curve in afterburner - 16% less power draw, 10% more FPS after getting the clock to stick at 1800-1830 MHz, so with 4090 it'd be even more beneficial to work with V/f curve looking at your findings, once it's fixed.
Thank you for the power efficiency curve. Liked and subscribed
in your a plageues tale, the dlss, you can notice some odd water sparkles as waves hit the rock, its almost like a white pixel large box, naturally this game isn't even out with proper drivers, but that part stood out for me.
Dual bios boot for the fe4090 would have been good. Normal uses the settings you found to be very efficient for a small fps loss and a boost mode where it gives you the current power and the current fps.Great video. Thanks. I sometimes don't watch der8auer videos for months and then when something interesting is released he is always my favourite to watch.
At 20:36 the fan sounds like not having enough air and creating vacuum bubbles, so it sounds like a racing car's engine or an helicopter blade hitting the air.
It's the cooler itself, it seems.
15:45 is amazing - great work !!
I really like your approach to testing, particularly the way you looked at the power draw, a lot more practical then a lot of the other reviewers.
I spotted differences, but not when pausing. In motion there is clearly flickering in the Cyberpunk demo at 14:32 for DLSS Quality, but it is hard to spot much with UA-cam compression.
Great review! However the imput lag for DLSS 3 would be iteresting as the algorythm is buffering 2 images and then puting some in between. So even if you have 1000 fps instead of 18 fps original, the game play would be terrible.
World class analysis regarding the power targets vs. performance. I wish power was something more reviewers focused on in general.
The truly mindblowing thing is how pricing starts at 2400$ in Romania. That's in US dollars.
The overkill cooling also improve perf vs watt usage? More cool more efficiently? Hope we get this kind of cooling forward.
Thanks man. Because of you now I can run 4090 on my 550w PSU! Amazing video.
7950x performance is already mind blowing and now the 4090... I wish they stopped making such improvements so we could not care about upgrading ><
the 7950x also has the same "problem" with 38k cinebench r23 for 200w but left unlimited you get 38.7k for 250w which makes no sense (+1.8% score for +25% power usage)
I had the exact fan issue with my 3080ti straight out of the box. Due to the shortage at the time my only option was to RMA my brand new card 2hours after buying it. To Nvidias credit the process was simple enough and I had a new card in under a week.
This jump in performance is huge, it is early but I cannot wait to see next generation of gpus! TSMC 3nm plus a chiplet design probably, I hope we see like a 20 per cent improvement over this gen but a heavy cut on power consumption.
I've fixed my own RTX 2070's strange noise that also came from a bearing gone bad. I first removed the sticker, then drilled into the back of the fan hub and refilled it with mineral oil (aka. vaseline oil, paraffinum liquidum, etc.), then put the sticker back on (had to use some glue to get it to stick right though). Works perfectly fine now, and if it should ever happen again, I'll just repeat the process. Tell your friends. It's kinda like the baking your graphics card to reflow the solder trick, but for the one moving part.
Der8auer: i tried doing the same thing with 3070Ti that i have and i concluded that the highest efficiency per watt i had on 55% power draw. But i did one more thing I tried after that use automatic overclocking in msi afterburner and limit TDP at the same time and efficiency improved even more. I was testing my card against Msi kombustor one benchmark suite and found out with combination of both I had 97.9% of performance at 75% TDP. The card is now silent, room is less hot and yet i can't notice any diffrence.
Great Review! Really appreciate the power usage test that you have done. I am now very curious to see what the 4080 is going to be like, especially power vs performance.
That power meter is such a cool invention. Can't wait to need one with a 40 series card a couple years from now.
Hey Roman, that Wire View would make a perfect plug adapter for those cards. Just without the electronics inside, male plug on one side female on the other U shape plug adapter so the stiff 4 way pigtail adapter doesn't have to be extremely bent to clear side panels on a case that can fit the card itself but without enough clearance for the cable. Also good for kinda hiding the ugly cable adapter. Just a product suggestion to consider.
Great review, Roman. Thank you for making your vids in english as well 👍
Thanks for the very interesting point of view on the power side, awesome videos as always, Roman!
Power consumption and Power target test is a very valid test case. Opens up a lot of options on how to use your gpu. Thanks for being so comprehensive, Debauer:)
@4:20 Wattage = voltage multiplied by amperage. It’s not added.
Great reviews
Thanks for including the power consumption numbers.
I just followed your example of cutting power to 70% and other percentages in 5% more each increment.
On my RTX 3080 12GB, every drop in power drops clocks speeds. ... Dropping power to 70% drops clocks from about 1950-2000MHz down to 1600MHz.
Tested in The Witcher 3 Upgrade Edition, at max settings.
Also with a lower power solution, the card could fit in all cases and support lower wattage power supplies for lest costly upgrade paths
GN Steve spotted the odd fan noise thing too. We shall see how this plays out. Although tempted, I'm going to hold a while longer before jumping on the 4090