you don't have to wait a decade, the ps6 and next xbox, which will be released in 5 to 7 years will have similar or even better performance than the rtx 4090
I just got a used 3090. This is basically an announcement for what you will be able to buy in 2 years for a high price and 4 years for a reasonable price.
It's fascinating to see this level of graphical grunt, but I must say my 3070ti is doing more than I need right now and the fact that 4090 costs the same as my entire pc build is just nuts.
well it is way more powerful, also im pretty happy with my 3070 as well its sucks that they hard lock newer software features, dlss 3 would have increased the life of 3000 series cards by a huge margin
@@itsd0nk Yeah, he misspoken at 3:08 reading the script. Should've been "almost a terabyte" of bandwidth. Instead, he read the script as "a gigabyte of bandwidth" which there is 1018 of them. I am surprised he didn't caught that as being off when reading the script.
Digital foundry always has this very calm vibe when talking about technology and I love how their charts set them apart from the rest. Their analysis always offers a unique perspective often comparing frame by frame gameplay to learn about new technologies and get a better understanding of new architectures.
I agree. The same goes for channels like Hardware Unboxed, which have really amazing charts, and set themselves apart from places like Gamer's Nexus and Digital Foundry. Their unique perspectives and insight into average performance for games is very calming for how objectively they tend to look at things. It's really chill. (...Heh.)
Great that you mentioned the issue with G-Sync not working with DLSS 3.0. That's something that would annoy the hell out of me after dropping 1.5K on a GPU.
So if you have a native g-sync processor/monitor, the DLSS 3.0 would cause issues? I ask cause I have the Alienware aw2521H 360hz monitor that has the g-sync module, and I play fps/competitive games so the decrease in latency is worrisome. I wonder, is there a way to run DLSS 2.0 on the new cards? Sounds like the last gen cards are better for my scenario…
yea, i dont see any reason to buy one now, specially not for this price, but maybe in 2-3 years for 300 or whatever will be out there..... it`s not worth is at all. there is no game i want to play that needs more power than my 3080 can deliver, and i dont think this changes in the next few years, i dont need to play on ultra, for this money i almost can buy a noe motocycle that lasts longer than 2 years :D
@@DubElementMusic I mean aren't the 90 series beyond gaming? The productivity on this card is absolutely phenomenal according to the Linus video. Buying it to only play games is not worth imo.
@@braxalt2228 the reason this exists is probably because people were using productivity focused cards for gaming and NVIDIA just took advantage of that.
@yello regardless of that, the pro version will still come out in 2024(equivalent close to 3090 perhaps) for 3 years run before ps6 (2027) imo they need to catch up on games requirement too especially this incoming UE5
Yeah, in 5 years these things used will likely be in the $500 range, certainly under $700 which will be amazing - it's fantastic to see such a huge leap in performance for basically no increase in cost - that said the cost of the 3090 was pretty insane already but at least we didn't get a huge jump there (and when you factor in inflation it's actually a bit _cheaper_ than 3090 was). Either way it's an incredible bit of kit - and we haven't even seen the 3rd party cards which will be even faster.
@@ReKonstructorraytracing the 5070 could be better, but not at 4k max settings. But another point is that you can always wait for better tech, and at some point you have to buy something. We don't know how long we live and we only live once. So if you don't want to wait until Q1 2025 you should buy a 4090 if you can afford it.
The last rig I built was back in 2016, using a GTX 1080 FE. At that time, the gains over previous generations were significant and allowed me to really enjoy then-current gen games in a way that my Xbox One and PS4 could never approach (21:9 aspect ratio, FOV sliders, nearly 60 FPS at ultrawide-HD, and so on). It was pricey at the time, costing about $500, if I remember correctly. Today, with so few games taking advantage of the Series X I own, at $500, I cannot rationalize why I would spend nearly $2,000 for a 4090, much less $1000 for a 3090ti, to play the same damn games, at similar fidelity. It's not a question as to whether I can afford the card, but rather a question of whether it makes any real sense to do so, given the current state of gaming and the fact that I'm content with my Series X at the moment. Buying one of these just feels... irresponsible. I don't want to encourage this kind of pricing for a consumer-grade PC component dedicated to playing games.
As always with PC, it's up to you and your wallet. I have a modest PC (GTX 1070ti and Ryzen 2700X) but i just can't keep up with the evolution and the price of the new technology. I've bought a Series X and a Switch instead so i have everything i need to enjoy games in descent condition, and my PC is still around if i need to play something that isn't on Consoles. It's a very efficient way to enjoy everything without having to sell a kidney.
@@MadX8 It not just about price also. Sure I think a good majority of people can afford these cards. They just make no sense value wise. I have a demanding job and family. I have a good PC, 2080 Ti and Ryzen 3600. I haven't gamed in it in 2 weeks because I've just had no time. When I am relaxing, Im just on my steam deck. A $1600 GPU makes no sense for people who have any sort of life outside of video gaming. For productivity and gaming, I could see someone making a case for it.
@Alarge Corgi2 from 2 outlets already I've seen complain about it I believe you have to reduce quality in some way in the form of sub sampling to get 144, I'll have to go to their videos and see if they made corrections or not or if its true.
It's kind of weird seeing mainstream game bloggers and influencers trashing things like the Steam Deck and other handhelds for their battery life, that they should be smaller, more powerful, (and make waffle fries too) while there seems to be no end in sight to how large and power hungry things will be on this other side of the market. I mainly find it nonsensical that they don't use even options given to them and don't consider the balance act manufacturers go though providing these devices at reasonable costs, but it's also like they don't pay any attention to just how crazy power consumption, size, etc has become or how impressive it is to be running the latest games on complete systems that max out at like 40 watts.
"performance with say 150W TGP" Unlikely to happen. Rate of transistor shrinking is slowing down. Will need advancements in material science to boost clocks with same current draw, or massive die sizes.
@@JimBob1937 technology will always move forward wether it be Intel or a third party having extreme high efficiency chips is just a matter of time physical limitations are only that until a new tech comes forward and breaks that limit.
This is massive in 4k dude anywhere from 40 to 2x the performance even natively sometimes most of the time sitting around 70 to 80%gains natively that's freaking massive and massive in ray tracing. I'm truely shocked because we are only used to seeing 30-40% gains to generations. Also the cpu limits it even further can't imagine how good it is with even more powerful cpu's in the future! I truely didn't expect anything like this massive lol. I'm truely shocked at the raster gains!
A lot of reviewers who used new ryzen and old 5800 3d seen massive cpu bottlenecks, even at 4K in some game. I’m waiting to see how new 13900k 24 core handles the 4090 before I buy a cpu.
@MirroredVoid Uh, well given 12900k was a roughly a 12% uplift over 11900k, and 7950x was a 15% uplift over 5950x 10% would be disappointing actually. Rumors are that 13900k will be a larger jump over 12900k that that over 11900k so maybe check those numbers and educate yourself there chief.
Are you joking? Current CPUs can carry it just fine, hell, old gen CPUs can. Next gen will be a walk in the park. CPUs have gotten insanely strong, don't underestimate them lmao
I still follow high-end PC, but I stopped keeping up years ago. If I were to budget correctly, I could afford this kind of technology. Only just though... I'd rather not
Pretty impressive performance jump, but of course it's too expensive. But in the least it shows that tech didn't stagnate at all. Excited to see how amd responds.
It's the absolute top of the line GPU, of course it's not priced for average people. That's like complaining that a Bugatti isn't priced for the average driver.
@@Totenglocke42 yeah, but games shouldn't be targeted just for richest people. and people wont buy game that needs high end hardware to run. It would be financial suicide for devs.
@@avatarxs9377 State of the art will always price out normal people. You can run games perfectly fine on lower end cards but if you want the best of the best you gotta pay for it. It will always be this way.
It didn't quite stagnate yet, but for me, it still looks like a game of being between a rock and a hard place: -they'll run out of die shrinks eventually, and in fact, effective die size for lower end GPUs has been increasing even taking into account node shrinks: GTX 1060 (200mm^2, TSMC 16nm); RTX 2060 (445mm^2, TSMC 14nm); RTX 3060 (276mm^2, Samsung 8nm, which is a die size reduction compared to the RTX 2060, but still an increase compared to the 1060 despite going from TSMC's 16nm to Samsung's 8nm); RTX 4060... -Bigger dies mean more power hungry chips, and Nvidia have been able to stick with 250w max for top GPUs pretty much until the RTX 2000 series, but that was no longer the case with the RTX 3000 series and onwards, with all the implications associated (ATX3.0 and PCIe 5.0 connectors to handle the power increase) -Silicon wafer sizes and dies sizes have physical limits from a production pov (silicon wafers are currently 300mm in diameter, and reticle limit allows for die sizes of ~800mm^2 max), as well as economical limits from a production pov (how much will a RTX 6060 with a +500mm^2 die size cost?) -There's only so much performance gains a new GPU architecture can do at any given node shrink The good news is that what's already out there should be enough for quite a few years, but something will have to radically change eventually...
I want this for VR so bad... Makes me think I'll build a new system over the next year and wait for prices to be a bit more reasonable before biting the bullet on the video card
Maybe wait for VR headsets with eye tracking (and therefore supporting foveated rendering) to come out so you don't need to spend a month's rent on a GPU.
If you wait like a year you might even get that same performance for maybe half the price and maybe power draw. Maybe the RTX 5070/5080... Or an equivalent AMD card, if you prefer to have efficiency and still want to have a smaller looking card.
@@liamness Hoping the PSVR2 releases at an affordable price, sees wide adoption, and begins to make foveated rendering standard in high-end VR titles. Really a lot is banking on Sony's success in that area in my opinion.
these prices are especially rough in today's economy, I think definitely so for the 4080 and fake 4080. but one thing i do appreciate about nvidia is they continue to push for big gains of GPU power even though they're already best in class performance. I'll almost certainly be picking up another Nvidia card next but I still want to see how RDNA 3 comes out
Thank you for the very good review guys. I actually enjoyed this one more than any other reviews. You did go talk about things you didn’t before and I respect that to the fullest. There was a key moment now for the 4090 that is convincing me that this time around, as expensive as this card is, it makes the games work better across the board without DLSS - that is a HUGE seller for me. Now to wait for the RDNA3 stuff because you never know but if I am to upgrade with NVIDIA this generation, I guess so far it will be this one or when they have the Ti version of it even for extra kicks. THANK YOU for going the extra mile on this review Rich and team. Much respect.
I'd love a 4090, but I built up my first PC in 10+ years with an EVGA 3080ti not too long ago, so I'll wait for the 5000 series and the competition. Hopefully we get the same 4K RT power with much less energy consumption.
@@snowpuddle9622 They indeed added new dedicated hardware to enable DLSS3 reconstruction, that's way input lag was increased as rendering pipeline has now more steps.
7:25 am interesting use for DLSS 3.0 would be to allow frame doubling above the engine limit. It would be a life saver for 30fps console ports, and also pretty cool to reach 500fps on other titles.
It probably worsens the experience as it increases latency. It would look way smoother, but IMO the latency is by far the main problem when playing at low framerates. Though, it may turn out to be a valuable feature specially for lower tier future RX 40##
@@ziokalco It would certainly be bad for games like Dark Souls 1, but for slow games it should be just fine of it can be forced from the drivers. When latency is important, the best case scenario would be to turn 120fps into 240fps to take advantage of your Shiny new monitor.
I wonder if DLSS 3 can evolve to predict frames at arbitrary spacing between frames, similar to Topaz algorithms. That would allow it to nullify visual frame stutter.
That Portal clip is just further proof that DLSS will always be a necessity if you want 4k quality visuals with RT. Thankfully a lot of games aren't path traced yet otherwise we would be in some biggie trouble
"Thankfully a lot of games aren't path traced yet" Do you really think they are gonna start making a lot of games even the most powerful hardware can't handle?
Wouldn't be surprised if NVidia at some point uses the DLSS3 software feature to cut down on chip size so the real performance gains only go to the bigger chips for raw performance, which NVidia then most likely, only will use on enterprise level cards... and then the biggest chips you will get in Nvidia consumer graphics cards will be max. 300-350mm²
Do we really need that amount of horse power? I feel like games are getting really grand in scale but also in detail, which makes most games nowdays a visual spectacle but also an empty shell, all this while also ramping up development time, crunch culture, costs and because of that, releasing incomplete games with predatory practices like loot boxes or season passes to make up for everything i said before. I appreciate advancements in technology, 4K 120fps with ray tracing is amazing, but i don't this necessary at the moment for the games right now (maybe it's great for other applications outside gaming realm) Such a high celling and the race for the highest fidelity has shadowed the game design and artistry behind games. I really wish more devs would use easier and less demanding engines so they could focus on delivering finished, inventing and passionate products instead of trying to impress with state of the art tools.
In a way, raytracing does lessen the load on devs though. Less requirement for carefully placing probes and baking lighting information into a scene. Then going through the process again if there is any change in the scene, or if there are other dynamic elements like time of day / weather.
Could not disagree more. So many of the problems in gaming tech have to do with devs using paradigms that are OUT OF DATE. Many highend technologies actually REDUCE the burden on devs in the long run despite an initial investment (like PBR or ray tracing)!
4090 is a beast, biggest gen on gen Rasterization increases I've ever seen. Price certainly reflects this. Thanks for all your hard work DF on this launch video. Would love to see DLSS Quality mode compared to Native (Both DLSS 2 and 3) as that's what I play with since DLSS Perf has some loss in quality vs Native but Quality mode does not for the most part. Seems like 4090 is a 120-144 Hz 4k Max settings beast. Can't wait to see Unreal 5 games played on it!
I think it should be noted that DLSS3 isn't quite just adding a "few ms" of input lag and massively improving performance. It's adding that extra few ms vs the base framerate but vs the framerate it's targeting, it's significantly laggier. So for instance a 30FPS game frame generated to 60 will still feel like a bit worse than the same game at 30FPS input latency wise, and while it'd look like 60FPS it won't play like 60FPS. Ofcourse the higher the framerate the lower the impact of it as the differential narrows down.
Unless I’m misremembering, it’s adding a few ms to the input lag you’d get from DLSS 2.0, not from the base frame rate. This ends up still being more responsive than native resolution, but slightly less responsive than DLSS 2.0.
@@DevNug But it's significantly less responsive than the framerate it's targeting if it were to run at that framerate natively. If a game has 80ms of lag at 30 FPS and 50ms lag at 60fps. Then using DLSS3 will make it "run" at 60FPS but the input lag will not be anywhere near 50ms, instead it'll be closer to 80ms.
Why anyone would even think of dlss 3.0 with a 4090 is beyond me. Just select a native resolution and you can still hit as much as 360fps conveniently with zero drops.
As someone who has been showcasing AI based interpolation since 2020, to see it done in realtime rendering is bananas. Hats off to Nvidia. Also can't wait for Nvidia Remix!
@@gavinderulo12 lol why is that? Certainly their pricing is high this time. I don’t plan on buying a 4000 series card until maybe the 4060 or 4070 honestly.
@@wizzenberryTV frame interpolation has huge input lag. It just doesn't matter since its not used for interactive media. The fact that nvidia is able to achieve this in real time, with better results, is more than impressive.
Would be interesting if you'd test these massive high end GPUs in VR applications, things like Flight Simulator or Assetto Corsa Competizione, on which the 3000 gen struggles massively compared to pancake games
I'll say this much, with as advanced as Unreal Engine 5 and these new GPUs are, if developers don't use UE5's Chaos Engine to its full potential, games built with it aren't going to be much more than just visual improvements. Fully physics-based maps/worlds though where nothing's static and everything is destructible or interactable? Now that's revolutionary. I thought we were headed in that direction with the likes of the Red Faction series, but we all know what happened there. All we were left with was the Battlefield series. Just Cause 3-4 were good, but I don't recall being able to affect the environment/map. That needs to change. Not just in First Person Shooters, but in any game really where you have powerful abilities. Even racing games would benefit from fully destructible environments. Imagine crashing, and then instead of abruptly stopping when you hit a wall or fence, you go through it.
The 4090 is a phenomenal GPU. Literally doubles the performance of the 3090 in most ray traced games, and that's referring to raw performance. No upscaling or frame gen.
I'm actually surprised. The 4090 is a certified beast compared to the 3090. We haven't seen this sort of jump in GPU power from generation to generation in a really long time.
@UA-cam Account Eh I would suppose otherwise and I don't know why people are so butthurt about "fake frames" when in reality they won't be able to tell a goddamn difference, but they still gotta cry because it isn't 'genuine'
I love every time Richard sneaks "streets ahead" into his reviews or analyses, as if it was a common phrase. I appreciate this. I just wanted to acknowledge it.
Surprisingly we have yet to see any Unreal Engine 5 games to release and they alreadly planning on an UE 5.1 soon. While we have some games that were announced to be made using it, including Hellblade 2, we haven't seen any release date to any of them yet. Guess development still take a long time even with all these conveniences to reduce said time.
geez, the raster performance of this thing is insane. Even at lower resolutions like 1440p you still gain 30+ frames.. at 4k with dlss 3 it makes my current 3090 eat shit.. both in and out of RT and with and without dlss 3. Massive gains in performance.
I got mine and the performance is insane. I'm so frequently CPU limited now even at 5120x1440. I can make pretty good use of my 240Hz refresh rate even in a lot of AAA games. Waiting for my 13900K to ship now to try and alleviate the cpu bottlenecking.
There's no energy crisis in America, at least not in red states where we actually care about things like quality of life and not being 3rd world peasants.
Would be interesting to see DLSS 3.0 working with Microsoft Flight sim in VR seeing as it's mostly bottlenecked by the cpu. Those artificial frames could really help smooth things out. Currently I get around 45fps with a fairly even mix of medium, high, and a couple of ultra settings on my Reverb G2 with RTX 3080ti. 4090 is too pricey though. I wont be upgrading until at least next summer when I get my bonus and maybe by then 30 series stock will have depleted leading to a price drop on 40 series?
I watched many reviews of this GPU and apparently everybody forgets the performance uplift that gtx 1080 ti had over gtx 980 ti while costing 699$... yes, rtx 4090 is quite the thing but a lot less impressive when you factor in the price tag... again, Nvidia's marketing worked wonders in persuading consumers that 1600$ is the norm for a GPU...
@Sorey there are a lot of good options it terms of any other type of hardware except the GPUs… that’s true… maybe the route from now on is used GPUs when the newer gen comes out…
This card is for paypigs and professionals, everyone else can wait for the cheaper ones. The silicon shortage isn't over, everyone in the business wants to get every dollar they can for the chips they can actually get from TSMC. Same reason AMD won't be rescuing us from high prices: they have a certain amount of silicon to sell and it's going to be made into high-end CPUs first. It's shitty for us but it's good business.
Appreciated the discussion on "fake frames". Yes, it's definitely not "increasing performance", but the end result is indistinguishable from better performance: Higher framerate at good frame pacing. So, does it even matter?
Input latency is a real thing and even this video mentioned wanted to test it in a fast paced FPS (for example). That latency may well be too much in some scenarios.
Yeah, but if you're playing a fast paced FPS, you likely won't be cranking up settings to the maximum, and you can just disable DLSS3. If you're playing a very demanding 1 player game though a small latency increase doesn't matter.
It's interesting how you would probably want to wait for a few CPU generations and display technology advancements to actually make good use of the 4090. At that rate, delaying your high end GPU purchase till CPUs catch up might be the most economical option. If you're only going to use 50% of 4090's power by capping frame rate for current and even next gen games, you're essentially sitting on a solution looking for a problem in the, ahem, here and now.
Makes no sense to me we already have 4k 144hz screens and the 4090 is only cpu bound at 1080 and sometimes at 1440p but the obvious resolution this card is aiming for is 4k so cpu performance doesn't matter mostly...
@@i3l4ckskillzz79 I both agree and disagree. Only DP 1.4 has the bandwidth for 4k144, and HDMI 2.1 can only go up to 4k120, these are the two ports 4090 supports. Secondly, we can see that with DLSS3, 4090 can very much provide ~4k200 experience in Spider-Man, a heavy AAA title from 2022, all those extra frames are beyond the display output spec. With optimised settings and engines designed for next gen, who's to say that can't become a norm with DLSS3.
Detailed rundown as always. I wonder if in those times when even at 4k it's running in excess of the refresh rate if it's time to use DLDSR with DLSS3.
Nice. so now we as consumers would have to pay enormous amount of money to buy these cards so game publishers can release their titles without any optimization knowing 40 series cards would run it. no issue.
Or a much more realistic scenario, no new game will use the extra power of those cards, because are just entusiast things, who are pretty rrarre to find, and cost too much to buy, so there is no market for developers.
People that will own 4080s and 4090s are in the minority. In general they will optimize for the cards that most people can buy I would imagine because that's where the most money comes from
u mean topaz well they still not using the monster 3080 which i got good it needs more updates once that happens yea it will run real good i know cause i did a 30 min cartoon with waifu and it took 10 min whereas topaz with the same one takes 1hr-hr and 30 min
@@cmoneytheman topaz is much more suited for irl picture, CG etc for cartoons and anime stuff i found ESRGAN with a custom model to be the best but thats for images, the video upscalers use a a whole different methods and algorithms, i.e the results aren't as perfect thats y they are much more lightweight
@@ShahZahid i wasnt comparing i just was saying the speed difference is way better with waifu i expect topaz to run that way once it gets more updates i have always thought that was gonna be the case with these monsters 30 series cards
Really feels like if we continue to get at least half this uplift generation on generation, in two more generations the 6-tier of GPUs will comfortably be getting ~200+ FPS in 4k. I genuinely believe that will happen by or before 2026 and that 4k at 200+ FPS really be a comfortable spot for most of players to stop. That thought makes me really giddy thinking about VR headset resolution potential by then as well.
The problem with a "next-generation of games" is that we are not going to get them until the consoles and mid-range cards will be able to run them, too, very few people will actually have a 4090 in their PC so developers may not bother at this moment, and with the way nVIDIA is pricing the whole series, I'd say we're in for a complete AAA market stagnation, which is already filled with stagnation and microtransactions.
Very true. The 1650 class of card is still by far the most used card on steam hardware survey. Few developers are going to put themselves in a position where 95% of gamers can't run their game. Nvidia's pricing is so far outside the realm of reality as to be a joke so nobody is upgrading to this shit.
It'd be great if the frame interpolation was a feature that could be turned on independent of the game engine. It'd be a great way of giving a boost to older games, particularly those which are console ports where things like controls and physics break if you run beyond 60fps. Imagine being able to 'upscale' old 8 and 16 bit games to 120fps. It's such a potentially transformative technology. I'd love to see some analysis though on the inserted frames to see whether there is visible ghosting, artifacting, etc.
Might as well wait for the 4070 since the 4090 is so powerful. No need spend that much money if DLSS 3 is bumping up specs like that. I still wanna see what AMD does this time around with RT/FSR/RSR
VR would have been worth a mention. Something like the Reverb G2 with 2 x 2160 panels at 90Hz pushed 3090Ti too far. This looks like the card to solve it.
Honestly I was expecting even a MENTION on 8k performance on this card, I remember when the 3090 came out and Nvidia was saying to tech reviewers test the 8k possibilities of that card, so I was interested to see how the 4090 compares with that.
@@MrVoland44 it is interesting to play at 8k even on a 4k 27"panel you notice the difference in sharpness, i tried mirrors edge the first one from 2008 at ultra 8k on 3070 and was getting 30 fps or so
Nvidia 1080 user here. :0) Glad i waited i will most likely pic up the 4080ti once it hits then i should be covered for a loooooooooooong time and if i can't then my 1080 is still perfect!
Very impressive card, but Will wait for the next card in 2024...and like everyone said before, pricing is really an issue. most of us doesnt are modern salves that doesnt earn much at all, and with the pricy cost of just living, its hard to buy something like this.
best thing about all this IMO is DLSS3. The future of it working with GSync will be great because you won't need a top tier GPU. A 4080 or less will stil be getting amazing framerates and making these new monitors even more worth having.
The Next Level In Electricity Bills too. This thing is like new Taylormade driver that comes out every year I always roll my eyes after I saw one announced and said "Well at least now I can get 2019 model for reasonable price"
@@CaptToilet How are you defining special? If it can hit 60 fps in most games at 8k then it would be the first card to do it, that would be special to me.
@@Noliving It would be really cool to see what 8K resolution would actually look like on an 8K TV for gaming. How much can you tell the difference over 4K.
Considering I was able to snag two 3090s ($700 each) for less than the price of a 4090, Price is the issue here. Used cards are plentiful and are not at all a bad buy. I refurbished both of my 3090s with fresh thermal paste and thermal pads and get solid temps better than new. So is 60% to 80% performance increase worth $900 more with the current market of used cards and heavily discounted new cards? It will be interesting to see what RDNA3 brings and the pricing AMD decides. The 4090 isn't a bad product, just a bad price.
the price isn't even *that* bad, it's the 4080 and the '4080' where price is abysmal. performance should be considered with what resolution you are using; sub 4k high refresh and it won't be worth it in the majority of titles for a couple of years.
If frame times are too high, and you don't want stutter, would enabling fast sync work? ie. To eliminate screen tearing for framerates above the monitor's supported rate.
Here's the thing. Developers make games for the PS5 and XBSX. Having this much power only makes sense for VR on PC, where you actually do need hi-res high refresh rate rendering, but even that will be leaps and bound a improved with foveated rendering. So I honestly don't see the use case for these GPUs atm.
Going all in on PC for the next gen seeing all games with exception of Nintendo going to PC on day 1 or within 1 year. I have a pc with a 3070 and 3700X cpu but will give that to my daughter and will build a new pc with a 4090 or 4080 with either a 7900 cpu or 13900K Intel. Can’t wait to just have one systems for my Xbox,PS and PC gaming for VR and Triple A titles
Amd's GPU offerings unveiled on Nov 3rd, unlikely but Nvidia may adjust prices for their lower end. The 7X003D cpu's should release Q1/2. An investment into your screen might be a wiser choice in the meantime, regardless all the tech is killer atm have fun!
i dont think its gonna get cheaper as we are now reaching the limits of chip manufacturing, either we need a breakthrough in chip chemistry and architecture or lesser performance gains that way the price will be justified
Thanks for the review !! Just a correction, at timestamp 3min:08s of the video. " A Gigabyte per second of memory bandwidth " 1018GB/s is actually aproximately a Terabyte per second of memory bandwidth.
Can't wait to get my hands on one next decade
you don't have to wait a decade, the ps6 and next xbox, which will be released in 5 to 7 years will have similar or even better performance than the rtx 4090
🤣 I'm with you
I just got a used 3090. This is basically an announcement for what you will be able to buy in 2 years for a high price and 4 years for a reasonable price.
@@MGTOW89 ya right
And that’s about when you’ll need one to actually play a game.
It's fascinating to see this level of graphical grunt, but I must say my 3070ti is doing more than I need right now and the fact that 4090 costs the same as my entire pc build is just nuts.
well it is way more powerful, also im pretty happy with my 3070 as well its sucks that they hard lock newer software features, dlss 3 would have increased the life of 3000 series cards by a huge margin
I just got a 3080
.. upgraded from a 1080ti. I just don't ever set a time that I need anything this beastly!!!
@@jcaashby3 here I am hoping this causes a massive drop in 3080 and 3090 pricing
My ass still running a 1050Ti 😢
Rich people exist and here we are.
Not a Gigabyte per second bandwidth, the Bandwidth is approx 1 Terabyte per second. Which is insane.
?!
@@itsd0nk memory
@@itsd0nk Yeah, he misspoken at 3:08 reading the script. Should've been "almost a terabyte" of bandwidth. Instead, he read the script as "a gigabyte of bandwidth" which there is 1018 of them. I am surprised he didn't caught that as being off when reading the script.
Radeon VII had that years ago :P:P
@@adi6293 Which was tbh a test batch of a GPU. AMD is in a bit of trouble with this kind of bumps in performance...
Digital foundry always has this very calm vibe when talking about technology and I love how their charts set them apart from the rest. Their analysis always offers a unique perspective often comparing frame by frame gameplay to learn about new technologies and get a better understanding of new architectures.
they are the best analysis channel, hands down
The atmosphere is why I always start with DF. Next up: GN
I agree.
The same goes for channels like Hardware Unboxed, which have really amazing charts, and set themselves apart from places like Gamer's Nexus and Digital Foundry. Their unique perspectives and insight into average performance for games is very calming for how objectively they tend to look at things. It's really chill.
(...Heh.)
Great that you mentioned the issue with G-Sync not working with DLSS 3.0. That's something that would annoy the hell out of me after dropping 1.5K on a GPU.
1.5k is cheap, in EU the FE costs €2K and the aftermarket starts from €2.4k
So if you have a native g-sync processor/monitor, the DLSS 3.0 would cause issues? I ask cause I have the Alienware aw2521H 360hz monitor that has the g-sync module, and I play fps/competitive games so the decrease in latency is worrisome. I wonder, is there a way to run DLSS 2.0 on the new cards? Sounds like the last gen cards are better for my scenario…
@@EndoV2 DLSS 2.0 is also possible to use on the 4000 series
I'm definitely not the target demo for this card but it's cool to see the crazy performance of it. I'd like to see it run marbles.
yea, i dont see any reason to buy one now, specially not for this price, but maybe in 2-3 years for 300 or whatever will be out there..... it`s not worth is at all. there is no game i want to play that needs more power than my 3080 can deliver, and i dont think this changes in the next few years, i dont need to play on ultra, for this money i almost can buy a noe motocycle that lasts longer than 2 years :D
@@DubElementMusic
I mean aren't the 90 series beyond gaming? The productivity on this card is absolutely phenomenal according to the Linus video. Buying it to only play games is not worth imo.
@@DubElementMusic 300? LOL.
@@MavihsLH none of the cards with their pricing right now is for consumers
@@braxalt2228 the reason this exists is probably because people were using productivity focused cards for gaming and NVIDIA just took advantage of that.
It always great to see what I will be having in 6 to 8 years.
me as egyptian watching this while still using gtx 1060 for gamng and using half broken rift cv1 to play vr games
Yep, hopefully the RTX 6060 can pull this off in late 2026 at a reasonable price. Just in time for the PS6 announcement =/
@yello regardless of that, the pro version will still come out in 2024(equivalent close to 3090 perhaps) for 3 years run before ps6 (2027) imo they need to catch up on games requirement too especially this incoming UE5
@@adzamree6870 im expecting the pro to be closer to the 4080 especially if they price it at 600$
@@johannessguten2527 thats going to be overpowered lol considering its optimized
Glad this exists to push graphics forward, hopefully we can get something with this performance in the mainstream in around 5 years
Yeah, in 5 years these things used will likely be in the $500 range, certainly under $700 which will be amazing - it's fantastic to see such a huge leap in performance for basically no increase in cost - that said the cost of the 3090 was pretty insane already but at least we didn't get a huge jump there (and when you factor in inflation it's actually a bit _cheaper_ than 3090 was). Either way it's an incredible bit of kit - and we haven't even seen the 3rd party cards which will be even faster.
In 5 years you will be begging Nvidia to get their xx60 for 1600usd.
Fake „generated“ FPS with scaling is no „Performance“. Lets hope AMD will bring a card that will get 120FPS in real native 4K.
@@wiLdchiLd2k
No?
You are just
hater.
The numbers are clear,what more do you need?
Especialy if it looks better than Native 4K.
@@wiLdchiLd2k They won’t. Can crush those hopes right now.
Just built my first pc. I913900k and 4090 founders edition. This pc is unbelievable
The 50 series specs was just leaked, and the 5070 is 15%ish percent better than your 4090 lol
@@ReKonstructor No it wasn't. Why did you just lie so comfortably like that? You are ridiculous.
@@ReKonstructor They won't even release the 5070 til Q2 2025💀 "5070 gOt LeAkEd". Bro fell for a yt clickbait vid
@@ReKonstructorEven the 5080 won't be as good as 4090 when it comes to rasterization.
@@ReKonstructorraytracing the 5070 could be better, but not at 4k max settings. But another point is that you can always wait for better tech, and at some point you have to buy something. We don't know how long we live and we only live once. So if you don't want to wait until Q1 2025 you should buy a 4090 if you can afford it.
Despite it's price, you have to admit the 4K performance is stunning. Almost seems to good for most current games.
This aged well
@romankozlovskiy7899 it did? 4090 is a 4k monster still
The last rig I built was back in 2016, using a GTX 1080 FE. At that time, the gains over previous generations were significant and allowed me to really enjoy then-current gen games in a way that my Xbox One and PS4 could never approach (21:9 aspect ratio, FOV sliders, nearly 60 FPS at ultrawide-HD, and so on).
It was pricey at the time, costing about $500, if I remember correctly.
Today, with so few games taking advantage of the Series X I own, at $500, I cannot rationalize why I would spend nearly $2,000 for a 4090, much less $1000 for a 3090ti, to play the same damn games, at similar fidelity.
It's not a question as to whether I can afford the card, but rather a question of whether it makes any real sense to do so, given the current state of gaming and the fact that I'm content with my Series X at the moment.
Buying one of these just feels... irresponsible. I don't want to encourage this kind of pricing for a consumer-grade PC component dedicated to playing games.
Great comment. It definitely creates a tough decision, it’s just so costly. I currently looking at videos detailing this card in terms of production.
Gotta love Richard's casual use of the phrase "streets ahead".
In the here and now
For one of this graphics card only, you can buy all 3 major gaming consoles
You can build a PC and a PS5 for the cost of this GPU lol.
As always with PC, it's up to you and your wallet.
I have a modest PC (GTX 1070ti and Ryzen 2700X) but i just can't keep up with the evolution and the price of the new technology. I've bought a Series X and a Switch instead so i have everything i need to enjoy games in descent condition, and my PC is still around if i need to play something that isn't on Consoles.
It's a very efficient way to enjoy everything without having to sell a kidney.
@@MadX8 It not just about price also. Sure I think a good majority of people can afford these cards. They just make no sense value wise. I have a demanding job and family. I have a good PC, 2080 Ti and Ryzen 3600. I haven't gamed in it in 2 weeks because I've just had no time. When I am relaxing, Im just on my steam deck.
A $1600 GPU makes no sense for people who have any sort of life outside of video gaming. For productivity and gaming, I could see someone making a case for it.
And I'd take a 4k 240hz capable gpu with DLSS 3 over some console with reduced graphics, meh 1 effect RT and low fps and poor joypad control schemes.
I REALLY wish they provided DP2.0 with the 4090.
honestly a surprising disappointment for me
@@aaz1992 Im curious as to how this is disappointing to you? Lower msrp than the 3090 was and much faster? what more could you want?
Another SHOCKER, officially the $1600 Geforce GPU is not supporting PCIe Gen 5.
@@randomguydoes2901 easily do 144fps/hz on alot of games and capped at DP 1.4 120hz LEL
@Alarge Corgi2 from 2 outlets already I've seen complain about it I believe you have to reduce quality in some way in the form of sub sampling to get 144, I'll have to go to their videos and see if they made corrections or not or if its true.
Clearly very powerful in say 5 - 6 years I would like to see this kind of performance with say 150W TGP that’s the real future
It's kind of weird seeing mainstream game bloggers and influencers trashing things like the Steam Deck and other handhelds for their battery life, that they should be smaller, more powerful, (and make waffle fries too) while there seems to be no end in sight to how large and power hungry things will be on this other side of the market.
I mainly find it nonsensical that they don't use even options given to them and don't consider the balance act manufacturers go though providing these devices at reasonable costs, but it's also like they don't pay any attention to just how crazy power consumption, size, etc has become or how impressive it is to be running the latest games on complete systems that max out at like 40 watts.
"performance with say 150W TGP"
Unlikely to happen. Rate of transistor shrinking is slowing down. Will need advancements in material science to boost clocks with same current draw, or massive die sizes.
@@JimBob1937 I wouldn’t say it’s unlikely it’s just a matter of time
@@CH-bn7qb , we're hitting physical limitations of silicon. Intel's latest 1.8nm announcement has transistor feature size at 9 silicon atoms.
@@JimBob1937 technology will always move forward wether it be Intel or a third party having extreme high efficiency chips is just a matter of time physical limitations are only that until a new tech comes forward and breaks that limit.
This is massive in 4k dude anywhere from 40 to 2x the performance even natively sometimes most of the time sitting around 70 to 80%gains natively that's freaking massive and massive in ray tracing. I'm truely shocked because we are only used to seeing 30-40% gains to generations. Also the cpu limits it even further can't imagine how good it is with even more powerful cpu's in the future! I truely didn't expect anything like this massive lol. I'm truely shocked at the raster gains!
@@KoeiNL No, it doesn't. It's almost 3x as efficient as 3090ti.
@@KoeiNL are you high? It used less power then the 3090ti lol
Sounds like these will pair really well with the Ryzen 7000 3D cache variants when they get released. Can't wait to see those results!
A lot of reviewers who used new ryzen and old 5800 3d seen massive cpu bottlenecks, even at 4K in some game. I’m waiting to see how new 13900k 24 core handles the 4090 before I buy a cpu.
@@4gbmeans4gb61 lol, Intel's own first party benchmarks show the 5800X3D beating the 13900K in *_multiple_* titles.
@@mitlanderson will there be a 7950x 3D? I’m looking to build my first pc?
@@johannessguten2527 yes, q1 2023
@@mitlanderson that great to hear I guess I’ll wait then maybe these gpus will also have minor price drops by then as well
absolutely brutal, hope the next generation of CPU's will be able to carry this beast.
@MirroredVoid Uh, well given 12900k was a roughly a 12% uplift over 11900k, and 7950x was a 15% uplift over 5950x 10% would be disappointing actually. Rumors are that 13900k will be a larger jump over 12900k that that over 11900k so maybe check those numbers and educate yourself there chief.
@@ross-carlson butt man
Are you joking? Current CPUs can carry it just fine, hell, old gen CPUs can. Next gen will be a walk in the park. CPUs have gotten insanely strong, don't underestimate them lmao
@@crylune no way i have r9 5900x and i barely get 60fps on spiderman with rt and eith only 70%gpu usage
@@eslamarida What GPU and what resolution?
I still follow high-end PC, but I stopped keeping up years ago. If I were to budget correctly, I could afford this kind of technology. Only just though... I'd rather not
Idk bro. I save like 5k after expenses but I cant even stomach this cost.
Pretty impressive performance jump, but of course it's too expensive. But in the least it shows that tech didn't stagnate at all. Excited to see how amd responds.
It's the absolute top of the line GPU, of course it's not priced for average people. That's like complaining that a Bugatti isn't priced for the average driver.
‘Too expensive’. Seriously what else did people expect at this level of performance??
@@Totenglocke42 yeah, but games shouldn't be targeted just for richest people. and people wont buy game that needs high end hardware to run. It would be financial suicide for devs.
@@avatarxs9377 State of the art will always price out normal people. You can run games perfectly fine on lower end cards but if you want the best of the best you gotta pay for it. It will always be this way.
It didn't quite stagnate yet, but for me, it still looks like a game of being between a rock and a hard place:
-they'll run out of die shrinks eventually, and in fact, effective die size for lower end GPUs has been increasing even taking into account node shrinks: GTX 1060 (200mm^2, TSMC 16nm); RTX 2060 (445mm^2, TSMC 14nm); RTX 3060 (276mm^2, Samsung 8nm, which is a die size reduction compared to the RTX 2060, but still an increase compared to the 1060 despite going from TSMC's 16nm to Samsung's 8nm); RTX 4060...
-Bigger dies mean more power hungry chips, and Nvidia have been able to stick with 250w max for top GPUs pretty much until the RTX 2000 series, but that was no longer the case with the RTX 3000 series and onwards, with all the implications associated (ATX3.0 and PCIe 5.0 connectors to handle the power increase)
-Silicon wafer sizes and dies sizes have physical limits from a production pov (silicon wafers are currently 300mm in diameter, and reticle limit allows for die sizes of ~800mm^2 max), as well as economical limits from a production pov (how much will a RTX 6060 with a +500mm^2 die size cost?)
-There's only so much performance gains a new GPU architecture can do at any given node shrink
The good news is that what's already out there should be enough for quite a few years, but something will have to radically change eventually...
i love how the "developer test tool" for Cyberpunk could still be mistaken for standard gameplay. especially that hiccup at 12:58
I want this for VR so bad... Makes me think I'll build a new system over the next year and wait for prices to be a bit more reasonable before biting the bullet on the video card
Dripping with supersampling
Maybe wait for VR headsets with eye tracking (and therefore supporting foveated rendering) to come out so you don't need to spend a month's rent on a GPU.
If you wait like a year you might even get that same performance for maybe half the price and maybe power draw. Maybe the RTX 5070/5080... Or an equivalent AMD card, if you prefer to have efficiency and still want to have a smaller looking card.
@@liamness Hoping the PSVR2 releases at an affordable price, sees wide adoption, and begins to make foveated rendering standard in high-end VR titles. Really a lot is banking on Sony's success in that area in my opinion.
@@ericd.9038 Quest Pro just announced and has eye tracking too, definitely not affordable though!
these prices are especially rough in today's economy, I think definitely so for the 4080 and fake 4080. but one thing i do appreciate about nvidia is they continue to push for big gains of GPU power even though they're already best in class performance. I'll almost certainly be picking up another Nvidia card next but I still want to see how RDNA 3 comes out
I would say they are absolutley diabolicle and ludicrous .
Thank you for the very good review guys. I actually enjoyed this one more than any other reviews. You did go talk about things you didn’t before and I respect that to the fullest.
There was a key moment now for the 4090 that is convincing me that this time around, as expensive as this card is, it makes the games work better across the board without DLSS - that is a HUGE seller for me.
Now to wait for the RDNA3 stuff because you never know but if I am to upgrade with NVIDIA this generation, I guess so far it will be this one or when they have the Ti version of it even for extra kicks.
THANK YOU for going the extra mile on this review Rich and team. Much respect.
I'd love a 4090, but I built up my first PC in 10+ years with an EVGA 3080ti not too long ago, so I'll wait for the 5000 series and the competition. Hopefully we get the same 4K RT power with much less energy consumption.
And a more compact form factor
Energy consumption is not that bad here, its unlikely to get much better in the future. Physics.
I also invested in a EVGA 3080Ti equipped machine..it’s a beast!
The leaks suggest slightly better raster and slightly worse RT performance with lower TDP from RDNA 3.
@@Batman-bh6vw slightly worse RT performance? Top RDNA 3 sku targeting 4080 12gb - 4018 16gb RT performance, and 4018 16gb at best case scenario.
The technology in this thing is actually insane
there'sno technology, they just slapped more wattage to it
@@snowpuddle9622 ok
@@snowpuddle9622 They indeed added new dedicated hardware to enable DLSS3 reconstruction, that's way input lag was increased as rendering pipeline has now more steps.
@@snowpuddle9622 it has better perf/watt so even if you capped the 4090 at a lower TDP, it'll still shit on equivalent AMD or NVIDIA GPUs.
@@lyserberg keep being delusional
7:25 am interesting use for DLSS 3.0 would be to allow frame doubling above the engine limit. It would be a life saver for 30fps console ports, and also pretty cool to reach 500fps on other titles.
It probably worsens the experience as it increases latency. It would look way smoother, but IMO the latency is by far the main problem when playing at low framerates.
Though, it may turn out to be a valuable feature specially for lower tier future RX 40##
If it can do that*
@@ziokalco It would certainly be bad for games like Dark Souls 1, but for slow games it should be just fine of it can be forced from the drivers.
When latency is important, the best case scenario would be to turn 120fps into 240fps to take advantage of your Shiny new monitor.
for 30fps locked games there are generally mods and methods to bypass those, either a mod or a launch parameter
Wouldn't the games also need to internally support DLSS3? I think older console ports that run 30fps are highly unlikely to get DLSS3
I wonder if DLSS 3 can evolve to predict frames at arbitrary spacing between frames, similar to Topaz algorithms. That would allow it to nullify visual frame stutter.
Whoa! That's a good point.
Stutter is definitely the plague of PC gaming. It would be great if dlss was able to mitigate the issue.
Nvidia Engineers screaming at their interns to write that shit down as we speak
It looks like it does if we go by the frametime graphs of the benchmarks. DLSS 3 is almost butter smooth
That Portal clip is just further proof that DLSS will always be a necessity if you want 4k quality visuals with RT. Thankfully a lot of games aren't path traced yet otherwise we would be in some biggie trouble
No we won't just turn that shit off problem solved
@@kickassguy211 no
"Thankfully a lot of games aren't path traced yet"
Do you really think they are gonna start making a lot of games even the most powerful hardware can't handle?
Not true, as Portal doesn't use ray tracing, it uses path tracing which is many times more intensive on performance than RT.
Wouldn't be surprised if NVidia at some point uses the DLSS3 software feature to cut down on chip size so the real performance gains only go to the bigger chips for raw performance, which NVidia then most likely, only will use on enterprise level cards... and then the biggest chips you will get in Nvidia consumer graphics cards will be max. 300-350mm²
Waiting for November to see the new rival from AMD 😎
Do we really need that amount of horse power? I feel like games are getting really grand in scale but also in detail, which makes most games nowdays a visual spectacle but also an empty shell, all this while also ramping up development time, crunch culture, costs and because of that, releasing incomplete games with predatory practices like loot boxes or season passes to make up for everything i said before.
I appreciate advancements in technology, 4K 120fps with ray tracing is amazing, but i don't this necessary at the moment for the games right now (maybe it's great for other applications outside gaming realm) Such a high celling and the race for the highest fidelity has shadowed the game design and artistry behind games.
I really wish more devs would use easier and less demanding engines so they could focus on delivering finished, inventing and passionate products instead of trying to impress with state of the art tools.
100% Exactly what I was thinking.
Well said 👏
In a way, raytracing does lessen the load on devs though. Less requirement for carefully placing probes and baking lighting information into a scene. Then going through the process again if there is any change in the scene, or if there are other dynamic elements like time of day / weather.
Could not disagree more. So many of the problems in gaming tech have to do with devs using paradigms that are OUT OF DATE. Many highend technologies actually REDUCE the burden on devs in the long run despite an initial investment (like PBR or ray tracing)!
4090 is a beast, biggest gen on gen Rasterization increases I've ever seen. Price certainly reflects this. Thanks for all your hard work DF on this launch video. Would love to see DLSS Quality mode compared to Native (Both DLSS 2 and 3) as that's what I play with since DLSS Perf has some loss in quality vs Native but Quality mode does not for the most part. Seems like 4090 is a 120-144 Hz 4k Max settings beast. Can't wait to see Unreal 5 games played on it!
I think it should be noted that DLSS3 isn't quite just adding a "few ms" of input lag and massively improving performance. It's adding that extra few ms vs the base framerate but vs the framerate it's targeting, it's significantly laggier. So for instance a 30FPS game frame generated to 60 will still feel like a bit worse than the same game at 30FPS input latency wise, and while it'd look like 60FPS it won't play like 60FPS. Ofcourse the higher the framerate the lower the impact of it as the differential narrows down.
Unless I’m misremembering, it’s adding a few ms to the input lag you’d get from DLSS 2.0, not from the base frame rate. This ends up still being more responsive than native resolution, but slightly less responsive than DLSS 2.0.
@@DevNug But it's significantly less responsive than the framerate it's targeting if it were to run at that framerate natively. If a game has 80ms of lag at 30 FPS and 50ms lag at 60fps. Then using DLSS3 will make it "run" at 60FPS but the input lag will not be anywhere near 50ms, instead it'll be closer to 80ms.
yes and yes, as DF said, good framerate health before the DLSS 3 effect is crucial.
Why anyone would even think of dlss 3.0 with a 4090 is beyond me. Just select a native resolution and you can still hit as much as 360fps conveniently with zero drops.
As someone who has been showcasing AI based interpolation since 2020, to see it done in realtime rendering is bananas. Hats off to Nvidia. Also can't wait for Nvidia Remix!
But nvidia baaad...
@@wizzenberry tv based interpolation is pretty bad. There’s just no comparison between what’s happening here and what TVs do with their interpolation.
@@gavinderulo12 lol why is that? Certainly their pricing is high this time. I don’t plan on buying a 4000 series card until maybe the 4060 or 4070 honestly.
@@wizzenberryTV frame interpolation has huge input lag. It just doesn't matter since its not used for interactive media. The fact that nvidia is able to achieve this in real time, with better results, is more than impressive.
Would be interesting if you'd test these massive high end GPUs in VR applications, things like Flight Simulator or Assetto Corsa Competizione, on which the 3000 gen struggles massively compared to pancake games
Hello fellow VR sim master race. vr in racing games is an absolute game changer.
@@TheScyy thank you man, what title did you test it with?
I'll say this much, with as advanced as Unreal Engine 5 and these new GPUs are, if developers don't use UE5's Chaos Engine to its full potential, games built with it aren't going to be much more than just visual improvements. Fully physics-based maps/worlds though where nothing's static and everything is destructible or interactable? Now that's revolutionary. I thought we were headed in that direction with the likes of the Red Faction series, but we all know what happened there. All we were left with was the Battlefield series. Just Cause 3-4 were good, but I don't recall being able to affect the environment/map. That needs to change. Not just in First Person Shooters, but in any game really where you have powerful abilities. Even racing games would benefit from fully destructible environments. Imagine crashing, and then instead of abruptly stopping when you hit a wall or fence, you go through it.
The 4090 is a phenomenal GPU. Literally doubles the performance of the 3090 in most ray traced games, and that's referring to raw performance. No upscaling or frame gen.
I'm actually surprised. The 4090 is a certified beast compared to the 3090. We haven't seen this sort of jump in GPU power from generation to generation in a really long time.
Since 980 Ti to 1080 Ti basically.
Or if you're AMD, from Radeon 7 / 5700 XT to 6900 XT / 6950 XT.
that power consumption tho...
@UA-cam Account Keep dreaming lmao
yep gonna try to get one tomorrow morning and sell my 3090 for $900
@UA-cam Account Eh I would suppose otherwise and I don't know why people are so butthurt about "fake frames" when in reality they won't be able to tell a goddamn difference, but they still gotta cry because it isn't 'genuine'
I just got my 4090 FE to be able to stream PC RT titles 👍 can’t wait to see what games in 2024 will utilize RT (Hellblade II is a given so far)
if I don't pay rent or buy food for 3 months, I could afford this
Richard: "In the here and now, there is a new GPU king".
I love every time Richard sneaks "streets ahead" into his reviews or analyses, as if it was a common phrase. I appreciate this. I just wanted to acknowledge it.
I was just looking to see if someone else acknowledged it!
I wonder if joules per frame will become α common benchmark for video cards in the future.
@@rockapartie i have never seen it before
Now we need next gen games. Good ones.
Rtx 5000 will be out by then 🤷🏼.
"Make gaming great again."
Surprisingly we have yet to see any Unreal Engine 5 games to release and they alreadly planning on an UE 5.1 soon. While we have some games that were announced to be made using it, including Hellblade 2, we haven't seen any release date to any of them yet. Guess development still take a long time even with all these conveniences to reduce said time.
It took 3 generations, but now I have a worthy upgrade for my 1080ti
Good you had the patience, I didn't and went for the 30 series. Kinda wish I waited.
sadly for 3x the price
Im upgrading my 980ti lol
1080 here. New Year build once all the hardware is in the wild
My GTX 1080 has served me very well for last the 6 years, but it's finally time to upgrade my whole build!
geez, the raster performance of this thing is insane. Even at lower resolutions like 1440p you still gain 30+ frames.. at 4k with dlss 3 it makes my current 3090 eat shit.. both in and out of RT and with and without dlss 3. Massive gains in performance.
How does that make you feel?
@@Stardomplay it makes me feel all antsy in my pantsy
@UA-cam Account 30+ frames is a big deal going from 30 to 70
Sounds like CPU’s are gonna need to get a whole heck of a lot faster, This is insane, i love competition!
Even displays. This can be a 8k card!
@@zacthegamer6145 ...and there's no point in playing anything in 8k.
@@esaedvik except vr games
@@mirukuteea Probably not, but that remains to be seen. Haven't really seen many AAA VR games yet.
People just need to hop on to 4K 240Hz displays which already cost less than this card. CPU bottleneck auto-resolved.
I got mine and the performance is insane. I'm so frequently CPU limited now even at 5120x1440. I can make pretty good use of my 240Hz refresh rate even in a lot of AAA games. Waiting for my 13900K to ship now to try and alleviate the cpu bottlenecking.
Nice dude
There's no energy crisis in America, at least not in red states where we actually care about things like quality of life and not being 3rd world peasants.
I remember no long ago the 3090 and 3090ti was a beast. Time flies.
2080 ti is a beast
4K, RTX or VR! Why is everyone forgetting how demanding VR is?!
Would be interesting to see DLSS 3.0 working with Microsoft Flight sim in VR seeing as it's mostly bottlenecked by the cpu. Those artificial frames could really help smooth things out. Currently I get around 45fps with a fairly even mix of medium, high, and a couple of ultra settings on my Reverb G2 with RTX 3080ti.
4090 is too pricey though. I wont be upgrading until at least next summer when I get my bonus and maybe by then 30 series stock will have depleted leading to a price drop on 40 series?
I love how cyberpunk glitches during a benchmark at 12:58 :D
Fantastic review, as always. Kudos to Digital Foundry. The information at 13:32 was especially 'eye-opening' too 🤣
I watched many reviews of this GPU and apparently everybody forgets the performance uplift that gtx 1080 ti had over gtx 980 ti while costing 699$... yes, rtx 4090 is quite the thing but a lot less impressive when you factor in the price tag... again, Nvidia's marketing worked wonders in persuading consumers that 1600$ is the norm for a GPU...
@Sorey there are a lot of good options it terms of any other type of hardware except the GPUs… that’s true… maybe the route from now on is used GPUs when the newer gen comes out…
This card is for paypigs and professionals, everyone else can wait for the cheaper ones. The silicon shortage isn't over, everyone in the business wants to get every dollar they can for the chips they can actually get from TSMC. Same reason AMD won't be rescuing us from high prices: they have a certain amount of silicon to sell and it's going to be made into high-end CPUs first. It's shitty for us but it's good business.
Appreciated the discussion on "fake frames". Yes, it's definitely not "increasing performance", but the end result is indistinguishable from better performance: Higher framerate at good frame pacing.
So, does it even matter?
Input latency is a real thing and even this video mentioned wanted to test it in a fast paced FPS (for example). That latency may well be too much in some scenarios.
Yeah, but if you're playing a fast paced FPS, you likely won't be cranking up settings to the maximum, and you can just disable DLSS3.
If you're playing a very demanding 1 player game though a small latency increase doesn't matter.
Richard is the most zen UA-camr out there right now. Always living in the here and now.
It's interesting how you would probably want to wait for a few CPU generations and display technology advancements to actually make good use of the 4090. At that rate, delaying your high end GPU purchase till CPUs catch up might be the most economical option.
If you're only going to use 50% of 4090's power by capping frame rate for current and even next gen games, you're essentially sitting on a solution looking for a problem in the, ahem, here and now.
Makes no sense to me we already have 4k 144hz screens and the 4090 is only cpu bound at 1080 and sometimes at 1440p but the obvious resolution this card is aiming for is 4k so cpu performance doesn't matter mostly...
@@i3l4ckskillzz79 I both agree and disagree. Only DP 1.4 has the bandwidth for 4k144, and HDMI 2.1 can only go up to 4k120, these are the two ports 4090 supports. Secondly, we can see that with DLSS3, 4090 can very much provide ~4k200 experience in Spider-Man, a heavy AAA title from 2022, all those extra frames are beyond the display output spec. With optimised settings and engines designed for next gen, who's to say that can't become a norm with DLSS3.
@@DivjotSingh you are just guessing
@@astrolillo yes of course
Detailed rundown as always. I wonder if in those times when even at 4k it's running in excess of the refresh rate if it's time to use DLDSR with DLSS3.
good question. I'm sure Alex will get to the bottom of that
Nice. so now we as consumers would have to pay enormous amount of money to buy these cards so game publishers can release their titles without any optimization knowing 40 series cards would run it. no issue.
Just look at the "requirements" for A Plague Tale 2. A 3070 for 1080/60. Wooow.
Or a much more realistic scenario, no new game will use the extra power of those cards, because are just entusiast things, who are pretty rrarre to find, and cost too much to buy, so there is no market for developers.
Dont worry consoles are still the common denominator. Games still need to run at atleast 30 fps on 2070 equivalent console hardware.
People that will own 4080s and 4090s are in the minority. In general they will optimize for the cards that most people can buy I would imagine because that's where the most money comes from
4K-120fps-RT-ultra graphics. Chefs kiss
Good review! Although why you can't show frame rates and percentages at the same time is beyond me.
Guys, VR has been a thing since March 29, 2013 . . . it's time to start adding a few VR games into the benchies :D
I'm really interested to see what image AI tools will be able to do with this
u mean topaz well they still not using the monster 3080 which i got good it needs more updates once that happens yea it will run real good i know cause i did a 30 min cartoon with waifu and it took 10 min whereas topaz with the same one takes 1hr-hr and 30 min
@@cmoneytheman no Open AI, Stable Diffusion, Midjourney, Dall-E and others like that
@@amj2048 oh well should be great these cards are monsters even the one i got 3080
@@cmoneytheman topaz is much more suited for irl picture, CG etc for cartoons and anime stuff i found ESRGAN with a custom model to be the best but thats for images, the video upscalers use a a whole different methods and algorithms, i.e the results aren't as perfect thats y they are much more lightweight
@@ShahZahid i wasnt comparing i just was saying the speed difference is way better with waifu i expect topaz to run that way once it gets more updates i have always thought that was gonna be the case with these monsters 30 series cards
Really feels like if we continue to get at least half this uplift generation on generation, in two more generations the 6-tier of GPUs will comfortably be getting ~200+ FPS in 4k. I genuinely believe that will happen by or before 2026 and that 4k at 200+ FPS really be a comfortable spot for most of players to stop. That thought makes me really giddy thinking about VR headset resolution potential by then as well.
Reply only to be clear: 6-Tier meaning 7*6*00XT or 40*6*0 to 40*6*0 Ti etc.. Around there.
I saw comments like this in the past decade.. New Plague Tale game requires RTX3070 for 1080p 60fps, and games are going to get more demanding..
you might but before you know it 4k will be obsolete and they will be focusing on 8k or some newer form of tec
Litrally the best 4090 review all over UA-cam period.
The problem with a "next-generation of games" is that we are not going to get them until the consoles and mid-range cards will be able to run them, too, very few people will actually have a 4090 in their PC so developers may not bother at this moment, and with the way nVIDIA is pricing the whole series, I'd say we're in for a complete AAA market stagnation, which is already filled with stagnation and microtransactions.
Are you dumb? New gen can run games just as good as mid range pc… so stfu not everyone can afford to spend all their $$ on games 🤦♂️ 🤡
Very true. The 1650 class of card is still by far the most used card on steam hardware survey. Few developers are going to put themselves in a position where 95% of gamers can't run their game. Nvidia's pricing is so far outside the realm of reality as to be a joke so nobody is upgrading to this shit.
graphics settings exist for a reason, compare rdr2 max graphics compared to consoles
It'd be great if the frame interpolation was a feature that could be turned on independent of the game engine. It'd be a great way of giving a boost to older games, particularly those which are console ports where things like controls and physics break if you run beyond 60fps. Imagine being able to 'upscale' old 8 and 16 bit games to 120fps. It's such a potentially transformative technology. I'd love to see some analysis though on the inserted frames to see whether there is visible ghosting, artifacting, etc.
The interpolator does require access to motion vectors and depth information though, would be difficult to hack into older games
it could work not as well when doing 30fps to 60 interpolation
Buy a samsung tv...
Might as well wait for the 4070 since the 4090 is so powerful. No need spend that much money if DLSS 3 is bumping up specs like that. I still wanna see what AMD does this time around with RT/FSR/RSR
VR would have been worth a mention. Something like the Reverb G2 with 2 x 2160 panels at 90Hz pushed 3090Ti too far. This looks like the card to solve it.
THE EMBARGO LIFTED! HOORAY!
13:29: Magnus dominus!
23:18: Noice!
I just got a 3080. I'm about to turn 50. I don't see myself ever getting or needing a 4090. it's just overkill for anything im trying to do.
Do you play much?
We have made it. Memory bandwidth is no longer in GB/s it’s in TB/s
Honestly I was expecting even a MENTION on 8k performance on this card, I remember when the 3090 came out and Nvidia was saying to tech reviewers test the 8k possibilities of that card, so I was interested to see how the 4090 compares with that.
Saw someone doing like 8k60 in GTA V. If you type 4090 review there are plenty.
@@simptrix007 and it wasnt getting even 60fps at 4k ultra settings in gtav an almost decade old game
we just hit 4K 120fps standard, and you already want 8K? to play what, in 45fps?))
@@MrVoland44 it is interesting to play at 8k even on a 4k 27"panel you notice the difference in sharpness, i tried mirrors edge the first one from 2008 at ultra 8k on 3070 and was getting 30 fps or so
The details at 13:31 are just mouth watering.
DLSS quality vs performance would be great to see vs native.
dlss 3.0 graphically is the same as 2. just with frame generation. so if you look at any video comparing dlss 2.0 it will be the same
@@lordadz1615 You can run DLSS 3 without any upsampling.
@@Safetytrousers oh i didnt know that lol. i guess if thats the case it would look the same as any other setting ur using
Nvidia 1080 user here. :0)
Glad i waited i will most likely pic up the 4080ti once it hits then i should be covered for a loooooooooooong time and if i can't then my 1080 is still perfect!
Very impressive card, but Will wait for the next card in 2024...and like everyone said before, pricing is really an issue. most of us doesnt are modern salves that doesnt earn much at all, and with the pricy cost of just living, its hard to buy something like this.
best thing about all this IMO is DLSS3.
The future of it working with GSync will be great because you won't need a top tier GPU. A 4080 or less will stil be getting amazing framerates and making these new monitors even more worth having.
me as egyptian watching this while still using gtx 1060 for gamng and using half broken rift cv1 to play vr games
22:33 What happened to the audio here? 😂
Love the video, keep up the good work.
The Next Level In Electricity Bills too.
This thing is like new Taylormade driver that comes out every year I always roll my eyes after I saw one announced and said "Well at least now I can get 2019 model for reasonable price"
If you reduce the power by 33% to 300w it only reduces performance by 5%
Picked one up from microcenter earlier today. Just started cyberpunk right now. Freaking AMAZING card.
I would love to see an 8k resolution rasterization benchmark on this card.
It won't be anything special. The higher your resolution the bigger your diminishing returns.
@@CaptToilet This card might be the first to have real 8K potential, like the 1080 Ti did for 4K.
@@CaptToilet How are you defining special? If it can hit 60 fps in most games at 8k then it would be the first card to do it, that would be special to me.
@@Noliving It would be really cool to see what 8K resolution would actually look like on an 8K TV for gaming. How much can you tell the difference over 4K.
@@loki76 My enthusiasm for 8k is the fact that you don't really need Anti-Aliasing at that resolution.
Considering I was able to snag two 3090s ($700 each) for less than the price of a 4090, Price is the issue here. Used cards are plentiful and are not at all a bad buy. I refurbished both of my 3090s with fresh thermal paste and thermal pads and get solid temps better than new. So is 60% to 80% performance increase worth $900 more with the current market of used cards and heavily discounted new cards? It will be interesting to see what RDNA3 brings and the pricing AMD decides. The 4090 isn't a bad product, just a bad price.
the price isn't even *that* bad, it's the 4080 and the '4080' where price is abysmal. performance should be considered with what resolution you are using; sub 4k high refresh and it won't be worth it in the majority of titles for a couple of years.
$700 for a single piece of a PC seems ludicrous.
Wow the performance is great. Seems like something i can buy after 5 years. Meanwhile I am happy rocking games on a CRT with 1050Ti 😁.
I play games on a CRT too even with my 3060
I mean this level of performance will be in the 50 series..in about 7 years or so.
Get an oled and use Black Frame Injection. The same what a crt do, lighter, bigger, less piwer consumption
@@alucardh.6413 As someone with both, no they do not do the same thing although both are amazing.
23:24 - 23:26 lol I see that Richard is still young at heart with that little quip. You can even hear him stifle a laughter right after too
Another masterful single take, unscripted review from Rich.
Perfect card for an LG C2 full 4K 120hz. I'll see you both in 5 years my friend, I will start saving.
this card alone uses more watt than my entire rig. maybe even the monitor included.
Yeah right, your entire rig uses less than 400 watts? Is your rig a laptop?
If frame times are too high, and you don't want stutter, would enabling fast sync work?
ie. To eliminate screen tearing for framerates above the monitor's supported rate.
Here's the thing. Developers make games for the PS5 and XBSX. Having this much power only makes sense for VR on PC, where you actually do need hi-res high refresh rate rendering, but even that will be leaps and bound a improved with foveated rendering. So I honestly don't see the use case for these GPUs atm.
Basically this card and next ones will be useful for true next gen games. There still hasnt been one though..
@@Masilya111 Are you taking about the PS6? It's been 2 years of PS5 games. PS6 is 3 years away, but then, the 5+series will be out.
Going all in on PC for the next gen seeing all games with exception of Nintendo going to PC on day 1 or within 1 year.
I have a pc with a 3070 and 3700X cpu but will give that to my daughter and will build a new pc with a 4090 or 4080 with either a 7900 cpu or 13900K Intel. Can’t wait to just have one systems for my Xbox,PS and PC gaming for VR and Triple A titles
Amd's GPU offerings unveiled on Nov 3rd, unlikely but Nvidia may adjust prices for their lower end. The 7X003D cpu's should release Q1/2. An investment into your screen might be a wiser choice in the meantime, regardless all the tech is killer atm have fun!
my 3060ti I got at launch is still doing just fine for my needs luckily.
This makes waiting for the 50 series more enjoyable. Knowing if I start saving i can actually give $$$ instead of a kidney.
Bro you gotta get the BBC GPU
22:34, nice audio transition you got there
Between the power requirements and price for these new gpus, it's sustainably unaffordable.
Nvidia doesn't give a shit
I can afford them
dont buy it simple, its like iphones, they are expensive af but people still buy them,
There is less left for "The Arrival" to Mars.
Guess I'll wait for the 5070 to buy it at an affordable price
i dont think its gonna get cheaper as we are now reaching the limits of chip manufacturing, either we need a breakthrough in chip chemistry and architecture or lesser performance gains that way the price will be justified
Thanks for the review !!
Just a correction, at timestamp 3min:08s of the video.
" A Gigabyte per second of memory bandwidth "
1018GB/s is actually aproximately a Terabyte per second of memory bandwidth.