Was going to say your math doesn't make sense. I know training is different in gaming but didn't he say it could do the same training at the same speed but with 1/4 of the GPU. 2000 vs 8000 GPU. Now you could say well 2000 GPU are now really 4000 cause it's two cores connected but it was also 1/4 the power consumption so there must be some efficiency improvements no?
You're just trying to cover your behind with your previous 35% uplift prediction by also predicting 70%. So you can always claim "you're right" as a "leaker". GB102 will not be multi-chiplet. Most of the ~70% performance uplift will come from the programmable L1 cache, better clockspeed and ~50% more SMs.
The sad part is that the 5080 will match the 4090, meaning the 5090 will be 70% faster than 5080, aka it'll be worse than 4k series with the gap being 2x wider per tier
That's not how nvidia premium product pricing works. 70% faster than 4090 would be like 4k minimum. Because theyre still going to sell 4090 for 1.8-2k and somehow gimp the 5070 to be slower than 4090 so they can still charge 2k for 4090 while 5070 is out and "only" costs probably 1k. Hey you can get almost 4090 for "just" 1k, what a deal!
@@Andytlp What a market we are at now... That or we can have two 5070 editions one with a lower vram that is just a slightly better 4070 and that "normal" 5070, all with the Nvidia ridiculous pricing
@@ever611 Thats likely too. But unlike nvidia a year or two ago they have infinite money. So for them to be greedy on cents on the dollar that their gaming gpu division is making compared to a.i accelerators, i'd be surprised if they still pull this crap. It happened because they had lots of ampere gpus to sell so they jacked the 4000 series prices for every tier to sell ampere. It worked and people still paid the premium. So theres a high chance theyll jack blackwell prices even higher than ada. We're never getting more performance for same price ever again if so.
@@Andytlp I know I'm in the vast minority, and this is wishful thinking more than anything, but I do believe that the prices of the 50-series will not be as ridiculous this time. If 5080 costs $750-$800, then it's not as bad, depending on performance, that is.
To say the game is poorly optimized is incorrect imo. The game has incredibly good graphics. There comes a point where "better optimization" isn't good enough and people need to upgrade their hardware. On the other hand, there comes a point where a better looking game has incredibly diminished returns...it doesn't look that much better to the average person for how much harder it is to run. What the game does need though, is a low settings performance mode that runs the CPU and GPU a lot less hard maybe with extra downscaled textures, NPCs or whatever.
@@ofon2000 clearly you don’t play many games because there are significantly better looking games that run way better than DD2. Alan wake 2, forbidden west that just released today, cyberpunk, all these look leagues better and more importantly, run better too. So yes, it is badly optimized and that’s not up for debate.
I most appreciate the aesthetics of games from the turn of the century - 2D renders and sharp, early 3d. The new aesthetics and the pursuit of realism do not suit me at all. That's why I'm glad that I haven't had to replace my hardware for 10 years hahaha.
Dragons Dogma 2 on the best CPUs today is struggling HARD. Game needs much more work under the hood on the CPU side of the render queues. The game performs like trash programming.
no. we always hear 2 times, 2.5 times or even 3 times as fast haha. 60%-70% increase is quite realistic expectation and it did end up true more often than not.
@@Vorexia nahhh alan wake 2 player base is already dead And alan wake 2 looks like generic unreal engine horror game with no style Cyberpunk open world which presents all kind of environment for graphics to shine
there is something VERT DIFFERENT about your voice in this video...... it just doesn't sound right.... did you use some AI that you feed it a few samples of you talking and you can type what you want and it'll speak it in your voice??? I can't tell if that's the case, but there's something SERIOUSLY DIFFEENT with your voice, and it's NOT a good thing
I don't know. It's never really been any different and it's always the "enthusiasts" who get shafted. You average PC gamer doesn't need path tracing @4k resolution and 120+ fps using fake frames. Just in case you forgot: OG Doom required a ~2k$ PC in order to run at 320x200 pixels at a capped 30(!!!) fps :) People simply forgot that "Ultra" settings were meant to be for future hardware, "Medium" settings should be the norm and "High" is for, well current high-end hardware. These days, the completely bonkers expectation is that sub-$300 GPUs can run every game at high settings, 1440p (the new 1080p) with 60+ fps...
@@DarkGT Fair enough. But looking at the dev cycles of major games, I still think it's save to assume that skipping one or two generations is still fine even for more mainstream cards (currently that'd be your 4060/7600 up to maybe 4070/7800XT). It's just that expectation have to be managed somewhat, that's certainly true.
I dont understand how you did the math here but, I will at least provide an actual logical argument. You're making the premise that you need an 5090 to game on your computer? That would be true only if you're gaming at 16M pixels and above which 16M is dual 4K Monitors. And even then you would be talking about 16M pixels at 240Hz refresh rate. Unless you're doing that sort of gaming, you can buy an 7900 XTX and an 7th Gen 3D processor and call it a day for a 3-3,5K PC that can wreck 4K 120FPS gaming. You dont need to exaggerate or make shit up, because a 5090/Ti is a use case scenario of gaming for the G95NC at max output in Ultra graphics and even perhaps ray tracing too and not for you average gamer's setup.
@@metallboy25 Like any regular gamer can afford all that shit lol. Don't come back at me with prices coming down in the future because I've seen it all before. Something will replace OLED before it's affordable for the average consumer. It's just a repeat of Plasma displays.
I don't think 5090 is targeted at gamers, maybe small % of rich gamers. But majority buyers will be content creators, AI content producers. I believe majority of 4090 buyers also was content creators, i heard rumors from salesman. PRO Nvidia cards cost 10K$, 4090 was 2K$, so Nvidia is filling niche gap 4K$ for consumers that don't have 10K$. But still want best that they can afford, and 4090 is too slow for their needs. If it is true, then it is smart. Filling market gap.
There are people that make a lot of money so to them it's not really a big deal to spend a few grand every few years. Then there are people like me who will soon sell 4090 and make most of the money back add 100-300 to get the 5090 been doing this every generation.
How can you be ripped off?? Do they force you to buy?? If you think to expensive just don't buy it. Why people cry so much. It's not basic food or water you can live without it especially that consoles have years to catch up. Just buy slower card
@@cccalhoun It is not monopoly, monopoly is when there is no other good option. People just don't choose to buy AMD graphics. That is people's free choice.
Monitor & other peripherals aside, the max I can justify on a complete new pc build is around £1500 (~$1900). Ignoring that, I'd still hate to see what kind of PSU power and cooling would be required for a dual-chip 5090.
As the advanced packaging capacity is currently a production bottleneck I do not expect it will be used for consumer GPUs. I would expect a ~50% performance increase, for the 5090, 30% from additional transistors/SMs and another 20% from clock uplift. No need for NV yo go full out, as RDNA4 is rumored to not have a high end variant. Maybe NV could launch a dual die -50 (Titan?) card as part of a mid-gen refresh in 2026, if RDNA5 tuns out to be good.
A serious question, why has nVidia seemed to have stopped actual GPU development. Take away all the ai and cache based changes, and what have they done?
Simple answer: lack of competition on their back. Yes, there is AMD, but only on lower-mid range GPUs. Intel is on the horizon for low-end GPUs. So we can expect somehow reasonable prices in those segments, but not at the highest segment. Every company will squeeze the market as much as the market will allow them.
what improvement should be done anyway on GPU? the only thing most gamer really care is faster FPS. when nvidia push RT most gamer shun it as useless because it tanks the performance. take out RT out of the equation GPU like 4090 can give you bajillion of FPS even at 4k. majority of gamer out there? they still with their 1080p monitors so most of them already doing very well if they can get something like RX6600 level of GPU.
12:00 I would like the next generation graphics card to be a significant improvement over the 4000 series in terms of power efficiency. Ideally, it would consume around 30-50% less power while maintaining moderate performance, within a 10-20% increase of the 4000's capabilities of the corresponding sku.
4000 is already crazy power efficient tho, you have 4090's running 95% or so @ 360w/80% PL and my 4090 using less power than my 3080 ti most of the time...would be impressive if they manage 30 - 50% less power but i doubt it... i just hope they keep it up going in the right direction at least
I keep waiting for their MCM GPUm for years now but I got a 3060 laptop in 2021 or 2022 to wait for that, and I had a Intel 6700HQ laptop to play games on since like 2016.
I'm not really trying to guess at pricing or how many dies and what not. I will wait until I see it then decide to go from there but normally the new 80 series is around 25-35% faster than their outgoing 90 series so 70% better is feasible at least.
Why would they need 2 die for the 5090? Wouldn't a 30% increase in density, paired with 30% faster GDDR 7, and a higher clock speed get you to the 70% figure on its own? Or is performance only tied to density?
As far as i have seen that seems accurate. I would expect 50 percent at least, but alot of that will be gddr7 and density, with clocks making up the rest. I cant imagine the gpus clocking THAT much higher personally. Kopite komi he mentioned in the video has been a leaker for a long time and usually has pretty reliable information. I dont see a double chip design making it to us gamers. In fact im almost certain the 5090 wont be that. No way they give that to us with the INSANE profit margins they get from enterprise cards and hardware
Maybe the 5090 might be some duel chip special, but all the real consumer stuff will be nothing more than 30% more expensive, with the same amount of memory, and the same memory bus. But if we see 50% performance improvements on each sku compared to the 40x0 series, then maybe it will be slightly more swallowable. But I can see the 5090 coming nearly $3000, especially if it’s some kind of dual chip thing.
10:12 how does this math work? It's more like 170% if you add up the 35% increase and the 35% increase for the second one, but not to forget the 100% addition from doubling the original.
The real question is: if they release a dual die gaming GPU... wouldn't that finish to destroy the quadro line? Also, what about restrictions exports to China? As much as I like tech and gaming, decision are still made based on business needs.
I don't know what you're trying to say at 9:38. A GPU consisting of two dies each having 35% more transistors than a 4090 would have 170% more than a 4090, not 70% more. This point is moot though, as I find it very unlikely that the 5090 will have two dies.
I doubt they'll release such a powerful GPU for simple gaming, since they're so hellbent on AI. After all, they are pitching this to AI companies, not gamers, they'll probably scrape the bottom of the barrel of the silicon lottery to "give" to gamers, and sell the good stuff to the AI giants. In simple terms, I'd still expect the 5090 to be the 35% or even 40% faster than the 4090, not the professional level 70% faster GPU promised to companies
I went from 980TI to 5700XT and then to 3080 so i have ben planning to go for 5090 when it comes but only if it cost 2500$ or less. over that and i wait.
They are saying that just because they will make something like dlss 4.0 just for the 50 series and call it a day and that's a fact. It can be like 15-20% faster at best.
No surprise here with the chip die interconnects as this with other features of advanced packaging form the waves of the future. Still the most important transition in gaming and all computing will eventually become SoC incorporating everything with the large pool of the very fast unified memory, similar to what Apple has done with the Mx chips or what AMD is doing with the MI300X. DDR will be still modular and possible to upgrade however, it will be considered as slower level from the unified memory pool. Only if AMD would be so advanced as Nvidia it would be really an important step forward.
I would consider myself a value purchaser. In the Neterlands I payed 1200 euro for the 4080 last year. The 4090 is about 1900-2000 euro. The only reason I needed a GPU that powerfull is because I upgraded to 4k monitors. A 5090 sure would be powerful, but also not that neccecairy and so far from beiing considered optimal value that I would fail to understand a purchase. Guess if you have the money more power to you but don't really see a real reason to upgrade. Maybe 8k?
Lisa Su and Jensen Huang are both of Asian heritage. They both are if the same age. They both in the same field of work. There are *_zero_* doubts that they are indeed intercoursing.
70% scaling for dual GPU is about in line with the best case scenario when SLI and Crossfire where a thing. I ran both, and had to do tricks like force triple buffering to eliminate micro shuttering but was hit with a ~20ms latency penalty. I call B.S. when he says it won't need programing to work, all GPU's have tuned profiles in the drivers to make them work best. Might not need software changes out of the normal. I'm certain there will still be a penalty and it will be frame times and very slight latency I might be going nuts in my old age (38), but GPUs used to run dead solid if it was pegged at 60Hz or 100Hz etc, and now they somehow feel inconsistent. Like frame times are off from the readings. And now Intel engineers are confirming my paranoia lol
Is the Coreteks voice over AI generated in this video (?) sounds like it the first half more so than the second half >> anyways... I reckon the pricing will not justify the upgrade for gamers when 5090 / 5080 is released only productivity / VR developers can justify the cost
These are AI chips. Jacketman said it himself - they aren't a graphics company anymore, they are an AI company. Cost can be high if you make $$ with the hardware, which a lot of people will.
Exiting. I’m 100% sure I’ll get this 10 years from now in order to replace my 15 year old one, for a very good price. By the way, why aren’t they just stacking in top of each other? What happened to that architecture?
tbh i think i would get tempted to spend 4k on a gpu, but realistically i think only 2k max is what i would spend for a NEW gpu. i cant imagine nvidia releasing a 4k usd gpu for the consumer market other than for AI perposes. because again the 3090s msrp was 1500 and 4090 was 1600. so realistically the next gpu probably will be 2k-2.5k
But is the NVIDIA 90 (RTX 3090, 4090, 5090) series a card designed for gaming? It is very monstrous to use her only for that or that that was the destiny they had for her. The gaming cards are cheaper and are the series 60, 70, Ti and 80 as the maximum. It is more at some point when an RTX 5050 (tI) comes out. It could be the price that is budgeted to build a Gaming PC for ordinary people like me.
I would honestly prefer to buy a 4K 5090 that has 70% performance from the 4090 than a 4090 ti disguised as a 5090 that has 30% more performance at 2,5K. Its not like the product goes from cheap to expensive. It goes from expensive to expensive, you're paying the price to get the most performance and in that sense, only the dual die chip version or a 5090 ti version makes sense to deserve that kind of hype or price to me at least as a consumer.
It makes no economic and business sense for NVIDIA to allocate their precious dies especially for consumers GPU with 2 die GPU configuration tier instead of overcharging profits corporate enterprise GPUs in the current AI boom The entire GTC focus was all about AI instead of gaming No sane consumer will ever buy GPU for $4k even $2k with this budget you can easily build entire Personal desktop computer and at $4k GPU price even enthusiasts will have harder decision but for professional market that money will not be issue if the GPU can save time and generate more money or profit
You haven't heard of simracing, right? We are buying 3k wheels, 2k pedals, triple 4k monitors,... Or flight sim enthusiasts. So yeah, I can see people buying 4k GPU just for gaming. edit: but you are probably right, dual die config. is for enterprise customers.
You're most likely right but let's not forget Nvidia is still charging the same amount for Blackwell Data center despite doubling the die size. Supply may not be an issue too since they are sticking with the 5nm family instead of moving to 3nm.
If the 4090 uses 400 watts of electricity would the 5090 then use 800 watts. This blows my mind and with electricity prices always on the up, running this card for 1 hour at today's prices will cost $4. If I use my rig 4 hours a day my annual electricity goes up by $1,460. Americans might have cheaper electricity but where I live it is prudent to constantly monitor for lights left on and airconditioning that could be turned off. I'll pass on this portable heater.
I will be getting the 5090 day one. If its 2500 or more i will probably pass. But up to 2500 dollars, as long as its roughly 50 percent more powerful than 4090 im in. I put 1500hours plus gaming into it over the year, i can justify paying a higher price but i wont be happy about it. But i think even 2500 might be my cutoff. Im not sure i could justify spending that much. Also spending 1500 for the non top tier of card is also a nogo. After last generations debacle my guess it the 5080 wont be more than 1000 dollars and i cant imagine them trying to sell a 5090 for more than 2000. My hunch is 5080 for 999 and 5090 for 1999. Which lines up nicely with what you are saying. But the 5090 better be 50 percent better than 5080 if its going to be that much more expensive
I cannot imagine nvidia doing a two-die 5090. Maybe the Titan branding will make a comeback, though? That'd also be easier to sell than a >$4K 5090. I don't even want to think about power consumption, since 4NP doesn't really give much in the way of efficiency. Nvidia will have to go to the 600W and probably still downclock compared to the 4090. Honestly, almost the more interesting question is if nvidia will give us a third generation capped at 24GB VRAM, or if we'll see more. Personally I think they'll keep consumer parts at a maximum of 24GB to make sure they don't cut into their sale of professional cards for AI workloads.
If cost per frame stays about the same I don't doubt that they could sell $4000 or even $8000 GPUs. Especially if 360-480 Hz 4k OLED displays get released soon. Even among my acquaintances I knew a guy who always tried to go for SLI setups when it was almost never worth it.
Maybe a titan version (and other professional cards) of the GPU would cost $2000+, but I don't think the 5090 would cost much more than the current 4090 (maybe 5-10% more, eg: $1699 or so) Don't forget that's for gaming (and RTX 4090 are probably happy with what they've got; besides, I don't see AAA games in the horizon that will make the promotion of that GPU ... and The Witcher 4 is for much later...)
Haha, 8k$ top nvidia "mainstream" gpu makes perfect sense. There will be people that buy it, more so if you get around 50 gb of fast gddr with it. They might actually have designed something with HBM for a double die chip with this much performance and cost. But, more and cheaper memory makes sense for people buying this stuff for AI and gpu rendering.
Maffs hard, and I'm a dummy: Should be 2.7X not 1.7X if it's 2 dies.
Of course there's no perfect scaling though, just like with the B100/200
Was going to say your math doesn't make sense. I know training is different in gaming but didn't he say it could do the same training at the same speed but with 1/4 of the GPU. 2000 vs 8000 GPU. Now you could say well 2000 GPU are now really 4000 cause it's two cores connected but it was also 1/4 the power consumption so there must be some efficiency improvements no?
You're just trying to cover your behind with your previous 35% uplift prediction by also predicting 70%. So you can always claim "you're right" as a "leaker". GB102 will not be multi-chiplet. Most of the ~70% performance uplift will come from the programmable L1 cache, better clockspeed and ~50% more SMs.
@@louisfriend9323 you caught me man
It depends on keeping the added resources busy - en.wikipedia.org/wiki/Amdahl%27s_law .
I expect a 50% increase. 30% on the die and 20% from GDDR7.
70% faster and it will only cost your remaining kidney and pinky finger.
You mean you have sold your car and children to get this gpu.
Doesn't it already cost a kidney to purchase Nvidia GPU
A kidney cost $2000 ? Just get a part time job and you’ll be able to save enough money before release
Buy more save more!
@@weyo14 lil bro part time job doesn’t even give you $1000
I have heard this before. I won't believe it until i have benchmarks
only accurate prediction will be the price.
When did it ever not come true???
4090 is around 75% faster than 3090 so I dont see this being 70% faster unreasonable.
The sad part is that the 5080 will match the 4090, meaning the 5090 will be 70% faster than 5080, aka it'll be worse than 4k series with the gap being 2x wider per tier
@@NadeemAhmed-nv2br How is that sad?
So they'll be $2800 a piece then. Might as well be 500% faster at that price.
It could be 10,000x better i still wont care at a 3k price point lol 😂😂😂
That's not how nvidia premium product pricing works. 70% faster than 4090 would be like 4k minimum. Because theyre still going to sell 4090 for 1.8-2k and somehow gimp the 5070 to be slower than 4090 so they can still charge 2k for 4090 while 5070 is out and "only" costs probably 1k. Hey you can get almost 4090 for "just" 1k, what a deal!
@@Andytlp What a market we are at now...
That or we can have two 5070 editions one with a lower vram that is just a slightly better 4070 and that "normal" 5070, all with the Nvidia ridiculous pricing
@@ever611 Thats likely too. But unlike nvidia a year or two ago they have infinite money. So for them to be greedy on cents on the dollar that their gaming gpu division is making compared to a.i accelerators, i'd be surprised if they still pull this crap. It happened because they had lots of ampere gpus to sell so they jacked the 4000 series prices for every tier to sell ampere. It worked and people still paid the premium. So theres a high chance theyll jack blackwell prices even higher than ada. We're never getting more performance for same price ever again if so.
@@Andytlp I know I'm in the vast minority, and this is wishful thinking more than anything, but I do believe that the prices of the 50-series will not be as ridiculous this time. If 5080 costs $750-$800, then it's not as bad, depending on performance, that is.
70% faster at heating my apartment maybe.
If Dragon’s Dogma 2 says anything about the future, we need these sorts of leaps because devs can’t optimize their games 😅
To say the game is poorly optimized is incorrect imo. The game has incredibly good graphics. There comes a point where "better optimization" isn't good enough and people need to upgrade their hardware. On the other hand, there comes a point where a better looking game has incredibly diminished returns...it doesn't look that much better to the average person for how much harder it is to run.
What the game does need though, is a low settings performance mode that runs the CPU and GPU a lot less hard maybe with extra downscaled textures, NPCs or whatever.
@@ofon2000 clearly you don’t play many games because there are significantly better looking games that run way better than DD2. Alan wake 2, forbidden west that just released today, cyberpunk, all these look leagues better and more importantly, run better too. So yes, it is badly optimized and that’s not up for debate.
@@ofon2000you have no idea what you're talking about
I most appreciate the aesthetics of games from the turn of the century - 2D renders and sharp, early 3d. The new aesthetics and the pursuit of realism do not suit me at all. That's why I'm glad that I haven't had to replace my hardware for 10 years hahaha.
Dragons Dogma 2 on the best CPUs today is struggling HARD. Game needs much more work under the hood on the CPU side of the render queues. The game performs like trash programming.
"5090 could be 70% FASTER than the 4090 "
Isn't that what we hear every time?
no. we always hear 2 times, 2.5 times or even 3 times as fast haha. 60%-70% increase is quite realistic expectation and it did end up true more often than not.
@@arenzricodexd4409you're right. 4090 is 70% faster than the 3090. A 40-50% increase is more realistic for the 5090 though
When did it ever not come true?
What he meant is 5090 make money 70% faster.
What new DLSS feature will they lock to the 50 series? Neural texture compression or whatever that's called?
Dlss 4.0 on 50 s3ries only
there must be something for sure and that would either be implemented in next aaa game or cyberpunk
@@Ghost-pb4ts Cyberpunk won't be receiving any more major updates, so that's off the table. Maybe Alan Wake 2, though
@@Vorexia nahhh alan wake 2 player base is already dead
And alan wake 2 looks like generic unreal engine horror game with no style
Cyberpunk open world which presents all kind of environment for graphics to shine
@@Vorexia so if not cyberpunk then probably gta6 will be nvidia new plaything
If a 5090 was 2 5080s fused together, I would expect NVIDA to price the 5090 twice that of the 5080 so I expect them to cost $1500 and $3000.
This video was 2 DIE FOR amirite?
Ba dum tsss
Dab dab
I wouldn't go that far.. death? For a GPU? Nah.. not really mate!
there is something VERT DIFFERENT about your voice in this video...... it just doesn't sound right.... did you use some AI that you feed it a few samples of you talking and you can type what you want and it'll speak it in your voice??? I can't tell if that's the case, but there's something SERIOUSLY DIFFEENT with your voice, and it's NOT a good thing
Aye, I did see two dies.
70% more performance for the 700% more money and 7000 times more power consumption. This gaming thing looks like some exotic hobby nowadays.
I don't know. It's never really been any different and it's always the "enthusiasts" who get shafted. You average PC gamer doesn't need path tracing @4k resolution and 120+ fps using fake frames.
Just in case you forgot: OG Doom required a ~2k$ PC in order to run at 320x200 pixels at a capped 30(!!!) fps :) People simply forgot that "Ultra" settings were meant to be for future hardware, "Medium" settings should be the norm and "High" is for, well current high-end hardware. These days, the completely bonkers expectation is that sub-$300 GPUs can run every game at high settings, 1440p (the new 1080p) with 60+ fps...
@@totalermist I tend to make future proof purchases. Today is the machine is capable of Ultra settings, next year High, Medium ... and so on.
@@DarkGT Fair enough. But looking at the dev cycles of major games, I still think it's save to assume that skipping one or two generations is still fine even for more mainstream cards (currently that'd be your 4060/7600 up to maybe 4070/7800XT). It's just that expectation have to be managed somewhat, that's certainly true.
@@DarkGT I just use 720p monitor and refuse to upgrade. My gpus last a very long time...
I dont understand how you did the math here but, I will at least provide an actual logical argument.
You're making the premise that you need an 5090 to game on your computer?
That would be true only if you're gaming at 16M pixels and above which 16M is dual 4K Monitors.
And even then you would be talking about 16M pixels at 240Hz refresh rate.
Unless you're doing that sort of gaming, you can buy an 7900 XTX and an 7th Gen 3D processor and call it a day for a 3-3,5K PC that can wreck 4K 120FPS gaming.
You dont need to exaggerate or make shit up, because a 5090/Ti is a use case scenario of gaming for the G95NC at max output in Ultra graphics and even perhaps ray tracing too and not for you average gamer's setup.
The price? Second mortgage and an internal organ?
they are not in the gpu business, they are an ai company
There was no mention of power requirements... What's up with that? Will the GPU require 1000W for itself alone or what???
I can't wait to have 2 towers on my desk. One for my pc and one for my future 6090!
Nope, the 6090 will be the desk.
It will take about 9-10months before 5090 comes out.
You can get a kid by that time, séII him/her and get 5090
One would probably need to install 3 phase power to power that new card :D
and an industrial size HVAC unit ;)
You'd need a 1000w PSU just for the dual-die GPU, 1300w if overclocking it.
Why would you need to overclock it? Lol. At what point is there enough frames? Monitors can't keep up.
@@Squidgy55 Monitors cant keep up? We have 240Hz 4k 0.1ms oled panels now. 4090 cant produce 240 frames per second in 4k. I doubt 5090 could either.
@@metallboy25 Like any regular gamer can afford all that shit lol. Don't come back at me with prices coming down in the future because I've seen it all before. Something will replace OLED before it's affordable for the average consumer. It's just a repeat of Plasma displays.
I don't think 5090 is targeted at gamers, maybe small % of rich gamers. But majority buyers will be content creators, AI content producers. I believe majority of 4090 buyers also was content creators, i heard rumors from salesman. PRO Nvidia cards cost 10K$, 4090 was 2K$, so Nvidia is filling niche gap 4K$ for consumers that don't have 10K$. But still want best that they can afford, and 4090 is too slow for their needs. If it is true, then it is smart. Filling market gap.
they would say it is actually 170% faster
@@coladict Source?
I feel this is more an AMD thing, they hyped up that 7900 XTX and then it was suddenly just a 4080 competitior :p
I would not spend over a thousand dollars on a GPU, period. Even a thousand is absurd in my mind.
There are people that make a lot of money so to them it's not really a big deal to spend a few grand every few years. Then there are people like me who will soon sell 4090 and make most of the money back add 100-300 to get the 5090 been doing this every generation.
Brokey
@@EskoxoNow it won't work anymore (4k$)
There are those who can buy them every day and not feel it, but most of us have a limited budget and, above all, common sense.
@@tristankordek based on what evidence that it will be 4k lol
4 grand? 😂 They ain't getting shit from me. Bwahahaha!
Don't think they care.
Rumor right now is $2500 USD
I wonder how far we are from being able to use photonics for the chip to chip interconnects.
Possible RTX 5090 (Highly Speculative):
CUDA Cores: Approx. 28,000 (+70% is a significant increase)
Boost Clock: 2.8 GHz - 3.0 GHz (Smaller clock speed gains are typical)
Memory: 24GB - 32GB GDDR7 (New memory standard likely)
Memory Speed: 24 Gbps - 28 Gbps (GDDR7 offers potential)
TDP: 550W - 650W (Power consumption increases are expected)
RTX 5090 launches at $1799 - $1999 (Highly Speculative):
also cost 70% more than a 4090.
Rumor right now is $2500 USD
I don't give a fuck how much faster it is, I care about PRICE and not being ripped off!
How can you be ripped off?? Do they force you to buy?? If you think to expensive just don't buy it. Why people cry so much. It's not basic food or water you can live without it especially that consoles have years to catch up. Just buy slower card
You don't have to buy Nvidia products or believe their hyped up lies.
Remember that!
@@ErraticPT I somewhat agree, but what's your opinion of Monopolies?
@@cccalhoun It is not monopoly, monopoly is when there is no other good option. People just don't choose to buy AMD graphics. That is people's free choice.
@@juriscervenaks8953 So it's a duopoly? What if they collude, is that different than what a monopoly does?
Monitor & other peripherals aside, the max I can justify on a complete new pc build is around £1500 (~$1900).
Ignoring that, I'd still hate to see what kind of PSU power and cooling would be required for a dual-chip 5090.
probably get almost 60fps in 4k with max settings with the lastest unreal engine
Unreal Engine has been ruining a lot for me lately 😂
80fps max in harry potter game bcause CPU bottleneck
unreal engine has cpu problem not gpu
As the advanced packaging capacity is currently a production bottleneck I do not expect it will be used for consumer GPUs.
I would expect a ~50% performance increase, for the 5090, 30% from additional transistors/SMs and another 20% from clock uplift.
No need for NV yo go full out, as RDNA4 is rumored to not have a high end variant. Maybe NV could launch a dual die -50 (Titan?) card as part of a mid-gen refresh in 2026, if RDNA5 tuns out to be good.
I don't mind if they sell a 5k or 10k USD GPU, as long as there is improvement of 30%-35% throughout the whole stack.
Improvement in pricing upwards perhaps
And 100% more expensive knowing Nvidia. So no need to get hyped...
in blury dlss ? or in RTX where all people turn it off ?
What would you consider to be the ideal build for a UA-cam content creator?
Thank you!
Cool, finally they've replaced the relatively really old 4090.
God bless.
A serious question, why has nVidia seemed to have stopped actual GPU development. Take away all the ai and cache based changes, and what have they done?
Simple answer: lack of competition on their back.
Yes, there is AMD, but only on lower-mid range GPUs. Intel is on the horizon for low-end GPUs. So we can expect somehow reasonable prices in those segments, but not at the highest segment. Every company will squeeze the market as much as the market will allow them.
what improvement should be done anyway on GPU? the only thing most gamer really care is faster FPS. when nvidia push RT most gamer shun it as useless because it tanks the performance. take out RT out of the equation GPU like 4090 can give you bajillion of FPS even at 4k. majority of gamer out there? they still with their 1080p monitors so most of them already doing very well if they can get something like RX6600 level of GPU.
@@arenzricodexd4409 AI is and will be a thing.
12:00 I would like the next generation graphics card to be a significant improvement over the 4000 series in terms of power efficiency. Ideally, it would consume around 30-50% less power while maintaining moderate performance, within a 10-20% increase of the 4000's capabilities of the corresponding sku.
4000 is already crazy power efficient tho, you have 4090's running 95% or so @ 360w/80% PL and my 4090 using less power than my 3080 ti most of the time...would be impressive if they manage 30 - 50% less power but i doubt it... i just hope they keep it up going in the right direction at least
im here in my garage with 1060 and i give 0fs about new expensive gpus
You looking forward to battlemage at all.
Thoughts on Intel GPUs?
Salt of the earth
In that case why don’t you save yourself the time and don’t click on video’s about new Nvidia GPU’s.
More like 0fps
5700xt still going so stronk
Well, I guess my math is wrong, I know you can't get perfect scaling, but double transistors for just 35% more performance seems wrong to me.
I suppose it depends on what they use them for, some stuff on the die will do stuff that won't really translate to benchmarking numbers.
My bad, I pinned a comment about that. Me dumb.
2x die size for less than 2x performance?
7:40 GA100 is Ampere not Ada, the H100 is the ADA based accelerator.
I keep waiting for their MCM GPUm for years now but I got a 3060 laptop in 2021 or 2022 to wait for that, and I had a Intel 6700HQ laptop to play games on since like 2016.
When one can merge two dyes, what's stopping them to merge more dyes in a 3D-stack? Or am I talking weird nonsense now?
Heat removal is probably the largest factor, followed by signal integrity.
@yurriaanvanduyn check my last video
Ada GA100? GA100 was *Ampere* but the process and transistor count is correct.
I'm not really trying to guess at pricing or how many dies and what not. I will wait until I see it then decide to go from there but normally the new 80 series is around 25-35% faster than their outgoing 90 series so 70% better is feasible at least.
It could be, maybe. But it won't be.
Why is every single product cycle of Nvidia precluded by this exact prognostication?
Why would they need 2 die for the 5090? Wouldn't a 30% increase in density, paired with 30% faster GDDR 7, and a higher clock speed get you to the 70% figure on its own? Or is performance only tied to density?
As far as i have seen that seems accurate. I would expect 50 percent at least, but alot of that will be gddr7 and density, with clocks making up the rest. I cant imagine the gpus clocking THAT much higher personally. Kopite komi he mentioned in the video has been a leaker for a long time and usually has pretty reliable information. I dont see a double chip design making it to us gamers. In fact im almost certain the 5090 wont be that. No way they give that to us with the INSANE profit margins they get from enterprise cards and hardware
I would cop a 2 die 5090 in a heartbeat. That sounds insane
Maybe the 5090 might be some duel chip special, but all the real consumer stuff will be nothing more than 30% more expensive, with the same amount of memory, and the same memory bus. But if we see 50% performance improvements on each sku compared to the 40x0 series, then maybe it will be slightly more swallowable. But I can see the 5090 coming nearly $3000, especially if it’s some kind of dual chip thing.
While consuming "only" 50W more
10:12 how does this math work? It's more like 170% if you add up the 35% increase and the 35% increase for the second one, but not to forget the 100% addition from doubling the original.
Good point
10k$ GPUs lets gooooooooo💰
I do not have 4kl or 2k
The real question is: if they release a dual die gaming GPU... wouldn't that finish to destroy the quadro line? Also, what about restrictions exports to China? As much as I like tech and gaming, decision are still made based on business needs.
I don't know what you're trying to say at 9:38. A GPU consisting of two dies each having 35% more transistors than a 4090 would have 170% more than a 4090, not 70% more. This point is moot though, as I find it very unlikely that the 5090 will have two dies.
There will be no dual die gaming gpu, as there was no ray tracing coprocessor.
probably it can be done but i don't think nvidia want to go down that route.
Clearly you didn't watch their keynotes lately. The bridge between dies is so high bandwidth that the software can address them as one singular die.
The more you pay the faster the tesselation 😅
I doubt they'll release such a powerful GPU for simple gaming, since they're so hellbent on AI. After all, they are pitching this to AI companies, not gamers, they'll probably scrape the bottom of the barrel of the silicon lottery to "give" to gamers, and sell the good stuff to the AI giants. In simple terms, I'd still expect the 5090 to be the 35% or even 40% faster than the 4090, not the professional level 70% faster GPU promised to companies
I went from 980TI to 5700XT and then to 3080 so i have ben planning to go for 5090 when it comes but only if it cost 2500$ or less. over that and i wait.
They are saying that just because they will make something like dlss 4.0 just for the 50 series and call it a day and that's a fact. It can be like 15-20% faster at best.
NVIDIA needs to work on converting their CUDA library to quantum computing compatible and they also need to make quantum computers as well...
No surprise here with the chip die interconnects as this with other features of advanced packaging form the waves of the future. Still the most important transition in gaming and all computing will eventually become SoC incorporating everything with the large pool of the very fast unified memory, similar to what Apple has done with the Mx chips or what AMD is doing with the MI300X. DDR will be still modular and possible to upgrade however, it will be considered as slower level from the unified memory pool.
Only if AMD would be so advanced as Nvidia it would be really an important step forward.
No way in hell they will make 5090 a 2-die.
Joining chips together is cheaper, not more expensive.
I would consider myself a value purchaser. In the Neterlands I payed 1200 euro for the 4080 last year. The 4090 is about 1900-2000 euro. The only reason I needed a GPU that powerfull is because I upgraded to 4k monitors. A 5090 sure would be powerful, but also not that neccecairy and so far from beiing considered optimal value that I would fail to understand a purchase. Guess if you have the money more power to you but don't really see a real reason to upgrade. Maybe 8k?
If the 5090 is really 70% faster while being on the N4P node, it's gonna be huge, expensive and gobble up tons of wattage.
Lisa Su and Jensen Huang are both of Asian heritage. They both are if the same age. They both in the same field of work.
There are *_zero_* doubts that they are indeed intercoursing.
They are cousins …
@@marcs2960 🤫 and once removed.
@@marcs2960anddddd
More like interconnecting!!!!
Sad that it will probably be 70% more expensive as well!
70% scaling for dual GPU is about in line with the best case scenario when SLI and Crossfire where a thing. I ran both, and had to do tricks like force triple buffering to eliminate micro shuttering but was hit with a ~20ms latency penalty. I call B.S. when he says it won't need programing to work, all GPU's have tuned profiles in the drivers to make them work best. Might not need software changes out of the normal. I'm certain there will still be a penalty and it will be frame times and very slight latency
I might be going nuts in my old age (38), but GPUs used to run dead solid if it was pegged at 60Hz or 100Hz etc, and now they somehow feel inconsistent. Like frame times are off from the readings. And now Intel engineers are confirming my paranoia lol
I used to be excited for GPU releases, not anymore.
Hmmm...guess its time to update my CV and start applying for a second job...
no competition, high price.
11:56 only if it came with >32GB VRAM. Other than that, meh, i'm content with my 4090.
They can take their 5090 and stick it 100% not 70% where the sun don’t shine because that’s how it’s going to be priced. F GreedVidia
Is the Coreteks voice over AI generated in this video (?) sounds like it the first half more so than the second half >> anyways... I reckon the pricing will not justify the upgrade for gamers when 5090 / 5080 is released only productivity / VR developers can justify the cost
These are AI chips. Jacketman said it himself - they aren't a graphics company anymore, they are an AI company. Cost can be high if you make $$ with the hardware, which a lot of people will.
70% more expensive. FIFY
Exiting. I’m 100% sure I’ll get this 10 years from now in order to replace my 15 year old one, for a very good price.
By the way, why aren’t they just stacking in top of each other? What happened to that architecture?
4090 cost 1 kidney, 5090 will cost 2 kidneys and a cornea 😢
Looks like Jensen found footage in the thumbnail lol
good news for 4k gamers
Is it actually 70 timees better or it's just cuz of dlss
70%**** which i highly doubt. 70% higher spec doesn't necessarily equate to 70% percent gaming performance uplift.
tbh i think i would get tempted to spend 4k on a gpu, but realistically i think only 2k max is what i would spend for a NEW gpu. i cant imagine nvidia releasing a 4k usd gpu for the consumer market other than for AI perposes. because again the 3090s msrp was 1500 and 4090 was 1600. so realistically the next gpu probably will be 2k-2.5k
But is the NVIDIA 90 (RTX 3090, 4090, 5090) series a card designed for gaming? It is very monstrous to use her only for that or that that was the destiny they had for her. The gaming cards are cheaper and are the series 60, 70, Ti and 80 as the maximum. It is more at some point when an RTX 5050 (tI) comes out. It could be the price that is budgeted to build a Gaming PC for ordinary people like me.
I would honestly prefer to buy a 4K 5090 that has 70% performance from the 4090 than a 4090 ti disguised as a 5090 that has 30% more performance at 2,5K.
Its not like the product goes from cheap to expensive. It goes from expensive to expensive, you're paying the price to get the most performance and in that sense, only the dual die chip version or a 5090 ti version makes sense to deserve that kind of hype or price to me at least as a consumer.
It makes no economic and business sense for NVIDIA to allocate their precious dies especially for consumers GPU with 2 die GPU configuration tier instead of overcharging profits corporate enterprise GPUs in the current AI boom
The entire GTC focus was all about AI instead of gaming
No sane consumer will ever buy GPU for $4k even $2k with this budget you can easily build entire Personal desktop computer and at $4k GPU price even enthusiasts will have harder decision but for professional market that money will not be issue if the GPU can save time and generate more money or profit
You haven't heard of simracing, right? We are buying 3k wheels, 2k pedals, triple 4k monitors,... Or flight sim enthusiasts. So yeah, I can see people buying 4k GPU just for gaming. edit: but you are probably right, dual die config. is for enterprise customers.
You're most likely right but let's not forget Nvidia is still charging the same amount for Blackwell Data center despite doubling the die size.
Supply may not be an issue too since they are sticking with the 5nm family instead of moving to 3nm.
@@illumi3604 But isn't the bottleneck packaging currently at TMSC?
If the 4090 uses 400 watts of electricity would the 5090 then use 800 watts. This blows my mind and with electricity prices always on the up, running this card for 1 hour at today's prices will cost $4. If I use my rig 4 hours a day my annual electricity goes up by $1,460. Americans might have cheaper electricity but where I live it is prudent to constantly monitor for lights left on and airconditioning that could be turned off. I'll pass on this portable heater.
That's just not happening. Perhaps in certain use cases like full on path tracing
I will be getting the 5090 day one. If its 2500 or more i will probably pass. But up to 2500 dollars, as long as its roughly 50 percent more powerful than 4090 im in. I put 1500hours plus gaming into it over the year, i can justify paying a higher price but i wont be happy about it. But i think even 2500 might be my cutoff. Im not sure i could justify spending that much. Also spending 1500 for the non top tier of card is also a nogo. After last generations debacle my guess it the 5080 wont be more than 1000 dollars and i cant imagine them trying to sell a 5090 for more than 2000. My hunch is 5080 for 999 and 5090 for 1999. Which lines up nicely with what you are saying. But the 5090 better be 50 percent better than 5080 if its going to be that much more expensive
cant see any need it as the 4090 is more than enough for the garbage games that are currently being distributed.
wouldn't adding another chip make it 135% faster and the total gains should be 170%, right?
No one has mentioned power draw? 400W+280W(%70)=680W
That's mental.
$4k cost and 2k Watt power supply...
I cannot imagine nvidia doing a two-die 5090. Maybe the Titan branding will make a comeback, though? That'd also be easier to sell than a >$4K 5090.
I don't even want to think about power consumption, since 4NP doesn't really give much in the way of efficiency. Nvidia will have to go to the 600W and probably still downclock compared to the 4090.
Honestly, almost the more interesting question is if nvidia will give us a third generation capped at 24GB VRAM, or if we'll see more. Personally I think they'll keep consumer parts at a maximum of 24GB to make sure they don't cut into their sale of professional cards for AI workloads.
If cost per frame stays about the same I don't doubt that they could sell $4000 or even $8000 GPUs. Especially if 360-480 Hz 4k OLED displays get released soon.
Even among my acquaintances I knew a guy who always tried to go for SLI setups when it was almost never worth it.
$4000 for just a GPU? If I were a professional who required such equipment, yes, I'd have one or more. As a gamer? Hell no.
Maybe a titan version (and other professional cards) of the GPU would cost $2000+,
but I don't think the 5090 would cost much more than the current 4090 (maybe 5-10% more, eg: $1699 or so)
Don't forget that's for gaming (and RTX 4090 are probably happy with what they've got; besides, I don't see AAA games in the horizon that will make the promotion of that GPU ... and The Witcher 4 is for much later...)
@Coreteks - Are you using AI generated narrative? It sounds like you, but kind of generative.
@Druze_Tito I've always been an AI
@@Coreteks I've been following you since the beginnings and I can negate what you say, dear "AI" :)
Seems like I have to sell body parts to get what I want.
Its hard to justify a $500 graphics card much less a $4000 card
Cortex beware of bots in the comments
If your so broke to justify 500$ gpus then its time to get a console bud
Haha, 8k$ top nvidia "mainstream" gpu makes perfect sense. There will be people that buy it, more so if you get around 50 gb of fast gddr with it.
They might actually have designed something with HBM for a double die chip with this much performance and cost. But, more and cheaper memory makes sense for people buying this stuff for AI and gpu rendering.