Starting to reach a point where I can't even if I want to, Prices are getting ridiculous that instead of upgrading every generation like I used to pre crypto days, It's now skip a gen and if they keep bumping it up it will be skip 2 gens. Not paying for a gpu that costs as much as the rest of the pc.
People really need to stop bitching about the price of every new tech product. The GPU market has changed. There is a ton of competition for cutting edge process nodes. Nvidia can sell GPUs to data centers and enterprise clients if you don’t want to pay for it.
@@BleedForTheWorld Please learn what monopoly actually is. I challenge everyone that would like to buy new gpu this upcoming generation go with AMD. They actually can.
@@Terrylovesjougurt it's a monopoly by design. Do you really think that Lisa Su doesn't own shares in Nvidia? No, you actually cannot buy AMD when it comes to the upcoming 5090 because there is no competitor.
My parents and grandparents both own a a business. My parents run at net 28% profit, my grandparents run at 32% net profit. 60% really is an insane figure. EDIT: I should add, the field where my family run their businesses, these amounts are towards the higher end too.
Same here. We'll see how the 5090 and 5080 will impact the 4090. If there is no new sexy feature in the 5000 series, the 4080 and 4090 will retain their value for even longer it seems.
Amd cards are perfectly fine if you don't care about maxing out ray tracing. Even right now a 16 gb 7800XT can be bought new for $450, that's a hell of a lot of bang for the buck.
@@Flyon86 AMD says they're not going for the high end market, but I find that claim questionable. Intel was best at gaming until it wasn't, but it still stayed ahead in productivity. Then AMD sneakily improved AVX512 performance between the 7000 and 9000 series. Now everyone wonders which AMD chip to buy while Intel's offerings literally melt. People don't seem to realize that AMD is using the same playbook for Nvidia: catch up in terms of raw compute, let them have their better accelerator chips, then eviscerate those without so much as an announcement. I expect that in the next 5 years Nvidia will be scrambling to figure out how AMD passed their performance.
@@Mavendow Idk, they skipped high end cards for the 1st RDNA generation so they just might do that again. When it came out the 5700XT was a pretty good deal for the money, and a good deal for the money is all they really need to do since Nvidia seems hell bent on scalping the gaming market for every extra cent right now lol.
Eh, it's not like games are really pushing graphics any more anyways. Cyberpunk is 4 years old and still looks better than anything basically. AAA devs are just using the increasing power of GPUs to get lazier with optimization.
@@Androidonator all chips are “scraps”. They make one die and the ones that need cores disabled get binned lower. It’s all the same silicon but if 50% of cores are bad that’s a 4070. If 95% are good it’s a 4090. That’s how all CPUs work as well.
If are not graphics addict, you could still do that. Otherwise RTX 30xx gen still seems like almost reasonable in terms of TDP, performance and (hopefully soon), price
Same, although i have a 5700XT. I was fortunate to build my new computer the year before the scalpers struck the market and people were dumb enough to buy at the inflated prices. I can still run anything i want with really nice settings at 1440p. I'd dearly love a gfx card with oodles of ram to play around with AI but apart from that this machine i built in 2019 does everything i need. I refuse to buy at these moronic prices, at least CPUs are sanely priced. The good news is if AMD and Intel bring out great card and compete heavily in the low - mid end of the market prices will be extremely competitive and you will get very good value for money. And the top end, well the top end will fall and prices will tumble in response or NVidia's sales will plummet. Now, i do believe AMD will target the top end, but only when UDNA is ready which will be 2026 at the earliest, most likely 2027. By then maybe sanity will have at least been partially restored.
A 13900k and 4090 setup is uncomfortably hot in the summer. I put my system in a different room because of that. I can heat up the other room as much as it wants.
dont worry , nvidia will release 5080 super , then 5080 ti , 5080 ti super , 5080 ti superduper , as long as fanboys keep cheering for their daddy jensen , he will milk .
you just paid the right price that gpu should have been , lol nvidia builds that gpu for $200 and sells it for $2000. fanboys just to dumb to understand that
@@pedroferrr1412 The bill of materials is, in fact, estimated at around $200. There are other costs ofc (marketing, support, etc.) but the card itself is only that much.
Considering how shrunk the 5080 is compared to the 5090 II cannot see it faster than the 4090 if the 5090 is just 30-35% faster. The 4090 is up to 35% faster than the 4080 in heavy games, and around 25 on average. Either the 5090 is 35% faster than the 4090 and 5080 is at parity with 4090 or the 5090 is 45-50% faster than 4090 and 5080 is 10% faster than 4090.
Nvidia would not want export restrictions on anything below the 5090. Regulations will get adjusted eventually but right now they'd need to keep it around the performance of the "4090 D". They'll use efficiency gains and whatever new gimmicks they have to justify it.
Why would I buy an rtx 4090 for $2000 now when I can get a 5090 for the same price at launch (it'll increase and stay high 1/2 months after launch till it's discontinued ) and use that for 2 years before the 6090 launch
@@justhomas83 It's been my GPU for the last two years. I'm definitely not switching it doesn't seem that the 5090's generation leap will be as big as it was with 3090 vs 4090. But if I were to get one I'd wait for the 5000 to release, 4090 price will drop... at least if the 5080 is faster. If the 5080 isn't even faster with a laughable 16gb vram for 1200 1300$ and that there is a GPU shortage due to mining and ai.. I bet 4090 will still sell expensively.
Nvidia will probably bring a new frame generator that triple the FPS output, and claim that the 5000 line is much faster than reality, influencers will sell this crap and the normies will buy into that.
I was thinking exactly the same, they will improve the quality of the FG and use that improvement to offer 3x FG. Its perfectly doable, as shown by LossLess scaling, where u have 3x FG and if u apply it at 60fps game, the resulting 180fps output looks quite good. And thats just with a SW FG, which has no data on the movement vectors. So yes, this will be the thing... And than they will market is like a powerhouse (of fake frames)
i mean, i've used frame generation and it's not bad at all if used properly, that means using it only if you have at least 45-50fps without frame generation. It's not that bad at all especially amd fsr 3.1 that allows to enable frame gen even on older gpus
@@luigicorciulo8190 Frame gen can work depending on the game, however even when you're already getting 60fps baseline the latency can make some games feel weird.
@@Koozwad it's still very much playable, in fact even some competitive multiplayer titles integrated FSR 3.1 frame gen. The games I've used framegen the most are bodycam, god of war Ragnarok and cyberpunk, and i had no issues aside some occasional little visual artifacts. Bodycam is a multiplayer and it it is difficult to run even for a 4080 without using frame gen. I think a lot of people are throwing shit at frame gen without even trying it or watching reviews of the tech. I also don't like the trend of people always complaining, when a new features requires upgrading hardware they complain "they could just make it software but they won't to lock us into buying new hardware", when instead it's a literal button to double the FPS they complain anyway. Btw god of war Ragnarok is good if you want to test them all side by side by yourself like I did because it includes AMD FSR 3.1, intel XeSS 1.3 and Nvidia DLSS 3.7 frame generation and you can switch between them mid game.
The MSRP of the 4090 was 1600. 5090 has 22% larger die than 4090. Meaning price will also go up 22% or more. 1.22 x 1600 = 1950. 1999 MSRP for 5090 is all but confirmed.
@@sfguzmani Some guys with money in the enthusiast crew definitely do, they actually find the performance of new games fine and don't want guys with sub x90s GPUs to play with max settings/4k/RT etc. These guys tend to have stock in nVidia, so by pushing the prices of the low-end and mid-range, they're actually getting a net profit off of it.
@@gameurai5701 Well, usually i buy 2 GPU´s each generation(son and myself), but this time...Is enough, the prices have become ridicule! And i´m not rich.
People blaming tech companies for the prices is silly as the reality is it's those who bought from scalpers that caused this. Why should AMD/Nvidia release a card for £500 when scalpers buy them up and people are dumb enough to pay them £2000. Why should scalpers make all the money when all they did was be scum, your life would not have ended if you didn't have the latest gfx cards. And we would still have very powerful and reasonably priced gfx cards, but for some insane reason people bought them at crazy prices, and this is where we are.
None of these companies will have a GPU faster than a 5070ti.... So maybe you can get a 5070ti for a bit cheaper than what Nvidia sells it for, but with worse raytracing and upscaling. If the 5070ti is roughly the performance of a 4080 super, then that's probably all you will need even for 4k gaming. If you are looking for cheap, you can probably get a used 30 series card cheap right now.
@@johnc8327 a 4080 super isn't all you need for 4k gaming if you're playing new games. Even 6 year old games like red dead 2 if you crank to ultra and run native 4k I drop to 55fps in the city. Stalker 2, again average of like 55fps. Cyberpunk 2077 even at 1080p with overdrive mode the 4090 only gets 45fps. black myth wukong at 4k ultra only gets 23fps on the 4090. So sure, if you're playing easier to run games or games that are 6+ years old you can get over 60fps at 4k ultra. But it's definitely not "all you need", there's plenty of games that demand more. Most new AAA games and any from the last 2 or so years to be honest. And you can argue that "you don't need 4k ultra though" I agree. But you don't "need" a $1000 graphics card. And you are the one that said 4k gaming, so also don't say "just turn the graphics down".
The MSRP doesn't matter anyway. In the end, supply and demand sets the price. The 4080 Super has an MSRP of $999, and prices vary from $950-1200. The 4090 has an MSRP of something like $1400, but they sell for about $2000-2200. So...if the 5090 has an MSRP of $1999, and it actually sells for $1999, that's less than a 4090. On the other hand, suppose they set the MSRP at $1400, but it sells for $2400, that's a big increase.
Honestly, the root of the issue is, that for many ppl, gaming is addiction. And when u addicted, price does not matter. I know its ugly statement, but its actually truth IMO.
There were ~17000 games released on Steam in 2023, I doubt even 5% will use UE5 in 2025... There are maybe 4-5 high budget PC games every year that are worth buying and Kingdom Come: Deliverance II don't use UE5 so we will be fine.
@@kkrolik2106 BS. Wukong, Stalker 2, Silent Hill 2... all run like SHIT, and just compete in which game will murder your CPU or GPU or both. Stutter cant be removed with any current hardware
@@HybOj hogwarts legacy, wuthering waves, really, basically any UE5 game... even fornite is ridiculously heavy compared to what it was at the beggining, even if you use lowest settings
I know this is a gaming centric channel but there is a lot of hate here but for 3D / VFX artist who rely on GPU rendering this looks fantastic. Some early leaks indicate nearly double the performance of the 4090 in Redshift / Octane etc and it’s only drawing 200 watts more power draw 🤯. People don’t realize how good we have it - this type of performance doubling is pretty impressive. I currently render with 2x 4090s. 2 of these 5090s will be like having 4x 4090s with considerably less power draw / heat etc. Between higher clock speeds RTX cores, more CUDA cores, faster memory bandwidth just two 5090’s are the rough equivalent of nearly 36 x 1080tis which many consider the 1080ti sort of the gold standard of Nvdias GPUs for its time. Think about how much power a setup like that would draw if you could even make it happen. / PCI lanes etc. Really looking forward to these.
TBC means another 8GB VRAM. No big surprise. We already got this leak way earlier than this since 4060 launched. They are not hiding it anymore this time around. The people who still supporting Nvidia are the same people who will support broken games too. We got UE5, we got greedy companies gimping up on GPUs, it can't get any worse right? Right?
Look man, if you hypothetically get a card that's twice as fast and has double the VRAM, why wouldn't they ask double the price? Getting more power for the same price every 2 years is actually kind of unique to the tech world.
What kind of stupid take is the calculation of a 5090 being 35% faster than a 4090 and then a 5080 10% faster. So you're saying a 5090 with TWICE the core count of a 5080 would only be 25% difference. When the 4090 had ~60% more cores than the 4080 and it was 30% faster. You don't make any sense. Also the rumor is that 5090 is more like 45-50% faster than a 4090. And given that the 5090 is TWICE the 5080. I don't expect the 5080 to be more than on par with 4090 or so. Perhaps a little faster in some instances. The 5080 is just a cheaper 4090 but with less VRAM at most.
@@stolenlaptop even at 1080p... if you talk about native resolution, you'll be lucky to get 60+ FPS stable at any AAA game released currently...with a 7800x3D or better.
Please don't buy Nvidia, half their revenue is profit, it's insane how they milk the market with overpricing. Your current graphics cards are fast enough and if not, AMD RX 8800XT in January hopefully might be a good choice for ~500$. I'm gonna switch to 8800XT from GTX 1080. Intel also releases its 2nd gen cards soon. 600W max power consumption for GTX 5090 is riddiculous and irresponsible. Even 400W is insane. My whole PC consumes
I'm in the same boat. bought a secondhand 7800X3D for 200€ a few month ago, and still on GTX 1080, all custom watercooled. I could afford the 1080 Ti at the time, but the TDP was too high. I want efficiency because I have a medium sized room and don't need a powerhouse getting me sick I'll also buy the 88OOXT and with the difference I'll buy an oled freesyng monitor, and keep the gsync I have as second monitor. Nvidia has always been about marketing: physx, gsync, hair animation, dlss..which all end up being mainstream and free after a while. And games are so boring nowadays, I always end up going back to games around 2012-2016. I passed the time having the biggest one, just stick with what I need.
Everyone expecting 5080 at $1000. Aint gonna happen. I expect 1400 msrp, actual 1500-1600 and out of stock fór months. Nvidia wont give us better price/performance ever.
AMD is ditching RDNA for UDNA. to board that AI train thankfully, its about time. I'm not interested in buying Nvidia or AMD anymore because of all the neckbeards on either side with superiority complexes. I think I'll go for intel if battlemage is good.
Unlikely. Production of the 5080 will stop and the price will instantly jump up, like with the 5090. You have two other options, as invalid as they may be for someone who wants a 5080. 8800XT and the B970, since they're both targetted at the 5070 and 4070 respectively. Or wait for the 6000 series and check out UDNA. They can't fuck it up more than RDNA, surely.
According to Coreteks RTX 5090 = 30-35% faster than the RTX 4090, and the RTX 5080 would be 10% faster than the RTX 4090? LOL, that's BS as the RTX 5090 has double the specs of a RTX 5080, the difference will be way higher than 20%.
Not sure about lower cards, But 5090 is surely over 40% faster than a 4090 with those specs, They are going back to the good old days of 512bit bus, I miss those days.
Yes, he doesn't make any sense at all. Now if the 5090 was close to 50% faster than a 4090, Then the 5080 would still be no more than parity with 4090. Barely faster. The 5090 literally have twice the core count of the 5080. There is no way in hell it would be only 35% faster than 4090 and then a 5080 being 10% faster. That would mean the difference between the 5090 and 5080 would be no more than 25%.
True. Bandwidth is a good indication 5090 is 512 bit. THis is very rare. We are seeing 512 bit card after long time. Rtx 5080 has has only 256 bits. It should be atleast 384 bits. THis is just ngreedia cutting stuff as usual rtx 5070 should be 256 bits.
The very last nvidia gpu I bought was "xfx geforce gtx 7600GT" back in 2006. So... I gave total of 0 (zero) $ to nvidia in last 18 years. I am still using sapphire R9 380 (4 GB) from 2015. I do not mind nvidia to overprice it's all gpu model. I am not going to buy any of them anyway... It is not my responsibility to make nvidia share holders rich. I'll have a look at "battlemage" and "rdna4" options. If they (amd & intel) are continue on price fixing crime, and not bring good value options to the market, I'll hold on to my R9 380. PC component world, never saw this level of darkness era in it's history.
I'm still on a used 6600xt on 1080p. Good enough for the games i play. My next card is likely a 7800xt performance level but more efficient than what the 7800xt is today. I would be happy with that.
I'm gonna go against the grain and say that I don't think Nvidia is going to have big price increases for most of Blackwell. They'd be dumb to put the 5080 at over $1200, and even that seems excessive. 5070Ti is gonna be a little faster than a 4080 Super for $800 or $900. This isn't great, but they are technically a little bit better price-performance than Ada.
@@AmUnRA256 5070 and 4070 definitely look like older 60 class GPUs. 5080 looks a lot like an 80 class to me. The 5090 is weird because it's their first 512b bus GPU in almost 20 years, so it's beyond the 4090, 3090 and old Titan cards. There's a big 4090 shaped hole in this lineup where they could have another GPU with 128SM and 24GB on a 384b bus.
@@coryray8436 Did you see the difference between the 5090 and 5080?!?!? If the leak is to be believed, which part of 50% less VRAM and Cores says 80 class GPU to you?
Why would they? They dont need to win any popularity contests and they dont have any competition. Jensen is gonna rob PC gamers blind this generation... again!
Its sad no maddening actually that it has come to this but yeah old cpu's and now even older gpu's are fine for many people's needs unless they want to be $#tards and want to drive 4k at Ultra with 100 fps! I say downgrade to 1080p 27 inch monitors and stick to older gpus!
@@50-50_Grind True. I only upgraded because I wanted to use Stable Diffusion for a project I'm working on. Doesn't work on older AMD cards. 3060 had 12gb vram making it x2 cheaper than the next "budget" 12gb card
NVIDIA has no reason to care. They're at all time highs, AMD competition is dead, Intel GPU's are a joke, and AI data centers will eat up wafers at any price since tech companies have infinite money. GeForce is just an annoyance for NVIDIA at this point.
16GB should have been standard with the 3000 graphics cards. 😐 And looks like it's not even going to be with the 5000 cards! The 3070 and 3070Ti are perfectly capable cards, but are significantly dragged down by having just 8GB. And 12GB is an aberration. It means a 192-bit bus, when all 70-class cards have had 256-bit fore more than a DECADE, since the GTX 670! 😠
@Heinz76Harald 4080 desktop had a 256bit bus....8 x 2GB chips so 16GB total......with 3GB chips, it would be 8 x 3GB which would give 24GB. The laptop 4080 uses the desktop 4070 die, 192bit bus, 6 Chips....6 x 2GB gave is 12GB Vram.......with 3GB chips.....Laptop 4080 can have 6 x 3GB chips which will give is 18GB. Perfectly adequate for 1440p gaming.
Techtubers with influence need to blatantly tell people not to buy certain Nvidia cards. I am more than happy with my 7900 XT. Especially since Ray tracing is still just a gimmick for the most part that tanks your performance.
@@pituguli5816 And those same tech-tubers have the audacity to tell us that they're 'Just like us', or 'We understand you'. ... Pfft, _No_ they do not. Most of em' are just a bunch of corpo sponsor teet suckers, who will look the other way once they see that fat check clear their bank accounts.
That's like saying that car youtubers should start telling their viewers not to buy a Ferrari. People who can't afford it won't buy it and those who can and think it is worth it, will, just as it has always been.
@@searingsword-4-21 My friend has a Ferrari, I drove it, nothing special not worth $745k just like the 5090 wont be worth thousands of fiat. This is coming from someone who can afford to buy multiple 5090's, I'm just not a numpty.
On the flip side I bought a $300 second hand 3090 with 24GB VRAM. The 5090 at a projected $1800 is only 40% faster according to these charts. There is a HUGE gap in pricing between the used and new market for GPUs and it's a waste for gamers not to utilize it.
AMD and Nvidia most likely have a non-compete agreement to allow Nvidia to occupy the halo slot. That way, they set the prices and AMD gets the profit margin benefits while maintaining the status quo. When they had a node advantage, they deliberately stayed off the top performance tier. I gave up on this hobby, it's full of useful idiots that reward this kind of anti-consumer behavior. Still like to support this channel, though.
Amd simply doesn't make these giant gpu dies like nvidea does. Amd regularly comes close to nvidea performance with only half the silicon. They don't see going with big dies as viable.
Having a written agreement like that would actually be criminal and prosecutable BUT AMD's behavior and Nvidia's too definitely has been very suspect at times. Infact plenty of times. It really seems like AMD does not want to take any marketshare away from Nvidia even when they had clear opportunities.
The messed up thing is that Nvidia will more or less be forced to "compete" on the low end since there is legitimately some competition at that point, but anything north of whatever they price the 5070 at is gonna be even further inflated. The VRAM numbers are pretty sad on everything other than the 4090. Since most ray tracing sucks I don't regret buying my 7900XT.
It probably won't burst. It's so fresh that there's still oodles of things to discover, and when all of that is discovered we'll use it a stepping stone into the next level and keep innovating.
@@yomatokamariYou are absolutely mistaken. AI is already being used to increase productivity in multiple areas. AI is already being used to review police body cam footage and writing a police report. This is just one but there are a lot of other ways it’s being used right now.
At some point… the entities that endlessly buy up Nvidia GPUs (Amazon,Microsoft,Google,Meta and every cloud provider of GPUs under the sun) will have to actually make a return on that investment otherwise they wont be able to buy more and more cards. Simply the AI products need to make a return to justify the investment in hardware. The most obvious example of this would be OpenAI….at some point THEY NEED to make money and actual PROFIT from their AI products otherwise they will go bust from their electricity bill alone
its been 6 years since my last major upgrade i think with these prices i will switch to 10+ years upgrade cycle once i upgrade to the 5090, unless future generations drop to half the current prices inflation included.
Everyone and their mom are going to be trying to buy a 5000 series GPU at launch before any tariffs are implemented, so unless you are lucky and able to get one early, you could easily end up with no GPU until mid 2025 when you are finally able to buy a 5090 for $2500-3000 after tariffs.
With incoming US tariffs on Chinese imports, trust me, 4090 gunna be worth a lot more in a few months even with the 5090 available (tho 5090 will sell out for at least 6 months at any price)
Your price and performance predictions are normal from generation to the nexr. But I disagree that nvidia doesn't have competition. The people suckered by Nvidia marketing tactics is the real problem. The 7900xtx is 80% to 110% of a 4090 but at less than half the price in rasterization performance.....thats a deal/steal. Nobody really cares about RT, dlss or any of those other gimmicks.
Problem is 7900XTX is $1000 GPU not $200. when you pay that much it better excel at everthing. Pay $1000 and still can't be no. 1 at everthing? Some people does not take that well. Same with 4080. That's why people opt for 4090. Plus nvidia able to retain it's used price better.
I dont know why everyone thinks a gpu needs to go up every year that's crazy so soon a 8050 will be 1k soon nvidia just will not be an option for people at all at this rate
1200+€ including VAT for a cheapest Biostar/Manli/Inno3D card with a most underwhelming heatsink using a single noodle-sized heatpipe and two cheapest 0.5$ fans is a great value...
I paid 500eu for a 1070ti during the 2018 crypto boom which was already arguably too inflated. So from that were looking at what? a 100%+ price markup on the new xx70ti class? Lmao
But who is buying those 2000$+ USD GPU for gaming? Besides a few pro-teams that makes a lot of dough, who else? I can't believe there are MILLIONS of users/gamers that are buying them.
Less than 1.3% of gamers on Steam bought 4080's and 4090's, you are right that majority of PC gamers aren't buying them. Gamers aren't the demographic for Nvidia's chips now days, they really don't care if gamers buy them. They will ride this Ai boom all the way till corpos cotton on that there is no money in it for them, so far Nvidia is the only one making a motza. As for us? Well I'm picking up a 7900XTX this week Bro and I am DONE with buying anything for at least 5 years which suits me fine since all I play is D2, DRG, SOT with indfies and some AA games, I haven't bought a AAA game since 2019.
The funny thing is that we are lucky to even get 50-series GPUs at any price this cycle. Datacenter makes nvidia 10x more money, per their financial earnings reports, so consumer GPU are barely a drop in the bucket to them.
If you think that you're lucky to be able to buy it instead of a company thinking it is lucky to be able to sell it to you, then you already lost and are part of the problem. Sorry for giving a come to Jesus.
@@Terrylovesjougurt I will pay any amount for GPU. Running local AI models makes my work easier and saves me time so the return on investment is huge for a lot of folks
@@thenextension9160 Well, then this isn't about you. And by "any" you mean, as long as it's profitable. Too bad in many cases (possibly not yours) this cost will be transferred to consumers, who may not even want any ai features that will be crammed to everything artificially (pun intended) balooning the price. Be ready for ai enhanced asparagus.
I'm about to hurt some more feelings but everyone needs this economics lesson. I built a top 2% gaming PC in 2016 that was very expensive for 4k ultra gaming. The video cards were 3x 980ti classifieds OC to 1500mhz with an OC 5960x. This yielded a firestrike ultra 4k score of 11405. It cost $650 per card plus $180 for each waterblock (2016 dollars). I currently have a 600w watercooled 4090 overclocked to 3025 mhz with an OC 7800x3d. This build puts down a firestrike ultra 4k score of 25568. The water-cooled 600w 4090 cost $2250 (2023 dollars). Usually video cards gain about 15% performance per generation. I think it'll be about 25% this gen. If we make the conservative estimation of 15% perf uplift we get 29,403 for $2500 (2024 dollars) So lets compare price to performance and ADJUST FOR INFLATION (the step no one wants to do). We have to pick a year so to drive the point home, I'm bringing everything to 2024 using the governments inflation numbers (which are manipulated down but that's another tangent). The 980ti SLI cost $3,274.92 2024 dollars (31% inflation) for 11405 performance (or 3.48 pts per $) The 4090 WC cost $2,330.95 2024 dollars (3.6% inflation for 25568 performance (or 10.97 pts per $) LOW(+15%)*EST* 5090 $2500 2024 dollars (0.0% inflation) for 29,403 performance (or 11.7pts per $) HI(+25%)*EST* 5090 $2500 2024 dollars (0.0 inflation) for 31,960 performance (or 12.78pts per $) So what does this all mean?? TLDR?? Video cards are cheaper and more powerful than they have ever been. We live in a golden age of computer performance that would seem preposterous not even 10 years ago in 2016 with a SINGLE CARD. Back then if the game wasn't optimized well, you'd get 1/3 or 1/2 of your total performance. I would have to debug a game for about a week before I could play with my friends. Play a game on launch, probably not. And most games you'd have to dial back settings to medium or low to get 4k to run a reliable 60fps. Now I can run ANY game 4k ultra and get usually about 120+fps. I run circa 2016 games on THREE 144hz 4k 43" monitors and get solid 144fps with a 160 degree FOV. The real problem is governments and central banks inflating currency which drives down the value of our money (which we trade hours of our lives for). The last massive surge of money printing from has raised the inflation since then over 100% doubling the cost of all goods. This rapid change is money value kills us consumers because our pay is the LAST thing to go up. First the cost of raw materials goes up. Then the cost of completed goods goes up. Then begrudgingly the cost of labor (our pay) goes up (because if employers jump the gun on pay, a competitor who can hold out a few months longer will murder them in the market and the company could suffer a short term shock that cripples them). The effect of the stimulus is that every single person living in a country that deals with dollars took about a 50% pay cut. Ask yourself, if your pay doubled, would a $2500 card seem outrageous?
Anyone who buys 5070 or 5080 with the measly 12GB-16GB of VRAM only deserves to get ripped-off like a gullible sucker. ... But seriously, tf is with that performance gap between the 5080 and 5090?! It's basically half the GPU.
I would never buy any Nvidia product ever again, id rather quit gaming. This is since the 20 series and i stil play well with AMD at the moment, but i hope that Intel will become strong one day.
"ppl should stoo giving Nvidia money" I want the best GPU, I want best power efficiency, best day 1 drivers (or day 1 drivers at all on a consistent level), I especially want DLSS upscaling, DLSS FG, I want the whole Nvidia suite like RTX HDR for example, I want to use the most graphically intensive settings to get the best visuals you can possibly get (even if I would buy in the 600-1000€ range, I obviously want to use as many of the best features as possible. and when I use upscaling, I want the best form of it), and because of all that I will def and absolutely and obviously buy an RTX 5090. As said even if I could afford only like 1000€ for a GPU, I would never buy a 7900XTX instead of a 4080. I play at 4k 240hz. You WILL use upscaling, especiallx at 4k. And bc you can use DLSS upscaling at Quality level, you will ALWAYS activate it bc you cant see any visual dow grade to native 4k. often its looks BETTER than native with TAA!! If I had to pay 1000€ for a XTX and then everytime I would avtivate FSR, I would know that I have literal worse image quality compared to a GPU running DLSS. And if i buy a high end GPU, you can be damn sure that I want to have all bells and whistles and want my GPU to handle ultra intensive but game changing features like Path Tracing in Alan Wake 2 or Cyberpunk for example. And there will be more and more games that use that. AMD makes the best CPUs atm. In fact I own a 9800x3D pc since 2 weeks. But for GPU I just wont buy AMD until they are EXACTLY as good as Nvidia in ALL cases. This wont happen bc Nvidia wont allow that. So since I buy ultra high end (90 class), I can buy only Nvidia anyways. AMD is interesting in the ultra low end to mod range of 200-500$ range. but 4070 and up I would def not buy any AMD GPU. Nvidia costs more, they can ask for more bc you get more.
@@pituguli5816 wtf are you talking about? I objectively laid out why Im buying Nvidia GPUs and not AMD. And I also objectively made my decision about the best CPU, according to benchmarks, which is the 9800x3D. In GAMING.
rough times ahead? as if the GPU market hasn't been a garbage heap for the past decade also, none of this matters. consooomers will slop it up anyway, further reinforcing the problem
I knew it was only getting worse. Bought a 4080 and left it at that, it will have to carry me for the next several generations since all future NV GPUs will become even less appealing
Nvidia is about to leave a gap in the market with their expensive pricing. This is the perfect time for AMD to come in with the RX8000 Series to fill that gap.
@@rogerk6180 high end. But yeah I wouldn't be surprised if they threw in the towel in a decade or so. We deserve the monopoly that we're willing to buy into.
Yeah AMD has only ever match the same pricing that Nvidia sets whilst giving you a little bit extra performance per dollar. If Nvidia increases their pricing then AMD will increase their pricing. AMD CPUs for instance are now very expensive. AMD is just as bad as nvidia
No they're just going to make shitty 200mm die cards which no one will buy because all of the influences and e-gamers are going to use 5090tis, and Nvidia has better upscaling and other technologies.
Since WHEN nvidia cared about gamers? Don't be confused. We buy their products because they're top in their category. Not because they care about gamers. Same goes for AMD vs intel in CPUs
People really need to stop giving nvidia money.
They definitely won't
@@BlueMax109 True, unfortunately.
But we need faster cards.
Starting to reach a point where I can't even if I want to, Prices are getting ridiculous that instead of upgrading every generation like I used to pre crypto days, It's now skip a gen and if they keep bumping it up it will be skip 2 gens.
Not paying for a gpu that costs as much as the rest of the pc.
People really need to stop bitching about the price of every new tech product. The GPU market has changed. There is a ton of competition for cutting edge process nodes. Nvidia can sell GPUs to data centers and enterprise clients if you don’t want to pay for it.
People normalized BS from companies.
Well not people, but the 60-70% of sheep out there.
@@Thor_Asgard_ So, sheeple. We all know what he means.
Easy to blame people who have a face rather than faceless companies who are simply participating within capitalism. It's a monopoly.
@@BleedForTheWorld Please learn what monopoly actually is. I challenge everyone that would like to buy new gpu this upcoming generation go with AMD. They actually can.
@@Terrylovesjougurt it's a monopoly by design. Do you really think that Lisa Su doesn't own shares in Nvidia?
No, you actually cannot buy AMD when it comes to the upcoming 5090 because there is no competitor.
You deserve what you tolerate.
Does this also work for pain?
Unfortunately the market is often driven by what other people will tolerate, and the rest of us have to put up with it regardless.
@@krazed0451 We can also just exit the market.
@@50-50_Grind 😂
@@50-50_Grind
Yeah, it's called Stockholm syndrome
$800 for 60 class silicon. Man these guys are insane.
What's more insane is that people will still buy them and Nvidia will profit a lot.
Is it so insane now?
@@Alakazam551 it's even more insane now. That sounds like a cult.
Its the consumers that are insane, Nvidia just does what any business does and prices accordingly.
This has been a crime since the RTX 2000 series, strange that people have only just started to be bothered by it.
nvidia could make dumb gamers buy anything.
16GB in 5080 is a crime.
Indeed 24GB should be the minimum
Then dont buy it!!
Really? I feel like that’s the perfect amount.
This entire lineup is bogus. Just go used if you really need a GPU.
they're saving it for the 5080 super ti ultra pro diamond titan platinum card
5060 With 2GB of VRAM, That'll be 1000$ plus a monthly subscription of 25$ to enable your GPU coolers.
I like it! You should be working for Nvidia friend 👍😊
@@bulutcagdas1071 and Nvidia can turn it off remotely if your payment is late 🤣
They're taking the BMW approach.
HDMI port pre-enabled free of charge, rest of ports subscription based.
@NotAnonymousNo80014 hdmi port only enabled for 1080i for free 😈 they will love that interlacing 🤣🤣
I don't know much about business but having a 60% net profit margin is actually crazy as far as I know
My parents and grandparents both own a a business. My parents run at net 28% profit, my grandparents run at 32% net profit. 60% really is an insane figure.
EDIT: I should add, the field where my family run their businesses, these amounts are towards the higher end too.
@@Christopher_Syep, always predict around 30% for a healthy business
All tech is way overpriced
Gillette razor blades have a higher margin
The margin that being reported by company is not net profit. When AMD still in the red in every quarter before their margin is something like 40%.
Criminal that I can sell my 2 yr old 4090 for more than I bought it for ($1700).
Feels like this will kill (or severely reduce) PC gaming
but should you?
Same here. We'll see how the 5090 and 5080 will impact the 4090. If there is no new sexy feature in the 5000 series, the 4080 and 4090 will retain their value for even longer it seems.
You can't 😅
I got mine for about the same, like $1650 before taxes. Is it inflation or actual scarcity?
I read this as “criminal that I can sell my 2 year old.”
I'm looking forward to the 8800XT at this point.
Amd cards are perfectly fine if you don't care about maxing out ray tracing. Even right now a 16 gb 7800XT can be bought new for $450, that's a hell of a lot of bang for the buck.
@@Flyon86 AMD says they're not going for the high end market, but I find that claim questionable. Intel was best at gaming until it wasn't, but it still stayed ahead in productivity. Then AMD sneakily improved AVX512 performance between the 7000 and 9000 series. Now everyone wonders which AMD chip to buy while Intel's offerings literally melt. People don't seem to realize that AMD is using the same playbook for Nvidia: catch up in terms of raw compute, let them have their better accelerator chips, then eviscerate those without so much as an announcement. I expect that in the next 5 years Nvidia will be scrambling to figure out how AMD passed their performance.
@@Mavendow Idk, they skipped high end cards for the 1st RDNA generation so they just might do that again. When it came out the 5700XT was a pretty good deal for the money, and a good deal for the money is all they really need to do since Nvidia seems hell bent on scalping the gaming market for every extra cent right now lol.
*Gaming is dead* ☠️
Gamers are no longer the target audience
Eh, it's not like games are really pushing graphics any more anyways. Cyberpunk is 4 years old and still looks better than anything basically. AAA devs are just using the increasing power of GPUs to get lazier with optimization.
People will stupidly pay the markup with a smile, as they’ve been doing since 2020.
Since 2016 you mean. Probably even longer than that 😂
@@xthesayuri5756 i know for a fact since 2011. nvidia didn't even have the best card then.
@@xthesayuri5756 the 980Ti was the beginning of steady markups before that it was up and down with halo GPUs.
What markup? 4090 was actually quite good value, at least if used for content creation. For gaming, well, less so.
@@zapadorcontent creation? You mean making dogshit 4k videos no one needs or VR content 7 people are engaging with
People seem not to understand that these GPUs are scraps from the big AI chips that didn't pass the QC.
Would you sell a chip at 2k when it can be sold for over 10k and having people knife battle over it?
That's fine, But if they are scraps then they can give us cheap prices for the leftovers.
@@megurinemiku8201they can't sell for servers at 10k because these are not working... You make no sense
@@ShaneMcGrath. It's already insanely cheap compared to their AI chips lol.
@@Androidonator all chips are “scraps”. They make one die and the ones that need cores disabled get binned lower. It’s all the same silicon but if 50% of cores are bad that’s a 4070. If 95% are good it’s a 4090. That’s how all CPUs work as well.
I'll just stick with my GTX1060 @ 1080p until sanity is restored.
If are not graphics addict, you could still do that.
Otherwise RTX 30xx gen still seems like almost reasonable in terms of TDP, performance and (hopefully soon), price
well to be completely honnest, i doubt the gpu prices are going to drop anytime soon, this is the sad reality we live in
@@derpsterPRO I ordered a 4060 yesterday🤭
Same, although i have a 5700XT. I was fortunate to build my new computer the year before the scalpers struck the market and people were dumb enough to buy at the inflated prices.
I can still run anything i want with really nice settings at 1440p. I'd dearly love a gfx card with oodles of ram to play around with AI but apart from that this machine i built in 2019 does everything i need. I refuse to buy at these moronic prices, at least CPUs are sanely priced.
The good news is if AMD and Intel bring out great card and compete heavily in the low - mid end of the market prices will be extremely competitive and you will get very good value for money. And the top end, well the top end will fall and prices will tumble in response or NVidia's sales will plummet.
Now, i do believe AMD will target the top end, but only when UDNA is ready which will be 2026 at the earliest, most likely 2027. By then maybe sanity will have at least been partially restored.
Never going the happen.. Every year the same story.. To expensive.
600W for a 5090. Damn. That's going to heat up a room.
A 13900k and 4090 setup is uncomfortably hot in the summer. I put my system in a different room because of that. I can heat up the other room as much as it wants.
@@LiveType I have same issue
That's like half the power draw of a fucking iron
That is half power induction cooking stove. Imagine running it for straight 4 hours a day.
And here I thought the 225w for my 57xt was crazy.
The specs of the "5080" makes it seem like it should have been called the 5070.
dont worry , nvidia will release 5080 super , then 5080 ti , 5080 ti super , 5080 ti superduper , as long as fanboys keep cheering for their daddy jensen , he will milk .
Nah the 80 series used to be the top one, that is now the 90 series. The new 80 series is what used to be a 200 dollar 60 series
Seriously what even is the difference between the 5080 and the 4080?
@@iamjacobnz No, that's not true. The X80 card was never the largest die in its uncut form.
Paying $400 for 4090 with a blown 5v fuse and soldering a new one on, priceless.
Well played!
you just paid the right price that gpu should have been , lol
nvidia builds that gpu for $200 and sells it for $2000. fanboys just to dumb to understand that
@@parm2-x7h You are right, but is not that low, 500$.
Lol genius.
@@pedroferrr1412 The bill of materials is, in fact, estimated at around $200. There are other costs ofc (marketing, support, etc.) but the card itself is only that much.
Considering how shrunk the 5080 is compared to the 5090 II cannot see it faster than the 4090 if the 5090 is just 30-35% faster.
The 4090 is up to 35% faster than the 4080 in heavy games, and around 25 on average.
Either the 5090 is 35% faster than the 4090 and 5080 is at parity with 4090 or the 5090 is 45-50% faster than 4090 and 5080 is 10% faster than 4090.
Just buy a RTX 4090 and cruise until 5 yrs later.
Nvidia would not want export restrictions on anything below the 5090. Regulations will get adjusted eventually but right now they'd need to keep it around the performance of the "4090 D". They'll use efficiency gains and whatever new gimmicks they have to justify it.
Why would I buy an rtx 4090 for $2000 now when I can get a 5090 for the same price at launch (it'll increase and stay high 1/2 months after launch till it's discontinued ) and use that for 2 years before the 6090 launch
Not necessarily. It might just not scale well at very high CUDA core counts.
@@justhomas83 It's been my GPU for the last two years. I'm definitely not switching it doesn't seem that the 5090's generation leap will be as big as it was with 3090 vs 4090.
But if I were to get one I'd wait for the 5000 to release, 4090 price will drop... at least if the 5080 is faster.
If the 5080 isn't even faster with a laughable 16gb vram for 1200 1300$ and that there is a GPU shortage due to mining and ai.. I bet 4090 will still sell expensively.
At this point they may as well just charge 1,200 for the 5060
Nvidia will probably bring a new frame generator that triple the FPS output, and claim that the 5000 line is much faster than reality, influencers will sell this crap and the normies will buy into that.
I was thinking exactly the same, they will improve the quality of the FG and use that improvement to offer 3x FG. Its perfectly doable, as shown by LossLess scaling, where u have 3x FG and if u apply it at 60fps game, the resulting 180fps output looks quite good. And thats just with a SW FG, which has no data on the movement vectors. So yes, this will be the thing... And than they will market is like a powerhouse (of fake frames)
i mean, i've used frame generation and it's not bad at all if used properly, that means using it only if you have at least 45-50fps without frame generation. It's not that bad at all especially amd fsr 3.1 that allows to enable frame gen even on older gpus
@@luigicorciulo8190 Frame gen can work depending on the game, however even when you're already getting 60fps baseline the latency can make some games feel weird.
don't forget even worse latency, just like current framegen
@@Koozwad it's still very much playable, in fact even some competitive multiplayer titles integrated FSR 3.1 frame gen. The games I've used framegen the most are bodycam, god of war Ragnarok and cyberpunk, and i had no issues aside some occasional little visual artifacts. Bodycam is a multiplayer and it it is difficult to run even for a 4080 without using frame gen. I think a lot of people are throwing shit at frame gen without even trying it or watching reviews of the tech. I also don't like the trend of people always complaining, when a new features requires upgrading hardware they complain "they could just make it software but they won't to lock us into buying new hardware", when instead it's a literal button to double the FPS they complain anyway. Btw god of war Ragnarok is good if you want to test them all side by side by yourself like I did because it includes AMD FSR 3.1, intel XeSS 1.3 and Nvidia DLSS 3.7 frame generation and you can switch between them mid game.
The MSRP of the 4090 was 1600.
5090 has 22% larger die than 4090.
Meaning price will also go up 22% or more.
1.22 x 1600 = 1950.
1999 MSRP for 5090 is all but confirmed.
To all who kept feeding the monster with their money - I hope you enjoy this one.
Lol. Most people who buy these GPUs aren't even gamers. Do you think gamers buy gpu every year/new release? Haha!
@@sfguzmani Some guys with money in the enthusiast crew definitely do, they actually find the performance of new games fine and don't want guys with sub x90s GPUs to play with max settings/4k/RT etc. These guys tend to have stock in nVidia, so by pushing the prices of the low-end and mid-range, they're actually getting a net profit off of it.
@@gameurai5701 Well, usually i buy 2 GPU´s each generation(son and myself), but this time...Is enough, the prices have become ridicule! And i´m not rich.
@@pedroferrr1412 Interesting...
People blaming tech companies for the prices is silly as the reality is it's those who bought from scalpers that caused this. Why should AMD/Nvidia release a card for £500 when scalpers buy them up and people are dumb enough to pay them £2000. Why should scalpers make all the money when all they did was be scum, your life would not have ended if you didn't have the latest gfx cards. And we would still have very powerful and reasonably priced gfx cards, but for some insane reason people bought them at crazy prices, and this is where we are.
Hoping for more competition from Intel to bring down the prices of previous gen products. But have been disappointed for last 7 to 8 years.
None of these companies will have a GPU faster than a 5070ti.... So maybe you can get a 5070ti for a bit cheaper than what Nvidia sells it for, but with worse raytracing and upscaling. If the 5070ti is roughly the performance of a 4080 super, then that's probably all you will need even for 4k gaming. If you are looking for cheap, you can probably get a used 30 series card cheap right now.
@johnc8327 there are no used 30 series here, if anything is there it will be as costly as new one.
@@johnc8327 a 4080 super isn't all you need for 4k gaming if you're playing new games.
Even 6 year old games like red dead 2 if you crank to ultra and run native 4k I drop to 55fps in the city.
Stalker 2, again average of like 55fps.
Cyberpunk 2077 even at 1080p with overdrive mode the 4090 only gets 45fps.
black myth wukong at 4k ultra only gets 23fps on the 4090.
So sure, if you're playing easier to run games or games that are 6+ years old you can get over 60fps at 4k ultra. But it's definitely not "all you need", there's plenty of games that demand more. Most new AAA games and any from the last 2 or so years to be honest.
And you can argue that "you don't need 4k ultra though" I agree. But you don't "need" a $1000 graphics card. And you are the one that said 4k gaming, so also don't say "just turn the graphics down".
"from Intel"... kekw
I absolutely think 5090 will be MINIMUM $1999. But nobody knows for sure because Jensen famously decides on the price moments before going on stage.
The MSRP doesn't matter anyway. In the end, supply and demand sets the price. The 4080 Super has an MSRP of $999, and prices vary from $950-1200. The 4090 has an MSRP of something like $1400, but they sell for about $2000-2200. So...if the 5090 has an MSRP of $1999, and it actually sells for $1999, that's less than a 4090. On the other hand, suppose they set the MSRP at $1400, but it sells for $2400, that's a big increase.
The more you pay, the more you save.
@@carlr2837don’t forget tariffs & scalpers
@@carlr2837 the 4090 MSRP was $1600.
Honestly, the root of the issue is, that for many ppl, gaming is addiction. And when u addicted, price does not matter. I know its ugly statement, but its actually truth IMO.
Most studio's changing to UE5 so....StutterEngine 5 going to ruin any upgrade you planned to make anyway... Save your money, see how it all plays out.
There were ~17000 games released on Steam in 2023, I doubt even 5% will use UE5 in 2025... There are maybe 4-5 high budget PC games every year that are worth buying and Kingdom Come: Deliverance II don't use UE5 so we will be fine.
UE5 working fine if you have 12GB+ VRam
@@kkrolik2106 BS. Wukong, Stalker 2, Silent Hill 2... all run like SHIT, and just compete in which game will murder your CPU or GPU or both. Stutter cant be removed with any current hardware
@@HybOj hogwarts legacy, wuthering waves, really, basically any UE5 game... even fornite is ridiculously heavy compared to what it was at the beggining, even if you use lowest settings
UE stuttering is because of shitty texture streaming.
No amount of ram will fix that.
I know this is a gaming centric channel but there is a lot of hate here but for 3D / VFX artist who rely on GPU rendering this looks fantastic.
Some early leaks indicate nearly double the performance of the 4090 in Redshift / Octane etc and it’s only drawing 200 watts more power draw 🤯.
People don’t realize how good we have it - this type of performance doubling is pretty impressive.
I currently render with 2x 4090s. 2 of these 5090s will be like having 4x 4090s with considerably less power draw / heat etc. Between higher clock speeds RTX cores, more CUDA cores, faster memory bandwidth just two 5090’s are the rough equivalent of nearly 36 x 1080tis which many consider the 1080ti sort of the gold standard of Nvdias GPUs for its time. Think about how much power a setup like that would draw if you could even make it happen. / PCI lanes etc.
Really looking forward to these.
TBC means another 8GB VRAM. No big surprise. We already got this leak way earlier than this since 4060 launched. They are not hiding it anymore this time around. The people who still supporting Nvidia are the same people who will support broken games too. We got UE5, we got greedy companies gimping up on GPUs, it can't get any worse right? Right?
3GB dies are coming. which means 12GB with a 128bit bus. Hopefully
people are mad the RTX 5090 will go for more than 2000$ when the RTX 4090 is already being sold for more than 2000$ everywhere
just try to remember a time when the flagship was $900
CUDA cores? More like CASH cores.
LOL
Alan Wake runs at 40 fps on 4090. We need 6090 to run current games at decent frametime and fps. The Unreal Engine 5 epidemic is ruining the industry.
With corporations spending billions on their AI hardware, gamers will be left behind.
Look man, if you hypothetically get a card that's twice as fast and has double the VRAM, why wouldn't they ask double the price?
Getting more power for the same price every 2 years is actually kind of unique to the tech world.
why buy a new one when you are not getting a better product? its not like a gpu is wearing out
What kind of stupid take is the calculation of a 5090 being 35% faster than a 4090 and then a 5080 10% faster. So you're saying a 5090 with TWICE the core count of a 5080 would only be 25% difference. When the 4090 had ~60% more cores than the 4080 and it was 30% faster.
You don't make any sense. Also the rumor is that 5090 is more like 45-50% faster than a 4090. And given that the 5090 is TWICE the 5080. I don't expect the 5080 to be more than on par with 4090 or so. Perhaps a little faster in some instances.
The 5080 is just a cheaper 4090 but with less VRAM at most.
He also ignores the gigantic increase in memory bandwidth between 4090 and 5090
There's always a difference between RT and rasterization
New into tech?
Ever heard of Amdahls Law?
No way am I gonna spend 2-3k to get 80fps instead of 60
how about 60 instead of 40
But many others will
@derbybOyzZ I game at 1080
@@stolenlaptop even at 1080p... if you talk about native resolution, you'll be lucky to get 60+ FPS stable at any AAA game released currently...with a 7800x3D or better.
@@lamikal2515 you are out of touch.
So the Apple Mac Mini is now the value play with 16GB of ram and a full SOC @$550. Unbelievable times we live in.
MAc mini uses 16gig of shared memory, so its actually closer to 8 gig of vram for 3d applications, apple is a straight scam as well
Here's our new GPU. It's 20% faster and costs 20% more! It's safe to upgrade now.
Please don't buy Nvidia, half their revenue is profit, it's insane how they milk the market with overpricing. Your current graphics cards are fast enough and if not, AMD RX 8800XT in January hopefully might be a good choice for ~500$. I'm gonna switch to 8800XT from GTX 1080. Intel also releases its 2nd gen cards soon.
600W max power consumption for GTX 5090 is riddiculous and irresponsible. Even 400W is insane. My whole PC consumes
I'm in the same boat. bought a secondhand 7800X3D for 200€ a few month ago, and still on GTX 1080, all custom watercooled. I could afford the 1080 Ti at the time, but the TDP was too high. I want efficiency because I have a medium sized room and don't need a powerhouse getting me sick I'll also buy the 88OOXT and with the difference I'll buy an oled freesyng monitor, and keep the gsync I have as second monitor. Nvidia has always been about marketing: physx, gsync, hair animation, dlss..which all end up being mainstream and free after a while.
And games are so boring nowadays, I always end up going back to games around 2012-2016. I passed the time having the biggest one, just stick with what I need.
Gamers want a cheaper 4090 with 4x 8-pin power connector, not a more expensive GPU nobody would buy.
Everyone expecting 5080 at $1000. Aint gonna happen. I expect 1400 msrp, actual 1500-1600 and out of stock fór months. Nvidia wont give us better price/performance ever.
I'll keep on buying AMD or perhaps even an Intel GPU, depending on how RDNA4 performs vs Battlemage.
NGreedia is an absolute joke.
Criminal more like.
AMD is ditching RDNA for UDNA. to board that AI train thankfully, its about time.
I'm not interested in buying Nvidia or AMD anymore because of all the neckbeards on either side with superiority complexes. I think I'll go for intel if battlemage is good.
Guess ill wait for 6000 series then to get my 5080.
Wild idea... Buy an AMD card.
Wait for Radeon udna 😊
Unlikely. Production of the 5080 will stop and the price will instantly jump up, like with the 5090.
You have two other options, as invalid as they may be for someone who wants a 5080. 8800XT and the B970, since they're both targetted at the 5070 and 4070 respectively.
Or wait for the 6000 series and check out UDNA. They can't fuck it up more than RDNA, surely.
@@krazed0451AMD lol
@@ZackSNetwork Some of us can afford more than 1080p monitors and appreciate the extra VRAM.
According to Coreteks RTX 5090 = 30-35% faster than the RTX 4090, and the RTX 5080 would be 10% faster than the RTX 4090? LOL, that's BS as the RTX 5090 has double the specs of a RTX 5080, the difference will be way higher than 20%.
Not sure about lower cards, But 5090 is surely over 40% faster than a 4090 with those specs, They are going back to the good old days of 512bit bus, I miss those days.
He is also ignoring the increase in memory bandwidth when calculating performance for some reason
Yes, he doesn't make any sense at all. Now if the 5090 was close to 50% faster than a 4090, Then the 5080 would still be no more than parity with 4090. Barely faster.
The 5090 literally have twice the core count of the 5080. There is no way in hell it would be only 35% faster than 4090 and then a 5080 being 10% faster.
That would mean the difference between the 5090 and 5080 would be no more than 25%.
True. Bandwidth is a good indication 5090 is 512 bit. THis is very rare. We are seeing 512 bit card after long time. Rtx 5080 has has only 256 bits. It should be atleast 384 bits. THis is just ngreedia cutting stuff as usual rtx 5070 should be 256 bits.
Is the price still over 900? 1000?
The 5080 should have 24GB minimum 😡
The very last nvidia gpu I bought was "xfx geforce gtx 7600GT" back in 2006.
So... I gave total of 0 (zero) $ to nvidia in last 18 years.
I am still using sapphire R9 380 (4 GB) from 2015.
I do not mind nvidia to overprice it's all gpu model. I am not going to buy any of them anyway...
It is not my responsibility to make nvidia share holders rich.
I'll have a look at "battlemage" and "rdna4" options.
If they (amd & intel) are continue on price fixing crime, and not bring good value options to the market, I'll hold on to my R9 380.
PC component world, never saw this level of darkness era in it's history.
I'm still on a used 6600xt on 1080p. Good enough for the games i play. My next card is likely a 7800xt performance level but more efficient than what the 7800xt is today. I would be happy with that.
Okay, again, I'm not upgrading my 3090. 10K CUDA cores are enough.
Hard time, greed ruins everything
I'm gonna go against the grain and say that I don't think Nvidia is going to have big price increases for most of Blackwell. They'd be dumb to put the 5080 at over $1200, and even that seems excessive. 5070Ti is gonna be a little faster than a 4080 Super for $800 or $900. This isn't great, but they are technically a little bit better price-performance than Ada.
the 5080 price looks compelling before noticing it is a 5070 (ti), same goes for 5070 -> 5060
@@AmUnRA256 5070 and 4070 definitely look like older 60 class GPUs. 5080 looks a lot like an 80 class to me. The 5090 is weird because it's their first 512b bus GPU in almost 20 years, so it's beyond the 4090, 3090 and old Titan cards.
There's a big 4090 shaped hole in this lineup where they could have another GPU with 128SM and 24GB on a 384b bus.
@@coryray8436 yeah the hole is missing a true 5080 with 24 gb
@@coryray8436 Did you see the difference between the 5090 and 5080?!?!? If the leak is to be believed, which part of 50% less VRAM and Cores says 80 class GPU to you?
Why would they? They dont need to win any popularity contests and they dont have any competition. Jensen is gonna rob PC gamers blind this generation... again!
Most people should stick to buying old and/or used cards. Im very happy with the $200 rtx3060 i got 2 months ago
Its sad no maddening actually that it has come to this but yeah old cpu's and now even older gpu's are fine for many people's needs unless they want to be $#tards and want to drive 4k at Ultra with 100 fps! I say downgrade to 1080p 27 inch monitors and stick to older gpus!
... but what happens in 3 years when the old used 5060 is $350? It starts today, this is bad news for the future.
I need a graphics card for cat pictures and cat videos. Any ideas?
A second hand card from 10 years ago with a TDP of only 20 Watt? 💪 Yes, please!
@@50-50_Grind True. I only upgraded because I wanted to use Stable Diffusion for a project I'm working on. Doesn't work on older AMD cards. 3060 had 12gb vram making it x2 cheaper than the next "budget" 12gb card
@@50-50_Grind big true. I only upgraded so I could use Stable Diffusion for a project I'm working on
5070 still at just 12GB, WTF???
NVIDIA has no reason to care. They're at all time highs, AMD competition is dead, Intel GPU's are a joke, and AI data centers will eat up wafers at any price since tech companies have infinite money. GeForce is just an annoyance for NVIDIA at this point.
5080 with 16GB vram surely a joke.
16GB should have been standard with the 3000 graphics cards. 😐 And looks like it's not even going to be with the 5000 cards!
The 3070 and 3070Ti are perfectly capable cards, but are significantly dragged down by having just 8GB.
And 12GB is an aberration. It means a 192-bit bus, when all 70-class cards have had 256-bit fore more than a DECADE, since the GTX 670! 😠
If 5080 was 20 or 24GB at that price i would probably buy it. Guess i will wait for a theoretical 5080 TI i guess.
5080 Super Ti
3GB vram modules might be used to give the 5080 24GB.
@@kninezbanks since the 5080 actually is a 5070, 5070ti at best, i highly doubt they will.
Wishful thinking....
@kninezbanks they really think we are stupid and try the same shit again, like they tried with the 4080
@Heinz76Harald 4080 desktop had a 256bit bus....8 x 2GB chips so 16GB total......with 3GB chips, it would be 8 x 3GB which would give 24GB.
The laptop 4080 uses the desktop 4070 die, 192bit bus, 6 Chips....6 x 2GB gave is 12GB Vram.......with 3GB chips.....Laptop 4080 can have 6 x 3GB chips which will give is 18GB. Perfectly adequate for 1440p gaming.
Techtubers with influence need to blatantly tell people not to buy certain Nvidia cards. I am more than happy with my 7900 XT. Especially since Ray tracing is still just a gimmick for the most part that tanks your performance.
Bro that's how they make fiat, leaching off of corpos, their souls are bought very cheap.
@@pituguli5816 And those same tech-tubers have the audacity to tell us that they're 'Just like us', or 'We understand you'.
...
Pfft, _No_ they do not. Most of em' are just a bunch of corpo sponsor teet suckers, who will look the other way once they see that fat check clear their bank accounts.
That's like saying that car youtubers should start telling their viewers not to buy a Ferrari. People who can't afford it won't buy it and those who can and think it is worth it, will, just as it has always been.
@@searingsword-4-21 My friend has a Ferrari, I drove it, nothing special not worth $745k just like the 5090 wont be worth thousands of fiat. This is coming from someone who can afford to buy multiple 5090's, I'm just not a numpty.
RIP
it's truly over for Ngreedia
i was just able to afford getting a 3080 😐 god i wish we went back to the golden days of 10 series gpus
On the flip side I bought a $300 second hand 3090 with 24GB VRAM. The 5090 at a projected $1800 is only 40% faster according to these charts. There is a HUGE gap in pricing between the used and new market for GPUs and it's a waste for gamers not to utilize it.
AMD and Nvidia most likely have a non-compete agreement to allow Nvidia to occupy the halo slot. That way, they set the prices and AMD gets the profit margin benefits while maintaining the status quo. When they had a node advantage, they deliberately stayed off the top performance tier. I gave up on this hobby, it's full of useful idiots that reward this kind of anti-consumer behavior. Still like to support this channel, though.
Amd simply doesn't make these giant gpu dies like nvidea does. Amd regularly comes close to nvidea performance with only half the silicon.
They don't see going with big dies as viable.
Having a written agreement like that would actually be criminal and prosecutable BUT AMD's behavior and Nvidia's too definitely has been very suspect at times.
Infact plenty of times. It really seems like AMD does not want to take any marketshare away from Nvidia even when they had clear opportunities.
Jensen and Su are related. So yeah non compete applies.
@NatrajChaturvedi verbal agreement most likely and yeah, completely corrupt. I don't understand how nobody in the FTC has gotten wise to it yet.
I just build a Threadripper PRO 7965WX system a month ago in lead up to the crazy tariffs that are about to hit.
*pats rtx3060* ready for another gen, little guy?
All we need now is entry level mice that cost $150 and entry level sounblaster at around $300. Good luck folks. Remember - this is just a hobby.
The messed up thing is that Nvidia will more or less be forced to "compete" on the low end since there is legitimately some competition at that point, but anything north of whatever they price the 5070 at is gonna be even further inflated. The VRAM numbers are pretty sad on everything other than the 4090. Since most ray tracing sucks I don't regret buying my 7900XT.
Its honestly crazy that they're shifting down the dies AGAIN, first it was 07 for the 4060 and now a 05 die for a 70 class? Crazy
Cant wait to see the AI Buble burst
Gonna be a long wait.
It probably won't burst. It's so fresh that there's still oodles of things to discover, and when all of that is discovered we'll use it a stepping stone into the next level and keep innovating.
AI is overvalued by almost 500 percent. It has no monetary value in the short, mid, and long term. The only ones making money is Nvidia.
@@yomatokamariYou are absolutely mistaken. AI is already being used to increase productivity in multiple areas. AI is already being used to review police body cam footage and writing a police report. This is just one but there are a lot of other ways it’s being used right now.
At some point… the entities that endlessly buy up Nvidia GPUs (Amazon,Microsoft,Google,Meta and every cloud provider of GPUs under the sun) will have to actually make a return on that investment otherwise they wont be able to buy more and more cards.
Simply the AI products need to make a return to justify the investment in hardware.
The most obvious example of this would be OpenAI….at some point THEY NEED to make money and actual PROFIT from their AI products otherwise they will go bust from their electricity bill alone
its been 6 years since my last major upgrade i think with these prices i will switch to 10+ years upgrade cycle once i upgrade to the 5090, unless future generations drop to half the current prices inflation included.
Sell your 4090 now for more than you paid for it while you still can and get the 5090 when it releases
Everyone and their mom are going to be trying to buy a 5000 series GPU at launch before any tariffs are implemented, so unless you are lucky and able to get one early, you could easily end up with no GPU until mid 2025 when you are finally able to buy a 5090 for $2500-3000 after tariffs.
Good luck with that. Ill hold onto mine, thanks.
Remember how that went last time when people sold their 2080 Tis and couldn't buy a 3090?
I'm thinking I will wait till the 5080ti to sell mine
With incoming US tariffs on Chinese imports, trust me, 4090 gunna be worth a lot more in a few months even with the 5090 available (tho 5090 will sell out for at least 6 months at any price)
NVidia gatekeeping VRAM is disgusting.
Your price and performance predictions are normal from generation to the nexr.
But I disagree that nvidia doesn't have competition. The people suckered by Nvidia marketing tactics is the real problem.
The 7900xtx is 80% to 110% of a 4090 but at less than half the price in rasterization performance.....thats a deal/steal.
Nobody really cares about RT, dlss or any of those other gimmicks.
Problem is 7900XTX is $1000 GPU not $200. when you pay that much it better excel at everthing. Pay $1000 and still can't be no. 1 at everthing? Some people does not take that well. Same with 4080. That's why people opt for 4090. Plus nvidia able to retain it's used price better.
I dont know why everyone thinks a gpu needs to go up every year that's crazy so soon a 8050 will be 1k soon nvidia just will not be an option for people at all at this rate
$1000 for a 5070 ti is INSANE!
it's very good, it's very good ! :D
it is a 5060 ti chip actually.
He's talking nonsense.
1200+€ including VAT for a cheapest Biostar/Manli/Inno3D card with a most underwhelming heatsink using a single noodle-sized heatpipe and two cheapest 0.5$ fans is a great value...
I paid 500eu for a 1070ti during the 2018 crypto boom which was already arguably too inflated. So from that were looking at what? a 100%+ price markup on the new xx70ti class? Lmao
I'll never pay more for a GPU than a whole console.
But who is buying those 2000$+ USD GPU for gaming?
Besides a few pro-teams that makes a lot of dough, who else? I can't believe there are MILLIONS of users/gamers that are buying them.
Less than 1.3% of gamers on Steam bought 4080's and 4090's, you are right that majority of PC gamers aren't buying them. Gamers aren't the demographic for Nvidia's chips now days, they really don't care if gamers buy them. They will ride this Ai boom all the way till corpos cotton on that there is no money in it for them, so far Nvidia is the only one making a motza. As for us? Well I'm picking up a 7900XTX this week Bro and I am DONE with buying anything for at least 5 years which suits me fine since all I play is D2, DRG, SOT with indfies and some AA games, I haven't bought a AAA game since 2019.
Will wait for 5080ti.
Dosent make sense to upgrade from 4080 with the same amount of vram.
The funny thing is that we are lucky to even get 50-series GPUs at any price this cycle. Datacenter makes nvidia 10x more money, per their financial earnings reports, so consumer GPU are barely a drop in the bucket to them.
Yeah I'll look forward to 3Q from now and if gaming has sunk under the 10% threshold....
If you think that you're lucky to be able to buy it instead of a company thinking it is lucky to be able to sell it to you, then you already lost and are part of the problem. Sorry for giving a come to Jesus.
@@Terrylovesjougurt I will pay any amount for GPU. Running local AI models makes my work easier and saves me time so the return on investment is huge for a lot of folks
@@thenextension9160 Well, then this isn't about you. And by "any" you mean, as long as it's profitable. Too bad in many cases (possibly not yours) this cost will be transferred to consumers, who may not even want any ai features that will be crammed to everything artificially (pun intended) balooning the price. Be ready for ai enhanced asparagus.
Folks are going to find another hobby or just game on console until that option gouges wallets.
99% of PC owners won't be able to afford these brute-force cards because of NV's greed. So what's the point?
I'm about to hurt some more feelings but everyone needs this economics lesson.
I built a top 2% gaming PC in 2016 that was very expensive for 4k ultra gaming.
The video cards were 3x 980ti classifieds OC to 1500mhz with an OC 5960x.
This yielded a firestrike ultra 4k score of 11405.
It cost $650 per card plus $180 for each waterblock (2016 dollars).
I currently have a 600w watercooled 4090 overclocked to 3025 mhz with an OC 7800x3d.
This build puts down a firestrike ultra 4k score of 25568.
The water-cooled 600w 4090 cost $2250 (2023 dollars).
Usually video cards gain about 15% performance per generation. I think it'll be about 25% this gen.
If we make the conservative estimation of 15% perf uplift we get 29,403 for $2500 (2024 dollars)
So lets compare price to performance and ADJUST FOR INFLATION (the step no one wants to do).
We have to pick a year so to drive the point home, I'm bringing everything to 2024 using the governments inflation numbers (which are manipulated down but that's another tangent).
The 980ti SLI cost $3,274.92 2024 dollars (31% inflation) for 11405 performance (or 3.48 pts per $)
The 4090 WC cost $2,330.95 2024 dollars (3.6% inflation for 25568 performance (or 10.97 pts per $)
LOW(+15%)*EST* 5090 $2500 2024 dollars (0.0% inflation) for 29,403 performance (or 11.7pts per $)
HI(+25%)*EST* 5090 $2500 2024 dollars (0.0 inflation) for 31,960 performance (or 12.78pts per $)
So what does this all mean?? TLDR??
Video cards are cheaper and more powerful than they have ever been. We live in a golden age of computer performance that would seem preposterous not even 10 years ago in 2016 with a SINGLE CARD. Back then if the game wasn't optimized well, you'd get 1/3 or 1/2 of your total performance. I would have to debug a game for about a week before I could play with my friends. Play a game on launch, probably not. And most games you'd have to dial back settings to medium or low to get 4k to run a reliable 60fps.
Now I can run ANY game 4k ultra and get usually about 120+fps. I run circa 2016 games on THREE 144hz 4k 43" monitors and get solid 144fps with a 160 degree FOV.
The real problem is governments and central banks inflating currency which drives down the value of our money (which we trade hours of our lives for). The last massive surge of money printing from has raised the inflation since then over 100% doubling the cost of all goods. This rapid change is money value kills us consumers because our pay is the LAST thing to go up.
First the cost of raw materials goes up.
Then the cost of completed goods goes up.
Then begrudgingly the cost of labor (our pay) goes up (because if employers jump the gun on pay, a competitor who can hold out a few months longer will murder them in the market and the company could suffer a short term shock that cripples them).
The effect of the stimulus is that every single person living in a country that deals with dollars took about a 50% pay cut.
Ask yourself, if your pay doubled, would a $2500 card seem outrageous?
Think I'll wait for a 6090
30-35% perf increase doesn't excite me enough to spend 2k
It'll be 50%
he forgets to account for clock speed. and ipc increases.
Anyone who buys 5070 or 5080 with the measly 12GB-16GB of VRAM only deserves to get ripped-off like a gullible sucker.
... But seriously, tf is with that performance gap between the 5080 and 5090?! It's basically half the GPU.
I would never buy any Nvidia product ever again, id rather quit gaming. This is since the 20 series and i stil play well with AMD at the moment, but i hope that Intel will become strong one day.
"ppl should stoo giving Nvidia money"
I want the best GPU, I want best power efficiency, best day 1 drivers (or day 1 drivers at all on a consistent level), I especially want DLSS upscaling, DLSS FG, I want the whole Nvidia suite like RTX HDR for example, I want to use the most graphically intensive settings to get the best visuals you can possibly get (even if I would buy in the 600-1000€ range, I obviously want to use as many of the best features as possible. and when I use upscaling, I want the best form of it), and because of all that I will def and absolutely and obviously buy an RTX 5090.
As said even if I could afford only like 1000€ for a GPU, I would never buy a 7900XTX instead of a 4080. I play at 4k 240hz. You WILL use upscaling, especiallx at 4k. And bc you can use DLSS upscaling at Quality level, you will ALWAYS activate it bc you cant see any visual dow grade to native 4k. often its looks BETTER than native with TAA!! If I had to pay 1000€ for a XTX and then everytime I would avtivate FSR, I would know that I have literal worse image quality compared to a GPU running DLSS. And if i buy a high end GPU, you can be damn sure that I want to have all bells and whistles and want my GPU to handle ultra intensive but game changing features like Path Tracing in Alan Wake 2 or Cyberpunk for example. And there will be more and more games that use that.
AMD makes the best CPUs atm. In fact I own a 9800x3D pc since 2 weeks. But for GPU I just wont buy AMD until they are EXACTLY as good as Nvidia in ALL cases. This wont happen bc Nvidia wont allow that. So since I buy ultra high end (90 class), I can buy only Nvidia anyways. AMD is interesting in the ultra low end to mod range of 200-500$ range. but 4070 and up I would def not buy any AMD GPU. Nvidia costs more, they can ask for more bc you get more.
Are you a marketing bot? Its getting hard to know who is a real and who's a bot..
i think a lot of people agree with this though. I do :/
@@pituguli5816 wtf are you talking about?
I objectively laid out why Im buying Nvidia GPUs and not AMD. And I also objectively made my decision about the best CPU, according to benchmarks, which is the 9800x3D. In GAMING.
first you have a picture of ben shapiro there is no way youre playing anything other than the yearly pump and dump game from activision lol.
@@DELTA9XTC Yea and you sound like ChatGPT..
gfx cards costing more than an entire build needs to cost, how long can this be viable? jesus
Its not viable to gamers but corpos don't care, gamers are not the primary focus.
Honestly, imma keep my 3070 for some more years... yeahh...
Planned obsolescence with a price sticker that might as well be a QR code that redirects to a picture of a middle finger.
Thanks I am staying with my RX 6800
No way 5080 will be at 1.3k. $1000-1100 max.
4080 super's are still £1000
Why would anyone buy a 16GB GPU at that price to begin with?
@@MetallicBlade because
rough times ahead? as if the GPU market hasn't been a garbage heap for the past decade
also, none of this matters. consooomers will slop it up anyway, further reinforcing the problem
I will stick with my 3060Ti until it dies.
You poor people. I'm buying the new RTX 5090 even if it's $5,090. I'm rich.
If its less than 5090$ I won't buy it. I don't want poor trash in my pc.
Hahahahaahahaha
I will buy it too and than burn another 5000 USD just to prove Im retarded
One kidney, sell my vehicle and i should be able to buy. Fingers crossed.
I knew it was only getting worse. Bought a 4080 and left it at that, it will have to carry me for the next several generations since all future NV GPUs will become even less appealing
can't wait for Radeon and Arc to decimate in sales numbers.
When this AI bubble pop/burst I'll be damned laughing hard.
Can't come fast enough. For reasons other than trivial topic of gpu prices.
Trump will give them a ramp on Tariffs, for TSMC to move factories to US
Nvidia is about to leave a gap in the market with their expensive pricing. This is the perfect time for AMD to come in with the RX8000 Series to fill that gap.
amd had that same gap for over 20 years now
Amd says it is leaving the market.
@@rogerk6180 high end. But yeah I wouldn't be surprised if they threw in the towel in a decade or so.
We deserve the monopoly that we're willing to buy into.
Yeah AMD has only ever match the same pricing that Nvidia sets whilst giving you a little bit extra performance per dollar.
If Nvidia increases their pricing then AMD will increase their pricing. AMD CPUs for instance are now very expensive.
AMD is just as bad as nvidia
No they're just going to make shitty 200mm die cards which no one will buy because all of the influences and e-gamers are going to use 5090tis, and Nvidia has better upscaling and other technologies.
"You will own nothing, and you will be happy." - Jensen Huang
Holy ****! 600W gpu!! Defecting to Team Red ASAP!
4090 was a 600W capable card. However, the insane price on Nvidia stuff is pushing me to the 8800XT instead.
Guess I'm never upgrading
Since WHEN nvidia cared about gamers?
Don't be confused. We buy their products because they're top in their category. Not because they care about gamers.
Same goes for AMD vs intel in CPUs
> 16 Gigs on the 5080
> 12 Gigs on 5070
That's gonna be a pass from me dog