@@mikelay5360 Eventually a savior will come. Until then stop crying and spend more shekels. They're merely a luxury item, it's not like Nvidia sells food or electricity LOL
Everybody wants more VRAM, but are you willing to pay for it? You think nobody has to get paid for these things? These days everyone is crying about VRAM as if there aren't available GPUs with more VRAM, we always want more while not willing to pay more for it hence why people are just expecting so much from low to mid range GPUs.
They will introduce the 18GB super version, they planned to use 3GB chips but they were probably not ready in time. I'm more pissed that they cheap out on PCB by lowering the memory bus width. The bigger onboard caches save the day for FullHD gaming, but 4K performance is castrated.
@@Dempig4070 Ti can run most (not very demanding) games at 4K w/o any issues. Big modern games is a different story though. The card runs out of power way before it has any issues with VRAM capacity. Alan Wake 2, Hellblade 2, Star Wars Outlaws, etc. - 4070 Ti is too slow to run them at 4K. Once you enable DLSS and/or reduce some settings to achieve 60 fps, VRAM usage decreases as well.
I am truly satisfied. If 5070 has 12GB, this means game devs will have to optimize their games better to fit 12GB buffer. So I can hold on to 4070 Ti 12GB a bit longer.
@@masterlee1988Honestly just wait for the 4070ti super to drop in price and pick on up, I would also say a 4080 super bit I dont think the prices would drop that drastically for the 4080 super.
I reckon 12 is still good but they have to make an upgrade to the 70 class chip somehow, then 12gb should be normalised on the 60 chip and then 8 for the 50 chip minimum
yh for a 5060 lol. definitely not acceptable for a 5080. But it's fine. We all know that a couple months later they're going to rebrand the 5080 into a 5070 ti super and then they'll just release a 24GB 5080. classic Nvidia moves
@@RubenGugis you are so wrong. The 40 series had significant competition at every front except for the 4090. This time I don't think AMD will even achieve that 5070 performance even in raster.
@@mikelay5360 The 'competition' does the same because the CEOs of both companies are Uncle and Niece. Why are people expecting anything from family-run businesses?
This is why last year I switched to AMD to a 7800xt. For $475 it's a bargain in comparison to competition with 16gb VRAM, my old GPU was starting to show it's age with only 8gb in some newer games. 12gb was a concern for me even a year ago as some newer games such as res 4 were using 11.5gb at 1440p, 12gb in 2025 for the price they are going to be asking is ridiculous. I'm honestly beginning to hate this company more than apple. They are limiting VRAM to increase your upgrade frequency, price gouging & intentionally misleading consumers with dishonest marketing. I certainly won't be buying anytime soon until they can get their s**t together.
@@Ruleta_23 With the amount of high quality meshes along with textures developers aren't bothering to compress properly, bigger worlds that they're trying to create, more RT features they're trying to implement, 12GB simply is not enough at times even on 1080p native.
because most of them just a casual gamer that not obsessed with ultra setting. forget 12GB. 8GB should be fine until PS6 arrive. i mean even freaking 4GB cards still running new releases despite it should be completely dead when PS5 comes out in 2020.
The RTX 5070 now rumored to have just 12GB of VRAM for $800, which exactly matches my prediction. A 12GB buffer is in no way competitive with RDNA3, let alone RDNA4. These is no way to offset a too-small VRAM buffer with fancy tricks. When the buffer is full, it's full, and the thing slows to a crawl.
RDNA 4 is said to be really competitive in the low-end and midrange segments. Navi 48, which will be the top-end RDNA 4 GPU, is expected to offer between 7900 XT and 7900 XTX levels of raster performance and be roughly on par with the 4080 in ray-traced games between $450 and $500. If that actually ends up being the case, it's going to make the 5070 and 5080 look absolutely pathetic. They will be DOA. The 5080 is also only rumoured to have 16GB of VRAM, which is absolutely pathetic for a next-gen 80-class GPU. Nvidia is making a huge mistake, and it will be a total embarrassment for them. RDNA 4 and Battlemage will make RTX 50 look pathetic. Nvidia is being anti consumer as fuck, and only strong competition can stop that.
The 5080 is so cut down, the 5070 will really be a 60 class card for a rip off price. Depressing , I can afford any GPU but don't want to reward greed and mug myself. They are destroying PC gaming.
Bro nvidia tried to get me to buy an $800. 12 gigabyte 4070 TI in 2023. I said forget that and forget nvidia and I’m glad EVGA left, and I left too , and for $1000 got a 24 GB 7900 XTX in 2023.
@@GreyDeathVaccine not really. I'm looking forward for 50 series to ugrade my almost 4 years old 3070. But I'll shoot for a big upgrade. If it's not the 5070, then it'll be the 5080.
This is what will happen. People will buy the 5070, play the latest games rn , have a billion stutters, crashes, and missing textures, complain that the games are unoptimized, go on UA-cam and see what’s the issue, find out 12gb is too little for ultra settings. See the 5080 is 1k,ask their parents for it, and get slapped in the face.
12gb isnt enough to use the features they will be promoting with it even at 1080p. There are games out right now, that in 1080p, ultra settings, rtx, with dlss and frame gen on , that exceed 12gb vram. Obviously, anyone buying a card like this, more then likely isnt playing at 1080p. They want 4k, at the very least 1440p. Newer games are only going to require more too! This is selling you a product that is physically not going to be able to use the features it promotes strictly because of the lack of vram. This is horseshit lol
Funny how with Nvidia the same performance next gen gives you less VRAM. Example: RTX 2080 Ti 11GB > RTX 3070 8GB RTX 3090 24GB > RTX 4070 Ti Super 16GB > RTX 5070 12GB (speculated)
Just got a 3090. The base goes while the ti sits at the microcenter here. My guess is people are doing ai stuff. That’s why I got mine. Nvidia is between a rock and hard place. They add the ram they should, people like me will likely go cheaper if the performance is there. Not sure their investors will allow that. The greed is real.
The 4070 already can't run some games with FG enabled because it increases VRAM usage. Imagine being gated out of an advertised feature in a new card because of VRAM.
@@kongyiu I wish people would vote with their wallets. It doesn’t make any sense to spend money on these 50 series cards but I’m sure some will just because they’re the latest GPU’s in name.
They dont need competition....They need customers with brains nothing more....If people not buying their overpriced products than they need to cut the prices to half........
when they had it during the 3000 gen the ydid the same, AMD competed very good with RDNA 2, people didnt care. is liek Apple, people dont care if the competition offers better products, they buy a name.
@@nicane-9966 This is true sadly.People can vote with their wallets.If they play the nonbrainer and pay every price this never changed and nGreedAI gives you the 6060 with 6gb 96bit card for 1200$......
What's the point in being faster in raytracing if Nvidia doesn't give you enough vram to actually run it? Nvidia wants to sell gamers on vram hungry features like frame generation and raytracing but won't give you the vram necessary to use them.
Are u happy with your AMD card? I have 6700 XT 12 GB card and i really hope that AMD will kick in the a ss NGREEDIA with 8800XT if that is comeptitor to 5070 and 5070 Ti and 5070 Super. This is their chance to take huge portion of market.
@@Ivan80054 "I really hope that AMD will"... oh please. You DO know that AMD's CEO is the niece of Nvidia's CEO, right? It's all a family business, and they act in tandem. It is NO coincidence that AMD ALWAYS 'fails' to capitalize on Nvidia's 'mistakes'...
@@KnightofAgesJust because they’re family doesn’t mean they are not competing in the GPU market lol. AMD is just amazing at shooting themselves in the foot and that’s it. Not a particular reason like they want to be worse than Nvidia…
@@KnightofAges So if they want to be more respected ,talking about AMD they need to fight for themselves. Paying to NGREEDIA their overpriced GPUs lead to this. People complaining and complaining on NGREEDIA an their prices and they still buy their cards. Well they deserved to be ripped off. I have 6700XT and i wait to see what will be RX 8800 and RX 8800XT cards . I simply refuse to pay much more to Nvidia than of what i think is decent price for gpu.
NVIDIA is doing NVIDIA thing! People complain, but still buy NVIDIA more than ever! NVIDIA market share has gone up… and most likely go even higher with 5000 series! NVIDIA market share did go from 84% to 88% in the latest market share info! And people,wonder why NVIDIA will continue to make subpar low/midd range gpus… they do because people buy!
Sounds right. 5080 24gb really is the real 5080, possible a ti or SUPER or tiSUPER. Which is why there is such a HUGE differentiating gap between the 5090 5080 16gb is really the 5070 5070 12gb really is a 5060
i have a 3070 ti, why bother upgrading, the upgrade for me would be an AMD card with WAY more VRAM than what NVidia is doing, FUCK THEM looks like this 3070 TI will be my last green team card for a long time
Smartest thing for Nvidia to do with a 5070 12 GB is ship it with a DLSS 4 with extra texture sharpening. Pin the actual texture setting to medium but the graphics setting will say ultra max AI enhanced.
I will refuse to aknowledge the existence of any 50 series card that has below 16gb vram and is over 250€. I'll probably get one of amd's 8000 series gpu's anyway but if nivida dares insult us consumers with a 12gb 5070 priced anywhere above 500€ i will be avoiding all nvidia products at any cost for about 10 years.
I wonder if people will have the discipline to shun the 5080 to send nvidia the message that we know what they are doing. They may have labeled it a 5080 but spec wise it should have been the 5070. Worse yet they are pricing it where a true 5080 should be. (Which of course is the point).
Rtx 5080 is actually the 5070 ... 5070 is the 5060... they are just sliding the numbers UP so they can charge more for less... look at 4090 cores divide buy "roughly" 2 and you get the 4070 ... 5080 is half the cores of the 5090 ... pay attention consumers
Content creators complaining about Nvidia but they can't wait until they hand them $2'000 or more for the latest 90 series card and then tell everyone else they shouldn't give their money to Nvidia and teach them a lesson. 😢
Hopefully, these cards will fall off a cliff for sales. I do not feel compelled to buy any Nvidia card this time. And this is from a loyal buyer since 2005. I bought my last Nvidia GPU in 2021. Never again...
Currently got an RTX 3080 10GB, not upgrading to anything with less than 16GB VRAM, Nvidia are just greedy and pathetic really. After the 4080 12GB scam I really don't want to give them a penny.
Last gen they downshifted the card by 1 tier , renaming them from xx70ti to 4080 , from xx60ti to 4070 and so on. Now they're trying to downshift by 2 tiers, xx70=5080 and xx60=5070. I guess they will do it until they find the absolute limit that the customers are willing to accept. Remember when the x60 card was a midrange gpu? Now the 4060 is literally the lowest tier.
If 8800XT doesn't suck at RT, i'll go for it instead a 5070 for my next upgrade. I don't care about the GDDR7, GDDR6 20Gbps is enough for the performance tier of the rumored 8800XT.
Wait people were expecting something different? Like Nvidia had 8gb for 3 generations and you thought that they would have 12gb for just a single generation? Remember that they wanted to have "4080 12gb"?
Regarding memory capacity for the next generation, this would be ideal: 5090 32Gb 512bit 5080 20Gb 320bit 5070 16Gb 256bit 5060 12Gb 192bit 5050 8Gb 128bit Then for Ti and Super cards, clamshell or use 3Gb memory modules to bump up those base capacities. It's not going to happen, but that WOULD have been an enticing lineup, even for people who are happy with their 30 and 40 series cards.
@@GreyDeathVaccine more vram is generally better, but when you get that low in the stack, it's hard to say if that extra 2Gb would really help all that much. Besides, it would have to be a 160bit card at that point, and the price would go up accordingly.
I'm looking forward to upgrade my almost 4 years old 3070. This is by far the gpu I had for the longest time in my life. by far. And my upgrade WILL be at least 140% this time.
Great news tbh! This means there is more change that ppl gonna buy the new AMD 8000 series cards. If everything goes as rumoured, I am going to buy one, too.
Damn...historically speaking everything under 320 Bit-Bus aged poorly in the long run...so this gen like last gen, everything under the top GPU will age badly...nVidia surely know how to keep people buying their cards every two years.
Sony brings a new technology that will allow games to look and run as good as PC games using high/ultra settings at 60fps with games looking as good and even better in some cases in 4K with minimal work from devs and is back compatible with PS4 games and will have 60 plus PS5 games at launch but 699.00 every one is upset and says not worth the 50plus performance bump in raster and 3 times performance improvement in RT. Nvidia gives PC gamers less VRAM than they should, barely a 10-15 percent performance improvements and charges 50 percent more and even way more in some product stacks and PC gamers are going to line up and buy it and say it has DLSS 4.0 so worth it lol
Try Increase Vram You Get 4k Gaming Ability Vram = Game Resolution New Rtx New RT performance Just Turn Down Vram The 12 Vram = 1080p Raytrace Realtime Gaming The 16 Vram = 1440p Raytrace Realtime Gaming The 24 Vram = 2160p Raytrace Realtime Gaming New RTX Will Be Expensive Nothing We Can Do About It And You All People Will Buy it 99%
Seriously if the 8800xt was as fast as a 7900xtx or 7900xt why would AMD go for slow speed 16gb GDDR6 with a 256bit bus? I expect RX 7900 GRE rasterization performance with 4070ti level raytracing.
Maybe the hardware modding scene could see a broader revival; there already are extreme enthusiasts that swap the memory chips on consumer GPUs to reach higher capacities and overclocking results, the hardest part seems to be the modding of the VBIOS/Firmware so that the GPU is actually using the higher memory capacity.
They are just calling the 5070 a 5080 and calling the 5060 a 5070 and still charging the higher prices. I seriously hope AMD brings it and Intel steps up to put more pressure on Nvidia to actually compete.
If AMD can wise up, learn from the mistakes they made this gen, improve FSR, release fewer cards, drop in a mid range contender, that can be about 4080 standard level of performance, or about 3-5% slower, but price it as dirt cheap as they can (from the start) and still make a slight profit out of it, I think that would be the foundation of some good competition to come.
If the 5070 really only gets 12GB of Vram.. wow. People will buy it regardless. Nvidia can do whatever they want, majority does not care and buys Raytracing. lol.
Lol at this point CES 2026 will be RTX 50 Super Refresh with faster and higher capacity GDDR7. 5070 Super 18GB, 5080 Super 24GB & mabye even 5090 Super 48GB (with more SMs enabled) lined up against RDNA 5.
A 12Gb 5070? Wow! I've been holding off upgrading my 3000 series card in the hope of more VRAM and performance with the 5000s . An $800 5070 12Gb that will probably perform at around 4070Ti level? Sorry, Nvidia, No dice. I can get a 4070Ti Super for less than that with 16 GB of VRAM. I'd rather buy your old stock than get a 12GB card. No Graphics card over $500 should come with less than 16GB and I refuse to purchase one.
I just fired up Diablo 4 expansion and setup the graphical settings to max with DLSS quality at 7680x2160. My system has a 7950X3D, 4090, 96GB DDR5, Samsung 57" Neo G9 and it can't handle this game at maximum resolution. The frame rate drops when the situation in game gets busy. Turning down the settings to performance mode made it look a bit crap but I'm having a great time with this game. This is a good benchmark to say it's time ro get a new computer for me. I'm getting the 5090 and 9950X3D as soon they come out.
@@MrKardovalk Ye, but they need to cramp like 4-5 cards between 5070 and 5080 bro. Lineup - 5070 12gb, 5070 super 12gb, 5070 super 16gb, 5070ti super 16gb, 5080-1fps and then 5080.
They do learn, they just dont care and they known after AMD announcement they can flex prices now on high end and crap mid tiers to drag their customers towards the higher options that they control. I believe they know amd never misses an opportunity to miss a chance
locking the card down to 12gb just so we don't have the freedom to occasionally 4k game is so fucking scummy it's literally so scummy. they know it would be good enough to do it and then no one would buy their 5080 etc
NVIDIA do learn from their mistake… to make them again but sneakily…
And they keep the market share because the cards work and are more than enough while not having competition? what the hell do you expect?
@@Javier64691 this time they will get away with it. If that card is faster than everything available for both intel and AMD then they win.
@@mikelay5360 Eventually a savior will come. Until then stop crying and spend more shekels. They're merely a luxury item, it's not like Nvidia sells food or electricity LOL
What mistakes? they own over 70% of the market, the 4060 becoming the best--selling card
They are not mistakes but scams and morons still keep paying no matter what.
This is how its should be done:
5090 = 32GB
5080 = 24GB
5070 = 16GB
5060 = 12GB
5050 = 8 GB
In my opinion, 12gb and 8gb shouldn't exist anymore
@@r.m8146 On lower end cards it's fine. Like a 5050 with 16GB doesn't really make sense and would only increase the price.
@@r.m8146damn son. Way to roast my 12GB 6750xt!
@@r.m8146 where you from.. a rich country? dont be ridiculous
@@aeromotive2it's literally a $10 difference, they're being uber cheap
that's 50% increase from 1070, amazing!
😂
At double the price Yaaay! 👏👏👏
GTX 1070 had double the VRAM of the GTX 980.
@@MarekNowakowski 😂 very funny. I would like to know the performance increase too 💀💀
😂😂😂😂😂😂
I am not crowdfunding more leather jackets, 12GB is a joke.
facts
210Billion in the bank. He's already laughing.
He has his own yagyu beef farm at this point
Everybody wants more VRAM, but are you willing to pay for it? You think nobody has to get paid for these things? These days everyone is crying about VRAM as if there aren't available GPUs with more VRAM, we always want more while not willing to pay more for it hence why people are just expecting so much from low to mid range GPUs.
They will introduce the 18GB super version, they planned to use 3GB chips but they were probably not ready in time. I'm more pissed that they cheap out on PCB by lowering the memory bus width. The bigger onboard caches save the day for FullHD gaming, but 4K performance is castrated.
Can’t wait for RTX 5050 Tie 2GB🎉
32 bit RAM bus and PCIe with two lanes, but, but, but think of the exquisit DLSS and texture compression.😅
“If you don’t like it buy the 5090”
there will be 5050 since they use 5050's chip for 5060 already
*3GB
@@lorsch. 5050 Tie Super has 3GB
even on my 4070ti I sometimes end up limited by the memory before I get limited by the actual power of the card. Brutal.
On what games? the only game that has remotely even come close is Cyberpunk with Path Tracing and Halks quality textures mod for me
@@ArcticVXR1 ark ascended also takes up alot op memory
@@ArcticVXR1 There are tons of games that use over 12gb vram at 4k, the card is powerful enough for 4k if it had 16gb
@@Dempig4070 Ti can run most (not very demanding) games at 4K w/o any issues.
Big modern games is a different story though. The card runs out of power way before it has any issues with VRAM capacity. Alan Wake 2, Hellblade 2, Star Wars Outlaws, etc. - 4070 Ti is too slow to run them at 4K. Once you enable DLSS and/or reduce some settings to achieve 60 fps, VRAM usage decreases as well.
That's exactly the point and why Nvidia give you 12gb. To limit the use and life of the card. 😂🎉 Nvidia: you get what you paid for, a subscription.
All NVIDIA supporters can now enjoy the fruit of their labor, mmm, I am sooo satisfied...
Same pc nerds deserve this for crying all the time ahaha vengeance is mine
The forbidden fruit
I will enjoy my 5090.
I am truly satisfied. If 5070 has 12GB, this means game devs will have to optimize their games better to fit 12GB buffer. So I can hold on to 4070 Ti 12GB a bit longer.
@@stangamer1151 Someone will optimize their game just cause you bought low vram gpu? Right.
16gb truly should be the minimum now
Yep, will definitely go for that for my next gpu.
@@masterlee1988Honestly just wait for the 4070ti super to drop in price and pick on up, I would also say a 4080 super bit I dont think the prices would drop that drastically for the 4080 super.
I reckon 12 is still good but they have to make an upgrade to the 70 class chip somehow, then 12gb should be normalised on the 60 chip and then 8 for the 50 chip minimum
shaking my head.. sure we can all afford mid range.. and there isnt an ai chip boom going on
yh for a 5060 lol. definitely not acceptable for a 5080. But it's fine. We all know that a couple months later they're going to rebrand the 5080 into a 5070 ti super and then they'll just release a 24GB 5080. classic Nvidia moves
5080 is really a 70 class card and the 5070 is the 60 class.
True but the competition is doing the same. They have no incentive to change.
Well said it really is a repeat of the 40 series, only the 90 series gave value for money but was overkill for the average gamer.
@@RubenGugis you are so wrong. The 40 series had significant competition at every front except for the 4090. This time I don't think AMD will even achieve that 5070 performance even in raster.
Screw Nvgreedia
@@mikelay5360 The 'competition' does the same because the CEOs of both companies are Uncle and Niece. Why are people expecting anything from family-run businesses?
This is why last year I switched to AMD to a 7800xt. For $475 it's a bargain in comparison to competition with 16gb VRAM, my old GPU was starting to show it's age with only 8gb in some newer games. 12gb was a concern for me even a year ago as some newer games such as res 4 were using 11.5gb at 1440p, 12gb in 2025 for the price they are going to be asking is ridiculous. I'm honestly beginning to hate this company more than apple. They are limiting VRAM to increase your upgrade frequency, price gouging & intentionally misleading consumers with dishonest marketing. I certainly won't be buying anytime soon until they can get their s**t together.
Same
I only got into pc gaming a few months back. Went with the 7800xt. I love the card, can even do native 4K at frames I find acceptable
nvidia really went from 4070 being 1/2 of the 4090 to the 5070 being almost 1/4 of the 5090
But the 5080 is 1/2 of the 5090....sounds like a 12gb 4080 fiasco again
No sane person will buy a 5070 12GB.
The refresh with 18GB when the 3GB chips come out might be ok
no that's your opinion depends on needs and price, not everyone plays at 4k or plays the same shit
Who said the 5080 will have 12GB ?
@@mikelay5360 Thanks - that was a typo - fixed.
The 16GB 5080 would be interesting if it isn't too pricy
@@Ruleta_23 With the amount of high quality meshes along with textures developers aren't bothering to compress properly, bigger worlds that they're trying to create, more RT features they're trying to implement, 12GB simply is not enough at times even on 1080p native.
@@mattzun6779 I'm getting that 5080 if it's 1200 or less. Especially if it touches or even goes beyond the 4090 in gaming.
Everyone complains and in the end they will still go buy Nvidia. Sad but true
because most of them just a casual gamer that not obsessed with ultra setting. forget 12GB. 8GB should be fine until PS6 arrive. i mean even freaking 4GB cards still running new releases despite it should be completely dead when PS5 comes out in 2020.
It's not like there is something better they could get.
@@nossy232323 under 1k AMD is objectively better for gaming, if you dont think so thats just means you're an nvidia simp and part of the problem
@@nossy232323 intel is better
@@Cryostal Sure.
The RTX 5070 now rumored to have just 12GB of VRAM for $800, which exactly matches my prediction. A 12GB buffer is in no way competitive with RDNA3, let alone RDNA4. These is no way to offset a too-small VRAM buffer with fancy tricks. When the buffer is full, it's full, and the thing slows to a crawl.
They are pushing consumers up a tier. Is what it is. Nvidia makes their money in datacenter right now.
RDNA 4 is said to be really competitive in the low-end and midrange segments. Navi 48, which will be the top-end RDNA 4 GPU, is expected to offer between 7900 XT and 7900 XTX levels of raster performance and be roughly on par with the 4080 in ray-traced games between $450 and $500. If that actually ends up being the case, it's going to make the 5070 and 5080 look absolutely pathetic. They will be DOA. The 5080 is also only rumoured to have 16GB of VRAM, which is absolutely pathetic for a next-gen 80-class GPU. Nvidia is making a huge mistake, and it will be a total embarrassment for them. RDNA 4 and Battlemage will make RTX 50 look pathetic. Nvidia is being anti consumer as fuck, and only strong competition can stop that.
@@thenextension9160 This data centre goldrush isn't going to last forever for Nvidia.
The 5080 is so cut down, the 5070 will really be a 60 class card for a rip off price. Depressing , I can afford any GPU but don't want to reward greed and mug myself. They are destroying PC gaming.
@@thenextension9160 Moving the prices up a tier while moving the performance down a tier. Sad
If this is true, I don't care how powerful the core is, I'm going with the RDNA 4 8800XT.
It's a 8700 XT, it should be behind the 7900 XT
I'm personally getting Battle Mage, Nvidia is not gonna screw me.
@@Jaxon-c4k We don't know if it will ever come to desktop tho
Bro nvidia tried to get me to buy an $800. 12 gigabyte 4070 TI in 2023. I said forget that and forget nvidia and I’m glad EVGA left, and I left too , and for $1000 got a 24 GB 7900 XTX in 2023.
@@FrostyBud777I just hope the fanboys also leave Nvidia; otherwise, they get to charge whatever they want and people will buy anyway.
It depends on the price point, if the 5070 was only $350 dollars, I think it would be acceptable... but we know it will be $600
Depends on performance. Reviews will determine it's worth.
Can you even get a 3070 for 350 dollars lol
@@TheOptimisedTech yes on the used market
@@mikelay5360How many AAA games launching in 2025 will need more than 12G of VRAM? Spoiler: all of them. Only buy 5070 if you’re playing older games.
@@tringuyen7519 what a vague statement. Depends on the resolution that you are gaming on. If your goal is 4k gaming but a 5090 and stop whining.
5070 with 12 gb will be a very grave mistake.
Sheeple will still buy this shit.
@@GreyDeathVaccine not really. I'm looking forward for 50 series to ugrade my almost 4 years old 3070. But I'll shoot for a big upgrade. If it's not the 5070, then it'll be the 5080.
This is what will happen. People will buy the 5070, play the latest games rn , have a billion stutters, crashes, and missing textures, complain that the games are unoptimized, go on UA-cam and see what’s the issue, find out 12gb is too little for ultra settings. See the 5080 is 1k,ask their parents for it, and get slapped in the face.
@@tamodolo Then Nvidia got exactly what they wanted from you.
@@KarimM-m4u they can do that today with the 4070 and 4070 super in MANY GAMES is not enough when all the gadgets are on.
They spent all of their money on leather jackets instead 💀
12gb isnt enough to use the features they will be promoting with it even at 1080p.
There are games out right now, that in 1080p, ultra settings, rtx, with dlss and frame gen on , that exceed 12gb vram.
Obviously, anyone buying a card like this, more then likely isnt playing at 1080p. They want 4k, at the very least 1440p. Newer games are only going to require more too!
This is selling you a product that is physically not going to be able to use the features it promotes strictly because of the lack of vram. This is horseshit lol
Sue them.
@@GreyDeathVaccine for what?
Funny how with Nvidia the same performance next gen gives you less VRAM. Example:
RTX 2080 Ti 11GB > RTX 3070 8GB
RTX 3090 24GB > RTX 4070 Ti Super 16GB > RTX 5070 12GB (speculated)
Yeah nothing above £/$600 should be less than 16GB ffs it's nearly 2025.
GTX 1080 Ti 11GB -> RTX2070 Super 8GB.
But only since RTX, GTX980 Ti 6GB -> GTX1070 8GB.
GTX780Ti 3GB -> GTX970 3.5GB.
@@magnusnilsson9792 true, we are regressing
Just got a 3090. The base goes while the ti sits at the microcenter here. My guess is people are doing ai stuff. That’s why I got mine. Nvidia is between a rock and hard place. They add the ram they should, people like me will likely go cheaper if the performance is there. Not sure their investors will allow that. The greed is real.
12GB VRAM for $800 bucks...
lol
Yet people keep buying Nvidia.
8gb for $1200 next gen
What is the problem with that? no game uses more than that and you don't know the price
@@caylarabdk8389 we need ARM gpus very fast
@@Ruleta_23 Incorrect. Some new games are breaking the 12GB buffer, and it will only get worse.
Nvidia: what do you mean? I've been doing this for ages.
The 4070 already can't run some games with FG enabled because it increases VRAM usage.
Imagine being gated out of an advertised feature in a new card because of VRAM.
It's our fault that we had paid so much for GPU in 2020😢.
This is it.If people still buying,the problem not solved.Need to stop buying and they cut the prices to half so fast.
wtf are they thinking with this VRAM? Who tf is even gonna want these cards?
Everybody will buy this card, that’s the problem. Nvidia do this because they can.
semi pro.
@@kongyiu I wish people would vote with their wallets. It doesn’t make any sense to spend money on these 50 series cards but I’m sure some will just because they’re the latest GPU’s in name.
@@kongyiu I don't think so. I was planning to update with the 50s generation but with these news I am out.
Nvidia makes their big bucks from ai, they no longer care about gamers.
Nvidia will claim that GDDR7 will be more memory efficient and that they have new methods to cut down on vram usage lol
You mean like apple laptops that said 8gb is the same as 16gb. In reality 4x slower.
Tbf this could be partially true. We will see when benchmarks come out
AMD had 16gb back in 2018 crazy
Ultimately you can thank the consumer for this, if people bought more AMD cards they would still compete with high-end Nvidia.
nvidia is trying to sell their high end by crippling 4k capable cards by limiting the vram
12gb 5070 is ridiculous
192 bit bus works with 12 gb vram (12x 1 GB/6x 2 GB chip)
or 24x 1 GB/12x 2 GB chip
no way the 5070 gets 24gb vram
@@mr.electronx9036 I was hoping for 16gb on it.
@@famousfighter2310You will need the 5070 Super then.
@@ZackSNetwork im waiting for rdna 5
Can't believe my 7-8 year old 1080ti has the same amount, it's nuts.
With no competition, this is what companies do.
They dont need competition....They need customers with brains nothing more....If people not buying their overpriced products than they need to cut the prices to half........
when they had it during the 3000 gen the ydid the same, AMD competed very good with RDNA 2, people didnt care. is liek Apple, people dont care if the competition offers better products, they buy a name.
@@nicane-9966 This is true sadly.People can vote with their wallets.If they play the nonbrainer and pay every price this never changed and nGreedAI gives you the 6060 with 6gb 96bit card for 1200$......
There must be something wrong with these leaked specs, the 4080 is basically the same as the alleged 5080.
This crack is why I bought a 7900 XTX and only looked back at Nvidia for DLSS but then I remembered how overpriced the cards were with low ram
What's the point in being faster in raytracing if Nvidia doesn't give you enough vram to actually run it? Nvidia wants to sell gamers on vram hungry features like frame generation and raytracing but won't give you the vram necessary to use them.
Are u happy with your AMD card? I have 6700 XT 12 GB card and i really hope that AMD will kick in the a ss NGREEDIA with 8800XT if that is comeptitor to 5070 and 5070 Ti and 5070 Super. This is their chance to take huge portion of market.
@@Ivan80054 "I really hope that AMD will"... oh please. You DO know that AMD's CEO is the niece of Nvidia's CEO, right? It's all a family business, and they act in tandem. It is NO coincidence that AMD ALWAYS 'fails' to capitalize on Nvidia's 'mistakes'...
@@KnightofAgesJust because they’re family doesn’t mean they are not competing in the GPU market lol. AMD is just amazing at shooting themselves in the foot and that’s it.
Not a particular reason like they want to be worse than Nvidia…
@@KnightofAges So if they want to be more respected ,talking about AMD they need to fight for themselves. Paying to NGREEDIA their overpriced GPUs lead to this. People complaining and complaining on NGREEDIA an their prices and they still buy their cards. Well they deserved to be ripped off. I have 6700XT and i wait to see what will be RX 8800 and RX 8800XT cards . I simply refuse to pay much more to Nvidia than of what i think is decent price for gpu.
NVIDIA is doing NVIDIA thing!
People complain, but still buy NVIDIA more than ever! NVIDIA market share has gone up… and most likely go even higher with 5000 series!
NVIDIA market share did go from 84% to 88% in the latest market share info! And people,wonder why NVIDIA will continue to make subpar low/midd range gpus… they do because people buy!
when there is no competition, this is what happens. I'm glad I'm over playing PC games, this kinda stuff is ridiculous.
Nowadays VRAM is barely enough for raster, let alone RT
Don`t buy any 12gb for 1440p or 8gb for 1080p anymore, just 16gb is the minimum
Nvidia: Our new cards aren't much better, but they have new DLSS that only works on this card!
Next gen Nvidia: RTX 5080 4G, RTX 5080 6 GB, RTX 5080 8 GB, RTX 5080 12 GB, RTX 5080 16 GB, RTX 5090 32 GB 😂
I will chill with my RX7900XT 20GB for a loooooooog time!!!!
Future gaming will call your bluff ...😂
@@EddieMcclanahanhe will be alright for some time I mean the gou im getting RTX 2080 is still a powerhouse in 2024
Been using mine since 2022, at this rate I’m going to wait until 2030 before upgrading.
Sounds right.
5080 24gb really is the real 5080, possible a ti or SUPER or tiSUPER. Which is why there is such a HUGE differentiating gap between the 5090
5080 16gb is really the 5070
5070 12gb really is a 5060
12gb of vram on a card with 600-800 dollar price just sounds like a terrible deal
i have a 3070 ti, why bother upgrading, the upgrade for me would be an AMD card with WAY more VRAM than what NVidia is doing, FUCK THEM looks like this 3070 TI will be my last green team card for a long time
The 5090 will be the only good 50 series GPU. If you want a good mid range GPU the RX 8800xt should be cool.
Smartest thing for Nvidia to do with a 5070 12 GB is ship it with a DLSS 4 with extra texture sharpening. Pin the actual texture setting to medium but the graphics setting will say ultra max AI enhanced.
Yeah, I don't care how good it is, I'm NOT paying 1- 2000$ for a GPU.
I will refuse to aknowledge the existence of any 50 series card that has below 16gb vram and is over 250€.
I'll probably get one of amd's 8000 series gpu's anyway but if nivida dares insult us consumers with a 12gb 5070 priced anywhere above 500€ i will be avoiding all nvidia products at any cost for about 10 years.
in 2025? hell, no. 300€ for 12GB at max.
I wonder if people will have the discipline to shun the 5080 to send nvidia the message that we know what they are doing. They may have labeled it a 5080 but spec wise it should have been the 5070. Worse yet they are pricing it where a true 5080 should be. (Which of course is the point).
Rtx 5080 is actually the 5070 ... 5070 is the 5060... they are just sliding the numbers UP so they can charge more for less... look at 4090 cores divide buy "roughly" 2 and you get the 4070 ... 5080 is half the cores of the 5090 ... pay attention consumers
Let's all join hands in boycotting Nvidia
Let the milking begin
There are more than enough sheep.
the problem isnt nvidia the problem is the fan boyz buying it.
they're clearly upselling the 5080 by crippling the 5070.
Content creators complaining about Nvidia but they can't wait until they hand them $2'000 or more for the latest 90 series card and then tell everyone else they shouldn't give their money to Nvidia and teach them a lesson. 😢
Well the 5070 is actually the 5060 but we dont need to talk about that lol
Hopefully, these cards will fall off a cliff for sales. I do not feel compelled to buy any Nvidia card this time. And this is from a loyal buyer since 2005. I bought my last Nvidia GPU in 2021. Never again...
Currently got an RTX 3080 10GB, not upgrading to anything with less than 16GB VRAM, Nvidia are just greedy and pathetic really. After the 4080 12GB scam I really don't want to give them a penny.
Last gen they downshifted the card by 1 tier , renaming them from xx70ti to 4080 , from xx60ti to 4070 and so on. Now they're trying to downshift by 2 tiers, xx70=5080 and xx60=5070.
I guess they will do it until they find the absolute limit that the customers are willing to accept.
Remember when the x60 card was a midrange gpu? Now the 4060 is literally the lowest tier.
If 8800XT doesn't suck at RT, i'll go for it instead a 5070 for my next upgrade. I don't care about the GDDR7, GDDR6 20Gbps is enough for the performance tier of the rumored 8800XT.
The 5090 is actually a 5080 😮 Brilliant 👏
Both nvidia and Intel marketing heavily for AMD it seems.
Jensen responds to gamers' requests to increase VRAM. "Suck my jacket" he says.
My guess we will get an 18gb 5070 Super, with 3 gb modules down the road
That would be nice, especially if it somehow manages to stay under 1k, but it won't...
I just bought the 4070 ti Super with 16GB..smdh..16 needs to be the minimum.
Wait people were expecting something different?
Like Nvidia had 8gb for 3 generations and you thought that they would have 12gb for just a single generation?
Remember that they wanted to have "4080 12gb"?
I would concern on inflation not so much due to mining , but A.I rental ?
Regarding memory capacity for the next generation, this would be ideal:
5090 32Gb 512bit
5080 20Gb 320bit
5070 16Gb 256bit
5060 12Gb 192bit
5050 8Gb 128bit
Then for Ti and Super cards, clamshell or use 3Gb memory modules to bump up those base capacities.
It's not going to happen, but that WOULD have been an enticing lineup, even for people who are happy with their 30 and 40 series cards.
5050 should have 10 gigs IMHO
@@GreyDeathVaccine more vram is generally better, but when you get that low in the stack, it's hard to say if that extra 2Gb would really help all that much. Besides, it would have to be a 160bit card at that point, and the price would go up accordingly.
Those are most likely 5080 Ti and 5070 Ti models you mentioned. But yeah, they should be the basic models.
Beautiful!
I'm looking forward to upgrade my almost 4 years old 3070. This is by far the gpu I had for the longest time in my life. by far. And my upgrade WILL be at least 140% this time.
16GB of GDDR6 cost 36$ and they will charge you 300$ for it
The RTX 5080 is just the rename RTX 5070 base on the leak spec, Nvidia learned their lesson this time.
5070 super with 16 gigabytes, to satiate pissed off gamers a year after 5070 launch. SMH
Great news tbh! This means there is more change that ppl gonna buy the new AMD 8000 series cards. If everything goes as rumoured, I am going to buy one, too.
Damn...historically speaking everything under 320 Bit-Bus aged poorly in the long run...so this gen like last gen, everything under the top GPU will age badly...nVidia surely know how to keep people buying their cards every two years.
Sony brings a new technology that will allow games to look and run as good as PC games using high/ultra settings at 60fps with games looking as good and even better in some cases in 4K with minimal work from devs and is back compatible with PS4 games and will have 60 plus PS5 games at launch but 699.00 every one is upset and says not worth the 50plus performance bump in raster and 3 times performance improvement in RT.
Nvidia gives PC gamers less VRAM than they should, barely a 10-15 percent performance improvements and charges 50 percent more and even way more in some product stacks and PC gamers are going to line up and buy it and say it has DLSS 4.0 so worth it lol
This Vram hording with Nvida has forced me to buy an AMD card for my next graphics card...F%^k em. It's just ridiculous. Talk about price gouging.
Try Increase Vram
You Get 4k Gaming Ability
Vram = Game Resolution
New Rtx
New RT performance
Just Turn Down Vram
The 12 Vram = 1080p Raytrace Realtime Gaming
The 16 Vram = 1440p Raytrace Realtime Gaming
The 24 Vram = 2160p Raytrace Realtime Gaming
New RTX Will Be Expensive
Nothing We Can Do About It
And You All People Will Buy it 99%
12gb of VRAM?
"Nice" 1080p card. $800?
Of course we know they will be releasing a 16GB Ti super variant later but that does not redeem them of their stupidity and sin
Agreed! Also currently there’s a huge market in the open source AI space such as image/video Diffusion large models which can use up a lot of vram
Seriously if the 8800xt was as fast as a 7900xtx or 7900xt why would AMD go for slow speed 16gb GDDR6 with a 256bit bus? I expect RX 7900 GRE rasterization performance with 4070ti level raytracing.
Maybe the hardware modding scene could see a broader revival; there already are extreme enthusiasts that swap the memory chips on consumer GPUs to reach higher capacities and overclocking results, the hardest part seems to be the modding of the VBIOS/Firmware so that the GPU is actually using the higher memory capacity.
They are just calling the 5070 a 5080 and calling the 5060 a 5070 and still charging the higher prices. I seriously hope AMD brings it and Intel steps up to put more pressure on Nvidia to actually compete.
So that is basically the 4080 12gb that never got released
If AMD can wise up, learn from the mistakes they made this gen, improve FSR, release fewer cards, drop in a mid range contender, that can be about 4080 standard level of performance, or about 3-5% slower, but price it as dirt cheap as they can (from the start) and still make a slight profit out of it, I think that would be the foundation of some good competition to come.
AMD is ran by the niece of Nvidia's CEO. Why would she do that to her uncle?
If the 5070 really only gets 12GB of Vram.. wow. People will buy it regardless. Nvidia can do whatever they want, majority does not care and buys Raytracing. lol.
Lol at this point CES 2026 will be RTX 50 Super Refresh with faster and higher capacity GDDR7. 5070 Super 18GB, 5080 Super 24GB & mabye even 5090 Super 48GB (with more SMs enabled) lined up against RDNA 5.
Of course it will be more expensive.
These specs are laughable, can't wait for these announcements. Probably be some backlash from the consumers.
They're doing what they do best; raw dogging the customer.
A 12Gb 5070? Wow! I've been holding off upgrading my 3000 series card in the hope of more VRAM and performance with the 5000s .
An $800 5070 12Gb that will probably perform at around 4070Ti level? Sorry, Nvidia, No dice.
I can get a 4070Ti Super for less than that with 16 GB of VRAM. I'd rather buy your old stock than get a 12GB card.
No Graphics card over $500 should come with less than 16GB and I refuse to purchase one.
He's wrong, it's going to be around $600 to $700 with 12gb.
what fkin 800$ XD? It will be 600$ max at launch or nobody will buy it
This settles it....picking up a 4080 Super and eff 5xxx series.
Smart decision the only good 50 series GPU will be the 5090.
I was hoping that the 5070 was 16 gb and 256 bit bus
Look, if AMD gives up on releasing high-end graphics cards, NVIDIA can unfortunately put out whatever crap they want and brand it 5080...
I just fired up Diablo 4 expansion and setup the graphical settings to max with DLSS quality at 7680x2160. My system has a 7950X3D, 4090, 96GB DDR5, Samsung 57" Neo G9 and it can't handle this game at maximum resolution. The frame rate drops when the situation in game gets busy. Turning down the settings to performance mode made it look a bit crap but I'm having a great time with this game. This is a good benchmark to say it's time ro get a new computer for me.
I'm getting the 5090 and 9950X3D as soon they come out.
If that’s the 5070 then we can already imagine the 5060 (I mean the 5050) 😂
They already have plans for Super and Ti cards. 'The more you buy, the more you save'.
Ofc its 12gb. It gives then a chance to release 16 gb model 6 months later. 5070 super 16 gb.
But we already have 4070 ti super for that 😂
@@MrKardovalk Ye, but they need to cramp like 4-5 cards between 5070 and 5080 bro.
Lineup - 5070 12gb, 5070 super 12gb, 5070 super 16gb, 5070ti super 16gb, 5080-1fps and then 5080.
Because AMD tapped out. Did anyone expect anything else? Have fun saving more!
They do learn, they just dont care and they known after AMD announcement they can flex prices now on high end and crap mid tiers to drag their customers towards the higher options that they control. I believe they know amd never misses an opportunity to miss a chance
locking the card down to 12gb just so we don't have the freedom to occasionally 4k game is so fucking scummy it's literally so scummy. they know it would be good enough to do it and then no one would buy their 5080 etc
Nvidia's dGPU share 88%, they can do whatever they want.GPU market overpriced has been 4 years
Since 20 series so 6 years