Well maybe we should stop only just looking at VRAM? Idk...maybe as it's GDDR7 having more on a GPU in that performance bracket might not benefit you as much as you might think? We'll just have to see
it is not the card number, it is that we already have games that will need more then 12gb vram and the Price of the 5070 is so damn high as the 4070 was already and they do not add 16gb vram ? but how it comes they can add 16gb vram on the 4060 but not on the 4070/5070 ?
I have a 4070 super 12GB and I play the most auf the time on my 4K 60hz TV. There is right now only two games that I can’t play on ultra without ray tracing. Wu Kong and Stalker 2. In my opinion the 12GB is not the biggest problem now. Even with a 4070 super 16GB the performance would be the same. There are no missing textures. If the 5070 would be faster than a 4080. it will still handle 4k material extremely well. They should added at least 1-2GB. 13GB/14GB would be fine.
@@Crimson_Lockhart I have a 3070 ti 8gb, and a 4060 ti 16gb. If you play in 4k, the 4060ti 16gb is a better card for higher 0.1% lows when compared to the 3070 ti. However, in 1440p the 3070ti will smoke the 4060ti 16gb. However, in UE5 games, in 4k, the 4060ti wins every time because it's not hampered by the lack of VRAM like the 3070ti is.
yep. low vram = stuttering and hitching, which means unplayable because you cant compensate for stuttering. if the video card is not powerful, it just means lower frames, but its consistent, which is playable because you can compensate movement and mouse movement to account for it..
@@themarketgardener nVidia can get away with this because everybody knows the 7900XTX isn't actually comparable to the 4080 in anything but raw raster. As soon as you talk about upscaling or raytracing, it falls way behind.
Yup 7900xtx is a banger Price to Performance $4.99 a frame compared to the 4090 sitting at $9.10 per Frame!!!! When you look at the breakdown like this it just plainly puts it into perspective how much Nvidia is overcharging and it’s Bullsh*t!!! Source=Tom’s Hardware
It will be 16gb vram, laptop specs already leaked and the 5080 laptop will be a cut down 5070 ti chip just like the 5090 will be a cut down 5080 desktop chip and they all have 16gb of vram. Now the 5070 probably only has 12gb of vram.
i fully plan on going back to amd after this brief dip into nvidia territory after 14 years of ati/amd with this 4080 (my last REAL nvidia card before this was a 8800GT. i don't count the gtx 970 because mine was a defective piece of shit that i ultimately RMA'd). this card has been good to me, but i hate how huge it is and it cost me the same as the 6700xt did on the peak of mining, except there wasn't any crisis when i bought this one... and i technically got the 6700xt for free since i sold my 5700xt for the same price i paid for it.
If Nvidia really thinks it can get away with selling 5060 series with anything less than 16gb, let alone a 5070, they can go milk themselves vigorously.
I honestly think they could care less this cycle. They make 10x more from datacenter right now with everyone ramping up their AI infrastructure. TBH we kinda lucky they even throwing us a bone.
@@thenextension9160You get it. This is exactly the issue. They don't need us and they lose money on every single die they sell compared to just allocating the same silicon to the data center stuff. They own 90% of the market already so they are going to milk it for everything they can get. And planned obsolescence is one of the methods they like to use to screw us over.
@BBRBGR if that were the case why even release a xx60 card then? Or a xx70 card? It's just greed. They know they have the best feature stack so they just don't care about taking care of their low end customers. It's like the look at the steam survey, see 1080p as the top dog still, and say 'well, that's enough market research' and call it there. If I were LG and Samsung I'd be getting angry at this point that Nvidia is in a way torpedoing sales of newer monitors. There is no way with current trends that 12gb is going to enough for 1440p gaming by EOL of the 50 series, unless game developers really figure out some new texture optimizations and real soon, I see more and more people just sticking with 1080p because the experience will be dismal higher up until you get to the xx80/90 cards.
@@Acupofwater_WAW Lol no it cant. The 7900xt gets 30-40 fps at 1440p in most UE5 games at native resolution. Whats the point in lying? In Wukong for example the 7900xt drops below 30 fps at native 1440p.
@@Acupofwater_WAW 7900xt is nowhere close to a 4080 super first off, and second you are the one that claimed a 7900xt can run "EVERY" (you put every in all caps ) game at native 1440p ultra. Now you have nothing but excuses. Theres literally 100+ games a 7900xt can not run native 1440p with acceptable framerates. Also the 7900xt is not amds best card, the 7900xtx is. You are wrong about everything you posted.
It shouldn't require it! It's an upscaler! We want raw visuals raw performance all this A I. Generated performance is baffling to me. Like why do we need that??? I have seen benchmarks of new and old games using the 7900tx and the performance is amazing. Wtf are you on?!
I don't believe in these specs, nvidia can't provide such a gap between the 5090 and the 5080, unless they want to force people to buy a 5090. I don't think so
@ I mean we’re talking about nvidia, their greed surpasses expectations. I don’t believe on theses numbers as long as they’re not officials but I clearly wouldn’t be surprised
@@Julien-hu4ccthey've already set a precedent with that. The MSRP of the 4080 was atrocious compared to the performance gap with the 4090, you might as well just cough up more for 4090. Unless their prices are much better compared to the 40 series. I don't expect it to change.
@@Zaney_ hum.. personally I didn't understand it like that. precisely very few people bought a 4090 because its price gap with the 4080 was precisely too huge. There for coherent prices it would take a 5090 at 2000 dollars and a 5080 at 1000 dollars, like twice the performance in addition according to the specs, it's huge. We have never seen that yet! I know it's probably to put 5080 ti, but for me it's weird.
The 5080 is still expected to match or even be slightly faster than a 4090 in rasterization. In ray tracing it should beat it by a decent margin due to its much faster ray tracing cores. If they can launch it for $999 it'll be a great card.
5090 will obv be great. The rest of them are GARBAGE. Allegedly the 5080 ti/super, whatever tf it will be…will be pretty decent. Otherwise 50 series is dog water.
@@godnamedtay Yeah I want to upgrade soon as possible but seeing this it just doesnt make sense to buy 5080 or 5070. 5090 is too expensive so kinda have to wait for super or ti to come out..
@@Felale it is brand loyalty. AMD offered much better GPUs last gen with 50%-100% more VRAM (RX 6800, 6700 XT) for less money than a RTX 3070 and 3060 TI, had the same or much better raster performance and aged far better -- didn't get any meaningful share. Did it again this gen by offering 4GB extra VRAM vs Nvidia across most SKUs: 12GB RX 7700 XT (that was beating the embarrassment that is the 8GB 4060 TI in most cases) and the RX 7800 XT (matching or beating the 12GB RTX 4070 in most games besides Cyberpunk and a handful of Nvidia-backed titles) and went even further for the upper high end (7900 XT vs 12GB 4070 TI -- 8GB VRAM extra), whilst offering 50-80% of the RT performance. Still didn't make any meaningful difference. Intel are going to bail on the GPU market completely next gen after nobody buys Battlemage and AMD will most likely follow suit and stop bothering to release anything that's not mid-low tier with a guaranteed huge market share, only focusing on the server/AI market. Just don't complain when you're still getting 8-12GB VRAM Nvidia cards in the $300-$600 price range 5 years from now and the flagship SKUs like the 4090s move up to the $2500+ range like the Titan cards did. The only places where AMD can't compete yet are in CUDA and AI/machine learning and 99% of Nvidia users aren't using either, so there's no excuse left anymore.
@@ma-af3753 the problem with AMD is they need to fix there upscaler. FSR still looks bad compared to dlss. hopefully AMD fixes this with FSR4 using ai cores (they are already doing that in the ps5pro with PSSR, although PSSR has many of its own issues). and many games these days are requiring upscal;ing to get playable frame-rates....so having a bad upscaler is why AMD marketshare is so small
The 10 series was 1070 was 75% of the 1080 and the 1060 was 50% now the 5080 is 50% of the main card and 5070 ti is 41% the power of the 5090 nvgreddia is really milking these cards
I was more disappointed when the 4070 Ti was (officially) announced. Because you just KNEW that the 4070 was gonna be shittier. And the mfs wanted to call it "4080 12GB"! 🤬🤬 The entire 4000 cards except for maybe the 4090 is shifted by 1-2 tiers. Then they wondered last year why the had the lowest GPU sales in over 20 years (actual article). Gee, I WONDER WHY. And now I'm even MORE disappointed to learn that the "4060" (which is really a 4030Ti in disguise) is now top of the Steam Hardware Survey charts. 😠 People make me so angry. It just means that they're gonna pull the same shit in the future. VOTE WITH YOUR WALLETS, SHEEPLE!
How the fuck could a 5080 POSSIBLY be on par with a 4090 as cut down as it is?! No achievable amount of clock or ips boosts can do that literal magic... It's just a god damn 4080 all over again
yeah they dont want people to buy the 5070... they want people to buy the 5080 and they want to sell some 5090's to idiots like me that go broke to buy a gpu. they want people who buy 5070's to have to upgrade every gen as punishment for not paying them enough
the difference in specs of the 5090 to the 5080 is absolutely disgusting behaviour. 5080 isn't good enough for probably 1299 and the 5090 is way to power hungry and expensive at probably 2k+
There's no reduction in cost per transistor from TSMC in this generation. Thus you should expect price per SM be same between generations, and perf/$ coming from frequency and maybe small architectural improvements. Thus most probable pricing is keeping pricing of 40 series refresh from 5080 to below. And die for 5090 was designed from ground up to be 1999$ card because at the time when that die was designed fastest card could carry any price and there would be enough demand for it. Reasonable expectation is that 5080 is 999$ 5070 ti is 799$ and 5070 is 549$ And 5090 is 1999$. Ideally most cards would be 100$ cheaper, but Nvidia has pretty much just used same basic multiplier between their manufacturing cost and MSRP from at least 10 series if not longer.
is the 50 seires based on the same process node this generation if so you should still expect price reductions on the tsmc side as nodes mature prices tend to go down slightly its why in the past nvidia would do budget cards on more mature nodes
@@alexmeek610 Supply and demand, not just Nvidia buying, other industries tend to jump nodes a year or 2 behind the computer market, no new nodes means competition for manufacturing slots, there's something stupid like 80+ microchips in a single car these days.
Eh I think you need to tack on at least $100 to all of them except maybe the 5090. Maybe $200. The original 4080 was $1199 on release. I doubt they undersell themselves from the last gen when costs have gone up.
@@alexmeek610 Except TSMC has increased its prices instead of reduced due to extra demand recently for the node. But that change is probably compensated by yield improvements.
the fact that they also dont know how much VRAM it has in the leak.. is troubling lol. but its probably kept secret so people will buy the 5070 and not wait for the 5070ti.
I like the idea that the 4070 ti be a specialized card that caters for a duel system. A 1440 p pc monitor, and a TV style system, where entry level 4k is a thing.
this is exactly why I bought a 4070 ti super last month. Was debating waiting for the 5000 series, but unless you're buying the 5090, this gen looks pretty unappealing.
16GB + 256-bit or gtfo. 😠 It was the exact same disappointment with the 4070 Ti. Which for some reason they decided to release first. Remember, the POS execs at Nvidia wanted to name it "4080 12GB"! So you just KNEW that the 4070 would've had shittier specs. 😫 Same situation now. When the 70-class has had 256-bit for a DECADE, since the GTX 670. 😠
People are so stupid for buying the 4060 because now it's one of the most popular GPUs. Anyone who is at least relatively experienced with PC gaming will know the 4060 is a 4050. Nvidia has gotten away with using an 07 chip, which is normally used for the 50 class GPU into a 60 class GPU. This is why I call it the fake 4060 because it has the performance of a 4050, but with the 4060 naming scheme and price slapped onto it. Worst of all, the 4060 managed to get this popular with only 8GB of VRAM on a measly 128-bit bus. This proves people really are that stupid and don't bother doing any research before buying a GPU. Hopefully, the 5060 doesn't suffer from the same problem, and it actually ends up using the full 06 chip and not the 07 chip. Fanboys are a major reason why the PC gaming industry constantly gets screwed.
lol its like nvidia is praising me for buying a 7900xtx an hour ago. Like yes please, keep reassuring me that not waiting for the 5070 was the right decision. This 3070 im about to upgrade from will always hold a special place in my heart tho since its my first GPU.
Not giving the 5080 bare minimum 20 gigs is insane unless there are planning a 5080s or ti S with more vram. Yea I’m going to get a 4080s on Black Friday and just wait for 6 series cards. Congratulations you’ve priced out the majority of your consumers with bad price to performance
Man the 7900xtx is on sale for $800 here and its raster is similar to 4080s. But for my case, i want good stream encoder which nvidia is way better at...
@ same man the 7900xtx was my first choice but I’m still going to have to go with the 4080s. $800 is a steal for that card and apparently it’s going down to 788. I just hope the 4080s can get to 890 that would be perfect
AMD is our only hope for the every day gamer that doesn't want to spend a mortgage payment on a video card... They need to have the 8800XT absolutely BLOW their 5070 out of the water.
There's several things that don't make sense here: 1.) I thought this was supposed to be a generational improvement, as in, more power efficient? Nope. all these cards are rated higher watts, meaning pretty much all the performance increase we're getting is due to simply adding more cores and more watts, and slightly better memory bandwidth (well, of course the 5090 is almost doubling it's bandwidth, but for the rest... about a 20% improvement) 2.) The 5090 (and the 4090 before it) is the only card getting a good treatment here. it's got like 40-50% MORE SMs (and RT cores) than the 4090 (which had a lot more than the 3090). Whereas the other cards are just getting like 5% more than their previous counterpart. the 4090 was already almost 100% BETTER than the 4080, which was reflected in the cost, i get it, but this 5090 will be literally more than twice as good as the 5080. The 5080 will be 4% better than the 4080/5070 ti. it's ridiculous. 3.) 12GB ?? On an $800 card?! ARE YOU SERIOUS RIGHT NOW The only saving grace would be HUGE IPC increases.
5090 is way too expensive, takes way too much power, will probably melt cables and be a fire hazard like the previous xx90 generations. It makes sense performance-wise, but it's not a card for the regular user. AMD, here I come. Nvidia will disappear from the gaming/PC market.
i cant believe people actually expected nvidia to add more vram to a 70 card. the 4080 is already a cut in half 4090 which is hilarious. only thing i can see happening this gen is nvidia adding about 3-4 different 4080 models to keep milking the fan boys.
We know why. Nvidia HATES you! They have new customers now, they thought if they kept feeding gamers crap, they’d go away. But they didn’t! Now Nvidia just gets to fleece consumers forevermore.
And it turns out people LIKE getting fleeced. 😐 The "4060" (which is really a 4030Ti in disguise) is now top of the Steam Hardware Survey. Un-be-fucking-livable. 😐 Nvidiots keep buying them. Signaling to Ngreedia that it's OKAY to charge $300 for a card that should be sub-$100.
Depends on the price, Right now a 4070ti is 700 and a super with 16gb or ram is $740. NVIDIA releases it at $700 and it beats the 4080 by 10%-15 they may be ok but I feel that is the min acceptable improvement acceptable. The problem is tariffs we get 10-20% added on and even a $700 card become $770-840.
This is what happends when there isnt any competition. HUGE prices and underwhelming specs/ performance. They can charge what they want for a minimum boost in performance. They haven't released yet, I know BUT Im calling it early. I have a good feeling they ALL will be more expensive than we want AND disappoint us with the performance numbers.
they only give 16GB vram to the 5080??? The new Stalker game consumes 22gb vram on the 4090 :o the only card that has proper vram is the 5090. 12 GB are not enough for the 5070. im right now holding back with buying a GPU. I wanted to get a 3060 ti or 4060. but then i thought of waiting for the new Nvidea GPUs. But it looks like i rather wait for new AMD and Intel GPUs instead.. i need proper vram amounts, even on the low and medium end cards
22gb allocation not usage, you get the same thing in call of duty games and besides, Nvidia cards are known to use way less vram than AMD due to optimisation for example. These are the specs I think: 5060 - 8gb 5060 Ti - 8gb 5070 - 12gb 5070 Ti - 12gb 5080 - 16gb 5090 - 28/32gb
Me too. For my needs, my 3070 is still good, but once I eventually want to upgrade I'll definitely be considering getting an amd gpu rather than nvidia
based on the performance we expect from the mainstream cards out of Nvidia it will either be a Zen moment and kickstart a movement that will see a massive market shift, or they will mess it up and market it as vs Nvidia when all they need to do is walk out and do a $299 mic drop and watch the community lose it and attack Nvidia themselves, gaming on PC is becoming ridiculously expensive with how demanding games are getting, some cheap and competent cards are what we need like yesterday.
@@kristiandevil im happy with my current setup that i built around the 7900 XTX which can go neck to neck with 4080 super WITHOUT upscaling and tbh i dont really care much for upscaling some games perform better without it.
I feel like they're missing a card that's a straight replacement in the market for the 4090. I realise a forthcoming 5080Ti could be that card, but it seems weird for NVidia to stop production on the 4090 and then create a lot of demand in the secondhand market. You'd think they'd want to encourage firsthand sales.
Well, I'm happy I out this is my skip generation. But these GPUs are looking more and more like planned redundancy and upselling, why upselling I think a lot of the comments have nailed it. Step the name down and you get the real level of the GPU 80=70 70-60 etc. and look at 5090 and its single 12 pin that is going to pull 600w, that sounds like story. We know it will interesting to watch what happens when get into wild.
Hard pass. The fanboys can eat that crap up if they want, but 12gb VRAM is minor league level stuff, especially when you consider what the price tag on it will probably be.
And people will STILL buy it, proved it with the 1650, 3050, 3060 (non ti), 4060 and 4070 (non ti) , the amount of brand loyalty have with n-greedia is just astonishing. These people do not care about you!
So they are going to be more expensive like always but not really more performance because you have to use them at lower settings to avoid vram limit.... They really only care about their server sales since its 90% of their revenue.
they are just making it so that people with the 1080ti dont have to upgrade . keep gimping the cards nvidia , my 1080ti still has a few good years left in her at this rate.
A next gen 70 class card should be better than the previous 70 class card in every way. But the stupid decision to put 12gb vram again makes the 4070ti super the better option
if the 5070 is 12gb of memory the 5060 will most likely be 8 or 10. I sincerely hope AMD makes the 8600 XT have 12gb of memory just to screw Nvidia over.
I'm super disappointed it's not the generational leap I was hoping it would be. They've dumped all their R&D into these $30,000 AI boards.. gamers seem 2nd class to their overall business right now - and they don't care because they don't have any competition. I'm actually rooting for Intel and AMD to make a comeback and possible leapfrog.. just like AMD came along and swiped 70% of Intel's customers this past 2 generations! Competition is good! I was going to get back into gaming next year.. but not for more than £1000... I am NOT paying £2000+ for a fucking graphics card. Asus ROG 4090 24GB is £1999 still...
This feels an awful lot like Kepler launch, especially considering there will be more segments coming in 2025 with the new 50% larger capacity memory chips. 5090 won't even be the top card when the full stack is done launching.
Hopefully Intel BMG can be good enough to eat into the midrange market. Give me a B770 with 16gb with 4070ti performance for $500 over a 5070 with 12gb for $600 anyday!
I'll buy a card next year to replace my 8-year-old (at that point) 11GB 1080TI - "an elegant weapon for a more civilized age". For 2025 I'm okay with a 16GB card for under £500. For around £600, I would expect 20GB minimum. The 7900 GRE (4070/4070 super equivalent) with 16GB for £500 was released almost a year ago, so if NVIDIA doesn't offer something noticeably better next year, I'll just wait for AMD and their new cards.
I own a 3070 Ti. Hot, expensive, and slow (less than 10% faster than 4070). The 4070 Ti was also super-disappointing. Almost the lowest price-performance of the 40xx generation! The idea of an xx70 Ti card is to squeeze extra blood out of the stone which is the xx70 card. So I think that ALL xx70 Ti CARDS ARE DISAPPOINTING. There is no counter example yet!
If these leaks are correct it seems the gaming laptops will be very disappointing and it won't match the performance of last generation desktop gpus like it has done for the last 3 or so generations.
I knew Nvidia wasn't going to focus on increasing the performance of the RTX 5070. After all, that's the graphics card most PC gamers are interested in.
Looks veeeery much like my next card is gonna be AMD, was heading in that direction already. Audio sidenote: I suspect your noise filter is skewing the noise quality (which often happens with deep voices). A realtime Voice mod software would fix that, and there's plenty of free ones around. Or, noise panels in the ceiling if you want a low tech fix.
Do not get the 12gb 5070 or 16gb 5080. Games are already struggling with vram, exceeding 16gb in 4k, and next gen ray tracing and frame gen will use up even more vram. These GPUs are not a good buy because it will be outdated too quick.
12 GB in 2025? Really? 5080 at 16 gb should be 24 gb and 5070 should be 16 gb. This is dumb.
Why dumb, NV doesn't want you to future proof your buy? They want you to buy the 6000 series as well.
I wouldn't spend more than 350$ for a 12GB GPU.
5080 at 20gb would be acceptable but 16 is sad
Two words for you "planned obsolescence"
Well maybe we should stop only just looking at VRAM? Idk...maybe as it's GDDR7 having more on a GPU in that performance bracket might not benefit you as much as you might think? We'll just have to see
Not getting a card without 16GB in this day and age. 12GB on a 5070? eff off.
cause it's really a '60-class' card with a '70-class' name, with most likely, an '80-class' price tag.
it is not the card number, it is that we already have games that will need more then 12gb vram and the Price of the 5070 is so damn high as the 4070 was already and they do not add 16gb vram ? but how it comes they can add 16gb vram on the 4060 but not on the 4070/5070 ?
@@corndog9482 I cant tell it even better sir 😂
I have a 4070 super 12GB and I play the most auf the time on my 4K 60hz TV. There is right now only two games that I can’t play on ultra without ray tracing. Wu Kong and Stalker 2. In my opinion the 12GB is not the biggest problem now. Even with a 4070 super 16GB the performance would be the same. There are no missing textures. If the 5070 would be faster than a 4080. it will still handle 4k material extremely well. They should added at least 1-2GB. 13GB/14GB would be fine.
@@ZSergioZ1going from a 4070 super to a 4070 ti super is +15% and not having to worry about vram
Lmao. If you’re not stuttering because of Unreal Engine 5, you’re stuttering from having no Vram.
RTX 4060 Ti 16 GB vs RTX 4060 Ti 8 GB.
@@Crimson_Lockhart I have a 3070 ti 8gb, and a 4060 ti 16gb. If you play in 4k, the 4060ti 16gb is a better card for higher 0.1% lows when compared to the 3070 ti. However, in 1440p the 3070ti will smoke the 4060ti 16gb. However, in UE5 games, in 4k, the 4060ti wins every time because it's not hampered by the lack of VRAM like the 3070ti is.
yep. low vram = stuttering and hitching, which means unplayable because you cant compensate for stuttering. if the video card is not powerful, it just means lower frames, but its consistent, which is playable because you can compensate movement and mouse movement to account for it..
@@Crimson_Lockhart ? what ?
16gb on the 5080 for a 2nd in line card is an absolute joke. it should be at a minimum 20gb preferably 24gb
It’s crazy AMD’s last gen 80 equivalent (4080) has 24GB…
24gb minimum
The new Stalker game consumes 22gb vram on the 4090 :o
@@themarketgardener nVidia can get away with this because everybody knows the 7900XTX isn't actually comparable to the 4080 in anything but raw raster. As soon as you talk about upscaling or raytracing, it falls way behind.
Yup 7900xtx is a banger Price to Performance $4.99 a frame compared to the 4090 sitting at $9.10 per Frame!!!! When you look at the breakdown like this it just plainly puts it into perspective how much Nvidia is overcharging and it’s Bullsh*t!!! Source=Tom’s Hardware
@@themarketgardener and what are u do with 24gb vram if the card cann not play even games at 1440p who use 8gb vram 16 is fine for 1440p
Honestly if Nvidia comes with a 5070 Ti that has less than 16 Gb memory, I will buy AMD just to give them the finger.
That's me if too
It will be 16gb vram, laptop specs already leaked and the 5080 laptop will be a cut down 5070 ti chip just like the 5090 will be a cut down 5080 desktop chip and they all have 16gb of vram.
Now the 5070 probably only has 12gb of vram.
Why wait? Do it now
@@PaintrainX absolutely...
i fully plan on going back to amd after this brief dip into nvidia territory after 14 years of ati/amd with this 4080 (my last REAL nvidia card before this was a 8800GT. i don't count the gtx 970 because mine was a defective piece of shit that i ultimately RMA'd). this card has been good to me, but i hate how huge it is and it cost me the same as the 6700xt did on the peak of mining, except there wasn't any crisis when i bought this one... and i technically got the 6700xt for free since i sold my 5700xt for the same price i paid for it.
If Nvidia really thinks it can get away with selling 5060 series with anything less than 16gb, let alone a 5070, they can go milk themselves vigorously.
That reminds me Nicholas Cage Lord of War movie. His uncle says "I told him to go have intercourse with himself." LOL!!
Hot comment! 😂😊
Nah 5060 witll never have 16gb , maybe 12gb
I honestly think they could care less this cycle. They make 10x more from datacenter right now with everyone ramping up their AI infrastructure. TBH we kinda lucky they even throwing us a bone.
@@thenextension9160You get it.
This is exactly the issue.
They don't need us and they lose money on every single die they sell compared to just allocating the same silicon to the data center stuff. They own 90% of the market already so they are going to milk it for everything they can get. And planned obsolescence is one of the methods they like to use to screw us over.
Welcome to Nvidia 2025. Here are the patch notes and changes:
090 = 090
080 = 070 ti
070 = 060 ti
100% accurate. Don’t forget
060 = 050
@@themarketgardener and
050=030
@@eQui253 ur forgetting 50 series will have gdr7 memory..
090 = 80 TI
must be an amd fanboy that apparently cant read
Thx for not being clickbait channel like gamer meld, red tech..
I know exactly how you feel
Look at the title and thumbnail of this video, clearly clickbait
Gamer meld is so awful
I stopped watching those garbage channels last year
lol gamer meld, that dude talks like he had a partial stroke.
Nvidia only seem to be advancing the top of the stack while the middle and bottom are essentially stagnant. Yay................
they move stagnation up The product stack every generation
@@anttikangasvieri1361they're gonna keep this up until their $3000 7090 is their best price-performance card, with 4x the core count of the 7080.
A mere reflection of society as a whole. You're either rich or miserable. No grey zones, no in between, no middle class whatsoever.
Maybe their attitude is "Let AMD have the middle, we rule the top"?
@BBRBGR if that were the case why even release a xx60 card then? Or a xx70 card? It's just greed. They know they have the best feature stack so they just don't care about taking care of their low end customers. It's like the look at the steam survey, see 1080p as the top dog still, and say 'well, that's enough market research' and call it there. If I were LG and Samsung I'd be getting angry at this point that Nvidia is in a way torpedoing sales of newer monitors. There is no way with current trends that 12gb is going to enough for 1440p gaming by EOL of the 50 series, unless game developers really figure out some new texture optimizations and real soon, I see more and more people just sticking with 1080p because the experience will be dismal higher up until you get to the xx80/90 cards.
More and more my 7900XT is looking like a wise investment.
7900xt is great if you only play old games. But any current gen and future games require upscaling, and FSR is unusable its so terrible.
thought the same thing..see all the hate amd gets but my 7900xtx looks like money well spent
@@Acupofwater_WAW Lol no it cant. The 7900xt gets 30-40 fps at 1440p in most UE5 games at native resolution. Whats the point in lying? In Wukong for example the 7900xt drops below 30 fps at native 1440p.
@@Acupofwater_WAW 7900xt is nowhere close to a 4080 super first off, and second you are the one that claimed a 7900xt can run "EVERY" (you put every in all caps ) game at native 1440p ultra. Now you have nothing but excuses. Theres literally 100+ games a 7900xt can not run native 1440p with acceptable framerates. Also the 7900xt is not amds best card, the 7900xtx is. You are wrong about everything you posted.
It shouldn't require it! It's an upscaler! We want raw visuals raw performance all this A I. Generated performance is baffling to me. Like why do we need that??? I have seen benchmarks of new and old games using the 7900tx and the performance is amazing. Wtf are you on?!
I mean the 5080 spec are already suspected to be horrendous so it doesn’t surprise me that much
I don't believe in these specs, nvidia can't provide such a gap between the 5090 and the 5080, unless they want to force people to buy a 5090. I don't think so
@ I mean we’re talking about nvidia, their greed surpasses expectations. I don’t believe on theses numbers as long as they’re not officials but I clearly wouldn’t be surprised
@@Julien-hu4ccthey've already set a precedent with that. The MSRP of the 4080 was atrocious compared to the performance gap with the 4090, you might as well just cough up more for 4090. Unless their prices are much better compared to the 40 series. I don't expect it to change.
@@Zaney_ hum.. personally I didn't understand it like that. precisely very few people bought a 4090 because its price gap with the 4080 was precisely too huge. There for coherent prices it would take a 5090 at 2000 dollars and a 5080 at 1000 dollars, like twice the performance in addition according to the specs, it's huge. We have never seen that yet! I know it's probably to put 5080 ti, but for me it's weird.
@@Julien-hu4cc I can definitely see Nvidia do this. The monopoly is theirs. They lose nothing by flipping bird on us.
The 5080 looks gimped to hell compared to the 5090. Either the 5090 is going to be amazing...or the 5080 is going to be dissapointing.
They want to push people to buy 5090 or settle for less
The 5080 is still expected to match or even be slightly faster than a 4090 in rasterization. In ray tracing it should beat it by a decent margin due to its much faster ray tracing cores. If they can launch it for $999 it'll be a great card.
5090 will obv be great. The rest of them are GARBAGE. Allegedly the 5080 ti/super, whatever tf it will be…will be pretty decent. Otherwise 50 series is dog water.
@@godnamedtay Yeah I want to upgrade soon as possible but seeing this it just doesnt make sense to buy 5080 or 5070. 5090 is too expensive so kinda have to wait for super or ti to come out..
@@03chrisv Performing like lasts Gens 90 Tier card is exactly what a 70 Tier card should be able to do.
Not for those prices, nope
new AMD cards are going to be sweet spot. Let them be smaller 7900XT with better RT and AI upscaling.
now we just need gamermeld to 1 to 1 copy your video and make his own :)
good video btw
Don’t forget the extra clickbaity titles where it’s a small news for the video but puts 50 series thumbnails for clicks
@@themarketgardener and his annoying yapping noises
cant stand that guy
Good. This is the future Nvidia fanboys deserve for their brand loyalty.
Amd for life...unless they pull some weird shit like this.
It’s not brand loyalty, AMD can’t compete
fanboys did not even make 1% of nvidia geforce sales.
@@Felale it is brand loyalty. AMD offered much better GPUs last gen with 50%-100% more VRAM (RX 6800, 6700 XT) for less money than a RTX 3070 and 3060 TI, had the same or much better raster performance and aged far better -- didn't get any meaningful share. Did it again this gen by offering 4GB extra VRAM vs Nvidia across most SKUs: 12GB RX 7700 XT (that was beating the embarrassment that is the 8GB 4060 TI in most cases) and the RX 7800 XT (matching or beating the 12GB RTX 4070 in most games besides Cyberpunk and a handful of Nvidia-backed titles) and went even further for the upper high end (7900 XT vs 12GB 4070 TI -- 8GB VRAM extra), whilst offering 50-80% of the RT performance. Still didn't make any meaningful difference. Intel are going to bail on the GPU market completely next gen after nobody buys Battlemage and AMD will most likely follow suit and stop bothering to release anything that's not mid-low tier with a guaranteed huge market share, only focusing on the server/AI market. Just don't complain when you're still getting 8-12GB VRAM Nvidia cards in the $300-$600 price range 5 years from now and the flagship SKUs like the 4090s move up to the $2500+ range like the Titan cards did. The only places where AMD can't compete yet are in CUDA and AI/machine learning and 99% of Nvidia users aren't using either, so there's no excuse left anymore.
@@ma-af3753 the problem with AMD is they need to fix there upscaler. FSR still looks bad compared to dlss. hopefully AMD fixes this with FSR4 using ai cores (they are already doing that in the ps5pro with PSSR, although PSSR has many of its own issues). and many games these days are requiring upscal;ing to get playable frame-rates....so having a bad upscaler is why AMD marketshare is so small
RTX 5080 looks like a scam lmao. Wtf are those specs
The 10 series was 1070 was 75% of the 1080 and the 1060 was 50% now the 5080 is 50% of the main card and 5070 ti is 41% the power of the 5090 nvgreddia is really milking these cards
Wow this is the most dissatisfied I’ve been with gpu news in ever!
look at the new intel gpus. 4070super/4070ti performance for cheap. And also 16-24gb vram
@ that’s great and all but I don’t trust intel
I was more disappointed when the 4070 Ti was (officially) announced. Because you just KNEW that the 4070 was gonna be shittier. And the mfs wanted to call it "4080 12GB"! 🤬🤬 The entire 4000 cards except for maybe the 4090 is shifted by 1-2 tiers. Then they wondered last year why the had the lowest GPU sales in over 20 years (actual article). Gee, I WONDER WHY. And now I'm even MORE disappointed to learn that the "4060" (which is really a 4030Ti in disguise) is now top of the Steam Hardware Survey charts. 😠 People make me so angry. It just means that they're gonna pull the same shit in the future. VOTE WITH YOUR WALLETS, SHEEPLE!
You know how to solve problems, don’t buy it. Let it fail and rot
GPU Memory should be
16gb, 20gb, 24gb, 28gb & 32gb
How the fuck could a 5080 POSSIBLY be on par with a 4090 as cut down as it is?! No achievable amount of clock or ips boosts can do that literal magic... It's just a god damn 4080 all over again
I still cant get over the 12gb vram config for the rtx5070 ........ i cant! somebody wake me up from this nightmare!
Such a joke I wanted to buy a 5070 but only with min 16gb vram. We have to wait for super refreshes at this point or regret the low vram very soon
@@rejectxz AMD has an opportunity here if they can play par with the 5080 and put 24GB of ram on it
@@PopePlatinumBeatsThey already did that with XTX, I don’t think they can match 5080 with RDNA4 lacking high-end unfortunately.
yeah they dont want people to buy the 5070... they want people to buy the 5080 and they want to sell some 5090's to idiots like me that go broke to buy a gpu. they want people who buy 5070's to have to upgrade every gen as punishment for not paying them enough
@@s7r49 feels like it! im just gonna get the 5080 and hope for the best!
the difference in specs of the 5090 to the 5080 is absolutely disgusting behaviour. 5080 isn't good enough for probably 1299 and the 5090 is way to power hungry and expensive at probably 2k+
I'm more interested if we will finally get full DP 2.0 speeds.
If so, it only took 5 whole years for the implementation...
Never felt the need for it anyway. But curious nonetheless
@@mikelay53608k 144hz?
I see it being useful for the new 4k oled displays that supports dp 2.1 which allows for 4k 240hz WITHOUT dsc (display stream compression)
@Chicken-o5e 8k 144hz needs it too. I want to upgrade my 4k 144hz
There's no reduction in cost per transistor from TSMC in this generation. Thus you should expect price per SM be same between generations, and perf/$ coming from frequency and maybe small architectural improvements. Thus most probable pricing is keeping pricing of 40 series refresh from 5080 to below. And die for 5090 was designed from ground up to be 1999$ card because at the time when that die was designed fastest card could carry any price and there would be enough demand for it.
Reasonable expectation is that 5080 is 999$ 5070 ti is 799$ and 5070 is 549$ And 5090 is 1999$. Ideally most cards would be 100$ cheaper, but Nvidia has pretty much just used same basic multiplier between their manufacturing cost and MSRP from at least 10 series if not longer.
is the 50 seires based on the same process node this generation if so you should still expect price reductions on the tsmc side as nodes mature prices tend to go down slightly its why in the past nvidia would do budget cards on more mature nodes
@@alexmeek610 Supply and demand, not just Nvidia buying, other industries tend to jump nodes a year or 2 behind the computer market, no new nodes means competition for manufacturing slots, there's something stupid like 80+ microchips in a single car these days.
Eh I think you need to tack on at least $100 to all of them except maybe the 5090. Maybe $200. The original 4080 was $1199 on release.
I doubt they undersell themselves from the last gen when costs have gone up.
@@alexmeek610 Except TSMC has increased its prices instead of reduced due to extra demand recently for the node. But that change is probably compensated by yield improvements.
12GB of VRAM is an utter joke for the regular 5070. 12GB on the Ti variant is more than taking the piss. Wtf is going through Jensen's head?
well he knows his consumers are brand brain-washed idiots and is expertly exploiting their biases.
Being worth like $200B
Tbf there's no mention of 12gb on 5070Ti in the leak, but i wouldn't expect Nvidia to not F over consumers at this point
If the 5070 has 16gb more than likely the 5070ti. Now if it got 18 or maybe even 20gb that would be solid.
The more you buy the more you save that is all 😂😂
After the mining craze GPUs became a license to print money. A 2.5x increase in price or so for the same range of cards.
the fact that they also dont know how much VRAM it has in the leak.. is troubling lol. but its probably kept secret so people will buy the 5070 and not wait for the 5070ti.
It’s 16GB VRAM according to Techpowerup. But it’s going to be over $800 making it a big scam.
Going from a 1080ti to a 5070ti or 4070ti. May go AMD if the prices is right. Will decide in February/March
so, again, the 50"70" here is actually a '60-class' card. nvidia with their stupid shenanigans, as always.
I like the idea that the 4070 ti be a specialized card that caters for a duel system. A 1440 p pc monitor, and a TV style system, where entry level 4k is a thing.
Nvidia doesn't exist to make consumers happy, it exists to make shareholders happy
Any business exists for only one reason, to make money.
@@ololometer536 Exactly. And guess what, people will still buy it anyways and to flex online.
At this point they're just relying on the "higher number better" crowd to buy their stuff whether it's really better or not
So glad I bought a used 7900xtx for $650 last week. At least it will hold me over until the sh## storm blows over.
Great deal! I just bought a 7900xtx from Micro Center open box for 760. Didn't shop around, just saw it and bought it.
this is exactly why I bought a 4070 ti super last month. Was debating waiting for the 5000 series, but unless you're buying the 5090, this gen looks pretty unappealing.
16GB + 256-bit or gtfo. 😠
It was the exact same disappointment with the 4070 Ti. Which for some reason they decided to release first. Remember, the POS execs at Nvidia wanted to name it "4080 12GB"!
So you just KNEW that the 4070 would've had shittier specs. 😫 Same situation now. When the 70-class has had 256-bit for a DECADE, since the GTX 670. 😠
I HAVE STOPPED BLAMING NVIDIA FOR ALL OF THESE RUBBISH...
THE CONSUMERS ARE TO BLAME...
THERE IS A REASON WHY MICROSOFT AND SONY WENT WITH AMD...
People are so stupid for buying the 4060 because now it's one of the most popular GPUs. Anyone who is at least relatively experienced with PC gaming will know the 4060 is a 4050. Nvidia has gotten away with using an 07 chip, which is normally used for the 50 class GPU into a 60 class GPU. This is why I call it the fake 4060 because it has the performance of a 4050, but with the 4060 naming scheme and price slapped onto it. Worst of all, the 4060 managed to get this popular with only 8GB of VRAM on a measly 128-bit bus. This proves people really are that stupid and don't bother doing any research before buying a GPU. Hopefully, the 5060 doesn't suffer from the same problem, and it actually ends up using the full 06 chip and not the 07 chip. Fanboys are a major reason why the PC gaming industry constantly gets screwed.
god pls AMD save us make the flagship at least as good as 5070 ti
dosnt matter, FSR makes amd cards unusable
@@Dempig Then just dont use it. If its cheap enough I wouldnt care. rx 7000 series wasnt cheap enough. Besides xess is not that bad.
lol its like nvidia is praising me for buying a 7900xtx an hour ago. Like yes please, keep reassuring me that not waiting for the 5070 was the right decision. This 3070 im about to upgrade from will always hold a special place in my heart tho since its my first GPU.
Not giving the 5080 bare minimum 20 gigs is insane unless there are planning a 5080s or ti S with more vram. Yea I’m going to get a 4080s on Black Friday and just wait for 6 series cards. Congratulations you’ve priced out the majority of your consumers with bad price to performance
dumb buy if its more than 800 $
@ it’s $900 right now but I’m fine with that
Man the 7900xtx is on sale for $800 here and its raster is similar to 4080s. But for my case, i want good stream encoder which nvidia is way better at...
@ same man the 7900xtx was my first choice but I’m still going to have to go with the 4080s. $800 is a steal for that card and apparently it’s going down to 788. I just hope the 4080s can get to 890 that would be perfect
AMD is our only hope for the every day gamer that doesn't want to spend a mortgage payment on a video card... They need to have the 8800XT absolutely BLOW their 5070 out of the water.
Sad thing is, there is still no Nvidia consumer Video card which cost over $150 to make.
There's several things that don't make sense here:
1.) I thought this was supposed to be a generational improvement, as in, more power efficient? Nope. all these cards are rated higher watts, meaning pretty much all the performance increase we're getting is due to simply adding more cores and more watts, and slightly better memory bandwidth (well, of course the 5090 is almost doubling it's bandwidth, but for the rest... about a 20% improvement)
2.) The 5090 (and the 4090 before it) is the only card getting a good treatment here. it's got like 40-50% MORE SMs (and RT cores) than the 4090 (which had a lot more than the 3090). Whereas the other cards are just getting like 5% more than their previous counterpart. the 4090 was already almost 100% BETTER than the 4080, which was reflected in the cost, i get it, but this 5090 will be literally more than twice as good as the 5080. The 5080 will be 4% better than the 4080/5070 ti. it's ridiculous.
3.) 12GB ?? On an $800 card?! ARE YOU SERIOUS RIGHT NOW
The only saving grace would be HUGE IPC increases.
The only card technically worth buying is 5090. Economically no.
5090 is way too expensive, takes way too much power, will probably melt cables and be a fire hazard like the previous xx90 generations. It makes sense performance-wise, but it's not a card for the regular user. AMD, here I come. Nvidia will disappear from the gaming/PC market.
"The more you buy the more you get" is Ngreedias motto.
You get the planned obsolescence for free
As someone who is buying a new pc next year I will see what amd and intel have in store for mid range
They’re pulling an apple saying that 12gbs of vram is just as good as another gpu with 24 gbs of vram.
i cant believe people actually expected nvidia to add more vram to a 70 card. the 4080 is already a cut in half 4090 which is hilarious. only thing i can see happening this gen is nvidia adding about 3-4 different 4080 models to keep milking the fan boys.
More then 24gb VRAM? Modded Skyrim, there will probably be someone playing 4K stacked with texture mods and maybe add VR into the mix
We know why. Nvidia HATES you! They have new customers now, they thought if they kept feeding gamers crap, they’d go away.
But they didn’t! Now Nvidia just gets to fleece consumers forevermore.
And it turns out people LIKE getting fleeced. 😐 The "4060" (which is really a 4030Ti in disguise) is now top of the Steam Hardware Survey. Un-be-fucking-livable. 😐 Nvidiots keep buying them. Signaling to Ngreedia that it's OKAY to charge $300 for a card that should be sub-$100.
Why nVidia not leaving how much vram to use up to AIB to decide. They can limit the power and speed of the core die, but leave the vram capacity.
Do not buy RTX 5090
I will eventually
I've started boycotting Nvidia three years ago.
This makes me almost regret buying a PC with an NVIDIA 4070 Super GPU in it recently, just because it's NVIDIA
Depends on the price, Right now a 4070ti is 700 and a super with 16gb or ram is $740. NVIDIA releases it at $700 and it beats the 4080 by 10%-15 they may be ok but I feel that is the min acceptable improvement acceptable. The problem is tariffs we get 10-20% added on and even a $700 card become $770-840.
Wow, so the 5070 TI is just a 4080. How revolutionary and groundbreaking, Nvidia. Bravo. /s
This is what happends when there isnt any competition. HUGE prices and underwhelming specs/ performance. They can charge what they want for a minimum boost in performance. They haven't released yet, I know BUT Im calling it early. I have a good feeling they ALL will be more expensive than we want AND disappoint us with the performance numbers.
classic:
reduce the features but don't change the price...
marketing 101...
I believe the highest vram usage I’ve seen was with hogwarts reaching 14-16gb depending on resolution closely followed by cyberpunk
funny enough they are lastgen games. Ue5 engine games dont have much issues with vram, they are very memory efficient.,
@ hopefully we don’t need more than 20gbs until the 6000 series comes out
LOL. So glad I just bought a 7999 XT for $584 shipped.
they only give 16GB vram to the 5080???
The new Stalker game consumes 22gb vram on the 4090 :o
the only card that has proper vram is the 5090.
12 GB are not enough for the 5070.
im right now holding back with buying a GPU. I wanted to get a 3060 ti or 4060.
but then i thought of waiting for the new Nvidea GPUs. But it looks like i rather wait for new AMD and Intel GPUs instead..
i need proper vram amounts, even on the low and medium end cards
22gb allocation not usage, you get the same thing in call of duty games and besides, Nvidia cards are known to use way less vram than AMD due to optimisation for example.
These are the specs I think:
5060 - 8gb
5060 Ti - 8gb
5070 - 12gb
5070 Ti - 12gb
5080 - 16gb
5090 - 28/32gb
Don't write off Intel. Battle mage is supposed to trade blows with 4070.
@@KamWysz nvidia textures also look worse at times because of 'optimization'
The 5070(ti) will be a true 60 class performer, with a 70 class nametag and for 80 class money...
Looking at the 5070Ti, I can only think of one thing and that is RIP the 5070.
Given that gen to gen uplifts aren't what used to be, limiting the offered VRAM is nvidia's guarantee to drive sells of the next refresh/generation.
yeah... but next generation amd.
Gaming in 2024 demand 16GB VRAM minimum. Typical greedy companies. Vote with your dollars
5090, I'm confused? I thought they weren't going to put DP 2.1a anything, just vanilla Display Port?
Friendly reminder do not buy to fix the prices. Another friendly reminder its always cheaper the generation afterwards.
I blame it on people for buying their brand new products.
Let their sales flop.
Keep your old cards or go for AMD.
hmmm i wonder how AMD's new 8000 series will perform against these 5070's?
Me too. For my needs, my 3070 is still good, but once I eventually want to upgrade I'll definitely be considering getting an amd gpu rather than nvidia
based on the performance we expect from the mainstream cards out of Nvidia it will either be a Zen moment and kickstart a movement that will see a massive market shift, or they will mess it up and market it as vs Nvidia when all they need to do is walk out and do a $299 mic drop and watch the community lose it and attack Nvidia themselves, gaming on PC is becoming ridiculously expensive with how demanding games are getting, some cheap and competent cards are what we need like yesterday.
@@kristiandevil im happy with my current setup that i built around the 7900 XTX which can go neck to neck with 4080 super WITHOUT upscaling and tbh i dont really care much for upscaling some games perform better without it.
@@elysian3623 we shall wait and see then shall we?
I’m willing to bet it will perform similar to RTX 5070 but have more VRaM, better AI features, and better price at around $500 or lower
so not only does it have less vram than a 4070 ti super but it also draws a bit more power, that makes no fkn sense
ugh, sick of Nvidia!
Don't buy any of their products. No one is holding a gun to your head.
I feel like they're missing a card that's a straight replacement in the market for the 4090. I realise a forthcoming 5080Ti could be that card, but it seems weird for NVidia to stop production on the 4090 and then create a lot of demand in the secondhand market. You'd think they'd want to encourage firsthand sales.
Well, I'm happy I out this is my skip generation. But these GPUs are looking more and more like planned redundancy and upselling, why upselling I think a lot of the comments have nailed it. Step the name down and you get the real level of the GPU 80=70 70-60 etc. and look at 5090 and its single 12 pin that is going to pull 600w, that sounds like story. We know it will interesting to watch what happens when get into wild.
This is what 90% of market share does. Keep going. Buy their stuff for the world’s all money like there is no tomorrow.
Nvidia is letting amd win the mid tier . 5080 and up is where they want to be
A 70 class card at 192 bit bus. You aren't running that thing at high resolutions
vrchat goes over 24
Hard pass. The fanboys can eat that crap up if they want, but 12gb VRAM is minor league level stuff, especially when you consider what the price tag on it will probably be.
And people will STILL buy it, proved it with the 1650, 3050, 3060 (non ti), 4060 and 4070 (non ti) , the amount of brand loyalty have with n-greedia is just astonishing. These people do not care about you!
Imo the 5060 should be 12gb. 5070 should be 16gb. The 5080 should be 20gb and the 5090 should be 32gb.
I've skipped 2 gen so far. Hope AMD delivers this time around. Ain't gonna pay premium for a 12GB card
from 5090 to 5080 its not drop its a face slap
So they are going to be more expensive like always but not really more performance because you have to use them at lower settings to avoid vram limit....
They really only care about their server sales since its 90% of their revenue.
LLMs can use a lot of VRAM
they are just making it so that people with the 1080ti dont have to upgrade . keep gimping the cards nvidia , my 1080ti still has a few good years left in her at this rate.
A next gen 70 class card should be better than the previous 70 class card in every way. But the stupid decision to put 12gb vram again makes the 4070ti super the better option
if the 5070 is 12gb of memory the 5060 will most likely be 8 or 10. I sincerely hope AMD makes the 8600 XT have 12gb of memory just to screw Nvidia over.
I'm super disappointed it's not the generational leap I was hoping it would be. They've dumped all their R&D into these $30,000 AI boards.. gamers seem 2nd class to their overall business right now - and they don't care because they don't have any competition. I'm actually rooting for Intel and AMD to make a comeback and possible leapfrog.. just like AMD came along and swiped 70% of Intel's customers this past 2 generations!
Competition is good!
I was going to get back into gaming next year.. but not for more than £1000... I am NOT paying £2000+ for a fucking graphics card. Asus ROG 4090 24GB is £1999 still...
New Nvidia Gpu? It's milking time!
This feels an awful lot like Kepler launch, especially considering there will be more segments coming in 2025 with the new 50% larger capacity memory chips. 5090 won't even be the top card when the full stack is done launching.
Hopefully Intel BMG can be good enough to eat into the midrange market. Give me a B770 with 16gb with 4070ti performance for $500 over a 5070 with 12gb for $600 anyday!
Why ? Because THEY Can. And You will still buy it and praise Them.
I'll buy a card next year to replace my 8-year-old (at that point) 11GB 1080TI - "an elegant weapon for a more civilized age". For 2025 I'm okay with a 16GB card for under £500. For around £600, I would expect 20GB minimum.
The 7900 GRE (4070/4070 super equivalent) with 16GB for £500 was released almost a year ago, so if NVIDIA doesn't offer something noticeably better next year, I'll just wait for AMD and their new cards.
I own a 3070 Ti. Hot, expensive, and slow (less than 10% faster than 4070). The 4070 Ti was also super-disappointing. Almost the lowest price-performance of the 40xx generation! The idea of an xx70 Ti card is to squeeze extra blood out of the stone which is the xx70 card. So I think that ALL xx70 Ti CARDS ARE DISAPPOINTING. There is no counter example yet!
12gb vram is not enough if you play on ultra texture and frame generation enable
If these leaks are correct it seems the gaming laptops will be very disappointing and it won't match the performance of last generation desktop gpus like it has done for the last 3 or so generations.
Asking why at his point is kind of silly. It's Nvidia.
I knew Nvidia wasn't going to focus on increasing the performance of the RTX 5070. After all, that's the graphics card most PC gamers are interested in.
Looks veeeery much like my next card is gonna be AMD, was heading in that direction already.
Audio sidenote: I suspect your noise filter is skewing the noise quality (which often happens with deep voices). A realtime Voice mod software would fix that, and there's plenty of free ones around. Or, noise panels in the ceiling if you want a low tech fix.
Do not get the 12gb 5070 or 16gb 5080. Games are already struggling with vram, exceeding 16gb in 4k, and next gen ray tracing and frame gen will use up even more vram. These GPUs are not a good buy because it will be outdated too quick.
There was a rumour that the 5090 is gonna be around 2000 dollars and people complaining about the price of ps5 pro 🤔