9070 XT is not a terrible name though. It makes it really obvious which Nvidia graphics cards AMD thinks it's comparable to, ones which also have 70 in the naming scheme. It's better than naming a 700 XT class product as the 800 XT class product, like they did with the 7800 XT.
Personally just bought a pc with 9800x3d with rtx 4080 super, I never had anything that’s even close to it, I’ll not upgrade forever. Enjoy your 50s tho if you can buy
@@DempigVery true, I’d like to see another company be able to compete with nvida at the higher end cards. Last generation’s AMD high end cards were plagued with driver issues.
@@baconfacegamer792 Yep I made the mistake of trying AMD last gen with a 6950xt. Constant driver crashes in dx12 , green screen crashes, and very poor FSR upscaling made certain I won't buy AMD again
@@joefield5217idk where here is. I’m in the US, where the msrp of the 4080 super is 1000. I’m not talking about the price that you will most likely buy at, I’m talking about msrp.
@@SMGJohnLike I said, those leaks are most likely on purpose to get you to accept 1000-1200. But hey we will know in like 36 hours so who knows maybe I’ll be completely wrong. We will just have to wait and see
Extremely underwhelming lol the 3070 was right around a 2080ti which was the previous gens flagship card. The 4070 was a massive fail, they corrected it with the 4070 super a year later. If the 5070 really is only 4070ti performance then nvidia is just repeating what the did with the 40 series. Better off waiting for the refresh...better yet...buy the 9070xt if it actually lives up to the hype and is $100 below the 5070
@@Sp3cialk304 yeah lol. I find it so hypocritic that people accept 7800xt being very similar in performance to 6800xt but have problem with Nvidia being only slightly faster.
@@bimsbarkas I bought a GTX 1080ti for £650 in 2017. 4090 came out in 2023 for like £2000. A casual 200% increase in 6 years. Super sustainable and consumer friendly. PC gaming sucks now and it's entirely due to AMD and Nvidia GPUs sucking in terms of pricing/price-to-performane
It doesn't "need" it, would be nice but we know it isn't happening. As far as NVIDIA is concerned It only needs to sell. NVIDIA's arrogance has eclipsed Apple's at this point. They need their cards to rot on shelves if we want them to reset their product approach. That is just not going to happen.
RX 6800 owner here, I got mine for $380 a little over a year ago, I play at 1440p and try for above 100 FPS native in whatever game, but if I have to turn on upscaling I will. I'm not playing the most ultra-modern and demanding games, but I'm not playing half life either, if you know what I mean. I just got done playing the RE4 Remake with the textures cranked, no raytracing, quality upscaling and I had a very smooth experience throughout the game, didn't turn on the FPS counter for an exact number, but it was using about 10-12 GB. seeing that, I am comforted in my choice, 16 GB should be the bare minimum for anything above $500, if I'm paying $800+ (which I will not be, I'd buy an OLED monitor before spending that much on a graphics card), I'm gonna need 20 GB of VRAM.
You are one of the very few smart consumers in the comments here in regards to spending. You will see a drastic improvement buying an OLED monitor than a GPU that only gives 20% better performance. Especially coming from a 1080p TN monitor that was $150. These FOMO gamers will buy a 4080 and then still stick to a 1080p TN monitor they had 8 years ago...
Also have an RX 6800 but I play at 4K FSR Quality. In some demanding games it is not enough to hit 60 fps at High Settings. I bought it for 400 USD 1.5 years ago. I want to upgrade but I dont want to spend more than 450 USD. I am screwed...
I dearly miss my 6800XT 16 GB. It was very good for VR, too, but I had to go to Nvidia because I now own a few Pimax headsets. I can feel the pinch with that measly 10 GB. Newer 80 series cards are bonkers and more expensive.
@xPhantomxify Some sold their 2 year old hardware for the same price they bought it to mitigate cost. The biggest sheep are the ones egging on scalper trolls pushing sky is the limit pricing!
16gb in a 5080 just feels too low and they're only doing it so they can add a TI/Super card later with a bit more vram for an even higher price. Should be: 5050 - 8 GB 5060 - 12 GB 5070 - 16 GB 5080 - 24 GB 5090 - 32 GB
Lot of ignorance/misunderstanding spread by these Unboxed guys. Games don't require as much VRAM as they claim already, but they're also adding new texture compression technology to the 5000 series cards so they'll use less VRAM than previous generations.
@@StaceyJensenn it's not about fanboys, it's just market right now. Companies are working hard with psychologists and already mastered perfect marketing and selling techniques that are simply manipulating human mind and convincing any narrative they want to sell. Look at apple, they sell premium device for ridiculous price, then they sell some mid tier for not much less. Now that super expensive option does not look that expensive if you see that twice as worse product is just not that much cheaper. Better to stash more and get that premium. And people that are really strict on budget but very focused on buying will buy that overpriced mid tier anyway. Jensen have the same strategy now. Just look at steam chart and GPU ranking, how on earth 4090 is in so high position? Because people thinking they can afford 4080 they will just stash a little more and buy 4090. It will be the same with 50 series. They will but ridiculous pricing on 5090 and then adjust all other prices to that premium. People are willing to pay more regardless. Amd is not saint either, their options are not that cheaper than Nvidia, so if people can choose inferior product for a little less, it's better to buy full product for a little more. People crying about RT is useless, but use it anyway because there is an option so it must be turned on.
I want nvidia for the best upscaling,ray tracing performance and their rtx HDR, only thing I dont like is the price 😂. If AMD can tick all those boxes for me then Ill think about them.
@@saiyaman9000DLSS is fake improvement, ray tracing doesn't improve graphic. Real graphic improvement and performance are raw power as well as more polygons on screen. Lightning is just a strawberry on the cake
For 1000$, i expect max settings at 4k on any currently released title. That is not possible with 16GB and thus is unacceptable for the 5080. The 5080 is a 700$ gpu. No more. And that statement is ignoring the 70class specs of the die...
Just play medium settings. Max settings and RT are gimmicks with low returns and high costs. Medium looks just as good and you increase your GPU longevity by x3. Medium settings only uses 6GB VRAM at most.
@@xPhantomxifyNo... not if I payed 1000$ or more. We are allowed to have high expectations when the price is high. Playing on medium settings on a 1000$+ gpu at the beginning of the lifecycle is simply unacceptable. with every price hike that nvidia pushes, performance expectations also go up. I do not accept a flat price hike without a performance increase and more importantly a price/performance increase.
@@i_grok_u2902if enough people cry about it, things do change. Look at the original 4080 launch. That product was left on the shelf and with the "super" launch, they were forced to lower the price point by 200$. If everyone thinks like you, then no nothing ever will change. If people think like me, nvidia will have a very difficult launch and will be forced to drop prices. You take your pick if you want lower prices. if you want things to change, change starts with you.
The 5090 will be the best GPU money can buy, they will charge whatever they want. What most consumers should be worried about is the price per performance of the 5080 and lower.
@@alm31 If there are only 2 cards available on release day I would expect it to sell out. No one, yet, has required a minimum amount of a product to qualify as a 'sellout'.
Honestly you'd be surprised at how naming is important to the average consumer. Even take the Wii and then the wiiu and what impact that had. Consumers are not the smartest bunch
Honestly it's not even about intelligence, its about most people simply not caring enough to research absolutely everything coming out, and can you blame them? I'm extremely familiar with tech (I'm here) and even I didn't realise intel had what is essentially the 15th CPU generation, because their new unnecessary name change made me think they were some sort of mobile or server CPUs...
I miss when you could get a XX80 card, brand new, for just $599-699. High end PC gaming was really affordable to get into, just a couple of years ago. But now, I'm not sure if I gonna bother with upgrading or making a new build anymore. If people didn't pay these insane prices, and the cards would be left rotting on the shelves, then Nvidia would be forced to lower the prices. But nope, even if they charge $5000 for the RTX6090, people will still buy them. 😂 99% fake frames, will make it worth the money, right? 😂
I've been pc gaming since 2003 and console gaming since 91.( I'm 37) I'm fed up with Nvidia and their BS gimmicks. I don't want upscale, fake frames. I just want to play my games at native resolution. Overpriced crap
Well 350 € in 2013 are about 550 € in 2025 (adjusted with historic inflation data). Today here in the EU/Germany the RTX 4070 is about 550 €, thx RX 7800 XT about 490 € (depends on the region) incl. VAT and so on. So, which prices are now outrages? Using your own data. Remember to be able to play in 1080p, all you need is an Intel B580 or equivalent card, for about 350 € more or less.
@@ThePeperich 350€ in 2014 is 445€ today, not 550€. RTX 4070 launch price was 650€. So lets say 450€, compared to 650€ for the same class of GPU. That's a 44% increase.
@ could be that I used the wrong data (or the wrong public inflation calculator). Which percentages used differ (total, average, electronics and so on). Nevertheless, the „classes“ were changed by Nvidia. The 9„70“class is in today‘s naming not to be compared to a 4070. All names/numbers are moved one to the left, so once 80 is now 70 and so on; may be even smaller. As said in the video, naming is not important, only what a card can do is (so which games at which resolution/setting) at which price. Btw Nvidia is not a cheap one and has huge markups - so profits. The main problem is, that people compare apples and oranges. Once usable for more or less 1080p, compared to 2160p („4K“) and so on - which is incomparable.
@@ThePeperich The classes have not been "moved to the left", now you're just pulling stuff out of your butt. If anything they have made the product skews worse. They tried to launch a RTX 4080 12 GB, but was forced to pull it because of the outrage and later relaunched it as the RTX 4070 Ti. ua-cam.com/video/jKmmugnOEME/v-deo.html
Knowing AMD, they'll release the 9070 @ 800 USD and it'll be a DOA product. Edit: many of you guys are optimistic, that gives me hope. Let's just wait and see I guess.
You are just making things up at this point to tell yourself not to buy AMD... Just say you are a Nvidia fanboy and weak minded FOMO consumer and get done with it.
My guess is that the change of name is to make consumers compare models with Nvidia easier, but it's going to bite them in the ass (according to the leaks) when the 5070 beats the 9070 and the 5070 Ti beats the 9070 XT.
Knowing Nvidia, it would be strange if they beat their own 4070 Ti Super in price/performance. Edit: $50 less for the 5070 Ti, actually good if the MSRP matchs the real price of the cheap models.
Very unlikely. It's why I got it last month, also in fear of lack of stock and the fact I had no GPU. It's already been painful enough to try and get a 9800x3d
Heh, recently sold my 4090 for more than I paid 2 years ago and got me a 4070 Ti Super. Made almost 1.000€ 'profit'. Figured the 4090 will lose way more of its value, let's see if the gamble was the right move.
You cant blame nvidia when consumers are the ones that eats up whatever nvidia sells. Nvidia keeps hiking up the prices and they still sell out regardless. If everyone actually had standards and held out on buying these cards at ludicrous prices maybe we could see affordable cards again.
The 5080 is nothing but Nvidia's big carrot for the 5090. The only reason for the hug gap in specs between the two and they know very well that we all want the best of the best.
Which benchmarks did you see to make this kind of claim? Who has leaked 50 series cards? Also, I for one have never ever been interested in the "best of the best". I prefer value for money.
@@xPhantomxifynot everyone cares about value though. People with no budget certainly don’t. That’s why the top-end products of virtually any hobby are priced ridiculously
Fanboys will still defend Nvidia for gimping the hardware for their gaming graphics cards, except for the flagship, and still charge absurd prices for them. It's honestly just sickening. Fanboys have been totally brainwashed by Nvidia. They believe every single Nvidia product is perfect and should not receive any criticism at all. Nvidia is full of fake shit nowadays. Nvidia is replacing hardware and raw performance with gimmicky software features. I don't give two fucks about these gimmicky, overrated AI technologies. Just give me more raw performance and hardware. AMD has done shady things as well, but most of it has come from Nvidia. Remember the 4080 12GB? Anyone who was at least relatively experienced with PC gaming knew that was the 4070. They unlaunched it and renamed it as a 4070 Ti for $100 cheaper. Nvidia keeps naming its products the tier above what they should be to try and trick consumers into thinking they are getting a better product. Nvidia is making consumers pay more for less, and idiot fanboys still defend Nvidia.
Nvidia is really the only choice. Yes, they are overpriced, purposely neutered, and cut down garbage. They get away with it because of zero competition. Everyone wants Nvidia products because they are the best. Everyone just wants them to be cheaper. If Nvidia priced their GPUs cheaper, AMD wouldn't sell a single gpu, and Nvidia would be declared a monopoly. This is what Nvidia wants to avoid at all costs.
lol Just buy their company shares to offset the cost of buying their products. Since their share price goes up when they make big profits, use that to fund the purchase of the GPU.
Like AMD fanboys aren't doing the same thing? I just saw some comment in a different video saying the Rx 9070xt will be faster than the RTX 5080 and costs like $500! Lol so delusional
Hopefully nvidia doesn't continue launching 8 gb vram gpus in 2025... even in 1080p some games are barely under that. At least 10 gb is required at this point. Edit: I would like to mention that if your card is capable of handling higher settings fps wise but is unable to do it because of its vram limitation, that's where the problem starts. If your card is never gonna be able to use its 8 gb of vram because it isn't capable of producing playable fps on the settings where it DOES end up using all 8 gb of vram, then that doesn't matter. So basically if you are buying a graphics card in 2025 with 8 gb of vram UNDER 200 US DOLLARS, don't do it. You will have to turn down a lot of settings even when your card seems capable of handling more, and that is something I know many people will not want to do.
@@xPhantomxify It is dumb to assume a 350+ Euro GPU shouldn't be capable of maxing out textures at 1080p. The 2016 1060 released with 6GB of Vram back in the day. The same amount the 980ti (top of the line GPU) had the previous generation. Six gigs were plenty back in the day. It is not okay to sell a GPU in 2025 with 8 gigs of Vram if you charge above 200 bucks. No mental gymnastics will change that.
@@MichaeltheORIGINAL1 There is no need to max textures when the returns are not noticeable enough for the cost. Nvidia shifted its focus to AI bros, because they pay any price regardless. Until that changes, there is no use screaming in the void. Its smarter to play medium settings, save VRAM, and get better longevity out of your GPU.
@@MichaeltheORIGINAL1Games are simply becoming much more detailed and demanding than what GPUs can offer. The focus should be on creating a good and fun game instead of these UE5 dime a dozen slop
I have a big issue with AMD changing the naming, and yes while it's not a big issue for reviewers or those in the know that skews your perspective. I can tell you that people who are not that knowledgeable are familiar with Nvidia since it's been consistent and mostly linear (mostly as in 9X0 to 10X0 series). AMD went from the RX 400/500, Radeon VII, Vega 56/64, RX 5000/6000/7000 and when it seemed like we had consistent naming it's switched up yet again. I would know how confusing it is, because that confused person was me a few short years back. Steve from GN put his 2 cents in and I can't agree enough, it's time to pick a naming system and stick with it.
A completely new naming scheme is better than using the same scheme in an inconsistent way. Naming the replacement for the 6700 XT as the 7800 XT was a big marketing fail, because it made reviewers as well as consumers compare it more against the 6800 XT, when the launch price was actually slightly LOWER than the launch price of the 6700 XT if you adjust for inflation, and it was definitely not a replacement for the 6800 XT. Copying Nvidia's naming is a perfectly good way of making it easier for less knoweledgeable consumers to understand what Nvidia cards the AMD cards are most closely comparable to, but some people just don't like change, and find it annoying when the same system isn't adhered to every generation. I find it annoying too, but not THAT annoying. They can call it the Donkey F***er Ai 1000QRL XTX and I will still buy it if it's a good value product.
20% is still not worth the upgrade. In my personal book I only upgrade for 2X performance. Given a 30 FPS game as an example. 10% = 33 FPS. 20% = 36 FPS . 30% = 39 FPS (at least it makes sense to upgrade at this point). But I personally prefer 2X , 200% , so you get 60 FPS.
At least 2 Generations before upgrade, ideally 3 generations, I have RTX 2060 and am planning on buying next Gen cards around 500-600$ range, would be very nice upgrade, anyone who upgrades every generation if it’s for anything other than Professional stuffs is just dumb imo
16GB definitely isn't enough for a 1000+ dollar card anymore. Indiana Jones, which is possibly the best optimized game which also has heavy ray tracing, requires more than 16GB of vram to run with maxed out settings. The RTX 4080 has enough GPU compute performance to max out the settings, but it runs out of vram. Of course, you can tweak the settings and still get a very good experience, but it's a bit unfortunate, especially for anybody who spent 1200 or more on one. If the ray tracing advantage is supposed to matter so much, then you need to have lots of vram to be able to actually do it properly. 16GB for a 1200 or 1400 dollar RTX 5080 sounds like a terrible value for a gaming system. The 7900 XTX has been regularly available for around 800 USD, or even lower, in the US and at least some other countries. I've seen the 7900 XT as low as 566 USD. 16GB isn't even that impressive anymore even at 500 USD, and is starting to be a bit on the low side even for an 800 dollar graphics card.
You don't have to play with max settings. Medium settings looks just as good and only uses 6GB VRAM at most. High/ultra/RT are gimmicks with low returns and high costs. If you go with the mindset of "needing" to play every game in max settings, you will have to upgrade your GPU every 2 - 3 years...
@@xPhantomxifymate, if you're paying over $1,000 you'd expect to be able to max texture settings. That's the point people are making. 16gb is fine on a cheaper card, but if you're paying that much you shouldn't have to worry about turning some settings because of lack of vram
And it especially matters when you use frame generation. Those 1% lows really tank the current 4080 in 1440p and 4k max settings just due to the 16GB of VRAM.
@@PCgamingbenchmarktime Kinda, Nvidia definitely need to stop skimping on the VRAM. But AAA game hardware requirements have increased much faster than hardware performance, it feels like. Games don't look that much better than they did some years ago where games could look pretty impressive and run fine on 8GB cards. Nvidia is definitely being greedy assholes with the VRAM, there's no denying that, but at the same time, it feels like game optimization is a relic of the past and it's all about slapping as much eye candy and fancy features on it as possible. 1440p/4K gaming getting more common doesn't help either, of course. Back when Minecraft was at it's peak, it ran on a potato but you could make it run like shit on a capable PC by adding shaders and photo realistic textures. I blame the game devs as much as i blame Nvidia. Devs need to optimize their games and Nvidia need to stop being assholes. Probably won't happen, though lol
The biggest problem with the 5080 isn't the price, it's the 16GB at that price. I'll wait for the 24GB 5080 Super to upgrade. I think a lot more people will do the same.
I really want to upgrade now but seeing the 16gb and how I got fucked already with an 8gb GPU I will try to wait also. It would feel so bad to drop 1500€ for a GPU and not being able to max out games due to VRAM.
5080 with 24gb vram will be absurdly expensive because of AI, people will see it as an alternative to the 5090. 16gb is probably a point where it's still mostly for gamers
16gb Vram for the second high end gpu on the market is an insult, its not just bad. Lets analyse it from another perspective. The real gaming platform is console gaming. Since PS5 appeared it had 16gb vram. All developers started making games that gradually increased the necessary vram from 8gb to 16gb. And 16gb of vram will be the standard until PS6 will arrive in probably 4 years. But we are speaking of PC gaming, which should be better. We are speaking about a high end gpu which alone costs almost 2 times the price of a gaming console. Of course it should be much, much better in every conceivable aspect. Otherwise a gamer should just buy a PS5 pro and minipc or a cheap desktop, or a laptop without discrete gpu for normal PC ussage such as browsing, office, multimedia and such.
_"Since PS5 appeared it had 16gb vram..."_ LOL! The PS5 has 16GB of *SHARED* DRAM, of which the base Kernel takes about 2GB, so you're left with 14GB for the game engine *and* VRAM allocation. The delusional comments are just hilarious!
That's not the case at all, the VRAM on the PS5 and Xbox are not dedicated, it's more like an even split down the middle when it comes to resource allocation, the CPU in most games takes about 8 gigs of memory and 8 is dedicated to the GPU, nobody is developing for a known 16GB of VRAM limitation, you only can get this kind of utilization on PC if you are allowing the game to load in with details or resolution above the consoles limitations. For example, God of War Ragnarök is a native PS5 title, and it happily will run with 6GB of VRAM on a dedicated GPU on PC, this is a first party title and should leverage everything the PS5 is capable of. Nothing on PS5 or Xbox even comes remotely close to allocating full system memory for GPU alone, and I don't think the design documents even allow this. Hell, even the PS5 pro only gains 2GB of CPU memory so it can allocate just a little more to video, 10GB of VRAM (the max a running PS5 Pro can use) is still below the 12GB threshold of your middle of the road gaming GPU, not to mention the XX80 class GPUs are orders of magnitude faster. the PS5 Pro makes about 16.7 Tflops in raw computational performance, a 4080 Super makes 49 Tflops.
Ps5 can have 200000 gb vram won't make up for its dogshit performance. Some games are upscaled from as low as 420p to get 60 fps. And the graphics are equivalent to medium settings on pc. It's trash.
The lack of VRAM alone should be enough of a reason NOT to buy Nvidia cards. The fact that prices are going to be astronomical as well should make you feel like a fool if you want one of them...
The low VRAM is planned obsolescence by nVidia. That way they can sell the next gen with a bit more RAM and performance and still charge a lot to force an upgrade.
A 5070 Ti with 16GB of VRAM and a 20-30% performance increase of the 4070 TiS is absolutely fine if it isnt to expensive. You dont need the 5060 to be better than the 4090 just because its one generation further. Doesnt make sense imo. The 5080 with 16GB is totally underwhelming yeah, this one should have become a 20 or 24GB card for sure.
In 2012 i got an gtx680 2gb for 500€ and i thought thats just mental. But today 500 for an 80 class card would be a bargain, the price hike is insane..
I'm expecting a modest uplift in performance, coupled with a pretty steep price bump. Nvidia at least has already shown their cards here. They stopped production on the 4000 series a while back, so there won't be a price comparison issue with the old cards since you won't be able to find them. If you're buying used that's a different story, but they're just looking at the new in box retail market. And regardless of how insane the pricing is I expect they will sell out a few minutes after they go live.
I already called out the 3080 as being underpowered re. vram before it even launched and I was right. Games are already using in excess of 16gb now. Nvidia think gamers are stupid.
@@grahamt19781Only if you are playing in max settings in a tiny few games like Indiana Jones. Medium settings only uses 6GB VRAM at most and looks just as good. 95% of games are not that demanding. 95% of gamers play easy to run competitive games and indie games. Or at most double A games like Warhammer 2, Helldivers 2.
@@grahamt19781 _"Games are already using in excess of 16gb now..."_ Complete and utter bullshit. No game "uses" more than 16GB or VRAM; it may *allocate* more, but it doesn't *use/require* it. The PS5 Pro _still_ has 16GB *shared* DRAM and if you think PC ports need more than that, you are delusional...
Great video guys. I bought my 1080ti for $700.00 back in 2018 and I still haven't found a good reason to upgrade since it works fine at 1080p. Gpu prices are insane and I can't believe they are still trotting out gpu's with only 8gb's of ram for over $400! I was tempted by the 7800xt this generation and bought one for my son, but my old eyes can't tell any difference. I need more bang for my buck. 😛
24:00 The average user probably has a GPU lifetime of 5-6 years. If I spend 1000+$ on a GPU I certainly want it to last more than 2 years. As always nobody here has a crystal ball but to say that the GPU is justified if it lasts 2 years is a very shortsighted view in my opinion.
Pretty sure he's talking about the manufacturers production cycle and having cards relevant to the top end gaming requirements during those periods for the purpose of sales, rather than how soon you will update.
I had decided to get a 5080 before i bought my 3080 when it released, and skipping the 40 series was really easy, it now seems i will end up skipping the 50 series aswell.
I'm on a 3080. If the 9070xt is 4080 performance for $499 that's what I'll be upgrading too, back up plan is a second hand 4080 when they drop on the market after the announcements. These new cards are useless if the prices are increasing again. Two generations with price increases, that's not a good sign
@awebuser5914 yeah.....that's a 40% uplift from the 7800xt. That's not unreasonable to expect. Many cards have had similar uplifts for the same price. The 2080 to 3080 was the same price with a 50% uplift. Don't act like this is impossible, If Amd is serious about taking low to midrange marketshare like they said that's the performance and price they need to hit for them to achieve that. The 5070 will be $600 and probably not too far off that performance target. It will probably still beat the 9070xt in RT, the 70xt around beat it in raster. If not...what's the point of Amd's card, it will just be another dud. AMD announcing their cards before nvidia could mean that they're confident with what they're putting out. That or they're delusional and they price it too high and/or the performance isn't where it needs to be. But considering they've copied the naming scheme of nvidia....it would be pretty stupid if they can't come close to the 5070 lol, and they need to be $100 below nvidia to stand a chance
The new naming is genius for marketing! It's more intuitive and familiar, which makes people feel more connected to the products, and that will definitely boost sales.
Exactly and also there will be no conflict with naming between gpu and cpu. Also people are lazy and they don’t want to do research to compare amd to nvidia cards and Most don’t even realize that amd cards do raytracing or VR. And always the same bull*** about drivers. Yes it happens. Yes nvidia has long run of driver releases that work, but they had issues also in the past. But the driver fail sticker goes to amd 😂 I had in my 2 year run with 6950xt 2 times issue with new driver release and 3 times had some driver issue. Except that they worked flawless.
Nope, games are as they've always been, they didn't come perfect 10-15 years ago and won't now. You still needed a pretty good for the time GPU for 60+ fps at 1080p ultra settings. The difference is the consoles are a lot closer to PCs in power now and some people just don't accept the performance targets. Like a game could be perfectly optimized, then pump the graphics until it hits all the resolution + 60 fps at max settings combos for all tiers of GPUs but people will just call it unoptimized anyway. Which makes the word lose all meaning. Also I think part of it is people coping with bad AMD purchases because modern resolutions that are fine on Nvidia means they have to use FSR.
RTX4090 already costs a kidney, based on RTX5080 pricing, RTX5090 is gonna cost both balls, a chunk of liver and probably quite a few unimportant organs like ears. By the time RTX9090 comes out, I'll be out of organs to sell.
I really don't know why a gamer is concerning himself over the price of a 4090 and 5090, when its not targeted for them? It will cost you a brain if you keep fretting about it... Stick to the 4070 Super equivalents.
Peeps concerned with AI inferencing/training besides gaming will be more concerned with the amount of and generation of Tensor cores, amount of memory and its bandwidth. Matrix cores (Cuda) is not that of a deal beaker, imo. So yes, give us bang for buck in the tensor department without the sacrifice of organs.
People old enough to remember when the Ati Radeon 9700 series came out to go up against the GeForce FX 5800. We’re expecting the RTX 5080 to come out soon so I’m pretty sure the choice to use 9070 is someone at AMD’s naming department having some fun.😂
Oh god PLEASE let the RX 9070XT come out at an actually reasonable MSRP of 500 or below. If AMD end up releasing another DOA product due to poor pricing, its actually so over for them, and Ngreedia are gonna keep charging 2 kidneys worth for a GPU
Depends on the actual Performance. If its really only 5% slower then a 4080S, like rumors suggest , then 550 or even 600 would be still be a fire Price.
@@rulewski33I agree in theory, but nvidia has such a hold on "mind share" that AMD need to really undercut the price to make a dent in the market share difference.
If 50 series has worse performance than 40 series Steve and Tim needs to use gaming community recovery stim packs. AMD could have named it as Ryzen 4070 for RDNA4
I'm my opinion, I believe AMD had a name change to keep the number similar to their CPU generation number (seeing as they want to unify both brands this makes sense), but needed to switch a 0 around to avoid confusion for the consumer.
@@sollice01 UDNA is a unification of RDNA and CDNA (which is basically just GCN with the occasional RDNA derived feature), not a unification of RDNA and RYZEN. Other than that, I definitely think that the choice to switch around the zero was to avoid collision with the CPUs, but if they had just called it a 8800XT, that wouldn’t happen. My question is, what are they going to do with the switch to UDNA (presumably) immediately after this generation? Are they going to swap over to a 3 digit naming scheme? So I can get an UX 580 xGB in the not so distant future? And that’ll then definitely collide with their Intel naming scheme Laptop parts? And are they going to change the naming scheme on desktop after this 9000 series? Maybe to match the new laptop parts and Intels new naming scheme for all CPUs? Now with even more guarantied naming collisions!
@@levygaming3133 thanks yeah good point. I can't really see why else they would do it, I just assumed they'd jump to a 10700 and a 10070 or maybe just go a new direction.. I suppose time will tell
So, guys, with my 3090 ftw3 24GB Vram, playing games at 1440p monitor and occasionally 4k Oled TV, would the 5080 be an upgrade or will the 16GB Vram be a real limitation in performance?
If you’re an editor, you can upgrade, if you’re a gamer there’s no absolute need to upgrade, the AAA slops that we’ve been getting for the past 15 years doesn’t justify upgrading in any way
I enjoy AAA games. Most people that pretend to hate AAA just have potato PCs that can't play them. Already pre ordered Kingdom Come Deliverance 2 and Final Fantasy Remake pt 2 , they are both better than any game from 15 years ago
the 9070xt is supposed to be just under a 7900xt in performance, and that goes for $700 right now. i would expect them to offer a better price to performance than last gen, so i would think it to be in the $650 range or have much better performance in ray tracing/ AI workloads.
$650 would be absolutely terrible if its that performance level. The 7900xt has been that price before lol. It needs to be closer to the 4080 and a maximum price of $499. If its below the 7900xt it needs to be $449 at the most. No one is buying that for $650 when the 5070 will be $600 and out perform if it really was less than a 7900xt performance wise lol. I don't think those benchmarks were accurate , if they are then Amd has a another dud generation lol
@@PCgamingbenchmarktime i agree, but this is just the way things tend to go. and by that i mean they go poorly for people looking to purchase a video card.
7900 xtx performance for half price $500 is gonna sell like hotcakes especially since Nvidia is getting greedier and people wont even be able to afford 5080 and 5070 likely will be overpriced too.
There is no way that AMD could make a 7900xtx for $500. That is not realistic at all. Latest rumors are that rx 9070xt will match 7900GRE in raster and have 20-30% better RT performance.
@@nidz3876 And those rumors are stupid since the GRE is only 10% faster than the 7800XT. Do you really think 10% is all that AMD manages to achieve, especially when the rumored price range is 500-600$? Common sense tells you that doesn't work out.
Steve with your wood working tools you could make the old classic coffin just for things like this hahaha. Get to it and I expect a 12 part build series hahaha
I agree, 16 GB is not enough already, I recently bought myself a powerful Radeon RX 7600 XT video card, and it turned out to be weak, in 4K at ultra settings it outputs only 30-40 fps, and I was told that this video card, with its 16 GB of memory, will last for 3-4 years!
@@Igor-Bond VRAM isn't everything. The 7600 XT is a Full HD intended card, not a 4K monster. It has enough VRAM, but its core performance isn't too powerful
26:00 - To be honest, I got an RX 7600 XT on launch and I'm happy that it's doing well with the 16 GB in games compared to my previous 8GB low budget card. And I play on 1440p and I'm happy.
We can’t say that people won’t buy AMD over Nvidia at a $50 discount if the cards are equal. Because the cards haven’t been equal in a long, long time, whether that be because of feature set or because of drivers.
They're not even close to equal. A 7900 xtx is worse than a 4070 ti super. It only wins slightly in raster and in vram, but everything else is hot garbage. It's like buying a car with a slightly better engine and slightly more space, but has no heating, no parktronic, no cameras, no electronic windows etc.
Yesterday I attended an RTX50 preheating event in Beijing. On the way, I wanted to ask the staff about the price of the RTX5080, which will be launched this month. He told me that the price is similar to the initial price of the RTX4080, which is equivalent to about RMB 9,000, not the US$1,499 advertised on the Internet.
On a multi monitor setup i'm using 1-1.5gigs of vram just idle with like a youtube video open or a movie playing, which is not factored in when talking about vram. The only scenario in which you have 100% of your 8 or 12 gigs is with one monitor and nothing else open but the game, even steam uses like 300mb of vram. 8 gigs is not enough today, 12 gigs is fine as of NOW but you're not buying a gpu for now (to the dismay of nvidia), you're buying it for a few years so 16-20 is absolutely necessary.
8:10 AMD was one step ahead of us and decided to not reveal RDNA4 GPUs at their keynote, probably so they could wait for Nvidia's pricing and avoid the same song and dance you described :D
$450 for a 9070 XT that is like 10% or so less in performance than 5070 (assuming 5070 matches 4080 performance) does sound good. I think the issue is that FSR 4 is going to make it look faster and they will try to charge more but reviewers are going to find it's not as good as they claim and not really worth the $549 or so they try to charge. Thumbs up.
I don't think we expect the 5070 to match 4080 performance. Instead it will be more like 4070 ti super (or whatever the best 4070 is called). But yea will buy a 450 card with that performance
The RTX 5070 will be RTX 4070 Ti, the RTX 5070 Ti will be closer to RTX 4080. I don’t think FSR 4 is a factor as long as is good, but leaks suggest the bare minimum will be $500 by closer to $600, maybe $550, but for RX 7900 XT to RTX 4080 performance, is not going to be bad.
It's crazy how everyone says "amd better not dare to price their product above 600". Meanwhile Nvidia gets a pass and people will buy it for whatever jensen decides 😂 AMD literally prices better than nvidia and people still complaining and want them to have it priced dirt cheap 😂... Hypocrites
Everyone is complaining about amd because they already know there is no hope with nvidia, nvidia hasn’t released a card worth buying new since the 2000 series
@@obj_obj Even a gimped card like 4060 can mawl through every game even with unnecessary settings turn on. Sometimes it's better not to fix stuff that isn't broke. Normies wouldn't understand.
Well because AMD prices their products just barely below its NVIDIA counterpart but without all the Nvidia features, granted they usually have better raster performance, but still AMD should realise they are always playing catch-up with Nvidia, they need to be aggressive with their pricing or else they will get decimated like always
Amd is pricing themselves slightly cheaper to nvidia while all their features apart from raster are far inferior. They need to be much much cheaper to be option for many. Their shrinking market share seems to agree. I've used amd gpu's for over 10 years now but im not blind to the fact that amd drops the ball 90% of the time.
I think confusing product names are a positive for gpus. When i built my first computer, i was so overwhelmed and confused by gpus. I had no idea what i was buying because it was so hard to follow what was good vs what was bad, what it meant when it was an nvidia msi gpu vs nvidia asus gpu. The confusion makes it impossible to make an informed choice.
In 2022, because of high GPU prices I took up chess and bought a beautiful hand carved set for $1,200. It is still beautiful and played Metro Exodus on my PC. It was a very good year.
@@hyakkimaru1057well the 7000 were already pretty great, didn’t really see how they could get much better than those, I fully expected 9000 series to be a slight upgrade
If Nvidia has any sense, they would price things like they did with the 30 series. I don't think people will be running out to upgrade from the 40 series..and it'll be interesting to see how much of an improvement there even is on the 5080/90 from the 4080/90. I don't think it'll be more than 20-25% better.
I disagree, I think naming is quite important to the "average" consumer. On reddit you still get questions from people who build a pc 10+ years ago and havent paid attention to hardware since, to explain where a 7700xt falls compared to nvidias lineup. On the otherhand I think changing your naming scheme too often is also quite damaging. Think of how often youve heard "i7=good" from people who arent as in the know as we are. Thats only the case because intel stuck with that naming scheme for so long. The same goes for gpus, people are buying 4060s because they bought the xx60 series back when they build their last computer, and it has served them well so far
Unfortunately, the latest rumors are the 5080 MSRP being above $1300, which would effectively kill the product. Nvidia's greed will catch up with them.
a used RX 6800 XT for like 300 - 350 bucks is fine. also a RX 7900 XT new on sale for around 600 bucks. Arc 580 is a good deal for 250 bucks. So ... especially NVIDIA prices are a joke!
@@hassosigbjoernson5738 7900 XT is 750 here. Arc 580 is sold out everywhere. And the ones that are available are 400+. 6800 XT is 450 here on the used market. But its old tech. i want new tech for under 400.
Hey Hardware Unboxed team, I would find it interesting if you had as part of your GPU charts an overview of mean vram usage across you suite for each GPU at each resolution. I currently find it hard to judge how long-term valuable 16GB vs 20GB is. Maybe with a differentiation raytraced and raster only. Just had this idea. GL at CES stay healthy, safe and have fun
As long as the 5090 wipes the floor with everything else then it's worth the price to me. It's not really expensive if you always deal with the 90 series and the Titan series since you can always sell those for a little more than you actually pay for them to pay for the new card.
@@TheEchelon thats what I'm saying either they will lower all the prices... which is definitely not gonna happen,or they will bump up the prices of new stuff and honestly i live in Europe and the prices are definitely not the same stuff is more expensive here so i know ill have to pay more than a 1000 for the new gpus if im stupid and i wont wait and honestly i want to build a pc soon i had to sell my old one last year and i miss playing sometimes in my spare time
I bought the ASRock Steel Legend Intel Arc B580 Dec 13th at Microcenter in Overland Park, KS. It was about $300 USD with tax. It wasn't $249, but rather, $269 for that particular model. Performance has been fantastic for what I paid for the card. If any of you out there can get one, buy one and use it. 1080p performance would be great. I'm using mine on an LG OLED 42" C2, which is 4K.
Also what I noticed about high end GPUs: There's a newish market there with streamers. Those will almost always gravitate towards the high end. It's simply a buisness expense to them and it's more affordable to them than your average gamer. Additionally with the people you talked about, there is now a significant ammount of people who can and will afford those $2000+ GPUs. Normal gamers not lucrative enough for Nvidia anymore
Streamers are the worst thing that could happen to people. Imagine watching and wasting your precious time watching someone play games instead of yourself, give them money on top and form a parasocial relationship with streamers that only look at you as a number and metric. They will of course overspend on GPUs more than what they actually need, since 90% of them play competitive games casually and are not playing in tournaments.
They are an important part of Nvidia's marketing strategy. It's extremely inexpensive marketing when all you have to do is give away one graphics card to get recurring product placement for a couple of years or more... But they didn't even have to pay for it in most cases, at first.
Yep, there are a lot of gamers with plenty of disposable income- we aren't all still living at home, retired. $2k+ is a lot of money, but not enough to put me off if I want it.
Nothing is released yet, hence I am still hoping, but my frustration is growing more and more. I want to upgrade from my RTX 3090 to something more powerful, since it powers my 4K TV. Based on the rumors, I do not see a viable upgrade path: downgrading to 16 GB VRAM while paying more than double the price of the used RTX 3090 that I currently use seems like a bad deal. Paying (more) than three times the price for an RTX 5090 seems just as ridiculous. AMD and Intel won't offer a high-end option. Used RTX 4090 costs more than it did a year ago brand new...
I'm in the exact same boat as you mate. I want to play at 4k with path tracing but my 3090 doesn't cut it without severe compromises. The 5080 vram issue is a major turn off and then the 5090 is just plain stupid in terms of probable price. We could be left without a viable upgrade path until 5080 24gb comes out (assuming it will).
Im still running a 2070 super... yeah it's starting to get real rough... i want a good generation, but im actually leaning on picking up a 7900xt/xtx or equivalent from AMD this year
@JensAllerlei I see what you mean, but I still don't see RT as good as it was promised. The performance hit is still too great. I could be proven wrong with the 50 series though. Dlss is what has kept my 2070 super running tbf, but I don't see the cost as worth it anymore. Fsr has got alot better. Steam os should be round the corner too. Edit: HL2 RTX will probably go hard though 👌
How can you compare frame rates from 8 yrs ago till now. What generational cpu are u pairing up with a 1070 or a 2070 or even a 3070? A 1080 didn't have a 9800x3d as a cpu to back it up.
Of course, Tim, who reviews monitors professionally, is not bothered by terrible product names!
Good one 😂
true lmao
Fair point, he's probably desensitised.
What you mean Q27R53X600LCD4XYQ rev 3r is confusing?
9070 XT is not a terrible name though.
It makes it really obvious which Nvidia graphics cards AMD thinks it's comparable to, ones which also have 70 in the naming scheme.
It's better than naming a 700 XT class product as the 800 XT class product, like they did with the 7800 XT.
Yes, definitely december. No. Stop looking at the calender, it's 100% december still.
I almost want to do the ''who's here on December 2024'' comment, like they post under songs.
@@damdibidum you understood wrong they canceled his Christmas for 2025, increased productivity so much that he's 11 months ahead of schedule. lol
Parallel Universe it's December somewhere 🤪!
Personally just bought a pc with 9800x3d with rtx 4080 super, I never had anything that’s even close to it, I’ll not upgrade forever. Enjoy your 50s tho if you can buy
I've watched a couple of Christmas movies in the last few days and still have one or two left. It's still December to me! 😆
Remember folks, if you keep buying them at these stupid prices they'll keep offering them for these stupid prices.
Doesn’t matter then they won’t make them. Unless you go and crash the AI boom Nvidia won’t ever back down.
I like games. I buy what I need to play those games. AMD cards are unusable for my needs, Nvidia is the only option
@@DempigVery true, I’d like to see another company be able to compete with nvida at the higher end cards. Last generation’s AMD high end cards were plagued with driver issues.
yup
@@baconfacegamer792 Yep I made the mistake of trying AMD last gen with a 6950xt. Constant driver crashes in dx12 , green screen crashes, and very poor FSR upscaling made certain I won't buy AMD again
That one poor dude really thinks 5080 will be 1000$
Edit: i sure am glad i was wrong
It probably will be close to 1000. These leaked prices are just to get you to accept 1000-1200
@@FreightTrain22 The 4080 super is 1300 here. So, I expect the 5080 to be around the same price, if not more due the scalpers.
@@FreightTrain22
All leaks show a price of 1600 euro. Thats horrific if true
@@joefield5217idk where here is. I’m in the US, where the msrp of the 4080 super is 1000. I’m not talking about the price that you will most likely buy at, I’m talking about msrp.
@@SMGJohnLike I said, those leaks are most likely on purpose to get you to accept 1000-1200. But hey we will know in like 36 hours so who knows maybe I’ll be completely wrong. We will just have to wait and see
5070 being 4070ti performance would be underwhelming
Absurdly underwhelming honestly. You can get 5-10% more performance than that overclocking a damn 4070 super
Extremely underwhelming lol the 3070 was right around a 2080ti which was the previous gens flagship card. The 4070 was a massive fail, they corrected it with the 4070 super a year later. If the 5070 really is only 4070ti performance then nvidia is just repeating what the did with the 40 series. Better off waiting for the refresh...better yet...buy the 9070xt if it actually lives up to the hype and is $100 below the 5070
I know that would be like the 7800xt being similar in performance to the 6800xt.
@@Sp3cialk304 yeah lol. I find it so hypocritic that people accept 7800xt being very similar in performance to 6800xt but have problem with Nvidia being only slightly faster.
It’s basically a 5060 named 5070 to upsell
I still remember buying high end cards for 500$… wasn’t even that long ago… WTH happened
I bought a GeForce 4 ti for 350 iirc.
AI
I remember buying a 980ti for 600 dollars and crying about it but enjoying cranking everything up to max.
These prices are ridiculous.
@@jackofsometradesmasterofnoneif you bought a 980ti at launch in 2015 for $600 that would be around $800 today due to inflation!
@@bimsbarkas
I bought a GTX 1080ti for £650 in 2017.
4090 came out in 2023 for like £2000.
A casual 200% increase in 6 years. Super sustainable and consumer friendly.
PC gaming sucks now and it's entirely due to AMD and Nvidia GPUs sucking in terms of pricing/price-to-performane
The RTX5080 needs at least 20GB of memory.
Extremely unlikely.
@@sirius4kNo it's functionally impossible at this stage they're just saying what it should have had
It doesn't "need" it, would be nice but we know it isn't happening. As far as NVIDIA is concerned It only needs to sell. NVIDIA's arrogance has eclipsed Apple's at this point. They need their cards to rot on shelves if we want them to reset their product approach. That is just not going to happen.
@@sirius4k I know, so we wait for the TI/Super. Current architecture + GDDR modules don't allow for that.
@@megageek8509 If its supposed to be a 4K card then it needs it.
RX 6800 owner here, I got mine for $380 a little over a year ago, I play at 1440p and try for above 100 FPS native in whatever game, but if I have to turn on upscaling I will. I'm not playing the most ultra-modern and demanding games, but I'm not playing half life either, if you know what I mean. I just got done playing the RE4 Remake with the textures cranked, no raytracing, quality upscaling and I had a very smooth experience throughout the game, didn't turn on the FPS counter for an exact number, but it was using about 10-12 GB. seeing that, I am comforted in my choice, 16 GB should be the bare minimum for anything above $500, if I'm paying $800+ (which I will not be, I'd buy an OLED monitor before spending that much on a graphics card), I'm gonna need 20 GB of VRAM.
You are one of the very few smart consumers in the comments here in regards to spending. You will see a drastic improvement buying an OLED monitor than a GPU that only gives 20% better performance. Especially coming from a 1080p TN monitor that was $150. These FOMO gamers will buy a 4080 and then still stick to a 1080p TN monitor they had 8 years ago...
Welcome to the club, got my 6800 in 2021 and its still a beast in 1440p...and it has got 16 Gigs of VRAM lol
Smart!!!
Also have an RX 6800 but I play at 4K FSR Quality. In some demanding games it is not enough to hit 60 fps at High Settings. I bought it for 400 USD 1.5 years ago. I want to upgrade but I dont want to spend more than 450 USD. I am screwed...
I dearly miss my 6800XT 16 GB. It was very good for VR, too, but I had to go to Nvidia because I now own a few Pimax headsets. I can feel the pinch with that measly 10 GB. Newer 80 series cards are bonkers and more expensive.
Jensen wasn't happy with charging for just one of your kidneys. He wants both this time.
Only if gamers are dumb enough (which they are) to even think about buying the 5080/5090...
@@xPhantomxify hard agree, idiots will offer what they have still.
This is the most worn out joke ever
@@xPhantomxify people with money will obviously do that. Competition is non existent
@xPhantomxify Some sold their 2 year old hardware for the same price they bought it to mitigate cost. The biggest sheep are the ones egging on scalper trolls pushing sky is the limit pricing!
The 4080 Super was basically a $200 price cut and Mr. Leather Jacket wants that back. So the 5080 MSRP will be the original 4080 MSRP + $200 = $1400.
I love leather
you forgot trump new tarrifs day one . add that also.
Why?
@@StaceyJensenn Tariffs are a bluff to negotiate and a fatal self-inflicted blow if applied.
😂
OMG we're doomed! 😭
16gb in a 5080 just feels too low and they're only doing it so they can add a TI/Super card later with a bit more vram for an even higher price.
Should be:
5050 - 8 GB
5060 - 12 GB
5070 - 16 GB
5080 - 24 GB
5090 - 32 GB
I wish it was this instead of what they do where they add a TI and super model in between and you never no what’s a good price to pay for something.
Welp, that was some wishful thinking...
@ 😔
F nvidia...
F ppl buying these overpriced cards and distorting the market...
F miners...
F greed...
F me for being so "poor"
Everything is correct, you just forgot to add F AMD for doing same shit as nvidia...
F
Mining is very dead right now
@lowzyyy it is now, but the prices will never return to pre mining.
Amen
3060 with 12GB VRAM was released 4 years ago, 5070 having same amount of VRAM in 2025 is Nvidia straight up making fun of their consumers.
First 12 GB GPU I remember was the Titan X and it launched in 2015…. For $999
Then going from 3060 12GB to 4060 8GB is utterly scamming their customers
Lot of ignorance/misunderstanding spread by these Unboxed guys. Games don't require as much VRAM as they claim already, but they're also adding new texture compression technology to the 5000 series cards so they'll use less VRAM than previous generations.
If you change the 00:00 timestamp from "Welcome Back to Hardware Unboxed" to "Intro", it'll split the timeline for easier chapter selection.
RTX 50 series is gonna suck for these prices.
doesnt matter, fanboys will buy day one anyway, and that also doesnt matter for me , it changes nothing
@@StaceyJensenn it's not about fanboys, it's just market right now. Companies are working hard with psychologists and already mastered perfect marketing and selling techniques that are simply manipulating human mind and convincing any narrative they want to sell.
Look at apple, they sell premium device for ridiculous price, then they sell some mid tier for not much less. Now that super expensive option does not look that expensive if you see that twice as worse product is just not that much cheaper. Better to stash more and get that premium. And people that are really strict on budget but very focused on buying will buy that overpriced mid tier anyway.
Jensen have the same strategy now. Just look at steam chart and GPU ranking, how on earth 4090 is in so high position? Because people thinking they can afford 4080 they will just stash a little more and buy 4090.
It will be the same with 50 series. They will but ridiculous pricing on 5090 and then adjust all other prices to that premium. People are willing to pay more regardless.
Amd is not saint either, their options are not that cheaper than Nvidia, so if people can choose inferior product for a little less, it's better to buy full product for a little more. People crying about RT is useless, but use it anyway because there is an option so it must be turned on.
I want nvidia for the best upscaling,ray tracing performance and their rtx HDR, only thing I dont like is the price 😂. If AMD can tick all those boxes for me then Ill think about them.
50, 60, and 70 RTX has been terrible in both value and spec for years now 😂
@@saiyaman9000DLSS is fake improvement, ray tracing doesn't improve graphic. Real graphic improvement and performance are raw power as well as more polygons on screen. Lightning is just a strawberry on the cake
For 1000$, i expect max settings at 4k on any currently released title. That is not possible with 16GB and thus is unacceptable for the 5080. The 5080 is a 700$ gpu. No more. And that statement is ignoring the 70class specs of the die...
Just play medium settings. Max settings and RT are gimmicks with low returns and high costs. Medium looks just as good and you increase your GPU longevity by x3. Medium settings only uses 6GB VRAM at most.
Guess you won't be getting a new card for many years, prices aren't going down. Crying about it isn't going to change a thing, sorry.
@@xPhantomxifyNo... not if I payed 1000$ or more. We are allowed to have high expectations when the price is high. Playing on medium settings on a 1000$+ gpu at the beginning of the lifecycle is simply unacceptable. with every price hike that nvidia pushes, performance expectations also go up. I do not accept a flat price hike without a performance increase and more importantly a price/performance increase.
@@i_grok_u2902if enough people cry about it, things do change. Look at the original 4080 launch. That product was left on the shelf and with the "super" launch, they were forced to lower the price point by 200$. If everyone thinks like you, then no nothing ever will change. If people think like me, nvidia will have a very difficult launch and will be forced to drop prices. You take your pick if you want lower prices. if you want things to change, change starts with you.
@@i_grok_u2902 discussing it is crying? wtf
A 5090 at 3000 would sellout day 1. Just how little faith I have in consumers.
Would it sellout at 5000?
I’m an nvidia representative and I’m trying to gauge the market
For productivity users, for AI modellers, that investment makes sense. For gamers? Nope.
oh it will sell out at msrp, then scalpers will sell in bulk to small companies at that price
The 5090 will be the best GPU money can buy, they will charge whatever they want. What most consumers should be worried about is the price per performance of the 5080 and lower.
@@alm31 If there are only 2 cards available on release day I would expect it to sell out. No one, yet, has required a minimum amount of a product to qualify as a 'sellout'.
Honestly you'd be surprised at how naming is important to the average consumer. Even take the Wii and then the wiiu and what impact that had. Consumers are not the smartest bunch
GTI versus some floppy Korean car name
Honestly it's not even about intelligence, its about most people simply not caring enough to research absolutely everything coming out, and can you blame them?
I'm extremely familiar with tech (I'm here) and even I didn't realise intel had what is essentially the 15th CPU generation, because their new unnecessary name change made me think they were some sort of mobile or server CPUs...
I haven’t seen good evidence that the Wii U failure had anything to do with the name rather than people just not liking the product.
Yeah, you can even slap "Ti" onto an 8 gb RTX 4060 and make it sound like an actual videocard
My friend buys the 4060 purely based on having a previous 60 series gpu
I miss when you could get a XX80 card, brand new, for just $599-699. High end PC gaming was really affordable to get into, just a couple of years ago. But now, I'm not sure if I gonna bother with upgrading or making a new build anymore.
If people didn't pay these insane prices, and the cards would be left rotting on the shelves, then Nvidia would be forced to lower the prices.
But nope, even if they charge $5000 for the RTX6090, people will still buy them. 😂
99% fake frames, will make it worth the money, right? 😂
I've been pc gaming since 2003 and console gaming since 91.( I'm 37) I'm fed up with Nvidia and their BS gimmicks. I don't want upscale, fake frames. I just want to play my games at native resolution. Overpriced crap
Hard to believe that the GTX 970 i bought back in the day was only 350€, GTX 1080 Ti for 850€ and RX 6800 XT 750€. These prices are outrageous.
Well 350 € in 2013 are about 550 € in 2025 (adjusted with historic inflation data). Today here in the EU/Germany the RTX 4070 is about 550 €, thx RX 7800 XT about 490 € (depends on the region) incl. VAT and so on.
So, which prices are now outrages? Using your own data. Remember to be able to play in 1080p, all you need is an Intel B580 or equivalent card, for about 350 € more or less.
@@ThePeperich 350€ in 2014 is 445€ today, not 550€. RTX 4070 launch price was 650€. So lets say 450€, compared to 650€ for the same class of GPU. That's a 44% increase.
@ could be that I used the wrong data (or the wrong public inflation calculator). Which percentages used differ (total, average, electronics and so on).
Nevertheless, the „classes“ were changed by Nvidia. The 9„70“class is in today‘s naming not to be compared to a 4070. All names/numbers are moved one to the left, so once 80 is now 70 and so on; may be even smaller. As said in the video, naming is not important, only what a card can do is (so which games at which resolution/setting) at which price. Btw Nvidia is not a cheap one and has huge markups - so profits.
The main problem is, that people compare apples and oranges. Once usable for more or less 1080p, compared to 2160p („4K“) and so on - which is incomparable.
@@ThePeperich The classes have not been "moved to the left", now you're just pulling stuff out of your butt. If anything they have made the product skews worse. They tried to launch a RTX 4080 12 GB, but was forced to pull it because of the outrage and later relaunched it as the RTX 4070 Ti. ua-cam.com/video/jKmmugnOEME/v-deo.html
@@ThePeperich Literally justifying Nvidia's anti consumer bullshit of shifting everything in the stack down 1 or even two tiers
At this rate my 2070super will only be replaced if it catches on fire 😂
Same here 😭💀
My kid is still running a 2070 non super...looking at a used 6800 for $350 which appears to get near double the fps at 1440p
I had the 2070 super I loved it. Have a 4070 ti currently.
Same for my 2070 super. I'm not paying these ridiculous prices.
You realize you can buy used right?
Knowing AMD, they'll release the 9070 @ 800 USD and it'll be a DOA product.
Edit: many of you guys are optimistic, that gives me hope.
Let's just wait and see I guess.
Yeah, but 2 months later it'll drop to 550$ and be a killer value compared to Nvidia.
@@X2ytthe people who want a top end card won’t wait and will have already bought Nvidia.
that doesn't make ANY sense lmao. it's a 70 class. i see 500-600
@@X2yt More like a year later
You are just making things up at this point to tell yourself not to buy AMD... Just say you are a Nvidia fanboy and weak minded FOMO consumer and get done with it.
My guess is that the change of name is to make consumers compare models with Nvidia easier, but it's going to bite them in the ass (according to the leaks) when the 5070 beats the 9070 and the 5070 Ti beats the 9070 XT.
Knowing Nvidia, it would be strange if they beat their own 4070 Ti Super in price/performance.
Edit: $50 less for the 5070 Ti, actually good if the MSRP matchs the real price of the cheap models.
Very unlikely. It's why I got it last month, also in fear of lack of stock and the fact I had no GPU. It's already been painful enough to try and get a 9800x3d
@@harmony-b4m how much did u spend on the 4070ti super ?
Heh, recently sold my 4090 for more than I paid 2 years ago and got me a 4070 Ti Super. Made almost 1.000€ 'profit'. Figured the 4090 will lose way more of its value, let's see if the gamble was the right move.
Not sure, I still see 30 series gpus going for same price as 40 series here.
@@KarlTheExpertim tempted to do the same with 5090 but I have never sold a part before
0:16 Great December release guys.
Oh yeah they killed it.
It’s not even December anymore
The 3080 launched only 4 years ago with an msrp of $699. Same with the 2080 and 1080 before it. That’s what it should be.
Laughs in market forces
You cant blame nvidia when consumers are the ones that eats up whatever nvidia sells. Nvidia keeps hiking up the prices and they still sell out regardless. If everyone actually had standards and held out on buying these cards at ludicrous prices maybe we could see affordable cards again.
the 2080 was 1500$, i bought it when it released.
Stick to a 3070 or 4070. Don't aim high when your wallet can't. 3080 + cards are overpriced and provide low returns vs the cost
@@enylaaa8851 I bought the 2080 new. It was £639. That's not $1500. That's ridiculous.
The 5080 is nothing but Nvidia's big carrot for the 5090. The only reason for the hug gap in specs between the two and they know very well that we all want the best of the best.
Which benchmarks did you see to make this kind of claim? Who has leaked 50 series cards? Also, I for one have never ever been interested in the "best of the best". I prefer value for money.
Only FOMO small brain gamers want the best of the best, when you can get better value from low to mid range cards...
@@xPhantomxifynot everyone cares about value though. People with no budget certainly don’t. That’s why the top-end products of virtually any hobby are priced ridiculously
Eventually 5080Ti will come out in mid-cycle product refresh when Nvidia accumulates enough partially defective GB202 silicon.
@AutieTortie if you prefer value for money you won’t be buying the 5080 either…
So 1 more day before all the GPU details unravels.
3 days
@@shuma3401 No, both AMD and Nvidia Keynotes are tomorrow. AMD is at 2pm EST and Nvidia at 3.30pm.
Then we gotta wait for gpu reviews 😢
@@shuma3401 1 day
@@saiyaman9000 yup, I think people will Start buying them in early Feb after proper analysis
Fanboys will still defend Nvidia for gimping the hardware for their gaming graphics cards, except for the flagship, and still charge absurd prices for them. It's honestly just sickening. Fanboys have been totally brainwashed by Nvidia. They believe every single Nvidia product is perfect and should not receive any criticism at all. Nvidia is full of fake shit nowadays. Nvidia is replacing hardware and raw performance with gimmicky software features. I don't give two fucks about these gimmicky, overrated AI technologies. Just give me more raw performance and hardware. AMD has done shady things as well, but most of it has come from Nvidia. Remember the 4080 12GB? Anyone who was at least relatively experienced with PC gaming knew that was the 4070. They unlaunched it and renamed it as a 4070 Ti for $100 cheaper. Nvidia keeps naming its products the tier above what they should be to try and trick consumers into thinking they are getting a better product. Nvidia is making consumers pay more for less, and idiot fanboys still defend Nvidia.
Tldr: don't buy luxury products
Nvidia is really the only choice. Yes, they are overpriced, purposely neutered, and cut down garbage. They get away with it because of zero competition. Everyone wants Nvidia products because they are the best. Everyone just wants them to be cheaper. If Nvidia priced their GPUs cheaper, AMD wouldn't sell a single gpu, and Nvidia would be declared a monopoly. This is what Nvidia wants to avoid at all costs.
lol Just buy their company shares to offset the cost of buying their products. Since their share price goes up when they make big profits, use that to fund the purchase of the GPU.
Like AMD fanboys aren't doing the same thing? I just saw some comment in a different video saying the Rx 9070xt will be faster than the RTX 5080 and costs like $500! Lol so delusional
@@adlibconstitution1609 well thats obviously dumb, cause neither card is even launched and tested
Hopefully nvidia doesn't continue launching 8 gb vram gpus in 2025... even in 1080p some games are barely under that. At least 10 gb is required at this point.
Edit: I would like to mention that if your card is capable of handling higher settings fps wise but is unable to do it because of its vram limitation, that's where the problem starts. If your card is never gonna be able to use its 8 gb of vram because it isn't capable of producing playable fps on the settings where it DOES end up using all 8 gb of vram, then that doesn't matter. So basically if you are buying a graphics card in 2025 with 8 gb of vram UNDER 200 US DOLLARS, don't do it. You will have to turn down a lot of settings even when your card seems capable of handling more, and that is something I know many people will not want to do.
Medium settings uses at the very most 6GB VRAM. High/Ultra/RT are gimmicks with low returns and high costs. The VRAM panic is very dumb.
@@xPhantomxify It is dumb to assume a 350+ Euro GPU shouldn't be capable of maxing out textures at 1080p. The 2016 1060 released with 6GB of Vram back in the day. The same amount the 980ti (top of the line GPU) had the previous generation. Six gigs were plenty back in the day. It is not okay to sell a GPU in 2025 with 8 gigs of Vram if you charge above 200 bucks. No mental gymnastics will change that.
I have bad news for you, friend. The 5060 is supposedly 8GB and the 5070 is supposedly 12GB.
@@MichaeltheORIGINAL1 There is no need to max textures when the returns are not noticeable enough for the cost. Nvidia shifted its focus to AI bros, because they pay any price regardless. Until that changes, there is no use screaming in the void. Its smarter to play medium settings, save VRAM, and get better longevity out of your GPU.
@@MichaeltheORIGINAL1Games are simply becoming much more detailed and demanding than what GPUs can offer. The focus should be on creating a good and fun game instead of these UE5 dime a dozen slop
I have a big issue with AMD changing the naming, and yes while it's not a big issue for reviewers or those in the know that skews your perspective. I can tell you that people who are not that knowledgeable are familiar with Nvidia since it's been consistent and mostly linear (mostly as in 9X0 to 10X0 series). AMD went from the RX 400/500, Radeon VII, Vega 56/64, RX 5000/6000/7000 and when it seemed like we had consistent naming it's switched up yet again. I would know how confusing it is, because that confused person was me a few short years back. Steve from GN put his 2 cents in and I can't agree enough, it's time to pick a naming system and stick with it.
Oh really? Is AMD naming going to ruin your life? Poor baby, grow a set
In 2029 Nvidia will also have a 9070 graphics card.
A completely new naming scheme is better than using the same scheme in an inconsistent way.
Naming the replacement for the 6700 XT as the 7800 XT was a big marketing fail, because it made reviewers as well as consumers compare it more against the 6800 XT, when the launch price was actually slightly LOWER than the launch price of the 6700 XT if you adjust for inflation, and it was definitely not a replacement for the 6800 XT.
Copying Nvidia's naming is a perfectly good way of making it easier for less knoweledgeable consumers to understand what Nvidia cards the AMD cards are most closely comparable to, but some people just don't like change, and find it annoying when the same system isn't adhered to every generation. I find it annoying too, but not THAT annoying. They can call it the Donkey F***er Ai 1000QRL XTX and I will still buy it if it's a good value product.
@@syncmonismIf that's the case then AMD will be the biggest selling gpu
@@syncmonism who cares. Bunch of big babies
20% is still not worth the upgrade. In my personal book I only upgrade for 2X performance.
Given a 30 FPS game as an example. 10% = 33 FPS. 20% = 36 FPS . 30% = 39 FPS (at least it makes sense to upgrade at this point). But I personally prefer 2X , 200% , so you get 60 FPS.
2X minimum.
This is the way
I find a 50% improvement to be good enough if the price is also reasonable, but 2x is definitely the point when an upgrade becomes most desirable.
I have rtx 3060 in mafia DE I got 90 fps highest settings 1080p, I upgraded to Rx 7800xt I got 190fps same location.
At least 2 Generations before upgrade, ideally 3 generations, I have RTX 2060 and am planning on buying next Gen cards around 500-600$ range, would be very nice upgrade, anyone who upgrades every generation if it’s for anything other than Professional stuffs is just dumb imo
5070 coming out with 12GB in 2025 is crazy. My 4070 is already running out of VRAM using PT & FG in Cyberpunk 2077.
I'm using roughly 14-15GB on a modded 2077 ran in 4k FSR3 with Raytracing on a 7900xt.
20gb is ideal
Right. Indiana jones you cant play on FUL HD with max settings because Vram is too low.
I love how we've all basically forgotten about Arrow Lake and the Core Ultra -9%
16GB definitely isn't enough for a 1000+ dollar card anymore.
Indiana Jones, which is possibly the best optimized game which also has heavy ray tracing, requires more than 16GB of vram to run with maxed out settings. The RTX 4080 has enough GPU compute performance to max out the settings, but it runs out of vram. Of course, you can tweak the settings and still get a very good experience, but it's a bit unfortunate, especially for anybody who spent 1200 or more on one. If the ray tracing advantage is supposed to matter so much, then you need to have lots of vram to be able to actually do it properly. 16GB for a 1200 or 1400 dollar RTX 5080 sounds like a terrible value for a gaming system.
The 7900 XTX has been regularly available for around 800 USD, or even lower, in the US and at least some other countries. I've seen the 7900 XT as low as 566 USD.
16GB isn't even that impressive anymore even at 500 USD, and is starting to be a bit on the low side even for an 800 dollar graphics card.
You don't have to play with max settings. Medium settings looks just as good and only uses 6GB VRAM at most. High/ultra/RT are gimmicks with low returns and high costs. If you go with the mindset of "needing" to play every game in max settings, you will have to upgrade your GPU every 2 - 3 years...
@@xPhantomxifymate, if you're paying over $1,000 you'd expect to be able to max texture settings. That's the point people are making. 16gb is fine on a cheaper card, but if you're paying that much you shouldn't have to worry about turning some settings because of lack of vram
And it especially matters when you use frame generation. Those 1% lows really tank the current 4080 in 1440p and 4k max settings just due to the 16GB of VRAM.
@@PCgamingbenchmarktime Kinda, Nvidia definitely need to stop skimping on the VRAM. But AAA game hardware requirements have increased much faster than hardware performance, it feels like. Games don't look that much better than they did some years ago where games could look pretty impressive and run fine on 8GB cards. Nvidia is definitely being greedy assholes with the VRAM, there's no denying that, but at the same time, it feels like game optimization is a relic of the past and it's all about slapping as much eye candy and fancy features on it as possible. 1440p/4K gaming getting more common doesn't help either, of course. Back when Minecraft was at it's peak, it ran on a potato but you could make it run like shit on a capable PC by adding shaders and photo realistic textures. I blame the game devs as much as i blame Nvidia. Devs need to optimize their games and Nvidia need to stop being assholes. Probably won't happen, though lol
@@xPhantomxify And you don't have to sell a card for a grand or more. But they do.
The biggest problem with the 5080 isn't the price, it's the 16GB at that price. I'll wait for the 24GB 5080 Super to upgrade. I think a lot more people will do the same.
i bought a 4090 just for the extra ram to future proof the card, ill be skipping the 50 series
@Sleepy.Time. You can wait for rtx 6000, the 4090 is a monster
It won't be a Super, just a 24GB version of the same card. The 5080 is already using the full GB203 die.
I really want to upgrade now but seeing the 16gb and how I got fucked already with an 8gb GPU I will try to wait also. It would feel so bad to drop 1500€ for a GPU and not being able to max out games due to VRAM.
5080 with 24gb vram will be absurdly expensive because of AI, people will see it as an alternative to the 5090. 16gb is probably a point where it's still mostly for gamers
08:11 Don't worry, AMD will be pricing the new 9070 XT so high that they'll "never miss an opportunity to miss an opportunity" 🤣
16gb Vram for the second high end gpu on the market is an insult, its not just bad. Lets analyse it from another perspective. The real gaming platform is console gaming. Since PS5 appeared it had 16gb vram. All developers started making games that gradually increased the necessary vram from 8gb to 16gb. And 16gb of vram will be the standard until PS6 will arrive in probably 4 years. But we are speaking of PC gaming, which should be better. We are speaking about a high end gpu which alone costs almost 2 times the price of a gaming console. Of course it should be much, much better in every conceivable aspect. Otherwise a gamer should just buy a PS5 pro and minipc or a cheap desktop, or a laptop without discrete gpu for normal PC ussage such as browsing, office, multimedia and such.
_"Since PS5 appeared it had 16gb vram..."_ LOL! The PS5 has 16GB of *SHARED* DRAM, of which the base Kernel takes about 2GB, so you're left with 14GB for the game engine *and* VRAM allocation. The delusional comments are just hilarious!
@@awebuser5914 And being shared, it doesn't have to duplicate stuff so the card can have it and the CPU can have it.
@@markhackett2302 _" it doesn't have to duplicate stuff..."_ Please don't comment if you are _completely_ clueless...
That's not the case at all, the VRAM on the PS5 and Xbox are not dedicated, it's more like an even split down the middle when it comes to resource allocation, the CPU in most games takes about 8 gigs of memory and 8 is dedicated to the GPU, nobody is developing for a known 16GB of VRAM limitation, you only can get this kind of utilization on PC if you are allowing the game to load in with details or resolution above the consoles limitations.
For example, God of War Ragnarök is a native PS5 title, and it happily will run with 6GB of VRAM on a dedicated GPU on PC, this is a first party title and should leverage everything the PS5 is capable of.
Nothing on PS5 or Xbox even comes remotely close to allocating full system memory for GPU alone, and I don't think the design documents even allow this.
Hell, even the PS5 pro only gains 2GB of CPU memory so it can allocate just a little more to video, 10GB of VRAM (the max a running PS5 Pro can use) is still below the 12GB threshold of your middle of the road gaming GPU, not to mention the XX80 class GPUs are orders of magnitude faster.
the PS5 Pro makes about 16.7 Tflops in raw computational performance, a 4080 Super makes 49 Tflops.
Ps5 can have 200000 gb vram won't make up for its dogshit performance. Some games are upscaled from as low as 420p to get 60 fps. And the graphics are equivalent to medium settings on pc. It's trash.
The lack of VRAM alone should be enough of a reason NOT to buy Nvidia cards. The fact that prices are going to be astronomical as well should make you feel like a fool if you want one of them...
The low VRAM is planned obsolescence by nVidia. That way they can sell the next gen with a bit more RAM and performance and still charge a lot to force an upgrade.
A 5070 Ti with 16GB of VRAM and a 20-30% performance increase of the 4070 TiS is absolutely fine if it isnt to expensive. You dont need the 5060 to be better than the 4090 just because its one generation further. Doesnt make sense imo. The 5080 with 16GB is totally underwhelming yeah, this one should have become a 20 or 24GB card for sure.
@@svenyproudyeah I’m gonna try and get a 5070 ti and if it ends up being worse than the 4070 tiS at the very least it’s cheaper
In 2012 i got an gtx680 2gb for 500€ and i thought thats just mental. But today 500 for an 80 class card would be a bargain, the price hike is insane..
I'm expecting a modest uplift in performance, coupled with a pretty steep price bump.
Nvidia at least has already shown their cards here. They stopped production on the 4000 series a while back, so there won't be a price comparison issue with the old cards since you won't be able to find them.
If you're buying used that's a different story, but they're just looking at the new in box retail market. And regardless of how insane the pricing is I expect they will sell out a few minutes after they go live.
5080 with 16gb will age worse than 3080 with 10gb.
I already called out the 3080 as being underpowered re. vram before it even launched and I was right. Games are already using in excess of 16gb now. Nvidia think gamers are stupid.
@@grahamt19781Only if you are playing in max settings in a tiny few games like Indiana Jones. Medium settings only uses 6GB VRAM at most and looks just as good. 95% of games are not that demanding. 95% of gamers play easy to run competitive games and indie games. Or at most double A games like Warhammer 2, Helldivers 2.
@@grahamt19781 _"Games are already using in excess of 16gb now..."_ Complete and utter bullshit. No game "uses" more than 16GB or VRAM; it may *allocate* more, but it doesn't *use/require* it. The PS5 Pro _still_ has 16GB *shared* DRAM and if you think PC ports need more than that, you are delusional...
@awebuser5914 you're wrong but ok. Have a good one.
@@xPhantomxify exactly. If you want to max out games with full path tracing on pc like IJ you better have more than 16gb vram. 👍
Great video guys. I bought my 1080ti for $700.00 back in 2018 and I still haven't found a good reason to upgrade since it works fine at 1080p. Gpu prices are insane and I can't believe they are still trotting out gpu's with only 8gb's of ram for over $400! I was tempted by the 7800xt this generation and bought one for my son, but my old eyes can't tell any difference. I need more bang for my buck. 😛
Dlss, frame Generation, much more power. The list is long why you should Upgrade.
Snapped a brand new 7900xt for 600. Awesome deal, awesome performance
I ended up with 7900xtx for $860… i figured this was a good deal. I was contemplating between that and the 7900xt due to price, but I figured why not?
@@Jibarin88 even better
24:00 The average user probably has a GPU lifetime of 5-6 years. If I spend 1000+$ on a GPU I certainly want it to last more than 2 years. As always nobody here has a crystal ball but to say that the GPU is justified if it lasts 2 years is a very shortsighted view in my opinion.
1000 GPU for 24 months. That is 40 a month... Which sounds like stupid amount of money for single component. Not even counting power.
@@_EkarosYou're ignoring resale value though. Old cards will hold it better if new ones are a bad deal.
Pretty sure he's talking about the manufacturers production cycle and having cards relevant to the top end gaming requirements during those periods for the purpose of sales, rather than how soon you will update.
I had decided to get a 5080 before i bought my 3080 when it released, and skipping the 40 series was really easy, it now seems i will end up skipping the 50 series aswell.
I'm on a 3080. If the 9070xt is 4080 performance for $499 that's what I'll be upgrading too, back up plan is a second hand 4080 when they drop on the market after the announcements. These new cards are useless if the prices are increasing again. Two generations with price increases, that's not a good sign
Why not switch to AMD. Do you realy need raytracing ?
@@-jc8578 I mostly need shadowplay, and amd version of that sucks.
@@PCgamingbenchmarktime _"If the 9070xt is 4080 performance for $499..."_ Riiight....
@awebuser5914 yeah.....that's a 40% uplift from the 7800xt. That's not unreasonable to expect. Many cards have had similar uplifts for the same price. The 2080 to 3080 was the same price with a 50% uplift. Don't act like this is impossible, If Amd is serious about taking low to midrange marketshare like they said that's the performance and price they need to hit for them to achieve that. The 5070 will be $600 and probably not too far off that performance target. It will probably still beat the 9070xt in RT, the 70xt around beat it in raster. If not...what's the point of Amd's card, it will just be another dud. AMD announcing their cards before nvidia could mean that they're confident with what they're putting out. That or they're delusional and they price it too high and/or the performance isn't where it needs to be. But considering they've copied the naming scheme of nvidia....it would be pretty stupid if they can't come close to the 5070 lol, and they need to be $100 below nvidia to stand a chance
Install a skylight with a hatch and repel down crushing the box of the product...?
The new naming is genius for marketing! It's more intuitive and familiar, which makes people feel more connected to the products, and that will definitely boost sales.
Agreed. Especially when they’re not aiming for high end anymore. They changed their marketing tactics. Might as well change the name too
Exactly and also there will be no conflict with naming between gpu and cpu. Also people are lazy and they don’t want to do research to compare amd to nvidia cards and Most don’t even realize that amd cards do raytracing or VR. And always the same bull*** about drivers. Yes it happens. Yes nvidia has long run of driver releases that work, but they had issues also in the past. But the driver fail sticker goes to amd 😂 I had in my 2 year run with 6950xt 2 times issue with new driver release and 3 times had some driver issue. Except that they worked flawless.
I think gpu ram depends on how often you upgrade. If you intend to wait 5 years to upgrade, i think id want more than 16.
half the gpu hikes are shockingly bad optimization on new games and its getting worse
Nope, games are as they've always been, they didn't come perfect 10-15 years ago and won't now. You still needed a pretty good for the time GPU for 60+ fps at 1080p ultra settings. The difference is the consoles are a lot closer to PCs in power now and some people just don't accept the performance targets. Like a game could be perfectly optimized, then pump the graphics until it hits all the resolution + 60 fps at max settings combos for all tiers of GPUs but people will just call it unoptimized anyway. Which makes the word lose all meaning.
Also I think part of it is people coping with bad AMD purchases because modern resolutions that are fine on Nvidia means they have to use FSR.
RTX4090 already costs a kidney, based on RTX5080 pricing, RTX5090 is gonna cost both balls, a chunk of liver and probably quite a few unimportant organs like ears. By the time RTX9090 comes out, I'll be out of organs to sell.
its 2500$, 3000€ in europe
I really don't know why a gamer is concerning himself over the price of a 4090 and 5090, when its not targeted for them? It will cost you a brain if you keep fretting about it... Stick to the 4070 Super equivalents.
@@xPhantomxifydidn't mention being a gamer anywhere. Just expressing concern over stupid prices.
Peeps concerned with AI inferencing/training besides gaming will be more concerned with the amount of and generation of Tensor cores, amount of memory and its bandwidth. Matrix cores (Cuda) is not that of a deal beaker, imo. So yes, give us bang for buck in the tensor department without the sacrifice of organs.
who cares, you are not forced to buy it just to have the " oh look internet, i got best gpu" , cause no one cares
People old enough to remember when the Ati Radeon 9700 series came out to go up against the GeForce FX 5800. We’re expecting the RTX 5080 to come out soon so I’m pretty sure the choice to use 9070 is someone at AMD’s naming department having some fun.😂
Oh god PLEASE let the RX 9070XT come out at an actually reasonable MSRP of 500 or below. If AMD end up releasing another DOA product due to poor pricing, its actually so over for them, and Ngreedia are gonna keep charging 2 kidneys worth for a GPU
Depends on the actual Performance. If its really only 5% slower then a 4080S, like rumors suggest , then 550 or even 600 would be still be a fire Price.
@@rulewski33I agree in theory, but nvidia has such a hold on "mind share" that AMD need to really undercut the price to make a dent in the market share difference.
Just confirming this video DIDN'T come out in December :(
my 3060 still alive since pandemic, will upgrade when gpu broken :p
bet Ngreedia is regretting releasing this product to this day
If 50 series has worse performance than 40 series Steve and Tim needs to use gaming community recovery stim packs. AMD could have named it as Ryzen 4070 for RDNA4
Maybe the price in US will be 1200$ but in Denmark we will probably pay around 1600-1700$ for the same product
I'm my opinion, I believe AMD had a name change to keep the number similar to their CPU generation number (seeing as they want to unify both brands this makes sense), but needed to switch a 0 around to avoid confusion for the consumer.
Wow. Interesting opinion. Stupid. But interesting.
@AutieTortie wouldn't that make sense though with udna being their goal?
@@sollice01 UDNA is a unification of RDNA and CDNA (which is basically just GCN with the occasional RDNA derived feature), not a unification of RDNA and RYZEN.
Other than that, I definitely think that the choice to switch around the zero was to avoid collision with the CPUs, but if they had just called it a 8800XT, that wouldn’t happen.
My question is, what are they going to do with the switch to UDNA (presumably) immediately after this generation? Are they going to swap over to a 3 digit naming scheme? So I can get an UX 580 xGB in the not so distant future? And that’ll then definitely collide with their Intel naming scheme Laptop parts? And are they going to change the naming scheme on desktop after this 9000 series? Maybe to match the new laptop parts and Intels new naming scheme for all CPUs? Now with even more guarantied naming collisions!
@@levygaming3133 thanks yeah good point. I can't really see why else they would do it, I just assumed they'd jump to a 10700 and a 10070 or maybe just go a new direction.. I suppose time will tell
@@levygaming3133this just in: they're gonna start naming their graphics cards like asus monitors.
So, guys, with my 3090 ftw3 24GB Vram, playing games at 1440p monitor and occasionally 4k Oled TV, would the 5080 be an upgrade or will the 16GB Vram be a real limitation in performance?
just use a tool to protocol vram usage in your games et voila.
The only option spec-wise is the 5090 but price might be a problem. Forget about 5080 (unless they refresh with a super model with more vram).
@@thehunk1 Why isn't anyone realizing a 5080 with 24 GB would sell for $1500 if not more. The demand for AI is too high.
If you’re an editor, you can upgrade, if you’re a gamer there’s no absolute need to upgrade, the AAA slops that we’ve been getting for the past 15 years doesn’t justify upgrading in any way
Very reasonable and pro-art of you to denounce literally every AAA game since 2010.
@ about 90% of games, garbage, that means ALL
I enjoy AAA games. Most people that pretend to hate AAA just have potato PCs that can't play them. Already pre ordered Kingdom Come Deliverance 2 and Final Fantasy Remake pt 2 , they are both better than any game from 15 years ago
I have a 9800x3d with a 4080 Super lol
If you're an editor you can just wait a few more seconds for whatever it is also 😮
the 9070xt is supposed to be just under a 7900xt in performance, and that goes for $700 right now. i would expect them to offer a better price to performance than last gen, so i would think it to be in the $650 range or have much better performance in ray tracing/ AI workloads.
$650 would be absolutely terrible if its that performance level. The 7900xt has been that price before lol. It needs to be closer to the 4080 and a maximum price of $499. If its below the 7900xt it needs to be $449 at the most. No one is buying that for $650 when the 5070 will be $600 and out perform if it really was less than a 7900xt performance wise lol. I don't think those benchmarks were accurate , if they are then Amd has a another dud generation lol
@@PCgamingbenchmarktime i agree, but this is just the way things tend to go. and by that i mean they go poorly for people looking to purchase a video card.
7900 xtx performance for half price $500 is gonna sell like hotcakes especially since Nvidia is getting greedier and people wont even be able to afford 5080 and 5070 likely will be overpriced too.
it will be more like a 7900 GRE 😉
There is no way that AMD could make a 7900xtx for $500. That is not realistic at all. Latest rumors are that rx 9070xt will match 7900GRE in raster and have 20-30% better RT performance.
Dream on, dreaaam oon.
At best youll get 7900xt performance for 600usd.
@@user-wq9mw2xz3j that doesn't even make sense because 7900 XT is already $600 in Microcenter bundle.
@@nidz3876 And those rumors are stupid since the GRE is only 10% faster than the 7800XT. Do you really think 10% is all that AMD manages to achieve, especially when the rumored price range is 500-600$? Common sense tells you that doesn't work out.
Great breakdown. Very helpful. 👍
i just don't want to spend over a £1000 for a 5080 when my whole computer costs less than a £1000 without the graphics card.
Nvidia's own 3080 was $699 not long ago, for MSRP. This is the problem with a lack of high end competition.
Space marine 2 with 4k textures at 1440p x 3440p using 22gb vram. No 16gb vram is mid range any card priced above 500usd should have 20gb + vram.
Steve with your wood working tools you could make the old classic coffin just for things like this hahaha. Get to it and I expect a 12 part build series hahaha
16GB of VRAM is the *bare* minimum. Anything less is daylight robbery.
I agree, 16 GB is not enough already, I recently bought myself a powerful Radeon RX 7600 XT video card, and it turned out to be weak, in 4K at ultra settings it outputs only 30-40 fps, and I was told that this video card, with its 16 GB of memory, will last for 3-4 years!
Well, its not a robbery until you buy that. People need to stop buying that stuff...
@@Igor-Bond VRAM isn't everything. The 7600 XT is a Full HD intended card, not a 4K monster. It has enough VRAM, but its core performance isn't too powerful
26:00 - To be honest, I got an RX 7600 XT on launch and I'm happy that it's doing well with the 16 GB in games compared to my previous 8GB low budget card. And I play on 1440p and I'm happy.
We can’t say that people won’t buy AMD over Nvidia at a $50 discount if the cards are equal. Because the cards haven’t been equal in a long, long time, whether that be because of feature set or because of drivers.
If they were identical on performance and features, I would still choose Nvidia for £50 extra due to energy efficiency alone.
Are they really that much more efficient though.
@@ZinoAmareyes
They're not even close to equal. A 7900 xtx is worse than a 4070 ti super. It only wins slightly in raster and in vram, but everything else is hot garbage. It's like buying a car with a slightly better engine and slightly more space, but has no heating, no parktronic, no cameras, no electronic windows etc.
@@kerkertrandov459 Everything else? You mean non-gaming related?
Yesterday I attended an RTX50 preheating event in Beijing. On the way, I wanted to ask the staff about the price of the RTX5080, which will be launched this month. He told me that the price is similar to the initial price of the RTX4080, which is equivalent to about RMB 9,000, not the US$1,499 advertised on the Internet.
On a multi monitor setup i'm using 1-1.5gigs of vram just idle with like a youtube video open or a movie playing, which is not factored in when talking about vram. The only scenario in which you have 100% of your 8 or 12 gigs is with one monitor and nothing else open but the game, even steam uses like 300mb of vram.
8 gigs is not enough today, 12 gigs is fine as of NOW but you're not buying a gpu for now (to the dismay of nvidia), you're buying it for a few years so 16-20 is absolutely necessary.
as owner of 2009 100$ ati radeon 4770, i welcome our new 9070 overlords!
Is that still your daily use GPU? :D
8:10 AMD was one step ahead of us and decided to not reveal RDNA4 GPUs at their keynote, probably so they could wait for Nvidia's pricing and avoid the same song and dance you described :D
$450 for a 9070 XT that is like 10% or so less in performance than 5070 (assuming 5070 matches 4080 performance) does sound good. I think the issue is that FSR 4 is going to make it look faster and they will try to charge more but reviewers are going to find it's not as good as they claim and not really worth the $549 or so they try to charge. Thumbs up.
I don't think we expect the 5070 to match 4080 performance. Instead it will be more like 4070 ti super (or whatever the best 4070 is called). But yea will buy a 450 card with that performance
The RTX 5070 will be RTX 4070 Ti, the RTX 5070 Ti will be closer to RTX 4080.
I don’t think FSR 4 is a factor as long as is good, but leaks suggest the bare minimum will be $500 by closer to $600, maybe $550, but for RX 7900 XT to RTX 4080 performance, is not going to be bad.
HAPPY NEW YEAR
It's crazy how everyone says "amd better not dare to price their product above 600". Meanwhile Nvidia gets a pass and people will buy it for whatever jensen decides 😂
AMD literally prices better than nvidia and people still complaining and want them to have it priced dirt cheap 😂... Hypocrites
Everyone is complaining about amd because they already know there is no hope with nvidia, nvidia hasn’t released a card worth buying new since the 2000 series
@@obj_obj Even a gimped card like 4060 can mawl through every game even with unnecessary settings turn on. Sometimes it's better not to fix stuff that isn't broke. Normies wouldn't understand.
Well because AMD prices their products just barely below its NVIDIA counterpart but without all the Nvidia features, granted they usually have better raster performance, but still AMD should realise they are always playing catch-up with Nvidia, they need to be aggressive with their pricing or else they will get decimated like always
Amd is pricing themselves slightly cheaper to nvidia while all their features apart from raster are far inferior. They need to be much much cheaper to be option for many. Their shrinking market share seems to agree. I've used amd gpu's for over 10 years now but im not blind to the fact that amd drops the ball 90% of the time.
@@dvornikovalexeiwouldn't you want it for normies to understand? Unless they not the target audience.
I think confusing product names are a positive for gpus. When i built my first computer, i was so overwhelmed and confused by gpus. I had no idea what i was buying because it was so hard to follow what was good vs what was bad, what it meant when it was an nvidia msi gpu vs nvidia asus gpu. The confusion makes it impossible to make an informed choice.
DLSS turning into a way game developers can be more lazy and not optimize their games correctly
I was thinking the same thing, hardware is outpacing software
Sounds about right honestly
Don't buy new broken unoptimized games.
The 9070 XT needs to come in at the same price the 7700 XT sits right now.
Its renamed 8800 XT so it will replace 7800XT. 500$ is more realistic.
In 2022, because of high GPU prices I took up chess and bought a beautiful hand carved set for $1,200. It is still beautiful and played Metro Exodus on my PC. It was a very good year.
AMD should be renamed MOC for 'Missed Opportunity Company.
so true, i was so dissapointed when the ryzen 9000 launched, besides 9800x3d the rest have no greal performance increase compared to ryzen 7000
@@hyakkimaru1057well the 7000 were already pretty great, didn’t really see how they could get much better than those, I fully expected 9000 series to be a slight upgrade
@@emma6648 true, the value of the 7600 and 7800x3d is insane
If Nvidia has any sense, they would price things like they did with the 30 series. I don't think people will be running out to upgrade from the 40 series..and it'll be interesting to see how much of an improvement there even is on the 5080/90 from the 4080/90. I don't think it'll be more than 20-25% better.
I disagree, I think naming is quite important to the "average" consumer. On reddit you still get questions from people who build a pc 10+ years ago and havent paid attention to hardware since, to explain where a 7700xt falls compared to nvidias lineup. On the otherhand I think changing your naming scheme too often is also quite damaging. Think of how often youve heard "i7=good" from people who arent as in the know as we are. Thats only the case because intel stuck with that naming scheme for so long. The same goes for gpus, people are buying 4060s because they bought the xx60 series back when they build their last computer, and it has served them well so far
100% true- plus you have your own identity.
Ah yes, reddit, where I get my world view and market standards from...
If the 5080 is the same performance or lower than 4090 it should be 800 dollars. It still has 16gb vram.
Sir!? Why 5090 priced in as 2.5 kidneys ?!!! I have only two !
NVDA: Do you have kids ?
The latest DLSS is not working on 30-series cards. Will we see a repeat of that?
Unfortunately, the latest rumors are the 5080 MSRP being above $1300, which would effectively kill the product. Nvidia's greed will catch up with them.
Maybe JH will announce that 50 series will be priced the same as the 40 series. That would be a big announcement.
GPU prices are a joke !
Just say you can't afford it and move on. Stick to 6700 XT, everything above is overpriced
a used RX 6800 XT for like 300 - 350 bucks is fine.
also a RX 7900 XT new on sale for around 600 bucks.
Arc 580 is a good deal for 250 bucks.
So ... especially NVIDIA prices are a joke!
@@hassosigbjoernson5738 7900 XT is 750 here.
Arc 580 is sold out everywhere. And the ones that are available are 400+.
6800 XT is 450 here on the used market. But its old tech. i want new tech for under 400.
Hey Hardware Unboxed team, I would find it interesting if you had as part of your GPU charts an overview of mean vram usage across you suite for each GPU at each resolution. I currently find it hard to judge how long-term valuable 16GB vs 20GB is. Maybe with a differentiation raytraced and raster only. Just had this idea.
GL at CES stay healthy, safe and have fun
Thanks STIM!
As long as the 5090 wipes the floor with everything else then it's worth the price to me. It's not really expensive if you always deal with the 90 series and the Titan series since you can always sell those for a little more than you actually pay for them to pay for the new card.
I wonder what a simple RX 9070 would be like?
More than 1k it can't be cheap
@@Filip5552_RDNA4 is mid to low end, they're not gonna be even close to 1k
Realistically it will be closer to USD 600, since its basically offering RX 7900 GRE level of performance.
@@Lovelandmark That would offer no added value then since the RC 7900 GRE was the same price.
@@TheEchelon thats what I'm saying either they will lower all the prices... which is definitely not gonna happen,or they will bump up the prices of new stuff and honestly i live in Europe and the prices are definitely not the same stuff is more expensive here so i know ill have to pay more than a 1000 for the new gpus if im stupid and i wont wait and honestly i want to build a pc soon i had to sell my old one last year and i miss playing sometimes in my spare time
I bought the ASRock Steel Legend Intel Arc B580 Dec 13th at Microcenter in Overland Park, KS. It was about $300 USD with tax. It wasn't $249, but rather, $269 for that particular model. Performance has been fantastic for what I paid for the card. If any of you out there can get one, buy one and use it. 1080p performance would be great. I'm using mine on an LG OLED 42" C2, which is 4K.
Too less Performance
Also what I noticed about high end GPUs: There's a newish market there with streamers. Those will almost always gravitate towards the high end. It's simply a buisness expense to them and it's more affordable to them than your average gamer. Additionally with the people you talked about, there is now a significant ammount of people who can and will afford those $2000+ GPUs. Normal gamers not lucrative enough for Nvidia anymore
Streamers are the worst thing that could happen to people. Imagine watching and wasting your precious time watching someone play games instead of yourself, give them money on top and form a parasocial relationship with streamers that only look at you as a number and metric. They will of course overspend on GPUs more than what they actually need, since 90% of them play competitive games casually and are not playing in tournaments.
They are an important part of Nvidia's marketing strategy. It's extremely inexpensive marketing when all you have to do is give away one graphics card to get recurring product placement for a couple of years or more... But they didn't even have to pay for it in most cases, at first.
Yep, there are a lot of gamers with plenty of disposable income- we aren't all still living at home, retired. $2k+ is a lot of money, but not enough to put me off if I want it.
coffin would be mixed singaling, since sitting is neutral and lying down during 9800x3d was positive.... it would have to be vertical coffin
Nothing is released yet, hence I am still hoping, but my frustration is growing more and more. I want to upgrade from my RTX 3090 to something more powerful, since it powers my 4K TV. Based on the rumors, I do not see a viable upgrade path: downgrading to 16 GB VRAM while paying more than double the price of the used RTX 3090 that I currently use seems like a bad deal. Paying (more) than three times the price for an RTX 5090 seems just as ridiculous. AMD and Intel won't offer a high-end option. Used RTX 4090 costs more than it did a year ago brand new...
I'm in the exact same boat as you mate. I want to play at 4k with path tracing but my 3090 doesn't cut it without severe compromises. The 5080 vram issue is a major turn off and then the 5090 is just plain stupid in terms of probable price. We could be left without a viable upgrade path until 5080 24gb comes out (assuming it will).
Im still running a 2070 super... yeah it's starting to get real rough... i want a good generation, but im actually leaning on picking up a 7900xt/xtx or equivalent from AMD this year
@@grahamt19781 which will probably have an insane price tag too :(
@@buddybleeyes lack of DLSS and RT performance have kept me from buying a 7900 XTX so far (i.e. I am looking forward to Half-Life 2 RTX)
@JensAllerlei I see what you mean, but I still don't see RT as good as it was promised. The performance hit is still too great. I could be proven wrong with the 50 series though. Dlss is what has kept my 2070 super running tbf, but I don't see the cost as worth it anymore. Fsr has got alot better. Steam os should be round the corner too. Edit: HL2 RTX will probably go hard though 👌
How can you compare frame rates from 8 yrs ago till now.
What generational cpu are u pairing up with a 1070 or a 2070 or even a 3070?
A 1080 didn't have a 9800x3d as a cpu to back it up.
"They could use monitor names and people will still buy it." Had me bursting out with laughter.