This shouldn't even be a discussion. VRAM is cheap, but sold to users at a huge premium. VRAM amount should *never* be a bottleneck for gaming, but because of greed, it is. Having spare VRAM is fantastic. It improves the longevity of the GPUs by enabling users to run newer games, on lower settings - but keeping textures and whatnot high, at little to no performance cost. It keeps the hardware usable for longer, which obviously isn't in Nvidias interest. So they limit the amount, to ensure that we keep on upgrading, even though we otherwise wouldn't need to. They're generating E-waste for the sake of profit. It's a version of planned obsolescence.
this comment My 1070 lasted me for nearly 7 years purely because of 8GB vram which at the time of it's release was a lot, even an overkill at that moment because most of 2016 games used no more than 4 GB of vram. I guess nvidia saw what they did and how people didn't upgrade their GPUs for a long time and decided enough is enough, if you don't want to upgrade at least every 2 generations we're gonna force you to do it with GPUs crippled with insufficient vram Recently I got a 7900 XT and those 20 gigs are a godsend, It will be plenty of vram before the GPU itself starts struggling
@@miha1999grobarwhat you people don’t realize is that the demand for vram will not go up forever, we’re pretty close to the best graphics possible and I can bet that even in 20 years, 20Gb of VRAM will still be more than fine.
It has always been the strategy. I was saying this to my friends up to 4 years ago. One of them already changed GPU twice since then because of low VRAM which lead to poor performance. Yet all still fail to recognize anything but NVIDIA as a viable option. It's crazy how brainwashed people are.
TBF, DLSS reduces the vram required vs having it off. I would still rather just have a strong enough GPU and enough vram to not need DLSS but that's expensive.
@@louisshin642and you get less features as a result, they’re not equivalent. Youre making trade offs to get that lower price. If you don’t care about the trade offs then it’s a good deal
@@louisshin642 amd cards release at around the same price as nvidia cards ie where they fit in nvidia's line up and then drift down in price later. also significantly cheaper depend on the tier of gpus cause in some case thats an outright lie since the 7600 and he 4060 are basically the same price.
People need to keep in mind, higher texture quality is the easiest way to increase graphical fidelity. It has the most impact on how nice your game looks. And with sufficient VRAM it has basically zero impact on your performance. This has been the case forever...and I don't understand why the average person forgot this.
This! No amount of RT and AI-frames will ever improve visual fidelity as much as high-resolution textures do. Having 8GB on an entry level card might still be fine, but as Steve and many others habe rightfully said time and again, having 8GB on any product above the entry level is borderline offensive.
That's why paths in Gothic 2 (2002) looked better than in Gothic 3 (2006). No amount of lighting, normal maps and shadows could cover up that the resolution got halved.
People need to keep in mind that texture quality are based on texture resolution in most games so trying to use 4k textures will slam your vram but if you are playing at 1080p or 1440p you are getting literally zero benefit from that.
@@angel_killzit yes, you can use AI built into our drivers, that's the most important part, as for games, just just upscale with dlss and double those frames with FG, it will be the best experience of your life, and you will be able to buy more cards for your family to save even more!
To be fair, the most expensive 8GB GPU of this generation is "only" $400, so maybe that'll drop to $300-350 in the next generation. Of course, 8GB should only be available for entry-level sub-$200 GPUs.
As a 3070 Ti owner.. I am still annoyed at Nvidia's decision to only include 8GB on this card. It had so much more potential. Definitely feeling regret buying this..
Yeah it's bullshit. It was the only reason to skip that while generation for me. And why I saved more to go for a 4080super. Wouldnt have gone that high if the 4070 had proper vram amount. So I guess their trick is working...
@@Perrajajaja 15% of performance increase for 40% more money between 4070 Super and 4080 is not justified at all and that the 4070 Ti is almost cost the same as a 4080 i just pure scam.
@@MuckytujaI agree it's a scam. I do not agree with the 15% performance you mention. The 12gb just isn't enough for a future proof 4k card. So the 4070s are not a real option imo
Sold off my 8 gb 4070 laptop about a month ago and got one with a 4080, just because the gpu was powerful enough for higher rez games, it just didn't have the vram to back it up.
Nothing wrong with a rig like I just bought with a 4060 for 800$. I don't game above 1080p though, so yeah stay away if you got the money and urge to game above 1080p!
I was supposed to buy a RTX 4060 TI 16gb then luckily I've saved some $ to the other part then upgraded it to rx 7800 xt. I'm happy with my very first pc build 😊
@cccpredarmy RTX 4060 ti 8gb or 16gb is overpriced for it's performance. To make it simple, 4060ti is good at 1080p resolution while RX 7800 XT have no sweat on 1440p.
@@JomeFromWork I was just surprised if 16gb on 4060ti are different compared to 7800 XT. I myself run a 6700 xt 12gb and will upgrade soon for a 6900 xt 16gb which I got for a bargain. My 6700 xt runs 1440p just fine
@cccpredarmy both the 8gb and 16gb version are the same GPU just the other has more VRAM which does not translate to a significant performance. It's just a high end 1080p GPU which is 👎 BTW, nice upgrade.
It has always been. Claim that the technology is worth overpaying while not caring about the hardware. And then turns out that software requires appropriate hardware to run well!
Create the problem and sell the solution. They have so much market share and marketing they can turn a feature like "ray tracing" that was nearly universally panned and make it something gamers absolutely can't leave on the table when they could get superior raster for their budget with Radeon.
Listen…….we the faithful Nvidia sheep must keep believing everything they tell us as words of gospel now that being said I believe the saying was “the more you buy the more you save” which translates to upgrade to the highest tier that your bank will allow you and upgrade every generation it’s a simple concept really
Nvidia VRAM professional says 6-8GB is more than enough until 2030 so buying an rtx5050 ti, 6050ti and 7050super ti with 4-8GB will be worth it at $400 to a fairly maximum price of $500+
Wrong. 6-8GB isn't enough. It was great ten years ago with the 980ti 6gb cards, but with games often having more textures or higher resolution textures now, 11-12GB is a more comfortable amount. It's not even just about a smoother experience, it's that some games or programs really do need that memory capacity. It also allows to do editing and mesh creation much better.
@@MiniDevilDF but according to the Nvidia textures and settings experts the 5050ti super with its dashing 8GB of VRAM at a low asking price of $500 entry level gpu can will be able to do low settings with ultra low Raytracing at 30-45fps what more d u want? Be more appreciative of our overlords at Nvidia please 🙏🏻
Finally a comprehensive video on VRAM usage in games. I wish you have had added a comparison of VRAM consumption when playing with the 7900 xtx and the 4090 as both have 24 GB of VRAM. I always hear that "20 GB VRAM in Radeon GPUs are equivalent to 16 GB VRAM in Nvidia GPUs". I would only believe it if someone like you shows a side by side comparison.
well thats bullshit nvdia fan boys talk too much crap .I have a 7900 xtx and my mate as a 4080s and we playin same game in coop and yes my xtx used more vram but it was only by 700 to 800 mb less then 1 gig . Space marines 2 was game.
@@albzc8345You spent around 1000$ and this card has perfomance at RT like 4060TI? Also bad fsr, more heat, usual bad coil whine and high power usage. I would go with 4080s. 😅
@@albzc8345 TRUE. In smaller cards it goes like 500mb, 200 mb + than a nvidia equivalent. Nvidia users say a lot of shit. Children. Its all about them.
yeah delivering a new product that has the exact same features, so both are bottlenecked, seems like a perfect way to introduce planned obsolescence....
A "methodology video" in how you measure actual usage would be SO NICE. Allocation vs usage is always a hotly debated topic, so laying it bare for all to see would end a lot of debate.
@@thatguy3000 Sure. People who now upgraded from their 900series, keeping them for more than 3 generation are well advised going for the 16 GB models by AMD.
These tests revealed something new for me. Frame generation is actually bad for anything below 16gb VRAM. Which means both of the main gimmicks of RTX cards ii.e RTX and Frame generation are only good for high end cards.
This is exactly why I went with the 6750 XT instead of the 4060. At first I thought I'd take the VRAM and Performance hits because surely RT and Frame Gen are enough, right? And turns out that no. After watching several benchmarks I noticed that RT and Frame Gen can't run together on the 4060 due to it's 8GB of VRAM, even at 1080p. Literally Nvidia's main selling points right now outside of productivity out the drain LMAO. Another reason was because I really want to play Cyberpunk 2077 and turns out that Phantom Liberty demolishes the 4060 even without RT because of how good it looks.
@@diego_chang9580I was also faced with the same dilemma of what to purchase and made the same choice. I already have some games using over 8gb of VRAM (not just allocation) so I'm pretty satisfied. I still have a 1080p monitor for now so upscaling on a 4060 would be out of the picture anyway, and the gap in raw performance compared to the 6750 XT is large.
Even 6GB still not enough for 60 fps on Medium. But luckily FG works like wonder but still horrible that Medium textures ain't looking anything special but blurriness. 🤷🏻♀️
I had an RTX 3070 in 2021. I played games with a youtube walkthrough and an excell sheet on my second monitor. I was getting frame stutters, lag spikes, and problems switching between active windows. I upgraded to a 6800xt for $900 in February 2021. No problems with VRAM since. Also upgraded to 64GB or RAM and. 5950x from 32GB and a 3900x a month later. Each time i could feel the upgrade. I like to multi task, and i found the 16 core over the 12 core really helps when multi tasking in games like elden ring. Cyberpunk, City Skylines 2.
12GB would have made a lot of sense for the 4060 ti, but it was designed primarily around the laptop configuration, where it made more sense to have less vram and a narrower memory bus, because when the GPU is clocked lower in the laptop variant, and with a smaller screen, you're likely to turn settings down lower anyway, and so 8GB is more likely to be adequate... But, jeez, what am I saying here?! I sound like an Nvidia shill. 8GB is still pretty weak even for the laptop variant! XD
Wow, very informative data and charts, I'm blown away. I really appreciate that you tested different graphics settings at different resolutions. When VRAM problems are brought up, I always thought "meh, you can just play on High settings anyway, it looks the same with higher FPS" but now I see some games use more than 8GB even on 1080p Medium which is the lowest settings that you would use on a new graphics card. It's criminal that you can pay upwards of $400 and play on 1080p Med, and still experience frame drops
Yes and no. You have to keep in mind there's some games that ask to _allocate_ as much vram as possible while not actually _needing_ it. While 8gb is absolutely getting to be a hard limitation on the settings you can use going forward, I highly doubt any game can't be played at 1080p medium. Ratchet and Clank runs perfectly fine on my kids' 1660 super at 1080p medium and that's only a 6gb card while Steve's chart shows it should run horribly with an 8gb minimum. Something is definitely wrong there. If it wasn't using quality textures it certainly wasn't noticeable in any way and probably needs to run WAY over the vram buffer before you start to notice muddy textures. While I played it cranked up on a 3080ti with RT and frame gen and I'm _positive_ I never noticed muddy textures or frame drops even though Steve's graph shows 13-14gb usage. It's easy enough to find clips on YT of someone playing just fine with a 1660s as well as another with a 3070 at 1440 with a mix of medium/high DLSS quality and vram is _not_ going over 6200mb, and even enabling RT reflections he didn't go over 7300 or experience frame drops. When the 3070 is the poster boy of too powerful but not enough vram something weird is going on with Steve's graphs when he's saying the 3070 should be horribly short on vram for those settings. I suspect R&C might be one of those games that asks for a ton of vram regardless if it actually uses it, and when offered the 24gb of a 4090 it asks for notably more than it needs. I've noticed FarCry games are notorious for this as well allocating everything they can get but utilizing notably less. Vram issues are very real, but I still think they get exaggerated, if at least a little. I'm not saying go out and buy an 8gb card today, but 8gb is hardly as useless today as this video is suggesting.
@@zodwraith5745 Yep. Using a 4090 for all of these tests unfortunately skews the numbers beyond what a cheaper card would use, and says nothing about performance. A vid like this is just to add fuel to the fire of people freaking out about vram numbers, when really they should be looking at performance of different cards in their favorite games. Like, I'm not saying that nvidia doesn't suck for cheaping out on vram. But people are really overreacting and vids like this don't help the situation, and don't help the consumer decide what they should buy.
@@sntslilhlpr6601 It's sad that you have to make sure to put in that you're not going to bat for Nvidia or you know you're going to rile up the AMD fanboys. But I do have to say Steve has been at the forefront of the vram debate and can often exaggerate it's effects. When his testing only says "medium, high, very high, and features enabled" makes me wonder if he had _textures_ cranked for every test when I know for a FACT that R&C doesn't need 8gb for "1080p medium". Which is obviously the biggest sponge for vram. It's like they're assuming people are incapable of adjusting settings outside of a slider. This debate got old a long time ago but I'll happily admit the 4060ti is _absolutely_ too powerful to be a measly 8gb card. I get it on the GDDR6X 4070 because the 6X costs literally twice as much as the cheap crap AMD uses, but it's unforgiveable on the 4060ti and below that uses the same cheap memory.
There's something comical about 4060 Ti 8GB buyers getting the card for the ray tracing and framegen, only to have those features not work properly because a lack of VRAM.
So wtf is the point of the 4060 then!? I recently bought a 3060, but was considering the 4060. When I found only 8GB options, I decided to go for the 3060 12GB and spent the savings on more system memory.
The point of the 4060 is to upsell you on a higher tier card for more money. These companies don't want to give you a product that will fulfill all your needs at the price you're willing to pay, so if you need more VRAM but you're primarily shopping on the lower end, they're going to do everything in their power to get an extra $100-200 out of you for what you really want.
4060 is a hard sell. It's not bad the price is or was just terrible for what it has to offer. It is still a good card just not for 1440p and probably a lot used in asia and internet cafes and stuff where not very demanding games are being played. The 4060ti with 16GB Vram is pretty good, except for the price of course. And you get to use DLSS3 and FrameGen. Not saying the 4060 line-up is great but if one finds a good deal the Ti version is a good thing.
@@zilverman7820 It's true that you would rarely be running games with settings that benefit from having more than 8GB with the 3060, but it definitely can happen sometimes. Also, while it may be very rare that it would be beneficial to use anywhere near the full 12GB with this card, having more than 8 is still very useful, as a lot of games will use more than 8GB but less than 10GB with certain combinations of settings. Having more vram can also help a lot with games that are not very well optimized, or with user created mods (especially graphical mods), which tend not to be well optimized, and can use a lot of vram. Having lots of vram on the videocard can also make up for having less system RAM, though that's not likely to be a very important consideration, especially because RAM is relatively cheap right now.
I‘m so happy I went with team red for my last GPU upgrade. The 20 GB on my 7900 XT seem to be more than enough, even in the most demanding games at 4K. I also purchased it at its all time low price.
Yes, AMD cards are hitting the value sweet-spot, now pricing for them is dropping. The drivers have now completely fixed the power consumption issues with multiple monitors & video play-back etc., and seem to be just as stable as Nvidia. AMD RT is actually equal to Nvidia, on a $ to $ basis, eg. XTX RT equals 4070 Ti 12gb RT, but raster is between 4080 - 4090, so a great mix.
What CPU do you have paired with your 7900x? I have a 5600x and know it will bottleneck the GPU a bit, but these prices seem great. $640 for a 20GB card seem good but i haven't looked at upgrades in about 4 years 😂
@@mthlay15 I‘m running a 7800X3D on a B650E board, so I’m GPU bottlenecked 100% of the time. I‘d say the 5600X is fine for most games at 4K, but might limit you in games that yield high fps numbers at 1440p and 1080p. I‘d recommend upgrading to the 5800X3D/5700X3D, if you want to stay on AM4 and have the budget.
I got a 7900xtx with my tax return last year. My 5800x might be bottlenecking, though I did plan on getting an x800x3d (9800x3d should be out) next tax time. At that point, I should be set for a few yesrs
The 7900xt is a wonderfully smooth card with years of service ahead. It amazes me some games are already at 16.5GB. I bought Hogwarts Legacy for the kids and now am playing. A 12GB card is the minimum, but the 7900xt is just wonderful.
This story is nearly 20 years old. Problem is not JUST the RAM but the BUS on which it is operated. Back in DOOM 3/Quake 4 days manufacturers have started slapping cheap DDR (NOT GDDR3 which is the 1st DRAM specifically getting the G in its name to differentiate it from standard DRAM) in bulk on low/mid range GPUS. I remember the 1650 NON XT with 512 MB DDR LOL - slow DRAM on 128 bit BUS ..... the card could never fill it up properly. I have a feeling if the market and MINERS weren't a factor HBM would have been the way to go, AMD/RADEON as usual tried to push the industry in right direction regarding G-RAM but at the totally wrong time.
That's a nuanced point most ignore. I made a similar argument a couple of weeks back on another video. In GPU's, scaling up memory scales up the memory bus bandwidth.
Fury and Vega were way ahead of their time, but were received poorly because idiots like me back then bought the GTX 770 3G instead of a r9 290 that lasted for quite a lot more years. It was about that time the big shift away from the high end for AMD.
Those were the days. Laughing at my work mate who'd bought an Nvidia FX5000 series card with a ton of video memory (for the time), but it was as slow as hell.
It's insane that something like the 3070ti is already becoming outdated. That should not be a card that requires you to already be at the point of turning down settings/resolution. The 4060 shouldn't even exist.
3070 ti was the only card I ever regretted purchasing. I paid $850 for it on launch day from best buy, not even a scalper. I sold it in less than a year. I was having VRAM issues even back in 2021 when it launched. Wont be buying nvidia again any time soon. Got a 6950 xt for $550 and loving it.
Welp, at 1080p it works out for me and due to space restraints, i am not moving to 1440p for a while. Once i do, maybe i will look into an upgrade, but at this moment it works ok, just runs a bit too hot for my taste.
@X_irtz Even at 1080p, it can run into vram issues. You can't play Forza Horizon 5 at max settings with ray tracing on at 1080p because the card doesn't have enough vram, even though the card has enough processing power to do so.
Since I play a lot of VR. VRAM is by far the most important aspect for smooth non-hitchy gameplay. if I play HL:Alyx, even at "high" texture quality (not even "very high") running at 5400x2700 my Radeon RX6800 utilize 14.5GB of VRAM. Before when I had an 8GB card I always had stutters. Same goes for flightsims, like DCS, even without VR you easily breach the 12GB boundary with highest texture quality landing somewhere between 13-15GB of VRAM utilization.
@@silverfoxvr8541 lol wut. I run vrchat perfectly fine on a 1050ti (no headset though) on both my 2k and 4k monitors. Are headsets really using THAT much of your vram? I was thinking of investing in a headset but no longer want one if that's the case.
@@thelonercoder5816 Your 1050ti doesn't even meet min spec for first gen VR hardware, 1060 6GB or RX 480 8GB or better were the absolute minimum for the early headsets which are a quarter of the resolution of even today's cheap headsets.
@@DigitalMoonlight That's insane. This is my old rig btw. I'm building a new rig with a 7800x3d and 7900 GRE (16 GB VRAM). Even I feel like this wouldn't be enough for vr it seems. Seems anything sub-20GB isn't worth it lol. Man I was looking forward to investing in a VR headset but IDK anymore.
My 6700xt is still going great but if i were upgrading 16gb would indeed be the minimum. My next one probably be 20gb one 8800xt or something like that.
I wonder what kind of feces soup slop you people play to need that enormous pile of VRAM. I am honestly, 100% sincerely doing great with mere 6GB, in any and all games I own, playing at 1200p since 2009. Enjoying smooth 60+ fps in all titles, with modestly high-ish, custom settings of mine.
@GugureSux hogwarts legacy cyberpunk2077 with mods. Games which are fine with 6gb but are complex graphically downgrade texture quality to give you fps. Cp77 does it out of box hence mods fix those. If that's fine for you that's great but not everyone likes the same thing.
@@GugureSux "1200p (read: just slightly more than 1080p) since 2009" "smooth 60+fps" ... yeah as if we needed any validation that you have nothing meaningful to say
Minimum today should be like 12gb or preferably 16gb. If youre buying new card. But even older 8gb cards (20 or 30 series) are still fine if you drop down some settings.
I feel bad for all the mugs who've bought the 8GB 4060 & 4060ti. Looking at the Steam survey, that's no small number either! The 4060+4060ti accounts for 4.96% (2.73+2.23) and while there's sadly no discrimination between 8GB & 16GB versions, I'd wager heavily that, due to prebuilts, 8GB cards make up the overwhelming majority. On the flip-side, the RX 6700 XT+6750 XT+6800 account for only 1.31%! (0.74 + 0.35 + 0.22), and mid-tier RDNA3 cards don't command enough marketshare to even register!
@@The_Noticer.Or those that bought a "4808 12GB"4070Ti with 12GB for 800-1,150 Euros from launch to the first half year. 12GB for 670 is taking the piss.....12GB for 800+ Euros is.....sheer unmitigated arrogance!
I'm still rocking a 1080ti and I don't have issues running out of VRAM, its still playing modern games just fine with details up pretty high but some games now require a bit of FSR/XESS to keep the frame rate up whilst still having a lot of graphical settings maxed out :)
The allocation is important too, as the game is able to cache the textures in vram for when they're needed. This may work ok pulling them out of ram, but usually those with a video card lacking sufficient vram aren't rocking a monster cpu and the latest ram.
Modern PCIe uses DMA copy, so it doesn't hit the CPU hard like a bus-copy in the olden days. The CPU only needs to do a quick trip to the kernel to set up the DMA copy, and then it gets notified some time later when it's done. But your general point still stands: People with entry level GPUs often also have entry levels of system RAM. If they're also running their PCIe x8 card on a last-gen-PCIe platform then things get really bad really fast, and making sure the game only allocates up to the *exact* amount of VRAM available becomes crucial.
@@andersjjensen Depends, some games DRM forces the data to be decrypted constantly, in addition even with DMA, if the GPU is using the RAM, your CPU is waiting around. And that's best case scenario. The whole thing is silly and all down to nvidia playing games with vram. Going with just enough vram will bite you sooner or later, just as nvidia intends so they can sell you that 5070 with 16gb, just barely enough to make it through the product cycle.
Actual usage given upscaled textures, reshades, light to medium ray tracing, among model and other population mods. Incredibly noticeable in Hogwarts legacy with 60 mods or so. Comparing 1080 1440 4k between my two rigs with the GTX 1080 and my new 7900XT.
@@handlemonium Ignoring the first ridiculous sentence, you're not getting 32GB of VRAM for $699. You're also not getting a new XTX card for that price either (RX 8000 series will be mid range cards with probably 16 GB anyway so come on). The 6900XT had a NAVI21XTX GPU and that was $999 at launch in 2020. No gaming card has ever had more than 24 GB of VRAM and with 32 GB you can do some AI workloads not possible with 24 so that's gonna cost you extra if you even get it all (probably not for a couple generations). In fact, the cheapest card to have 24GB is the 7900XTX and that launched at $999 too. ATi charged $949 to get the highest VRAM card (ATi FireGL X1 256MB) way back in 2003 when 128MB cards like the 3DLabs Wildcat VP970 were going for $899. You want more VRAM than even Nvidia's future nearly $2000 card will give you for less than $1000?
nVidia: Please buy our GPUs to play all these *totally* efficient graphics-arms-race games with our RTX features and frame-gen! Also nVidia: What do you mean you need VRAM for that?
I predict the 7900XT is going to age particularly well from this generation with its 20GB of VRAM. The 4080 outclasses it in raw performance, but that extra VRAM will probably keep it performing consistently for at least one extra generation over the 4080 if I had to guess
Meh, at that point you're kind of talking about limping it along with reduced quality, and that's exactly the kind of thing that happens when graphics cards aren't aging well. @@jozefhirko6024
The 6950XT I bought for a little over €500 over two years ago is pushing my 3440x1440 OLED to its 165 FPS limit just fine in Ghost of Tsushima which is literally the only good AAA game that has come out since I bought that card
@@jozefhirko6024 I imagine it’ll be kind of like the 1060 vs the Rx580/480. Early on, the 1060 looked a good deal better, but as time went on, the 580 maintained an acceptable level of performance for much longer. Nothing crazy, mind you, but if you were trying to stretch your card as long as possible, the 580 was the better purchase
@@dsc9894 thats actually what happend to my dumb ass i was fooled by nvidia (bought 1060 3GB) had to upgrade within some years if i would have had an 480 or 580 with 8gb for literally the same price i wouldnt have to upgrade (and yes at the time it was already pretty ass i got a uhd 28" gifted for free which was far better then my 1050p dell)
Welp. Rough week for my 1080 to die on me, but at least I now have some data to tell me how much lube I'll need when I start browsing those NVIDIA shelves
I have a 7900 XTX so, 24 GB of vram, and I've seen some games use up to 18.5. Avatar Frontiers of Pandora for example (amazing looking game btw). I would say more RAM the better. But 24 for now is more than plenty.
Yes, 8gb is just ridiculous for a mid tier GPU to have but It’s pointless to use a 4090 with 24GB and then generalize the results for GPUs with less VRAM. With texture compression and dynamic allocation, plus driver quirks for each GPU class, memory allocation varies wildly. A bit more care or critical thinking before publishing the video wouldn’t hurt. A 4060 Ti with 8GB can absolutely run Cyberpunk and Alan Wake 2 with ray tracing without massive frame drops due to data swapping with system memory. A correction here would be ideal.”
the 7900GRE is only 50$ cheaper and actually performs identical even with higher VRAM anyway. You are paying 50$ for the better features. Which is a small difference given dlss is still superior than fsr, until fsr actually starts updating again.
FWIW lots of people are upscaling 1080p to 4K and 960p to 1440p via DLSS. That cuts the VRAM needs a lot. I do think that this shows that 16GB should be the norm going forward at this price range.
Would be almost as bad as paying 600+ for a GPU with poor RT performance and no AI upscaling. Certainly more games that force RT or make you use low settings for things like shadows, AO, reflections, GI to remove RT than there are games that use 12gb at 1440p. And certainly more games that use your upscaler for the AA and have it set in the default presets. Which makes since, DLSS offers a much larger performance budget for developers and looks better than the AA solutions that modern games use (TAA,FXAA,TSR)
Because AMD isn't as fast in RT as Nvidia people think that AMD flat out can't do RT. You can't convince them that the 7800XT brutally murders the 4060ti in raster and still exceeds it by a bit in RT.
Features>Performance. AMD is catching up but they are still miles behind in basic driver features that Nvidia offers. Also, DLDSR is amazing and AMD has no answer to that yet.
You have to realize that a lot of gen Z children that were introduced into PC hardware during COVID don't really have a great understanding in hardware and were easily finessed. These same people are are fb marketplace trying to sell their 5600x and 6800xt systems for 1.5-1.8k.
I paid £420 for the 16gb 4060ti gpu when I could have saved £60 and paid £360 for the 8gb version. But I thought that £60 extra to double the VRAM seemed a good deal and that is why I got the 16gb instead of the 8gb. And I play games like Cyberpunk 2077 and Hogwarts Legacy and they play like a dream because of the choice that I made. And I don't regret it at all.
Don't discount the 6800 non-XT. It's precisely as fast as a 7700XT and uses precisely as much power, but has 16GB. There aren't many of them left, but when you can find them they're usually only a little bit more expensive than a 6750XT which makes them a stupid good buy.
It would depend on the DLSS quality setting used and how much overhead DLSS carries. Not every game eats a ton of extra VRAM at higher resolutions as well.
I am not surprised to see the new DLSS 4.0 (FG2) will be exclusive to the 5000 series just like FG to the 4000 series. We all know the importance of "Buy more to save more."
This is exactly what i've been saying on many AAA discussion pages on steam. People complaining about stuttering, but refusing to believe its their 12GB buffer on their 4070. Or for that matter, Nvidia's weak DX12 driver with CPU overhead. But I also know, these people will buy a 5070 and the cycle continues. C'est la vie. I have seen Forbidden west use up to 14.5GB on my 6800XT, but that was after extended playing.
I finished Forbiden West at 1440p with my RTX 4070ti at around 100 fps with a smooth frametime for the entire game, I'm super sensitive to these types of things! You have to learn that allocation does not mean real use.
@@BaNdicootBaNdiDo Horizon Forbidden West typically doesn't use more than 10 GB at 1440p, at least up to 10 minutes of gametime, so you shouldn't have issues. You don't know if the OP's VRAM usage was allocated or not though as it's possible VRAM usage went up after a long session. HWInfo64 shows what resources are actually being used, not allocated, and it gives you the maximum, average and minimum. The tests Steve showed in this video was use not allocation and we see multiple games use 12-14GB at the highest settings in 1440p with Avatar using 16.4 GB which is crazy but that's due to frame generation added to the Unobtanium setting.
It's what I keep telling people. My 6800M often pulls ahead in newer games over a desktop RTX2070 in 1440p because of the extra VRAM, but people don't fucking believe me. I even showed them the difference and they tell me it's faked and I underclocked the 2070.
Back in March of this year I built a tower, spenting $760 Aus (~$520 US) on a 16GB Asus 4060 Ti . It was actually the cheapest 16GB card that I could get my hands on at the time. Coming from someone who had previously been gaming on a Laptop with a 4GB GTX 1650 Ti for 4 years, I knew 8GB would do me for as a basic card but 12GB or better was a preference. I found the price jump between the 12GB cards to the 16GB one I got wasn't that much more and thought would be worth it. I don't regret it, and found it was a good buy compared to the other cards in the 40 series with 16GB that cost at least 600 ~ 1000 more than mine.
Less than 16 gb will age quite fast with newer games past 1440p but it always depends how you play if its not max settings than u can get away with less
Dude you would feel like a fool if the next gta game can only be run on your 4060 ti with medium settings just because of vram constraints, whereas it would be perfectly capable of playing high or ultra and imagine a 6700xt getting better frame rates being cheaper on higher settings.
I honestly find the “just lower settings” and “just don’t use ultra/high textures” as the worst coping I’ve ever heard but to each their own I find maxing out textures to be just as fun as playing the game sometimes 😂
@@seanchina9902 All Nvidia has to do are give more vram to their mid range gpus and give them a reasonable launch price , AMD wouldn't stand a chance if they do that.But no, greed is really strong with Nvidia at the moment.
I bought a 4090, and it was worth it. My games will not consume more than 14Gb of VRAM. With 10Gb VRAM I built a gaming expert AI assistant to run alongside using about 10Gb of VRAM. I have dumped manuals (text, Word, RTF, PDF, Web Pages, Wikis, Steam Guides, forum strategy, UA-cam video transcripts into my vector database for the AI (RAG aka "chat with docs). I can now put down a game for 6-12 months no problem and easily return and just ask questions. When you are a senior and your memory is failing making game hopping impossible; having a gaming expert at your side is fantastic. 24Gb VRAM was perfect.
Obviously! That's a bit of a dumb statement tbh - what did you expect from the most expensive GPU on the market, a slightly better 4060 Ti, for 4-5x the cost? This discussion is not for you, it's for those who have a normal fixed budget, and need to get best value & longevity.
@@Roll_the_Bones It was about full memory utilization and that is where I am at. Do you I imagine that retired seniors are not on fixed budget? I waited 8 years to build that PC. It’s amazing how petty some people get. Also, what most builder channels are not even realizing is VRAM reduction is not about NVidia greed. It is about big tech putting an end to open source AI.
I bought a 7800XT to "upgrade" from my 3070 Ti (tired of Nvidia's crap and I wanted more VRAM). I ran Cyberpunk 2077's benchmark with the same settings (1440p, DLSS/FSR enabled, ultra quality, volumetric fog high) as the 3070 Ti. While I saw 19 more average FPS on the 7800 XT, I saw almost double the 1% lows, (67FPS to 111FPS). Not sure if it was the raw raster uplift, or the VRAM. Either way, I'm pleased.
It blows my mind how many people say the 3080 with 10gb is still fine, I upgraded to a 4090 at the start of 2023 and nearly every new game at 4k uses over 10gb at High or Ultra, and plenty of games use around 10gb at 1440p as well. In-game my overlay shows most games want more than 10gb at 4k, even though the 3080 is plenty fast to run those games otherwise. Like the 4060Ti 8gb vs 16gb video showed, even with 8gb it'll be running out while showing 6gb or 7gb used, and you know what that means. It means people using a 3080 who haven't used cards with more aren't gonna notice they're still held back even if the game says it's using 8-9gb. That's potentially because of other applications, or multi monitors, for example, I find 1.5gb is used with nothing open, which drops down if I uplug my 2nd monitor.
So my next card will need at least 20Gb, got it. Thanks for this, it's surprisingly hard to find somewhat accurate information in terms of VRAM consumption.
That's because the big tech company channels always crank up settings and resolution, and don't even bother testing out 1080p no more, and way too often skip any sort of setting optimizations. As an user of a single-digit VRAM GTX card who still enjoys rock solid 60fps in all games, without any wonky upscaler tricks, I'm seriously disturbed by this VERY RECENT skyrocketing of system requirement. And no, it's absolutely not explained by any sort of jump in visual quality; some literal 10yo games look exactly the same, if not better, than modern slop.
Agreed. Honestly, it's like PC gamers don't understand that optimising games like Alan Wake 2 or Hellblade 2 will still end up with a fantastic looking game, even on medium settings. What does 'medium' even mean? It's arbitrary. Didn't the Avatar devs hide the Unobtainium graphics setting? They probably did that cos PC gamers can't help themselves and want to max out every setting, then complain that the game is unoptimised. I don't max out graphics on games, unless I see a significant visual benefit from it.
@@GugureSux if you don’t use upscalers, AMD is just better for gaming. raytracing also costs VRAM, so you probably don’t use that, either. if you don’t need it for productivity, keep AMD in mind should you upgrade ;)
I upgrade my GPU about every five years, and I game at 4K. I need all the VRAM I can get. I went from an RTX 2080 Ti 11GB. I'm now using an RTX 4090 24GB. It really depends on the resolution and how often you upgrade your hardware. Some people will use the same old GPU for many years. I still see people on Steam using an RX 580 8GB. That's crazy to me. It's so old now. Not everybody plays the latest games. It helps to have a lot of system RAM for when you exceed dedicated GPU VRAM. I would be in the red playing Resident Evil remakes at 4K ultra settings with the RTX 2080 Ti and the games still ran fine. My laptop has a mobile RTX 4070 with only 8GB VRAM. The Last of Us Part 1 glitched out in a strange way with white light all over the place and then crashed. You definitely want more than 8GB for 4K ultra settings for AAA games. Laptop chips are too cut down. I prefer desktop gaming.
Everyone talking about how bad the 4060 was at $400 for 8GB, but I think the 4070Ti at $800 for 12GB is far, far worse. Imagine spending $800 and you have to turn textures down after only a year, and turning DLSS and/or RT on gets you stutters. Yikes. Also, just got my 7800XT today, upgrading from an RX 5700 8GB. What a time to make a VRAM video! :)
Not true at all. These additional 4GB actually make a big difference. 12GB feel a bit better at 4K then 8GB at 1080p. Thus 4070 Ti is a better 4K card, then 4060/Ti as a 1080p card. But ideally 4070 Ti should be paired with 1440p screen (since it is not fast enough to run modern AAA games at 4K). And here it has absolutely no issues. I've been using my 4070 Ti for 1,5 years. And I haven't reduced texture quality in any game yet.
I think it was Digital Foundry that said that Avatar manages textures based on available VRAM so it would make sense that its trying to use a lot of it if there is enough available. So VRAM usage should vary depending on what GPU you use.
Here I am again with my GTX 1070Ti with 8GB VRAM @ 1080p, just gave it some love and changed paste because hot spot temp was getting uncomfortable for me (also, summer D:)
The reason why I didn't buy the 4070 Ti was that I found it unacceptable for its price to have only 12 GB. This was later corrected by the 4070 Ti Super but it didn't exist when I was buying. So I settled for the regular 4070 which is fine for what I play and decided to keep the extra money and put it in a future upgrade.
@@ej1025 wouldnt have worked with the bus width of AD103, and frankly 20GB is overkill; 7900XT was more an accident of having 5 * 4GB as the XTX needed chopping down somehow. I ended up with the XT as I needed something to drive a 4K screen and it had price adjusted below the 4080 12GB I mean 4070 Ti at the time. There was no way as I was going to pay that kind of money for a 12GB card at the time; and if the 7800XT had been on the market at the time would have gone with one of those. I personally dont find RT massively compelling currently outside of some novelty titles, we're waiting for the PS5 Pro or even the PS6 to properly bed in; and the console crowd are increasing demanding high framerate so that will only delay it further. The only useable implementations have been stuff by Insomniac or that Lego puzzler. It massively screwed up the Art & Lighting in Cyperpunk & Metro. Maybe DOOM Eternal had a reasonable implementation. The IQ is so much better driving raster with some smoke and mirrors at a decent framerate and close to native resolution than some more reflections or some GI that only updates every ten frames. It's the same problem with like Monster Hunter Rise on the Switch where everything more than 100m away suddenly animates at half framerate. It's really distracting.
You show horizon forbidden west not needing more than 12gb vram at 4k in your tests, but that does not match my experience playing on a 12gb gpu, where I frequently saw numbers at 11.8gb and accompanying stutters as it starts filling ram instead. That was in both main game and DLC. 10 min tests is nowhere near adequate.
And that is perfectly fine for most indie games. Just not anything remotely up to date like Enshrouded. Consoles can allocate 12GB to frame buffer which gets ported to PC as the "high" setting so just use medium or low. Medium of today is the ultra of 2016, when 8GB was normalized.
“VRAM or graphicscard memory has been a hot topic over the past few years” is so true. VRAM was crucial when I was looking to get a new GPU last Christmas, but it wasn’t even a consideration when I got my PC about 4 years ago.
I think this whole vram panic is very profitable for gpu makers. People are willing to give them more money and go for higher-tier gpus because they have more vram. I think both Nvidia and AMD are motivated to keep this going for another generation.
AMD has very good cards in the 6000 series, with 10 GB for the 6700 12 GB for the 6700XT and higher and 16GB for the rx 6800 and higher, with 192/256 bit bus respectively. having gotten an rx 6800 for 400€, these are budget options for mid-tier grapics cards
@@mitsuhh No it's not an 80-class card, not from performance and for sure not from the hardware perspective. Stop defending corporations man it's kind of stupid.
I wish I had this video when deciding between a 8gb and 16gb variant of intel arc. The tests I saw back when I was shopping around found that the 16gb one didn't get much more fps, but now I've been finding games that are pushing its vram to its limits.
Thanks for solid content Steve! I have 21:9 ultrawide monitor and wondering how much VRAM usage would 3440 x 1440 resolution take? Should I aim for 4K results for my VRAM requirement?
@@stevenostrowski6552 100+ isn't great when it's 144hz monitor and I was suggesting for ultra presets in games so that it could still be viable into the future, tell me what kind of games are you talking about
8GB: Can't max most game at 2K Can max most game at 1080p 12GB: Can max most game at 1080p upgraded to 2K (will usually lack of VRAM on very demanding VRAM or badly optimized game), will usually not be able to max game on 2K. 16GB: Well, technically you can max everything, so you're kind of future proof. (but there's a few game that already hit the 16GB VRAM by their poor optimization) 20GB: To my knowledge. None game is enough badly optimized to consume 20GB (max I've seen was around 17GB VRAM). Because I don't own a 4K monitor. I don't know if some game hit the 18-20GB VRAM usage on 4K ultra (+ RT if possible) especially with FSR or DLSS using more VRAM when enabled. (You will the limit with some heavy computing/IA simulation/video 24GB: Well. You're more than fine (Almost no task hit the 24GB but there's some oddly specific who does)
what you call 2k is not 2k. you mean 2560x1440. 1920x1080 is what some call 2k, because the horizontal resolution is approximately 2000 pixels. 3840x2160 is called 4k because the horizontal resolution is approximately 4000 pixels. 1080p, 1440p and 2160p are not resolutions, but video formats, because the p stands for progressive scan as opposed to interlaced 1080i, 1440i, 2160i. all games are rendered progressively and adding the p is the same as saying that your car has wheels every time you mention your car, though it's unfortunately common these days, even among hardware reviewers
One of the main reasons I went 7900xtx. Never wanted to run into a situation where I needed more. I’ve seen upwards of 20gb used in very demanding scenes.
These graphs don't make sense. Just because a game CAN use 16 GB doesn't mean it NEEDS to use 16 GB at all. This does not negate the fact that there really are games where 8 GB may not be enough for ultra settings. But there is manipulation in this test. We are shown 2 games where FPS drops, and in one of the games we are shown a video clip on the game engine (in such videos FPS is not important and higher-quality textures are loaded there, as well as the director can make the scene temporarily more difficult). And the second game was simply not optimized at the start Then, when the right idea is embedded in our brain, we are shown a dozen graphs from the most modern games. When you view these charts, remember a simple rule: The ability to use memory is not the same as the need.
Yeah it's difficult for the lack of an actually good assessment tool but he qualified it with first hand experience what happens at these particular settings with particular games, he pointed out several that went stuttery. It is actually pretty typical that the amount of VRAM occupied with objects hovers around what the game actually intends to use, and that you have some leeway to squash it but not very much. Longer term excessive caching isn't very good, since freeing the cache can lead to hitching in and of itself, so it's not a typical behaviour in well developed engines. And you do have a fallback algorithm that kicks in during scarcity, either the textures are getting eagerly removed (slightly stuttery), or loading emergency-halts (you get patches of very blurry textures), or a host allocation type is temporarily used for new loads instead of device type (reduces performance). I think unless you have results or first hand experiences to the opposite, that at a particular setting a game hasn't started degrading on an 8GB GPU in comparison to a more generous one, while he indicated that it would, you don't have a strong point.
A lot of folks bashing the 4060 but I actually like the 4060 as a low power card for upgrading low cost systems without having to upgrade the power supply at the same time. It just needs to be maybe 10% less expensive. The 8GB I think is acceptable on it. However, the rest of the Nvidia lineup sucks; high prices and not enough VRAM. That, and because I don't care much about RT, is why I went with an AMD 7800 XT this year; decent value and plenty of VRAM. I know it's not really better than a 6800 XT, but that's still a good card too.
I wanna buy 2 of them for work and gaming machine, for me and my wife. 8 vs 16 is 300 vs 400 € euro right now. 1080p is fine but I wanna use Minecraft Rtx, heavy shaders and mods. Maybe some AAA game? Maybe AI gets interesting for me? Feel like 16gb is slightly overkill but maybe safer
I have 20gb in my 7900XT, and I can tell you personally that it is NOT overkill for VR. There are many VR games that can get close to maxing out on using that VRAM. For regular gaming yeah probably, but I'm using a 5120x1440 super-ultrawide and some more graphically intemsive games do use more than 16gb of VRAM. 20gb is just really nice to have.
Still amazed by my 3080's 4K 60fps performance. Four years later, I am now upscaling 1440p Medium-High to 5K and then squashing back down to 4K for visual quality imperceptively similar to the native 4K High that I used to run in late 2020... still not hitting its 10GB limit. I can even run Medium with ray tracing without major stutters. Though I expect that I may need to swap my card within the next 2 years, I'm totally satisfied with 4 years and counting of 4K gaming!
Even better considering you won't be turning on any heavy ray tracing and probably don't care about running AI cause AMD so 24 Gb Raster machine is definitely still going to be a raster machine with plenty of VRAM in 2030+.
@@albert2006xp I'm playing Cyberpunk with Path Tracing at 250 fps. (1080p XeSS 1.3 Performance + FSR3 Frame Gen + AFMF). I'm keeping my card underclocked to 2000 MHz instead of the possible 2900 MHz.
@@raresmacovei8382Dang. And RDNA 4 is supposed to have a massive RT boost. If they maintain their generosity with VRAM (at least compared to Nvidia), they should have a highly-compelling product.
@@raresmacovei8382 whom are you lying to? Yourself? At 1080p upscaling is bad idea, at best DLSS quality looks decent, not FSR, despite that say you get 250fps with XeSS 1080p performance? Hilarious, that must have been mudy & blurry mess, I bet. Good luck to you 😂. Like, honestly? 540p upscaled to 1080p LMAO!!!!
Everyone building or buying a gaming pc should see this video first. I would not go below 16 gb in 2024 in a new system. Also why are gaming laptops dumbed down with low vram a 16 gb laptop costs a fortune. Anyway this was a really interesting watch
I recently acquired a 4070Ti with 12GB. I noticed most of the games I played had zero overlap with the games tested here. My games almost never exceeded 8GB of VRAM usage, even with ultra+dlss+RT+framegen at 1440p. The games were indie titles like Palworld, Helldivers, and Darktide.
@@ssjbargainsale and the said AAA slop is the kind of shit that always has THE worst optimization, worst art-design, and generally get forgotten within the same year they drop out. There's truly few exceptions happening these days, and they're usually made by old industry pros who understand visual directing and optimization.
DLSS and framegen are meant to save GPU processing power and VRAM. They are the crutches the modern devs use to excuse their poorly optimized Blender vomit visuals. Also lol @ calling big-publisher trash like HD and DT "indie"!
After watching Black Myth: Wukong gameplay footage, I will consider 16 GB of VRAM GPU because I want to maximize the setting for that beautiful graphic
It's the video games, completely out of bloody control. Near horrible optimization and carelessly packed. You can't expect everything to have hyperrealistic graphics 4k textures, ray tracing, bloom, fxaa and all the jazz then somehow expect GPU companies to magically slap an extra 10 gb of vram on their GPUs and still sell it affordable. Fix the gaming industry, the gaming hardware will follow. Keep the gaming industry in an eternal state of inflated requirements for negligible visual return and gaming hardware will continue to spike in price.
Yup. As greedy as NVIDIA is, sometimes it's game publisher companies not giving the devs time to optimize games and/or having difficulty accepting that PC gaming is more relevant to people now. It does not matter if AMD/NVIDIA actually listens to their customers, if lots of game publishers decided that (as an example) RTX 4090 or its 24GB VRAM is going to be minumum very low settings 24fps for 640x480p resolution in their games. Both GPU manufacturers may offer 32GB for their next series of xx60 (or x600 for AMD) but if publishers decided that 64GB required low settings 480p, then yeah....
I bought a 7800xt because of the 16gb of vram. I probably would have bought a 4070 super or 4070 ti if they had 16gb for the same price. But the 4070 ti super was nearly double the price of a 7800xt and I don't game enough to justify the extra money.
My friend spent $800 on a 4070ti 12gb and he barely gets more performance than my $400 6800xt, nvidia is straight up pathetic imo bc they know how to gimp cards perfectly
I know this is old, but you guys are awesome, and the information you provide is priceless! Thank you for everything you do! HU and GN are the 2 places I go before a hardware purchase- if you both have positive reviews of whatever it is, I buy it. Thanks for never letting me spend good money on underpowered or underperforming hardware.
That's nothing new. Gaming laptops have always had a short lifespan. If it isn't the heat killing it early, it's the lack of VRAM or lack of drivers because your laptop OEM requires ones for your specific laptop that they don't even bother updating.
This shouldn't even be a discussion.
VRAM is cheap, but sold to users at a huge premium. VRAM amount should *never* be a bottleneck for gaming, but because of greed, it is.
Having spare VRAM is fantastic. It improves the longevity of the GPUs by enabling users to run newer games, on lower settings - but keeping textures and whatnot high, at little to no performance cost. It keeps the hardware usable for longer, which obviously isn't in Nvidias interest. So they limit the amount, to ensure that we keep on upgrading, even though we otherwise wouldn't need to.
They're generating E-waste for the sake of profit. It's a version of planned obsolescence.
Can you be more dramatic please, you’re being too kind
this comment
My 1070 lasted me for nearly 7 years purely because of 8GB vram which at the time of it's release was a lot, even an overkill at that moment because most of 2016 games used no more than 4 GB of vram. I guess nvidia saw what they did and how people didn't upgrade their GPUs for a long time and decided enough is enough, if you don't want to upgrade at least every 2 generations we're gonna force you to do it with GPUs crippled with insufficient vram
Recently I got a 7900 XT and those 20 gigs are a godsend, It will be plenty of vram before the GPU itself starts struggling
@@miha1999grobarwhat you people don’t realize is that the demand for vram will not go up forever, we’re pretty close to the best graphics possible and I can bet that even in 20 years, 20Gb of VRAM will still be more than fine.
@@RoiDeCoeurs No way lol. I don't know for how long it will keep going up but it definitely will keep going up for years
GDDR6X is not cheap.
Nvidia marketing RT as the future of gaming and DLSS as the saviour of gamers, while also selling cards lacking the VRAM needed to run DLSS and RT.
It has always been the strategy. I was saying this to my friends up to 4 years ago. One of them already changed GPU twice since then because of low VRAM which lead to poor performance. Yet all still fail to recognize anything but NVIDIA as a viable option. It's crazy how brainwashed people are.
TBF, DLSS reduces the vram required vs having it off. I would still rather just have a strong enough GPU and enough vram to not need DLSS but that's expensive.
@@randy206 AMD's cards are significantly cheaper than Nvidia's.
@@louisshin642and you get less features as a result, they’re not equivalent. Youre making trade offs to get that lower price. If you don’t care about the trade offs then it’s a good deal
@@louisshin642 amd cards release at around the same price as nvidia cards ie where they fit in nvidia's line up and then drift down in price later. also significantly cheaper depend on the tier of gpus cause in some case thats an outright lie since the 7600 and he 4060 are basically the same price.
People need to keep in mind, higher texture quality is the easiest way to increase graphical fidelity. It has the most impact on how nice your game looks. And with sufficient VRAM it has basically zero impact on your performance. This has been the case forever...and I don't understand why the average person forgot this.
I dunno man. For me personally shadows and lighting are a lot more important than textures...
This!
No amount of RT and AI-frames will ever improve visual fidelity as much as high-resolution textures do.
Having 8GB on an entry level card might still be fine, but as Steve and many others habe rightfully said time and again, having 8GB on any product above the entry level is borderline offensive.
That's why paths in Gothic 2 (2002) looked better than in Gothic 3 (2006). No amount of lighting, normal maps and shadows could cover up that the resolution got halved.
People need to keep in mind that texture quality are based on texture resolution in most games so trying to use 4k textures will slam your vram but if you are playing at 1080p or 1440p you are getting literally zero benefit from that.
@@ancientweeb5984 riiiiight... 🤦♂️
as Nvidia marketing team expert i can say 6gb is enough in 2024
but with 6gb am i buying enough to save?
@@angel_killzityes. the more you buy, the more you save. that's right.
@@angel_killzit yes, you can use AI built into our drivers, that's the most important part, as for games, just just upscale with dlss and double those frames with FG, it will be the best experience of your life, and you will be able to buy more cards for your family to save even more!
You mean 3gb?
For a premium, you can always download more
I can't wait for $500 usd 8gb next gen cards! Let's go
Make that 600l
To be fair, the most expensive 8GB GPU of this generation is "only" $400, so maybe that'll drop to $300-350 in the next generation. Of course, 8GB should only be available for entry-level sub-$200 GPUs.
Nah, make it 550. People will still pay.
Didn’t kingpin sign with PNY to bring his GPUs back? So $750 for a 5050ti super kingpin ultimate mega edition
RTX 5060 8gb $500
As a 3070 Ti owner.. I am still annoyed at Nvidia's decision to only include 8GB on this card. It had so much more potential. Definitely feeling regret buying this..
Feel this with the 3080
U have problems with this gpu??? What problem ? Have you tested any of these games?
Yeah it's bullshit. It was the only reason to skip that while generation for me. And why I saved more to go for a 4080super. Wouldnt have gone that high if the 4070 had proper vram amount. So I guess their trick is working...
@@Perrajajaja 15% of performance increase for 40% more money between 4070 Super and 4080 is not justified at all and that the 4070 Ti is almost cost the same as a 4080 i just pure scam.
@@MuckytujaI agree it's a scam. I do not agree with the 15% performance you mention. The 12gb just isn't enough for a future proof 4k card. So the 4070s are not a real option imo
Wait until Nvidia releases 5070 laptop, again with 8gb for the 4th year in a row.
AMD Strix Halo to the rescue
1070 laptop had 8 gb as well so it's more like 8-9 years in a row.
Sold off my 8 gb 4070 laptop about a month ago and got one with a 4080, just because the gpu was powerful enough for higher rez games, it just didn't have the vram to back it up.
@@GrumpyWolfTechsame bought the 4070 and sold it after I found a good deal on a open box best buy deal on a 4080
Nothing wrong with a rig like I just bought with a 4060 for 800$. I don't game above 1080p though, so yeah stay away if you got the money and urge to game above 1080p!
Not only did the 1070 have 8GB of VRAM, it was double that of the 970!
3,5GB double? 😉😂
@@xDUnPr3diCtabl3 LOL It had 4GB total, it's just that half a gig was at a reduced speed.
@@TheZoenGaming True.
And 4 years after that, the 3070 released still with the same 8gb VRAM lol
@@HennesTobias Yeah, going from rx480 8GB to 5700 8GB to 3070 8GB is a joke
I was supposed to buy a RTX 4060 TI 16gb then luckily I've saved some $ to the other part then upgraded it to rx 7800 xt.
I'm happy with my very first pc build 😊
what's the difference between 4060 TI 16gb and 7800 xt 16gb? is 4060 ti just upscaling its 8gb?
@cccpredarmy RTX 4060 ti 8gb or 16gb is overpriced for it's performance. To make it simple, 4060ti is good at 1080p resolution while RX 7800 XT have no sweat on 1440p.
@@JomeFromWork I was just surprised if 16gb on 4060ti are different compared to 7800 XT.
I myself run a 6700 xt 12gb and will upgrade soon for a 6900 xt 16gb which I got for a bargain.
My 6700 xt runs 1440p just fine
@cccpredarmy both the 8gb and 16gb version are the same GPU just the other has more VRAM which does not translate to a significant performance. It's just a high end 1080p GPU which is 👎
BTW, nice upgrade.
Hey, I have an RTX 4060 Ti 16GB and a 3440x1440 monitor. I dunno why it's being called a 1080p card when it handles my resolution really well.
So DLSS frame gen uses Vram... VRAM that Nvidia doesnt give you at the mid-low range cards.
thats a hell of an upsell trick lol.....
It has always been. Claim that the technology is worth overpaying while not caring about the hardware. And then turns out that software requires appropriate hardware to run well!
Create the problem and sell the solution.
They have so much market share and marketing they can turn a feature like "ray tracing" that was nearly universally panned and make it something gamers absolutely can't leave on the table when they could get superior raster for their budget with Radeon.
Always remember the time nvidia sold 3.5GB + 0.5GB VRAM.
@@warnacokelatthat was a massive sh*t Strom when people found out what they did. Especially when that last 0.5 gb of VRAM was slower vram.
Listen…….we the faithful Nvidia sheep must keep believing everything they tell us as words of gospel now that being said I believe the saying was “the more you buy the more you save” which translates to upgrade to the highest tier that your bank will allow you and upgrade every generation it’s a simple concept really
Nvidia VRAM professional says 6-8GB is more than enough until 2030 so buying an rtx5050 ti, 6050ti and 7050super ti with 4-8GB will be worth it at $400 to a fairly maximum price of $500+
Wrong. 6-8GB isn't enough. It was great ten years ago with the 980ti 6gb cards, but with games often having more textures or higher resolution textures now, 11-12GB is a more comfortable amount. It's not even just about a smoother experience, it's that some games or programs really do need that memory capacity. It also allows to do editing and mesh creation much better.
@@MiniDevilDF He was joking
@@MiniDevilDFmy guy. He is taking the piss
@MiniDevilDF I don't know what you expect for a xx50 and xx30 they are they for low budget gamers. That means vram cutbacks
@@MiniDevilDF but according to the Nvidia textures and settings experts the 5050ti super with its dashing 8GB of VRAM at a low asking price of $500 entry level gpu can will be able to do low settings with ultra low Raytracing at 30-45fps what more d u want? Be more appreciative of our overlords at Nvidia please 🙏🏻
Finally a comprehensive video on VRAM usage in games. I wish you have had added a comparison of VRAM consumption when playing with the 7900 xtx and the 4090 as both have 24 GB of VRAM. I always hear that "20 GB VRAM in Radeon GPUs are equivalent to 16 GB VRAM in Nvidia GPUs". I would only believe it if someone like you shows a side by side comparison.
This is the same cope Apple uses in their phones. Sure you can optimize but memory is memory
well thats bullshit nvdia fan boys talk too much crap .I have a 7900 xtx and my mate as a 4080s and we playin same game in coop and yes my xtx used more vram but it was only by 700 to 800 mb less then 1 gig . Space marines 2 was game.
@@albzc8345You spent around 1000$ and this card has perfomance at RT like 4060TI? Also bad fsr, more heat, usual bad coil whine and high power usage. I would go with 4080s. 😅
@@albzc8345 TRUE. In smaller cards it goes like 500mb, 200 mb + than a nvidia equivalent. Nvidia users say a lot of shit.
Children. Its all about them.
@@albzc8345 and maybe its just because you have more Vram, so the game feels free to use it
Planned obsolescence.
bingo
The collective anguish of 8GB RTX 3060 Ti and RTX 3070 buyers during 2020-2022 cannot be understated.
yeah delivering a new product that has the exact same features, so both are bottlenecked, seems like a perfect way to introduce planned obsolescence....
They paid for it. Their headache.
@@zonta71 Don't forget we also paid for it all while GPUs were scalped to the high heavens and back. Thanks for your sympathy.
Graphics card from 7 years ago (1080ti) having more Vram than 2024 cards is criminal.
Flagship vs entry level be like
1080Ti was a flagship in case you didn't realize.
2080 ti has 11gb as well.
Hey, you could always start your own GPU company and just make cards with 1TB of VRAM.
1080 Ti doesn’t have more VRAM, unless you are comparing it to $300 GPUs which is just dumb
A "methodology video" in how you measure actual usage would be SO NICE. Allocation vs usage is always a hotly debated topic, so laying it bare for all to see would end a lot of debate.
Yep. It's really not clear how one could know what memory is "used" vs merely "allocated".
remember that the video is about current requirement, usually people buy for the next few years at least.
This
Thats nothing for people buying a new GPU each generation.
@@eventhorizon9598 But it's something devastating for the majority that keep their GPUs for a few Generations.
@@thatguy3000 Sure. People who now upgraded from their 900series, keeping them for more than 3 generation are well advised going for the 16 GB models by AMD.
If you want to play at 4k ultra for years you'll have to spend more money, sorry but that's how it works
RIP 3080 10gb . GPU performance would be still enough , but the vram is just too small now .
Just download more VRAM!! 🤣
Same power as my 6800xt but genuinely not enough vram, really is sad to see. rip 3080
wdym RIP 3080 10gb? just dial down to High settings and it's perfectly capable (ultra is stupid anyway)
RIP 6800XT (it was DOA for RT)
@@shlapmaster3080 users themselves talk about how they don’t even use RT but shore hate on 6800xt that never stated rt was a feature for it
nvidia pulling an apple move charge a ton for more memory - genius
These tests revealed something new for me. Frame generation is actually bad for anything below 16gb VRAM.
Which means both of the main gimmicks of RTX cards ii.e RTX and Frame generation are only good for high end cards.
Which have much less need of these features because they can achieve good performance through raw power. 😂
That's why just rely on benchmarks is not great for GPU reviews
This is exactly why I went with the 6750 XT instead of the 4060.
At first I thought I'd take the VRAM and Performance hits because surely RT and Frame Gen are enough, right? And turns out that no. After watching several benchmarks I noticed that RT and Frame Gen can't run together on the 4060 due to it's 8GB of VRAM, even at 1080p.
Literally Nvidia's main selling points right now outside of productivity out the drain LMAO.
Another reason was because I really want to play Cyberpunk 2077 and turns out that Phantom Liberty demolishes the 4060 even without RT because of how good it looks.
@@diego_chang9580I was also faced with the same dilemma of what to purchase and made the same choice. I already have some games using over 8gb of VRAM (not just allocation) so I'm pretty satisfied. I still have a 1080p monitor for now so upscaling on a 4060 would be out of the picture anyway, and the gap in raw performance compared to the 6750 XT is large.
No. You didn't pay attention to the graphs... The VRAM limits are reached because of Ray Tracing. FG used 1gb at the most.
Forbidden West DLC would make you drop down to medium on 8GB cards
Yep, seeing 14+ on my 6800XT in fleet's end.
@@The_Noticer.14gb in rust @1080p as well, I was surprised cause it was 15gb at 1440p so I thought it be less than 14gb @1080p tbh
@@puffyips rust has just trash optimalization
Avatar frontiers of pandora eat 15.5 gbs vram at 1080p ultra in my 4080 super
Even 6GB still not enough for 60 fps on Medium. But luckily FG works like wonder but still horrible that Medium textures ain't looking anything special but blurriness. 🤷🏻♀️
I had an RTX 3070 in 2021.
I played games with a youtube walkthrough and an excell sheet on my second monitor.
I was getting frame stutters, lag spikes, and problems switching between active windows.
I upgraded to a 6800xt for $900 in February 2021.
No problems with VRAM since.
Also upgraded to 64GB or RAM and. 5950x from 32GB and a 3900x a month later.
Each time i could feel the upgrade. I like to multi task, and i found the 16 core over the 12 core really helps when multi tasking in games like elden ring. Cyberpunk, City Skylines 2.
The multitasking has to do with your system RAM and not your video RAM.
@kennethd4958 i had 64gb of ram. Multitasking uses vram as well. And I ran out of vram on the 3070 while multitasking. So I upgraded to the 6800xt.
32gb RAM and 16gb VRAM, recommended balance on gaming PC 2024.
Anno 2016:
16GB RAM and 8GB VRAM
@@xDUnPr3diCtabl3 which is 8 years ago... what a stagnation.
64 and 24gb vram u meant
12GB would have made a lot of sense for the 4060 ti, but it was designed primarily around the laptop configuration, where it made more sense to have less vram and a narrower memory bus, because when the GPU is clocked lower in the laptop variant, and with a smaller screen, you're likely to turn settings down lower anyway, and so 8GB is more likely to be adequate...
But, jeez, what am I saying here?! I sound like an Nvidia shill. 8GB is still pretty weak even for the laptop variant! XD
you don't need more than just few kilobytes
Wow, very informative data and charts, I'm blown away. I really appreciate that you tested different graphics settings at different resolutions. When VRAM problems are brought up, I always thought "meh, you can just play on High settings anyway, it looks the same with higher FPS" but now I see some games use more than 8GB even on 1080p Medium which is the lowest settings that you would use on a new graphics card. It's criminal that you can pay upwards of $400 and play on 1080p Med, and still experience frame drops
Yes and no. You have to keep in mind there's some games that ask to _allocate_ as much vram as possible while not actually _needing_ it. While 8gb is absolutely getting to be a hard limitation on the settings you can use going forward, I highly doubt any game can't be played at 1080p medium.
Ratchet and Clank runs perfectly fine on my kids' 1660 super at 1080p medium and that's only a 6gb card while Steve's chart shows it should run horribly with an 8gb minimum. Something is definitely wrong there. If it wasn't using quality textures it certainly wasn't noticeable in any way and probably needs to run WAY over the vram buffer before you start to notice muddy textures. While I played it cranked up on a 3080ti with RT and frame gen and I'm _positive_ I never noticed muddy textures or frame drops even though Steve's graph shows 13-14gb usage. It's easy enough to find clips on YT of someone playing just fine with a 1660s as well as another with a 3070 at 1440 with a mix of medium/high DLSS quality and vram is _not_ going over 6200mb, and even enabling RT reflections he didn't go over 7300 or experience frame drops. When the 3070 is the poster boy of too powerful but not enough vram something weird is going on with Steve's graphs when he's saying the 3070 should be horribly short on vram for those settings.
I suspect R&C might be one of those games that asks for a ton of vram regardless if it actually uses it, and when offered the 24gb of a 4090 it asks for notably more than it needs. I've noticed FarCry games are notorious for this as well allocating everything they can get but utilizing notably less.
Vram issues are very real, but I still think they get exaggerated, if at least a little. I'm not saying go out and buy an 8gb card today, but 8gb is hardly as useless today as this video is suggesting.
@@zodwraith5745😅
@@zodwraith5745 Yep. Using a 4090 for all of these tests unfortunately skews the numbers beyond what a cheaper card would use, and says nothing about performance. A vid like this is just to add fuel to the fire of people freaking out about vram numbers, when really they should be looking at performance of different cards in their favorite games.
Like, I'm not saying that nvidia doesn't suck for cheaping out on vram. But people are really overreacting and vids like this don't help the situation, and don't help the consumer decide what they should buy.
@@sntslilhlpr6601 It's sad that you have to make sure to put in that you're not going to bat for Nvidia or you know you're going to rile up the AMD fanboys. But I do have to say Steve has been at the forefront of the vram debate and can often exaggerate it's effects. When his testing only says "medium, high, very high, and features enabled" makes me wonder if he had _textures_ cranked for every test when I know for a FACT that R&C doesn't need 8gb for "1080p medium". Which is obviously the biggest sponge for vram. It's like they're assuming people are incapable of adjusting settings outside of a slider.
This debate got old a long time ago but I'll happily admit the 4060ti is _absolutely_ too powerful to be a measly 8gb card. I get it on the GDDR6X 4070 because the 6X costs literally twice as much as the cheap crap AMD uses, but it's unforgiveable on the 4060ti and below that uses the same cheap memory.
There's something comical about 4060 Ti 8GB buyers getting the card for the ray tracing and framegen, only to have those features not work properly because a lack of VRAM.
So wtf is the point of the 4060 then!? I recently bought a 3060, but was considering the 4060. When I found only 8GB options, I decided to go for the 3060 12GB and spent the savings on more system memory.
The point of the 4060 is to upsell you on a higher tier card for more money. These companies don't want to give you a product that will fulfill all your needs at the price you're willing to pay, so if you need more VRAM but you're primarily shopping on the lower end, they're going to do everything in their power to get an extra $100-200 out of you for what you really want.
4060 is a hard sell. It's not bad the price is or was just terrible for what it has to offer. It is still a good card just not for 1440p and probably a lot used in asia and internet cafes and stuff where not very demanding games are being played.
The 4060ti with 16GB Vram is pretty good, except for the price of course. And you get to use DLSS3 and FrameGen.
Not saying the 4060 line-up is great but if one finds a good deal the Ti version is a good thing.
It really should have been the 4050 but Nvidia swapped the names around. Instead they gave a few percent performance increase +FG so not worth IMO
3060 12gb ram will not give more fps. 3060 is not fast enough to utilize that 12 gb ram.
@@zilverman7820 It's true that you would rarely be running games with settings that benefit from having more than 8GB with the 3060, but it definitely can happen sometimes. Also, while it may be very rare that it would be beneficial to use anywhere near the full 12GB with this card, having more than 8 is still very useful, as a lot of games will use more than 8GB but less than 10GB with certain combinations of settings.
Having more vram can also help a lot with games that are not very well optimized, or with user created mods (especially graphical mods), which tend not to be well optimized, and can use a lot of vram. Having lots of vram on the videocard can also make up for having less system RAM, though that's not likely to be a very important consideration, especially because RAM is relatively cheap right now.
I‘m so happy I went with team red for my last GPU upgrade. The 20 GB on my 7900 XT seem to be more than enough, even in the most demanding games at 4K. I also purchased it at its all time low price.
Yes, AMD cards are hitting the value sweet-spot, now pricing for them is dropping. The drivers have now completely fixed the power consumption issues with multiple monitors & video play-back etc., and seem to be just as stable as Nvidia. AMD RT is actually equal to Nvidia, on a $ to $ basis, eg. XTX RT equals 4070 Ti 12gb RT, but raster is between 4080 - 4090, so a great mix.
What CPU do you have paired with your 7900x? I have a 5600x and know it will bottleneck the GPU a bit, but these prices seem great. $640 for a 20GB card seem good but i haven't looked at upgrades in about 4 years 😂
@@mthlay15 I‘m running a 7800X3D on a B650E board, so I’m GPU bottlenecked 100% of the time. I‘d say the 5600X is fine for most games at 4K, but might limit you in games that yield high fps numbers at 1440p and 1080p. I‘d recommend upgrading to the 5800X3D/5700X3D, if you want to stay on AM4 and have the budget.
I got a 7900xtx with my tax return last year. My 5800x might be bottlenecking, though I did plan on getting an x800x3d (9800x3d should be out) next tax time. At that point, I should be set for a few yesrs
The 7900xt is a wonderfully smooth card with years of service ahead. It amazes me some games are already at 16.5GB.
I bought Hogwarts Legacy for the kids and now am playing. A 12GB card is the minimum, but the 7900xt is just wonderful.
Biggest visual flaw with a 4 year old game was limiting textures to 8gb for 1080p should tell you all.
This story is nearly 20 years old. Problem is not JUST the RAM but the BUS on which it is operated. Back in DOOM 3/Quake 4 days manufacturers have started slapping cheap DDR (NOT GDDR3 which is the 1st DRAM specifically getting the G in its name to differentiate it from standard DRAM) in bulk on low/mid range GPUS.
I remember the 1650 NON XT with 512 MB DDR LOL - slow DRAM on 128 bit BUS ..... the card could never fill it up properly. I have a feeling if the market and MINERS weren't a factor HBM would have been the way to go, AMD/RADEON as usual tried to push the industry in right direction regarding G-RAM but at the totally wrong time.
That's a nuanced point most ignore. I made a similar argument a couple of weeks back on another video. In GPU's, scaling up memory scales up the memory bus bandwidth.
AMD never misses an opportunity to miss an opportunity
Fury and Vega were way ahead of their time, but were received poorly because idiots like me back then bought the GTX 770 3G instead of a r9 290 that lasted for quite a lot more years. It was about that time the big shift away from the high end for AMD.
@@solmariuce5303 no, you were influenced by channels of this type.
Those were the days. Laughing at my work mate who'd bought an Nvidia FX5000 series card with a ton of video memory (for the time), but it was as slow as hell.
It's insane that something like the 3070ti is already becoming outdated. That should not be a card that requires you to already be at the point of turning down settings/resolution. The 4060 shouldn't even exist.
3070 ti was the only card I ever regretted purchasing. I paid $850 for it on launch day from best buy, not even a scalper. I sold it in less than a year. I was having VRAM issues even back in 2021 when it launched. Wont be buying nvidia again any time soon. Got a 6950 xt for $550 and loving it.
WOW AMD DOES NOT EXIST?
Yeah 3070 has been an absolute regret card for me.
Welp, at 1080p it works out for me and due to space restraints, i am not moving to 1440p for a while. Once i do, maybe i will look into an upgrade, but at this moment it works ok, just runs a bit too hot for my taste.
@X_irtz Even at 1080p, it can run into vram issues. You can't play Forza Horizon 5 at max settings with ray tracing on at 1080p because the card doesn't have enough vram, even though the card has enough processing power to do so.
Since I play a lot of VR. VRAM is by far the most important aspect for smooth non-hitchy gameplay. if I play HL:Alyx, even at "high" texture quality (not even "very high") running at 5400x2700 my Radeon RX6800 utilize 14.5GB of VRAM. Before when I had an 8GB card I always had stutters. Same goes for flightsims, like DCS, even without VR you easily breach the 12GB boundary with highest texture quality landing somewhere between 13-15GB of VRAM utilization.
THIS!! Try VRChat in a world with 60 people in custom Avatars. It will bury my 16GB 4080 instantly.
I have used 22,8GB out of 24GB in Kayak VR with max settings at native Resolution on the Pimax Crystal.
@@silverfoxvr8541 lol wut. I run vrchat perfectly fine on a 1050ti (no headset though) on both my 2k and 4k monitors. Are headsets really using THAT much of your vram? I was thinking of investing in a headset but no longer want one if that's the case.
@@thelonercoder5816 Your 1050ti doesn't even meet min spec for first gen VR hardware, 1060 6GB or RX 480 8GB or better were the absolute minimum for the early headsets which are a quarter of the resolution of even today's cheap headsets.
@@DigitalMoonlight That's insane. This is my old rig btw. I'm building a new rig with a 7800x3d and 7900 GRE (16 GB VRAM). Even I feel like this wouldn't be enough for vr it seems. Seems anything sub-20GB isn't worth it lol. Man I was looking forward to investing in a VR headset but IDK anymore.
Eye candy and frame gen using over 12gb isn't looking good for 12gb boys including myself with 4070.
So glad I got a 6950 over a 4070. They were both at $550 ish August last year when I bought.
so dont use frame gen, just use DLSS and settings tweaks.
As Steve mentioned at the end: 12 GB is bare minimum, 16 GB the go-to. I'm good with my RX 6800 16 GB :)
My 6700xt is still going great but if i were upgrading 16gb would indeed be the minimum.
My next one probably be 20gb one 8800xt or something like that.
I wonder what kind of feces soup slop you people play to need that enormous pile of VRAM. I am honestly, 100% sincerely doing great with mere 6GB, in any and all games I own, playing at 1200p since 2009. Enjoying smooth 60+ fps in all titles, with modestly high-ish, custom settings of mine.
@GugureSux hogwarts legacy cyberpunk2077 with mods.
Games which are fine with 6gb but are complex graphically downgrade texture quality to give you fps.
Cp77 does it out of box hence mods fix those.
If that's fine for you that's great but not everyone likes the same thing.
@@GugureSux Agreed. These 'enthusiast' thinks large vram is street cred. I'm getting by on a 8 and 10gigs with no issues.
@@GugureSux "1200p (read: just slightly more than 1080p) since 2009" "smooth 60+fps" ... yeah as if we needed any validation that you have nothing meaningful to say
Minimum today should be like 12gb or preferably 16gb. If youre buying new card.
But even older 8gb cards (20 or 30 series) are still fine if you drop down some settings.
I feel bad for all the mugs who've bought the 8GB 4060 & 4060ti.
Looking at the Steam survey, that's no small number either!
The 4060+4060ti accounts for 4.96% (2.73+2.23) and while there's sadly no discrimination between 8GB & 16GB versions, I'd wager heavily that, due to prebuilts, 8GB cards make up the overwhelming majority.
On the flip-side, the RX 6700 XT+6750 XT+6800 account for only 1.31%! (0.74 + 0.35 + 0.22), and mid-tier RDNA3 cards don't command enough marketshare to even register!
I feel worse for people that bought a 4070 12GB. 650 dollaridoo's for 12GB.
@@The_Noticer. Literally lol I found a used 3090 on eBay for around 500 bucks, that's an infinitely better deal.
@@The_Noticer.Or those that bought a "4808 12GB"4070Ti with 12GB for 800-1,150 Euros from launch to the first half year.
12GB for 670 is taking the piss.....12GB for 800+ Euros is.....sheer unmitigated arrogance!
Why feel bad? This vram situation is not a revelation. Plenty of videos on the topic at the time. 4060 8gb owners went into this eyes-wide open.
Yet the most popular AMD card(excluding Rx 580) is an Rx 6600 8gb.
I'm still rocking a 1080ti and I don't have issues running out of VRAM, its still playing modern games just fine with details up pretty high but some games now require a bit of FSR/XESS to keep the frame rate up whilst still having a lot of graphical settings maxed out :)
The allocation is important too, as the game is able to cache the textures in vram for when they're needed. This may work ok pulling them out of ram, but usually those with a video card lacking sufficient vram aren't rocking a monster cpu and the latest ram.
Modern PCIe uses DMA copy, so it doesn't hit the CPU hard like a bus-copy in the olden days. The CPU only needs to do a quick trip to the kernel to set up the DMA copy, and then it gets notified some time later when it's done.
But your general point still stands: People with entry level GPUs often also have entry levels of system RAM. If they're also running their PCIe x8 card on a last-gen-PCIe platform then things get really bad really fast, and making sure the game only allocates up to the *exact* amount of VRAM available becomes crucial.
@@andersjjensen Depends, some games DRM forces the data to be decrypted constantly, in addition even with DMA, if the GPU is using the RAM, your CPU is waiting around. And that's best case scenario. The whole thing is silly and all down to nvidia playing games with vram. Going with just enough vram will bite you sooner or later, just as nvidia intends so they can sell you that 5070 with 16gb, just barely enough to make it through the product cycle.
For games with mods I almost use all 24gb, others most iv seen is close to 16gb @ 4k
Same here, Skyrim, cyberpunk, Witcher 3, and more eat up my 20GB on my 7900XT.
@@DJ-fw7mi
Allocated or not?
Actual usage given upscaled textures, reshades, light to medium ray tracing, among model and other population mods. Incredibly noticeable in Hogwarts legacy with 60 mods or so. Comparing 1080 1440 4k between my two rigs with the GTX 1080 and my new 7900XT.
I'd love a 24GB RX 8800XTX for $599.
A 32GB "8850XTX" would be for $699 would be even better.
@@handlemonium Ignoring the first ridiculous sentence, you're not getting 32GB of VRAM for $699. You're also not getting a new XTX card for that price either (RX 8000 series will be mid range cards with probably 16 GB anyway so come on). The 6900XT had a NAVI21XTX GPU and that was $999 at launch in 2020. No gaming card has ever had more than 24 GB of VRAM and with 32 GB you can do some AI workloads not possible with 24 so that's gonna cost you extra if you even get it all (probably not for a couple generations). In fact, the cheapest card to have 24GB is the 7900XTX and that launched at $999 too.
ATi charged $949 to get the highest VRAM card (ATi FireGL X1 256MB) way back in 2003 when 128MB cards like the 3DLabs Wildcat VP970 were going for $899. You want more VRAM than even Nvidia's future nearly $2000 card will give you for less than $1000?
nVidia: Please buy our GPUs to play all these *totally* efficient graphics-arms-race games with our RTX features and frame-gen!
Also nVidia: What do you mean you need VRAM for that?
I predict the 7900XT is going to age particularly well from this generation with its 20GB of VRAM. The 4080 outclasses it in raw performance, but that extra VRAM will probably keep it performing consistently for at least one extra generation over the 4080 if I had to guess
I believe once 16GB will be fully utilized, 4080 will need DLSS to keep performance so VRAM usage will drop anyway.
Meh, at that point you're kind of talking about limping it along with reduced quality, and that's exactly the kind of thing that happens when graphics cards aren't aging well. @@jozefhirko6024
The 6950XT I bought for a little over €500 over two years ago is pushing my 3440x1440 OLED to its 165 FPS limit just fine in Ghost of Tsushima which is literally the only good AAA game that has come out since I bought that card
@@jozefhirko6024 I imagine it’ll be kind of like the 1060 vs the Rx580/480. Early on, the 1060 looked a good deal better, but as time went on, the 580 maintained an acceptable level of performance for much longer. Nothing crazy, mind you, but if you were trying to stretch your card as long as possible, the 580 was the better purchase
@@dsc9894 thats actually what happend to my dumb ass i was fooled by nvidia (bought 1060 3GB) had to upgrade within some years if i would have had an 480 or 580 with 8gb for literally the same price i wouldnt have to upgrade (and yes at the time it was already pretty ass i got a uhd 28" gifted for free which was far better then my 1050p dell)
i feel like shortly, (2026~) gamers will likely need nearing 16Gb of GPU memory which is pretty lame for a majority of people.
Majority yes, but the majority also bought 3.5GB 970's and 6GB 1060's and 8GB 3070's. So the majority will keep buying what the majority keeps buying.
I upgraded from a 16GB card to a 24GB card, so i'm set for the next few years.
You overdramatize as F😂 not everybody plays 4k with framegen on ultra settings. 12gb is going to be fine untill ps6.
@@Audiosan79I actually downgraded from 24gb 3090 to 16gb 4070 Ti Super. Don't ask me why.
There's so many people being mislead into buying 8gb cards thinking that's "future proof" because companies are telling them that.
Random guy: Vram makes the card more futureproof
Nvidia: 6GB it is....
Welp. Rough week for my 1080 to die on me, but at least I now have some data to tell me how much lube I'll need when I start browsing those NVIDIA shelves
No options except a 4080 or 4090 really 🤔
Just buy AMD. 16GB isn't that expensive.
@@motmontheinternet Some people want ray tracing/path tracing
I have a 7900 XTX so, 24 GB of vram, and I've seen some games use up to 18.5. Avatar Frontiers of Pandora for example (amazing looking game btw). I would say more RAM the better. But 24 for now is more than plenty.
Some games will try to use as much VRAM as Possible.
I use a RTX 3090 with 24GB - and in MSFS2020 and all other programs that I need to run, I seen as high as 20gb Vram being used XD
@@DeadPhoenix86DP Be glad, as it means less SSD read/write cycles.
i saw modded cyberpunk running over 24gb on my 4090, and Escape from Tarkov is around 19gb at 4k
@The_Noticer That's good info to know.
Also, unrelated, but is your name antisemitic 😂😂
Yes, 8gb is just ridiculous for a mid tier GPU to have but It’s pointless to use a 4090 with 24GB and then generalize the results for GPUs with less VRAM. With texture compression and dynamic allocation, plus driver quirks for each GPU class, memory allocation varies wildly. A bit more care or critical thinking before publishing the video wouldn’t hurt. A 4060 Ti with 8GB can absolutely run Cyberpunk and Alan Wake 2 with ray tracing without massive frame drops due to data swapping with system memory. A correction here would be ideal.”
I see what you are saying 👍
Imagine paying $600+ for only 12gb vram 🤯
My friend paid more for 8gb 😂
the 7900GRE is only 50$ cheaper and actually performs identical even with higher VRAM anyway. You are paying 50$ for the better features. Which is a small difference given dlss is still superior than fsr, until fsr actually starts updating again.
FWIW lots of people are upscaling 1080p to 4K and 960p to 1440p via DLSS. That cuts the VRAM needs a lot. I do think that this shows that 16GB should be the norm going forward at this price range.
Would be almost as bad as paying 600+ for a GPU with poor RT performance and no AI upscaling. Certainly more games that force RT or make you use low settings for things like shadows, AO, reflections, GI to remove RT than there are games that use 12gb at 1440p. And certainly more games that use your upscaler for the AA and have it set in the default presets. Which makes since, DLSS offers a much larger performance budget for developers and looks better than the AA solutions that modern games use (TAA,FXAA,TSR)
Imagine only caring about VRAM amount
4060ti 16gb sometimes costs more than 7800xt, what gives? Are people that dumb?
Yes, precisely.
Because AMD isn't as fast in RT as Nvidia people think that AMD flat out can't do RT. You can't convince them that the 7800XT brutally murders the 4060ti in raster and still exceeds it by a bit in RT.
@@andersjjensen 4060 Ti is cheaper than 7800xt
Features>Performance. AMD is catching up but they are still miles behind in basic driver features that Nvidia offers. Also, DLDSR is amazing and AMD has no answer to that yet.
You have to realize that a lot of gen Z children that were introduced into PC hardware during COVID don't really have a great understanding in hardware and were easily finessed. These same people are are fb marketplace trying to sell their 5600x and 6800xt systems for 1.5-1.8k.
I paid £420 for the 16gb 4060ti gpu when I could have saved £60 and paid £360 for the 8gb version. But I thought that £60 extra to double the VRAM seemed a good deal and that is why I got the 16gb instead of the 8gb. And I play games like Cyberpunk 2077 and Hogwarts Legacy and they play like a dream because of the choice that I made. And I don't regret it at all.
happy for you, it's very hard to get good deals in terms of VRAM for laptop gamers like me lol
So 12gb video card is the starting minimum🤔🤔
That RX 6700XT or 6800XT maybe the one i'll look in to
🤔🤞
Don't discount the 6800 non-XT. It's precisely as fast as a 7700XT and uses precisely as much power, but has 16GB. There aren't many of them left, but when you can find them they're usually only a little bit more expensive than a 6750XT which makes them a stupid good buy.
would be interesting to see how much dlss upscaling decrease vram usage
It would depend on the DLSS quality setting used and how much overhead DLSS carries. Not every game eats a ton of extra VRAM at higher resolutions as well.
Agreed, it’s a huge miss on this video. Everything else is showcased making it a little bit misleading.
@@giglioflex what is overhead ? Thanks brother
@@Will00 In this case it's the amount of VRAM required to run DLSS itself.
I am not surprised to see the new DLSS 4.0 (FG2) will be exclusive to the 5000 series just like FG to the 4000 series. We all know the importance of "Buy more to save more."
This is exactly what i've been saying on many AAA discussion pages on steam. People complaining about stuttering, but refusing to believe its their 12GB buffer on their 4070.
Or for that matter, Nvidia's weak DX12 driver with CPU overhead. But I also know, these people will buy a 5070 and the cycle continues. C'est la vie.
I have seen Forbidden west use up to 14.5GB on my 6800XT, but that was after extended playing.
I finished Forbiden West at 1440p with my RTX 4070ti at around 100 fps with a smooth frametime for the entire game, I'm super sensitive to these types of things! You have to learn that allocation does not mean real use.
@@BaNdicootBaNdiDo Horizon Forbidden West typically doesn't use more than 10 GB at 1440p, at least up to 10 minutes of gametime, so you shouldn't have issues. You don't know if the OP's VRAM usage was allocated or not though as it's possible VRAM usage went up after a long session. HWInfo64 shows what resources are actually being used, not allocated, and it gives you the maximum, average and minimum. The tests Steve showed in this video was use not allocation and we see multiple games use 12-14GB at the highest settings in 1440p with Avatar using 16.4 GB which is crazy but that's due to frame generation added to the Unobtanium setting.
It's what I keep telling people.
My 6800M often pulls ahead in newer games over a desktop RTX2070 in 1440p because of the extra VRAM, but people don't fucking believe me.
I even showed them the difference and they tell me it's faked and I underclocked the 2070.
Back in March of this year I built a tower, spenting $760 Aus (~$520 US) on a 16GB Asus 4060 Ti . It was actually the cheapest 16GB card that I could get my hands on at the time. Coming from someone who had previously been gaming on a Laptop with a 4GB GTX 1650 Ti for 4 years, I knew 8GB would do me for as a basic card but 12GB or better was a preference. I found the price jump between the 12GB cards to the 16GB one I got wasn't that much more and thought would be worth it. I don't regret it, and found it was a good buy compared to the other cards in the 40 series with 16GB that cost at least 600 ~ 1000 more than mine.
Less than 16 gb will age quite fast with newer games past 1440p but it always depends how you play if its not max settings than u can get away with less
Dude you would feel like a fool if the next gta game can only be run on your 4060 ti with medium settings just because of vram constraints, whereas it would be perfectly capable of playing high or ultra and imagine a 6700xt getting better frame rates being cheaper on higher settings.
I honestly find the “just lower settings” and “just don’t use ultra/high textures” as the worst coping I’ve ever heard but to each their own I find maxing out textures to be just as fun as playing the game sometimes 😂
@@puffyips lol ok have fun maxing out playing with yourself. Make sure you clean up those textures
Hopefully AMD keeps their larger VRAM buffers on their new "mid-range" cards and they show benefits longterm.
Hopefully they beat or at least match ngreedia in ray tracing and power draw. Nvidia ain't sweating yet coz they know they have an edge over amd.
@@seanchina9902 All Nvidia has to do are give more vram to their mid range gpus and give them a reasonable launch price , AMD wouldn't stand a chance if they do that.But no, greed is really strong with Nvidia at the moment.
I bought a 4090, and it was worth it. My games will not consume more than 14Gb of VRAM. With 10Gb VRAM I built a gaming expert AI assistant to run alongside using about 10Gb of VRAM. I have dumped manuals (text, Word, RTF, PDF, Web Pages, Wikis, Steam Guides, forum strategy, UA-cam video transcripts into my vector database for the AI (RAG aka "chat with docs). I can now put down a game for 6-12 months no problem and easily return and just ask questions. When you are a senior and your memory is failing making game hopping impossible; having a gaming expert at your side is fantastic. 24Gb VRAM was perfect.
Obviously! That's a bit of a dumb statement tbh - what did you expect from the most expensive GPU on the market, a slightly better 4060 Ti, for 4-5x the cost? This discussion is not for you, it's for those who have a normal fixed budget, and need to get best value & longevity.
@@Roll_the_Bones It was about full memory utilization and that is where I am at. Do you I imagine that retired seniors are not on fixed budget? I waited 8 years to build that PC. It’s amazing how petty some people get. Also, what most builder channels are not even realizing is VRAM reduction is not about NVidia greed. It is about big tech putting an end to open source AI.
I bought a 7800XT to "upgrade" from my 3070 Ti (tired of Nvidia's crap and I wanted more VRAM). I ran Cyberpunk 2077's benchmark with the same settings (1440p, DLSS/FSR enabled, ultra quality, volumetric fog high) as the 3070 Ti. While I saw 19 more average FPS on the 7800 XT, I saw almost double the 1% lows, (67FPS to 111FPS). Not sure if it was the raw raster uplift, or the VRAM. Either way, I'm pleased.
When you see a small uplift to AVG but a large uplift to 1% Lows it is nearly always VRAM related.
Would love if you tried revisiting the rtx 3060 vs rtx 4060 to see how the vram affects them
I never knew this! Thanks a lot for the info :)
It blows my mind how many people say the 3080 with 10gb is still fine, I upgraded to a 4090 at the start of 2023 and nearly every new game at 4k uses over 10gb at High or Ultra, and plenty of games use around 10gb at 1440p as well. In-game my overlay shows most games want more than 10gb at 4k, even though the 3080 is plenty fast to run those games otherwise. Like the 4060Ti 8gb vs 16gb video showed, even with 8gb it'll be running out while showing 6gb or 7gb used, and you know what that means. It means people using a 3080 who haven't used cards with more aren't gonna notice they're still held back even if the game says it's using 8-9gb. That's potentially because of other applications, or multi monitors, for example, I find 1.5gb is used with nothing open, which drops down if I uplug my 2nd monitor.
So my next card will need at least 20Gb, got it. Thanks for this, it's surprisingly hard to find somewhat accurate information in terms of VRAM consumption.
That's because the big tech company channels always crank up settings and resolution, and don't even bother testing out 1080p no more, and way too often skip any sort of setting optimizations. As an user of a single-digit VRAM GTX card who still enjoys rock solid 60fps in all games, without any wonky upscaler tricks, I'm seriously disturbed by this VERY RECENT skyrocketing of system requirement. And no, it's absolutely not explained by any sort of jump in visual quality; some literal 10yo games look exactly the same, if not better, than modern slop.
Agreed. Honestly, it's like PC gamers don't understand that optimising games like Alan Wake 2 or Hellblade 2 will still end up with a fantastic looking game, even on medium settings. What does 'medium' even mean? It's arbitrary.
Didn't the Avatar devs hide the Unobtainium graphics setting? They probably did that cos PC gamers can't help themselves and want to max out every setting, then complain that the game is unoptimised.
I don't max out graphics on games, unless I see a significant visual benefit from it.
@@GugureSux if you don’t use upscalers, AMD is just better for gaming. raytracing also costs VRAM, so you probably don’t use that, either. if you don’t need it for productivity, keep AMD in mind should you upgrade ;)
@@gonozal8_962 Does AMD really bad in productivity and editing? 🤷🏻♀️
@@GugureSux its cause DX12 is actually garbage at its job, games engines make it even worse.
I upgrade my GPU about every five years, and I game at 4K. I need all the VRAM I can get. I went from an RTX 2080 Ti 11GB. I'm now using an RTX 4090 24GB. It really depends on the resolution and how often you upgrade your hardware. Some people will use the same old GPU for many years. I still see people on Steam using an RX 580 8GB. That's crazy to me. It's so old now. Not everybody plays the latest games. It helps to have a lot of system RAM for when you exceed dedicated GPU VRAM. I would be in the red playing Resident Evil remakes at 4K ultra settings with the RTX 2080 Ti and the games still ran fine. My laptop has a mobile RTX 4070 with only 8GB VRAM. The Last of Us Part 1 glitched out in a strange way with white light all over the place and then crashed. You definitely want more than 8GB for 4K ultra settings for AAA games. Laptop chips are too cut down. I prefer desktop gaming.
Everyone talking about how bad the 4060 was at $400 for 8GB, but I think the 4070Ti at $800 for 12GB is far, far worse. Imagine spending $800 and you have to turn textures down after only a year, and turning DLSS and/or RT on gets you stutters. Yikes.
Also, just got my 7800XT today, upgrading from an RX 5700 8GB. What a time to make a VRAM video! :)
I’ve had no problems running cyberpunk at 1440p with RT ultra and frame gen. I think this is a blown over subject
Not true at all. These additional 4GB actually make a big difference.
12GB feel a bit better at 4K then 8GB at 1080p. Thus 4070 Ti is a better 4K card, then 4060/Ti as a 1080p card.
But ideally 4070 Ti should be paired with 1440p screen (since it is not fast enough to run modern AAA games at 4K). And here it has absolutely no issues. I've been using my 4070 Ti for 1,5 years. And I haven't reduced texture quality in any game yet.
@@licensetochill1299 cope
I think it was Digital Foundry that said that Avatar manages textures based on available VRAM so it would make sense that its trying to use a lot of it if there is enough available. So VRAM usage should vary depending on what GPU you use.
Here I am again with my GTX 1070Ti with 8GB VRAM @ 1080p, just gave it some love and changed paste because hot spot temp was getting uncomfortable for me (also, summer D:)
The reason why I didn't buy the 4070 Ti was that I found it unacceptable for its price to have only 12 GB. This was later corrected by the 4070 Ti Super but it didn't exist when I was buying. So I settled for the regular 4070 which is fine for what I play and decided to keep the extra money and put it in a future upgrade.
The same for me, I got the 7900XT instead.
same I went with 7800xt for now. I will wait for a good priced 20-24gb vram gpu later.
@@spacechannelfiver the 4070 Ti was an expensive GPU, you expect more at this price point, and the 4080 was ridiculously expensive.
Would've been nice if the 4080 Super was bumped to 20GB.
@@ej1025 wouldnt have worked with the bus width of AD103, and frankly 20GB is overkill; 7900XT was more an accident of having 5 * 4GB as the XTX needed chopping down somehow. I ended up with the XT as I needed something to drive a 4K screen and it had price adjusted below the 4080 12GB I mean 4070 Ti at the time.
There was no way as I was going to pay that kind of money for a 12GB card at the time; and if the 7800XT had been on the market at the time would have gone with one of those.
I personally dont find RT massively compelling currently outside of some novelty titles, we're waiting for the PS5 Pro or even the PS6 to properly bed in; and the console crowd are increasing demanding high framerate so that will only delay it further. The only useable implementations have been stuff by Insomniac or that Lego puzzler. It massively screwed up the Art & Lighting in Cyperpunk & Metro. Maybe DOOM Eternal had a reasonable implementation.
The IQ is so much better driving raster with some smoke and mirrors at a decent framerate and close to native resolution than some more reflections or some GI that only updates every ten frames. It's the same problem with like Monster Hunter Rise on the Switch where everything more than 100m away suddenly animates at half framerate. It's really distracting.
Definitely going to need 16 GBs of VRAM for any type of future proofing.
You show horizon forbidden west not needing more than 12gb vram at 4k in your tests, but that does not match my experience playing on a 12gb gpu, where I frequently saw numbers at 11.8gb and accompanying stutters as it starts filling ram instead. That was in both main game and DLC. 10 min tests is nowhere near adequate.
well then go ahead and test for yourself. Every game at all those settings and that for 1 hour each.
Most people on the steam survey have 8 GB VRAM GPU’s.
Yeah it sucks hey.
Yeah, the 4060 with it's laptop version alone is outselling like every other current gen card combined rigth now.
And that is perfectly fine for most indie games. Just not anything remotely up to date like Enshrouded. Consoles can allocate 12GB to frame buffer which gets ported to PC as the "high" setting so just use medium or low. Medium of today is the ultra of 2016, when 8GB was normalized.
@@Hardwareunboxed yeah man but people are buying low budget than all these overpriced GPU's from AMD and Nvidia. Intel A580 is only $149.99 USD.
@@JayzBeerz most people on steam also play e-sports and old games, almost all games tested here aren't even in the top 100 most played
How did you measure? for other video would you use any gpuopen or Nsight ?
“VRAM or graphicscard memory has been a hot topic over the past few years” is so true. VRAM was crucial when I was looking to get a new GPU last Christmas, but it wasn’t even a consideration when I got my PC about 4 years ago.
I feel like 8-12gb of VRAM does just fine in 1080p. But 1440p and 4k will definitely enjoy having 16-20gb for max settings.
I think this whole vram panic is very profitable for gpu makers. People are willing to give them more money and go for higher-tier gpus because they have more vram. I think both Nvidia and AMD are motivated to keep this going for another generation.
Its literally the apple move, overcharge for iPhones with more storage and force people to pay crazy money for it
Intel enters the chat
You are reversing things.
amd left the gpu race
AMD has very good cards in the 6000 series, with 10 GB for the 6700 12 GB for the 6700XT and higher and 16GB for the rx 6800 and higher, with 192/256 bit bus respectively. having gotten an rx 6800 for 400€, these are budget options for mid-tier grapics cards
NoVramIDIA.
On top of everything they started to sell 70-class GPUs in 80-class boxes for 90-class price.
And yet AMD marketshare still struggles. It shows what a small percentage of buyers would qualify as "enthusiasts".
@@benjaminoechsli1941 enthusiasts market doesn't exist anymore. (Not the U.S.A anyways)
Yeah!
We customers say! 8gb is more than enough! We buy nvidia! Or at least 88% of us think so!
No, RTX 4080 is an 80 class card in performance. It's just a bit pricey
@@mitsuhh No it's not an 80-class card, not from performance and for sure not from the hardware perspective. Stop defending corporations man it's kind of stupid.
I wish I had this video when deciding between a 8gb and 16gb variant of intel arc.
The tests I saw back when I was shopping around found that the 16gb one didn't get much more fps, but now I've been finding games that are pushing its vram to its limits.
Thanks for solid content Steve!
I have 21:9 ultrawide monitor and wondering how much VRAM usage would 3440 x 1440 resolution take? Should I aim for 4K results for my VRAM requirement?
4070 Ti Super at minimum or a 7900 XTX
@@Shimo_28 cap got a 7900 gre and hit 100fps+ in newer titles on 3440x1440p
@@stevenostrowski6552 100+ isn't great when it's 144hz monitor and I was suggesting for ultra presets in games so that it could still be viable into the future, tell me what kind of games are you talking about
8GB: Can't max most game at 2K
Can max most game at 1080p
12GB: Can max most game at 1080p upgraded to 2K (will usually lack of VRAM on very demanding VRAM or badly optimized game), will usually not be able to max game on 2K.
16GB: Well, technically you can max everything, so you're kind of future proof. (but there's a few game that already hit the 16GB VRAM by their poor optimization)
20GB: To my knowledge. None game is enough badly optimized to consume 20GB (max I've seen was around 17GB VRAM). Because I don't own a 4K monitor. I don't know if some game hit the 18-20GB VRAM usage on 4K ultra (+ RT if possible) especially with FSR or DLSS using more VRAM when enabled. (You will the limit with some heavy computing/IA simulation/video
24GB: Well. You're more than fine (Almost no task hit the 24GB but there's some oddly specific who does)
what you call 2k is not 2k. you mean 2560x1440.
1920x1080 is what some call 2k, because the horizontal resolution is approximately 2000 pixels.
3840x2160 is called 4k because the horizontal resolution is approximately 4000 pixels.
1080p, 1440p and 2160p are not resolutions, but video formats, because the p stands for progressive scan as opposed to interlaced 1080i, 1440i, 2160i. all games are rendered progressively and adding the p is the same as saying that your car has wheels every time you mention your car, though it's unfortunately common these days, even among hardware reviewers
One of the main reasons I went 7900xtx. Never wanted to run into a situation where I needed more. I’ve seen upwards of 20gb used in very demanding scenes.
All these high quality textures increasing VRAM usage only to get blurred by TAA, DLSS, and FSR, it's shameful.
These graphs don't make sense. Just because a game CAN use 16 GB doesn't mean it NEEDS to use 16 GB at all.
This does not negate the fact that there really are games where 8 GB may not be enough for ultra settings.
But there is manipulation in this test. We are shown 2 games where FPS drops, and in one of the games we are shown a video clip on the game engine (in such videos FPS is not important and higher-quality textures are loaded there, as well as the director can make the scene temporarily more difficult).
And the second game was simply not optimized at the start
Then, when the right idea is embedded in our brain, we are shown a dozen graphs from the most modern games.
When you view these charts, remember a simple rule: The ability to use memory is not the same as the need.
Yeah it's difficult for the lack of an actually good assessment tool but he qualified it with first hand experience what happens at these particular settings with particular games, he pointed out several that went stuttery. It is actually pretty typical that the amount of VRAM occupied with objects hovers around what the game actually intends to use, and that you have some leeway to squash it but not very much. Longer term excessive caching isn't very good, since freeing the cache can lead to hitching in and of itself, so it's not a typical behaviour in well developed engines. And you do have a fallback algorithm that kicks in during scarcity, either the textures are getting eagerly removed (slightly stuttery), or loading emergency-halts (you get patches of very blurry textures), or a host allocation type is temporarily used for new loads instead of device type (reduces performance). I think unless you have results or first hand experiences to the opposite, that at a particular setting a game hasn't started degrading on an 8GB GPU in comparison to a more generous one, while he indicated that it would, you don't have a strong point.
He did mention that he will show Dedicated VRAM instead of Allocated.
16gb if you plan to keep your GPU for the next 5 years especially if you're going to be playing at 2k, 4k is a pipe dream.
A lot of folks bashing the 4060 but I actually like the 4060 as a low power card for upgrading low cost systems without having to upgrade the power supply at the same time. It just needs to be maybe 10% less expensive. The 8GB I think is acceptable on it. However, the rest of the Nvidia lineup sucks; high prices and not enough VRAM. That, and because I don't care much about RT, is why I went with an AMD 7800 XT this year; decent value and plenty of VRAM. I know it's not really better than a 6800 XT, but that's still a good card too.
I wanna buy 2 of them for work and gaming machine, for me and my wife. 8 vs 16 is 300 vs 400 € euro right now. 1080p is fine but I wanna use Minecraft Rtx, heavy shaders and mods. Maybe some AAA game? Maybe AI gets interesting for me? Feel like 16gb is slightly overkill but maybe safer
I have 20gb in my 7900XT, and I can tell you personally that it is NOT overkill for VR. There are many VR games that can get close to maxing out on using that VRAM.
For regular gaming yeah probably, but I'm using a 5120x1440 super-ultrawide and some more graphically intemsive games do use more than 16gb of VRAM. 20gb is just really nice to have.
People keep talking about VR but I never see any games where it makes sense... what are you playing there?
@@andersjjensen ever played racing games ? they make the biggest sense.
Still amazed by my 3080's 4K 60fps performance. Four years later, I am now upscaling 1440p Medium-High to 5K and then squashing back down to 4K for visual quality imperceptively similar to the native 4K High that I used to run in late 2020... still not hitting its 10GB limit. I can even run Medium with ray tracing without major stutters.
Though I expect that I may need to swap my card within the next 2 years, I'm totally satisfied with 4 years and counting of 4K gaming!
Chilling with 24 GB on 7900 XTX. Gonna be set for a while.
Even better considering you won't be turning on any heavy ray tracing and probably don't care about running AI cause AMD so 24 Gb Raster machine is definitely still going to be a raster machine with plenty of VRAM in 2030+.
@@albert2006xp I'm playing Cyberpunk with Path Tracing at 250 fps. (1080p XeSS 1.3 Performance + FSR3 Frame Gen + AFMF). I'm keeping my card underclocked to 2000 MHz instead of the possible 2900 MHz.
@@raresmacovei8382Dang. And RDNA 4 is supposed to have a massive RT boost. If they maintain their generosity with VRAM (at least compared to Nvidia), they should have a highly-compelling product.
1080p with xess performance is basically 240p it looks horribly 😂👌
@@raresmacovei8382 whom are you lying to? Yourself? At 1080p upscaling is bad idea, at best DLSS quality looks decent, not FSR, despite that say you get 250fps with XeSS 1080p performance? Hilarious, that must have been mudy & blurry mess, I bet. Good luck to you 😂. Like, honestly? 540p upscaled to 1080p LMAO!!!!
My RX 6950 XT is definitely gonna last.
So far I’ve had no problems with my 10gb 3080, playing ultrawide. Even Cyberpunk runs nicely.
Everyone building or buying a gaming pc should see this video first. I would not go below 16 gb in 2024 in a new system. Also why are gaming laptops dumbed down with low vram a 16 gb laptop costs a fortune. Anyway this was a really interesting watch
Nvidia has almost always been stingy with VRAM. The GTX 10 series was an anomaly except for the 1060 3GB...
The VRAM cost of frame gen in Avatar is insane. Whats going on there? +4gb at 1080p?
I recently acquired a 4070Ti with 12GB. I noticed most of the games I played had zero overlap with the games tested here.
My games almost never exceeded 8GB of VRAM usage, even with ultra+dlss+RT+framegen at 1440p.
The games were indie titles like Palworld, Helldivers, and Darktide.
Indies and multiplayer titles wont use nearly as much VRAM as single player AAA. No point in testing those games
@@ssjbargainsale and the said AAA slop is the kind of shit that always has THE worst optimization, worst art-design, and generally get forgotten within the same year they drop out. There's truly few exceptions happening these days, and they're usually made by old industry pros who understand visual directing and optimization.
DLSS and framegen are meant to save GPU processing power and VRAM. They are the crutches the modern devs use to excuse their poorly optimized Blender vomit visuals. Also lol @ calling big-publisher trash like HD and DT "indie"!
@@GugureSux I agree. Very few AAA games worth playing in the last 5 years
@@ssjbargainsale that's not true. Medieval Dynasty is why I sold my 3060 Ti and got a 6950XT two years ago. Granted I play at 1440p but still
After watching Black Myth: Wukong gameplay footage,
I will consider 16 GB of VRAM GPU because I want to maximize the setting for that beautiful graphic
thanks for the explaination, you explain it really welll with perfect pace, not like other youtubers that explain it like they rushed to the toilet
It's the video games, completely out of bloody control. Near horrible optimization and carelessly packed.
You can't expect everything to have hyperrealistic graphics 4k textures, ray tracing, bloom, fxaa and all the jazz then somehow expect GPU companies to magically slap an extra 10 gb of vram on their GPUs and still sell it affordable.
Fix the gaming industry, the gaming hardware will follow. Keep the gaming industry in an eternal state of inflated requirements for negligible visual return and gaming hardware will continue to spike in price.
Yup. As greedy as NVIDIA is, sometimes it's game publisher companies not giving the devs time to optimize games and/or having difficulty accepting that PC gaming is more relevant to people now.
It does not matter if AMD/NVIDIA actually listens to their customers, if lots of game publishers decided that (as an example) RTX 4090 or its 24GB VRAM is going to be minumum very low settings 24fps for 640x480p resolution in their games.
Both GPU manufacturers may offer 32GB for their next series of xx60 (or x600 for AMD) but if publishers decided that 64GB required low settings 480p, then yeah....
I bought a 7800xt because of the 16gb of vram. I probably would have bought a 4070 super or 4070 ti if they had 16gb for the same price. But the 4070 ti super was nearly double the price of a 7800xt and I don't game enough to justify the extra money.
My friend spent $800 on a 4070ti 12gb and he barely gets more performance than my $400 6800xt, nvidia is straight up pathetic imo bc they know how to gimp cards perfectly
@@puffyips Considering the 4070 Ti is on average 18-20% faster than a 6800XT and is equal to a 3090, I guarantee you your friend is CPU limited.
@@puffyips4070 ti competes with 3090 ti performance wise. I'm pretty sure your friend has a weaker cpu.
@@riven4121cod just gets more fps on amd cards for some reason, but I did tell him to go with the 5800X3D instead of the 5900X atleast for gaming
@@riven4121+ just look up benchmarks the 4070ti will get let’s say 130 fps where my 6800xt gets 110, that’s not worth double the price.
How does the 4060 with only 8gb still outperform the 3060 with 12gb at 1080P and 1440P though?????????:?
With 8GB, 12GB and 16GB cards being mainstream you can expect low settings soon to be fir 8GB cards.
2016 to 2024, 450 becomes 600. About a 33% increase. I don't make 33% more :(
Well I bought GTX 1070 for €400 in 2017. 4070 Ti Super with 16GB in EU cost €900!
My wage certainly did not increased by 125%!!!
I know this is old, but you guys are awesome, and the information you provide is priceless! Thank you for everything you do! HU and GN are the 2 places I go before a hardware purchase- if you both have positive reviews of whatever it is, I buy it. Thanks for never letting me spend good money on underpowered or underperforming hardware.
so all gaming laptop users are screwed?
"Gaming" and "laptop" right next to each other is blasphemy in and of itself
Its been screwed for years, laptop users jist wont admit it 😅
They always been.
That's nothing new. Gaming laptops have always had a short lifespan. If it isn't the heat killing it early, it's the lack of VRAM or lack of drivers because your laptop OEM requires ones for your specific laptop that they don't even bother updating.