Also, Desktop area also matters, 4K desktop uses 800mb VRAM, 1080p uses 300mb or less., which is included in the allocated value. Multimonitors also add up.
Did not consider that. So with my double 1440p setup I will be noticing slightly higher vram than if I only used one. Wow that seems like such an obvious thing but I never considered it.
@@mooseonshrooms yeah I only started considering it when using Nvidia DSR 1080p to 4K+DLSS for those borderless fullscreen only games where you need desktop at 4K and saw 300MB VRAM jump to 800MB!!!
@alargecorgi2199 The thing about console games are ; they don't run on Windows and rely on DirectX , optimization is better for consoles as there is less driver overhead . So if consoles allocate 12GB for VRAM, doesn't mean its applicable to PC . Especially consoles often make use of upscaling feature, many of us PC gamers prefer to play at native resolution, not using FSR and DLSS .
@@fleurdewin7958 most people often follow the misconception that "consoles allocate 12gb etc etc" and apply those concepts directly onto pc when all have different api, different driver overhead
so when some of the newest AAA games are already right at the edge of 12GB we can logically expect VRAM usage to increase above 12GB in the next years. That puts the current graphic cards market really into a weird position since Nvidia charges 800$ for the first real 16GB option (I dont count the 4060Ti). AMD starts with the 7800 XT which is slightly under 500$. Thats A LOT OF MONEY on a graphic cards alone for the normal PC gamer.
@@jayceneal5273 That's not true, tons of games are made by AA studios or indie studios and they care less about consoles and these studios are the ones that are making great games this year. And just because consoles are limited doesn't mean the same game won't use more on PC.
@@No-One.321 if any indie game or AA game does then it's not very optimized and that's rarely the case anyhow, usually they are made to be run on potatoes. You're just fear mongering for something that isn't going to be a problem
Would not be a good test since the always allocate more VRAM than it needs, I saw someone did a system requirement test on that game when it just came and see it try to allocate 14GB of VRAM when it doesn’t even use half of it, the setting was 1080p high setting native resolution.
12GB at 1440p is fine with this console cycle. Developers will not create higher resolution textures just for high end PCs if it will not run on a Playstation 5. Which makes 24GB useless.
@@mrnicktoyou Are you joking? Why not? There are so many games that have mods better than what the original developers even did. I'm still playing new Left 4 Dead 2 campaign maps for free that blow away anything Valve did. It turned a game that would've been entertaining for a couple of years into a game that is still immensely entertaining after 15 years. Not to mention, that 24GB card is going to age a hell of a lot better in the next 2-3 years than a current card with 12GB-16GB.
@@TooBokoo I'm sure they will age better but I'm also sure the majority who spent top dollar for those cards today will be upgrading again next GPU Gen so 24gb helps the future used market more than first time owners of those cards.
@@totalermistThe vast majority of PC players are not interested in modifying their games, so that issue is irrelevant in the general use of the GPU and its vram.
Will it be enough for extreme setting? for example Cyberpunk 2077 Ultra realistic reshading mod on 1080p? (To clarify, its a mod that push graphic to far far beyond normal Ultra setting)
Exactly! By the time its not enough.. A huge leap or seismic shift will probably happen then. So, It's not a real worry in the here or now. Get what works with at least 12 gigs and be happy!? But seriously, I purchased a 4070 Super and no regrets for price or my gaming needs. Cheers!
as a 6800 user who plays less demanding non AAA games at 4k it's also interesting to me that the 6800 also appears to beat the 7700xt in performance per watt by 30-50watts. Unless you require AVI encoding the 6800 seems like the sleeper GPU for 2024...
The wattage reports differently between rdna 2 and 3. Pretty sure the 6800 is only showing GPU power but the 7700xt is showing total board. I think in reality there isn't a meaningful difference in efficiency.
I have a 6800 but can't take advantage of it because i had to spend all my money on a 4060 laptop for school, instead of upgrading the 5 year old i5 that's paired with the 6800. It's painful to watch these videos and see how good the 6800 is, because i literally have it but i also kind of don't.
Daniel is correct. 6800 doesn't show the total board power usage. I think it's another 40-50 watts added onto the 6800 to get the total board wattage, so they basically have the same efficiency.
@@legendp2011he should also test with an operating system that is out of date. Perhaps had a virus. He could do a test while having 1000 tabs open. Yes, all valuable pieces of information
OK, it's starting to get enoying: How about 10Gig? A LOT of people have the 3080, and it was/is a very good deal at certain times. And yet, it get's completely overlooked again and again. I just wonder how much it is limited against the 12GB-version or the 4070 (super).
A lot. 3080 is completely insane. 1080 ti had 11gb. Many years later you get 10gb on 3080 which is absolutely insane. Also add the prices people paid for it...
How about 11Gb? Well let me tell you. I have an RTX 2080ti and it's a perfect Nvidia midrange GPU. Bought used 2 years ago for only $300 before Vram drama. Used 3070 was much more expensive.
The 3080 was ridiculously expensive almost until it was discontinued due to scalpers. And by that time the 4070 was released which was the best new GPU option for that performance tier (similar performance with more VRAM and dlss3 for a lower price).
You were talking really well about this card (6800)last autumn, It went on sale with starfield and I picked it up. Still going strong and can play just about anything I want on high at 1440p. Really happy with this purchase. Thanks!! gr8 video.
I always had stutters in HL: Alyx with the RX6600XT (8GB, the only reasonable card during the GPU-apocalypse), Now when I have an RX6800 and 16GB I no longer have those issues, and the Radeon software reports 14.5GB usage in that game (when playing at 5400x2700) so I guess it was a VRAM-related all along. (Or me not understanding that 8GB wasn't enough for "high" textures) Today I just love the security of having plenty of VRAM. Throughout the ages, VRAM is the single thing that has increased longevity of GPUs, and I am happy to have some proper headroom.
I have 8 GB RTX 3070 and in Horizon Forbidden West I had to change textures for high instead of very high just because I didnt had enough memory and my FPS dropped from 90-110 to even 36 in cutscenes... Its fair to say that 8 GB is definitely not enough in 2024 for 1440p.
@@essn6807 Stalker 2 is a terrible optimized game which runs like shit even on my rx 7800xt with 16 gb VRAM. Anyway what is the problem with lowering Texture settings and optimize the game so that your card can run it? The most games run perfectly fine with 8gb Cards. Just bc a few bad optimized games dont run , doesnt meen 8gb is not enough. For 99% of Cases it is enough and in the 1% of cases where it doesnt, you can just lower some settings.
One thing to 15:00 It seems like the ram usage on the RX 7700 XT is way higher with 16.7GB compared to 15.1GB on the RX 6800. Not sure if there were maybe other reasons for it, but just something I think is interesting It seems like that the same happens on 1440p. But also happens on Starfield 17:44 while it doesn't even come close to using 10GB of VRAM
I just ended up getting 7900 gre, 16 seems like a safe spot and was relatively much cheaper than a 4070S. Sure no nvidia feature set but it is what it is- so far been enjoying it! Honestly with this test I feel like it's difficult to compare 12gb vs 16gb because there isn't any games that actually use over 16gb vram (unless RT), once those become more mainstream due to consoles then we'll see performance difference. Just like how 8gb vram cards are becoming more irrelevant.
8GB cards started to struggle when new gen consoles were released. Same will happen to 12GB cards, once the next gen consoles launch in 3-4 years. Until then there won't be any major changes in VRAM usage. We are experiencing the same game behavior for ~3 years - 8GB has become insufficient for max (ultra) settings even at 1080p, but still enough for high settings. 12GB is enough for 1440p/max and 16GB is enough for 4K/max.
Not the same. Both Nvidia Pascal and AMD Polaris have equal gaming features. RTX4070 just gets better ray tracing capability over the RX7800XT. As a gaming card, sure the RX580 is better today. However, for HTPC use, the GTX1060 is better because it has hardware VP9 decoder which UA-cam playback requires. RX580 doesn't , only RX5000 series and newer can decode VP9.
@@fleurdewin7958better raytracing in some games. It's mostly Cyberpunk that heavily favors Nvidia. In other raytracing titles AMD RDNA3 performance is on par or sometimes even better than Nvidia.
Keep in mind this is just one-time testing. If you do actual gameplay or longer session of gaming, it going to nuts, especially in games where the memory leak issues are horrible, if you know what I mean. 🤷🏻♀️
@@kkrolik2106 I'd want to do that just because I can. Because not a lot of cards can have that even with soldering and a custom bios. If the 6700 was 8GB it might've even been worth it for the trouble but I'd say the 10GB memory will well outlast the capabilities of that GPU.
@@kkrolik2106 To be fair the RTX 3080 with its 320 bit memory controller would have deserved 20 GB of VRAM because that card just much more faster. Back then they put 1GB GDDR6X on every 32 bit because 2GB GDDR6X modules wasn’t available. In the other hand on 160 bit on the 6700 AMD already put 2GB GDDR6 modules per 32 bit, how would you put extra 2GB on it? Does for example 4x2GB module plus 1x4GB module work with that card, is a memory controller able to use different kind memory capacity because the only successful mod that I saw was the RTX 3070 8GB to 16GB but the guy on the 256 bit memory controller just replaced the 1GB modules to 2GB modules.
Maybe the 4070 Ti and the 4070 Ti super could close the gap to compare more accurately the VRAM, but these ones are still a good example, nice comparison
Is less than a 5% in most cases, wouldn’t make a big difference, and it would a better representation letting aside generational improvements and architecture
Daniel has already made a comparison between 4070 Ti and 4070 Ti Super at 1440p and 4K res. In 99% of cases 12GB was fine. There were a couple of examples, when Ti run out of VRAM, but in those cases even Ti Super could not provide playable experience. VRAM panic is getting out of hands IMO. Even 10GB cards are still mostly fine, let alone 12GB models. It is just 8GB GPUs, that struggle in many modern games, since they usually need at least 8-9GB to run w/o any issues.
I have a 7900XTX and just bought Avatar. I play at 4k with FSR and frame gen. The benchmark is pretty choppy with highs in the low 100s and lows as low as 25 yet actual gameplay is highs of 300 and lows of around 100. Allocated VRAM is 22gb!
From this video i can tell that even 8GB is almost enough for native 1440p Ultra settings. That means with a bit of DLSS/FSR 8GB is just the right amount of VRAM to have for 1440p in 2024. 10GB of VRAM will last you a bit longer, and 12GB of VRAM is FAR from being a bottleneck. I actually have the RX 6800 16GB so i'm pretty well off but i also have a 4060 laptop which is 8GB so i occasionally check these videos haha.
It's not, games like Forbidden West would have noticeable chug on 8Gb at 1440p. And that's just an example off the top of my head. A lot of games allocate textures based on your VRAM as well, so you might get blurry textures as it approaches the cap even if you don't hit the drop FPS in half out of VRAM wall. 12Gb is probably fine for a couple years, but if you're buying a new card at this point I would assume you want to be covered for at least 5 years.
RTX 3070 1440p high RT Reflections Cyberpunk 2077. After opening map or inventory i have drops to 15 FPS for 10 seconds, because vram bottleneck. 8gb Not Enought for 1440p high without rt lol
@@albert2006xp This isn't buying advice. This is just telling 8GB card owners that they can hold out on an upgrade for another year. Forbidden West is in this video and it uses right around 8GBs at 1440p Very High. This allocation is on a 12GB card (games allocate extra memory if you have it) so it's safe to say it would use around 7.5G on an actual 8GB card. And again, that's one of the most demanding games of 2024 at native QHD Max settings. And then there's upscaling when you will need it next year, so you're covered there. People who own 3070's or 6600XT's should be fine until next gen cards are available at sensible prices. I have a 6800 16GB so i'm good for the foreseable future.
You should test a 3080 with 10gb and 12gb versions, and just underclock the 12gb by like 5% to match the performance of the 10gb. I'd love to see that video, since that would be a good showcase if 12gb is enough compared to 10gb, and if those with 10gb versions were basically screwed by nvidia.
A lot of gamers are looking at VRAM the wrong way, been saying this for years. Just because VRAM allocation says 14gb doesn't mean you need 16gb. If you have 12gb it will just allocate less memory. What matters more is your memory clock and bandwidth, the faster your gpu handles memory the less it needs to allocate.
Although no one thumbs up for your comment, i thought the same as you based on my own experience. Using VRAM is like a subway with a limited number of seats, and it can carry more people than the number of seats to their destinations without delay.
@@loxonin138 Ikr, ppl saying 12gb vram is not enough are literally dumb. I can play games native in 1440p at over 144 fps, I am even using 1620p DLDSR with Quality DLSS (mainly because TAA is trash, if DLAA is an option I use that instead of DLSS) and ultra settings and games still run like butter. Above 12gb for 4k ok sure, in SOME games. Most games aren't that demanding though, and the most demanding games are often poorly optimised or only have bad performance with ray tracing which is also true for the 4090. Try running a game with ray tracing on that card, you won't hit 100 fps. Clearly VRAM isn't everything because a 4090 has 24gb of VRAM and still can't do high fps at 4k. I wish more people had brains and stopped spreading misinformation, all these console players coming to PC thinking they know about hardware when they know fk all is to blame I think. Just because they come to PC and built one they think they know everything, but it's easier than ever to build a PC nowadays I could teach an 8 year old how to do it in 1 day. Back in the day, you would get crucified on sites like tomshardware if you even dared trying to say VRAM is the only thing that matters.
I use my 6700XT in my living room on my budget 4K TV and it usually requires that some form of upscaling be used. So far I haven't found a game that simply refused to work on it, although there's discussion about the importance of a card's memory bandwidth when trying to push a 4K image even with upscaling.
If you cant benchmark for 1-2 hours on each game to make sure your results are accurate, maybe include fewer games instead? Also, AMD vram usage and Nvidia vram usage are often not the same, a game that uses more than 12gb on AMD might use below that on Nvidia card, but I'm sure you already know that. But your video is titled as if AMD vram usage represents all GPUs.
Have you considered writing a pytorch program which allocates a tensor of say 4 GB to artifically restrict the vram avaliable to the games? Would it even work? If it does, This should serve better than finding different variant of a single GPU.
I had vram issues on my 12 GB 4070 TI when using path tracing on cyberpunk dlss quality 1440p. I eventually upgraded to a 4090 and my takeaway is that 12 GB is not an issue until path tracing and high-end Ray tracing starts to come into a play, which is not something that matters much with AMD right now
Having experienced my 3070 suffer artificial premature longevity issues due to the lousy 8GB VRAM makes me opt for a 16GB card even if it's not needed yet.
I have the RTX 4070 Super, and considering how where I am computer parts are insanely expensive, getting my pure performance build at $2000, which is outrageous just for 4070S, I will have to make it last for at least a decade. With that said I've settled on a 1080P monitor, hoping it'll be able to play all games in 1080P max at smooth 60FPS for the upcoming decade.
one game that uses an absurd amount of vram is Escape From Tarkov. I have a 3080 12gb and I have had to turn DLSS to performance when I have textures on high. When I turn textures to medium, I still have DLSS on quality to guarantee the VRAM usage stays under 12gb.
I ran Ghost of Tsushima at Very High preset (motion blur off/chromatic aberation off/depth of field off) on 3080Ti 12GB @5120x1440 (not 4k but nearly) and it was in the 80s-110s for the entire game. Worked fine.
I think at this tier of GPU, 12gb is plenty, but with a 4070/Super or especially a 4070 ti, things like path tracing and 4k become more viable in general, and the costs of the cards is high enough that you probably are aiming to keep it for a few years and don't want to be lowering settings due to vram constraints. I'd be very hesitant to get a 4070 ti (non-super), especially, for that reason. And if the rumors of the 50-series are true, we should expect the next gen of mid-tier cards for Nvidia to have the same VRAM limits, while presumably having more power in general.
In my opinion memory issues in games come down to devs not properly mangaging memory because its hard and time consuming. They have 4090 in there dev pcs so if it runs fine there they call it a day. Also Nvidia is aware of this. Nvidia limits vram on the mid to low end to drive players to spend more via planned obsolescence.
I'm quite happy with my 12 GB 6700xt playing on 1440, but to be fair, I tend to play games that aren't too demanding, either. I'm confident the card will serve me very nicely for several more years at least.
I'm running Ghost of Tsushima with a 5800x3d+4070 on High 21:9 1440p with DLSS on Quality and Frame Gen enabled getting 135 fps - 140 fps. 12GB of Vram still isn't an issue at all right now.
the issue here is people using presets that are not meant for these cards, the hardware requirements for this game have the medium preset as the recommended preset, very high is for top end cards running the game at 4k
I’d love to see an Nvidia RTX 2000 vs AMD Radeon Rx7000 series video comparing Ray Tracing to see how well Radeons RT compares to early Nvidia iterations 💪😇👍
It’s mostly games that were designed for the XBX and PS5 that need more vram. Object density and textured were heavily optimized on consoles for a little over 8gb shared memory. Hardware Unboxed did a good video on Harry Potter and Resident Evil needing more vram.
I do wish you’d used an Nvidia GPU for this test, because their drivers both allocate and use less VRAM. But still a good test, about as expected. Another thing to note is that if you’re struggling with VRAM and have a secondary monitor, try running your side monitor off the motherboard. That usually cuts off about half a GB for me.
the 12gb card performs at VRAM bottleneck better when the game just doesnt load the textures properly. so it has less to process. nowadays you always have to check the visual result, 1% lows, runtime dependency, etc. low-mid tier gpus cant keep up with technical dev demand anymore
The fact that my 4070 Super can run Cyberpunk 2077 at 1440p at the highest settings with modded 4K texture, run modded RT implementation, which is even heavier than the original RT implementation from the game, with FG and DLSS on, and the VRAM allocation never go over 11GB, is enough evidence to show me 12GB of VRAM is enough for the next few years for 1440p.
Fps and stability can take a hit before VRAM is 100% full. My 3070ti never went above 7.2gb playing hogwarts legacy for example, but the performance stability was very bad and turning textures down would give me a very significant boost.
I mean, if you're hitting almost your max on a current game, when do you expect to replace that card? I would expect it to still be capable of that for the next 4.5-5 years if I bought a refresh card in current year. Until 7000 series basically. And that's if you only game on your PC and don't run any local AI.
i have a superultrawide 1080p monitor and might upgrade to 1440p superultrawide, would the RX 6800 be better? It's slightly cheaper than the 7700XT in my country (for extra clarification, I will be using it as an eGPU via oculink with either an 8840U or strix point/lunar lake)
If you're running 8gb of VRAM even on powerful GPUS like the 4060/4060ti/3060ti/3070 or even the 10 of 3080, reducing graphics settings to High than Ultra would certainly make this VRAM issue solvable, the additional particles on ultra compared to High will always give diminishing returns vs its performance no matter what the card.
Those who like to mod games (especially texture packs) will find the 12GB buffer can be exceeded much more easily. My previous GPU was the 1080TI. I didn't like the idea of upgrading to a card with only 1 extra 1GB of Vram, so picked up a 6800XT. The only Nvidia card I would consider buying now is the 4070TI Super, but it's still a bit too much on the expensive side for my taste.
My 6700xt rocks. Raytracing taxes the cpu too much for the benefits that I barely notice anyway, so thus far I have zero issues or games I can't play. Thanks for making all these comparison vids, very useful info, I appreciate all you do fellow Daniel.
Playing Cyberpunk 2077 in 1440p with everything maxed out, RT Psycho & Path Tracing on with a 4070 @ 60 FPS constantly using less than 200W and 11,5GB of VRAM
Because that's the current threshold for the 9th console generation. PS6 and the next XBOX generation will change that, but we are 4 years away from the start
I have a 6700 on my gaming rig, and a 5700xt on my secondary attached to the 4k tv. I can play nearly everything on the secondary just fine. The 67 just rocks without flinching.
Last year using 4070ti....and yes 12gb vram for 2k/4k ultra setting not enough juice for run AAA games, UE5 games or ps5 console port games.....games like forspoken, immortal of aveum and hogwart legacy is very laggy and destroy the experience...... After using 4080 all problem of vram issues is solve.....all games above already crank the vram around 13gb to 15gb vram on 2k-4k ultra max setting.
When I got my 4070 at launch, I never bothered stressing about whether 12GB of VRAM was enough for 1080p or 1440p. Consoles have 16GB unified so probably no more than 12GB dedicated VRAM. For me, not only will it be when the PS6 comes out that 12GB might not be enough, but when they're releasing games for PS6 only, kind of like early PS5 games worked on PS4 and PS5, but now it's PS5 only. So it's gonna be a few years off, and when it does happen my 4070 won't have enough horsepower anyway.
Yeah basically 12Gb will hold... but be ready to not play at the highest settings in 3 years or so. You're gonna be turning down some RT, some textures way before PS6 games are a thing. PC release only features basically.
The thing about consoles that you have to remember that part of that “unified memory” is still reserved for system RAM. As someone else said as well. The consoles don’t even run aggressive textures either since they still prioritize FPS over fidelity.
My 4090 rarely exceeds 12GB for 4k, which is why I went ahead with a 4080 laptop instead of a 4090 laptop (a sale at half the price). The 4080 laptop is a 4070 desktop equivalent. I don't expect problems until after the start of the next console generation and understand that I will have to utilize DLSS tech for the few most demanding games.
I'm currently playing Rise of the Tomb Raider, a 2015 game, and with 1440p DLSS Quality everything max I'm going beyond 8GB of VRAM, getting massive stutters on my 3070 Ti. This is a 9 year old game, if 8GB wasn't enough back then, why are we still seeing so many 8GB and 12GB cards in 2024.
23:45 - this. You should aim for 16GB even though PC has VRAM separate from system RAM. We simply run higher settigs and resolutions and sometimes on bad ports. We need the headroom.
3070 user here with a 4k monitor. I can get away with 1440p and med/high textures upscaled to 4k on top of frame gen and I usually have enough VRAM for that. Otherwise I'll try to run 4k DLSS Performance + Lossless Scaling as well.
Well using this logic i dont think upgrading a card will be caused by the VRAM but the pure performance that will be required to run the game that WILL REQUIRE more than 12gb of VRAM. IN that case you are upgrading not becuase of the VRAM but because of the CHIP technology and what is required to get the FPS, or both.
4K is certainly doable in a 7700 XT/6800 in many games, especially when using upscaling. Would have been useful and interesting to have 4K comparisons as well
Before watching the video I already know this is mostly going to be about over the top settings in AAA games that I'm never going to play. Most people don't buy the kinds of cards that will run these games on the highest settings anyway. So I think since my 6 GB card is just now starting to struggle a bit in terms of playing modern games I would say 8 GB is fine for most people right now and 12 is probably going to be what I aim for in my next card and probably what I would recommend people start buying if they are buying new. Anyone who tries to say we 'need' more than that has unrealistic beliefs of what and how most people play.
Heya. Don't know if that really helps you, but I have been playing ghost of tsushima on a vega 64 (8gb) on high in 4k just fine! But it's with fsr quality, capped at 60fps. (The GPU isn't even sweating, running below 100% most of the time if you capped the frame rate at 60, tsushima in particular is super well optimized) Generally speaking, the really outdated Vega 64 has been running everything just fine in 4k with medium to high graphics (30-60fps)! So you should be fine until your GPU upgrade if you are willing to turn down your settings a bit. I personally don't really see the difference between very high and ultra and the performance gains you get are totally worth tuning it down a notch. Also (in my case) it helps that I don't even have the option to ray trace xD
@@blubbel222 Thanks I does help. I use the 3060 12gb model which is similar in performance to Vega 64. At least now I know Ghost of Tsushima won't be a problem in 4k. I recon there are many games that do not cross 8gb even in 4k so hopefully it won't be a big issue.
Sometimes VRAM not effect the performance, but sometimes it does.. depends on the PC system, pc specs, games you play, monitor you use, or applications you use, or software you use... but for overall use, 8GB is fine, and 10GB or more is much better for future proof... same as you building a PC, either AM4 or LGA 1700.. but for future proof get AM5...
Im curious about HDR, not thats its related to this at all, but how many of you guys use HDR in your games, and if you do, do you use the windows HDR or RTX HDR from the new nvidia app? I guess most people just use SDR for the simplicity and cause thats what we've always had before
I wonder if GPUs are designed to only use a specific percentage of the available memory, keeping a chunk generally available for spikes. If that were the case, you would almost never see fully utilized RAM. Also, I would like to know what a GPU does as it approaches full RAM. What specific adjustment to the visual quality does it make? It is possible that it first adjusts color, sharpening, ect...rather than FPS.
II really like this comparison, but it would be nice to see the GPU Usage % on top left as well so at least sometimes we can rule out the potential reason of the games not being optimized ( low utilization )
Thanks for this Daniel! Would it be possible to add a GPU percentage utilization stat to your overlay? I noticed the 6800 power/wattage dropped for most of the ray tracing tests vs the 7700 XT which makes me think the GPU was not fully utilized during ray tracing. Probably drivers as you mentioned but the % would give us a bit more info on this
Question. is there anything inherently wrong with vram being the limiting factor of a GPU I understand its a manufactured limitation but, Would it be better to be limited in other ways like power , clock, cuda cores, or bus width.
@@KrisDee1981 6800 has also more Bandwidth due to its 256bit bus vs the 192bit bus from the 7700xt. With 21% a bit less but this should ve made some differences already if it was nessesary.
It is very simple 12 gigs of vram is enough for all or most games at native 1440p high settings with or without some level of raytracing and you should still get the bare minimum 60 fps up to 100+ fps depending on the games.
16GB needs to be on the next 5070. The 4070 super should have been 16Gb. The extra memory for advanced settings like DLSS scaling or RTX (both at the same time)
i dont understand why people saying we need 16 because the ps5 have 16gb but its split between the cpu and gpu and i see modren games use at least 10gb of cpu ram so...
I'm still using the RTX 2080ti that I bought used 2 years ago for only $300. 11GB of Vram is actually perfect for this performance class, and 2560x1440 or 3440x1440 resolutions.
Well, I was using a 42" LGC2 4K OLED and 2080ti was struggling in newer games like Alan Wake 2 even without RT, and with DLSS performance. It wasn't because of Vram, but just not enough GPU power. 6 months ago I changed to 34" 3440x1440 and it's a much better combo.
@@KrisDee1981 Alan Wake is a special case as it is one of a few games that started using mesh shaders. NVIDIA cards didn't really have a good utilizing of hardware for mesh until the 4000 series
@@ehenningsen Looking at the benchmarks Turing perform as expected in AW2. 1440p max, no rt 2080ti 40fps Rx6800 is only 2% faster 4070 is 30% faster I know for sure Pascal GPUs are struggling big time in AW2, even 1080ti, but Turing is still going strong in rasterization and light RT.
I've done my share of research and it all points to that for 1080p gaming, 12Gb VRAM is perfect, you only need 16Gb for 1440p. But then the next question will be, is the VRAM on 128 or 256bit... and also how many cudacores. For a GPU to be optimal, it needs to have fast VRAM, and enough cudacores to make the most out of the 12Gb VRAM. I was very interested in ASUS Radeon RX 7600 XT Dual OC 16GB, but it turns out that this GPU doesn't really have enough "power" to really use all those 16Gb VRAM, because that much VRAM is mostly used in 1440p. So.. in the end. I believe the real question is.. for Budget and medium, who are the kings of 12gb GPU's ?. A Bird whispered to me that the RTX 3060 ti 12gb could be a winner here - for pure 1080p gaming. What's your opinion on this ?.
It seems that the VRAM is not the only thing to consider, neither upscaling / retracing. it seems that we could underestimate the optimisation factor. Therefore, it becomes more and more difficult to buy hardware these days. That was a great video! Thanks.
Good choice of GPUs to use for the comparison. My opinion is that the 4070 TI variants are too powerful to make a comparison for these more commonly used resolutions.
The RTX 3080ti and 3090. Both have the exact same GPU die, memory bandwidth, and near identical Cuda Core count... Same manufacturer, same hardware generation. The RTX 3080ti is essentially a 3090 with half the VRAM.
Also, Desktop area also matters, 4K desktop uses 800mb VRAM, 1080p uses 300mb or less., which is included in the allocated value. Multimonitors also add up.
Did not consider that. So with my double 1440p setup I will be noticing slightly higher vram than if I only used one. Wow that seems like such an obvious thing but I never considered it.
@@mooseonshrooms yeah I only started considering it when using Nvidia DSR 1080p to 4K+DLSS for those borderless fullscreen only games where you need desktop at 4K and saw 300MB VRAM jump to 800MB!!!
I believe it does not matter, if you are using in-game exclusive fullscreen mode. But it would be interesting to make such a comparison anyway.
@alargecorgi2199 The thing about console games are ; they don't run on Windows and rely on DirectX , optimization is better for consoles as there is less driver overhead . So if consoles allocate 12GB for VRAM, doesn't mean its applicable to PC . Especially consoles often make use of upscaling feature, many of us PC gamers prefer to play at native resolution, not using FSR and DLSS .
@@fleurdewin7958 most people often follow the misconception that "consoles allocate 12gb etc etc" and apply those concepts directly onto pc when all have different api, different driver overhead
Is 10 million dollars enough for a house
I can't believe I have to specify but this is a joke
Not in the west.
@BattleToads Depends.
🤓replies
Market says 8 million wasn't enough so ofc we discussing 12
@@Sevastousmight be able to get away with 10mil if you're willing to upscale
so when some of the newest AAA games are already right at the edge of 12GB we can logically expect VRAM usage to increase above 12GB in the next years. That puts the current graphic cards market really into a weird position since Nvidia charges 800$ for the first real 16GB option (I dont count the 4060Ti). AMD starts with the 7800 XT which is slightly under 500$. Thats A LOT OF MONEY on a graphic cards alone for the normal PC gamer.
It won't use more until the next console generation. thats roughly 4 years away, when you would already be upgrading anyway.
@@jayceneal5273 how do you know that? whats your source?
@@svenyproud they arent going to make the minimum vram usage more than what consoles can use lol
@@jayceneal5273 That's not true, tons of games are made by AA studios or indie studios and they care less about consoles and these studios are the ones that are making great games this year. And just because consoles are limited doesn't mean the same game won't use more on PC.
@@No-One.321 if any indie game or AA game does then it's not very optimized and that's rarely the case anyhow, usually they are made to be run on potatoes. You're just fear mongering for something that isn't going to be a problem
6750 XT user. 12 GB has been enough. Never been an issue.
That card right there was the best value of last gen. Saw some good sales on it when it was still new.
6700xt here, killer.
@@Mattribute6800 better deal honestly
Ye, there's plenty of other things besides that has an impact on performance, VRAM is rarely actually the issue
@@Mattribute 6800 or 6800 XT is a bit better in my opinion as it's more future proof
I crack up every time you move yourself to point things out 😂😂I'm not even into gaming that much anymore but these videos are fun to watch
if its abt fun watching my old streams were fun to watch, i didnt think abt visual quality and just play games as usual
im pretty sure theyll automatically added to the studio, im too lazy to contact and ask for revival
Curious to see 4070 ti vs 4070 ti super in 4k settings.. Similar difference to 7700xt / 6800
4070 never meant for 4k.. 2k only. If you got that 4k monitor, then upgrade to 4090
You should have tested TLOU as well! That game actually makes gpus perform worse when it tries to use more vram than the gpu has.
Played it a few weeks ago. Worked like a Charme
that was before the patches
Well duh any game is gonna run worse with more vram needed than allocated
That's with any game. If any game uses more VRAM than a card has, the game will start using system memory which is a lot slower than GPU memory.
Would not be a good test since the always allocate more VRAM than it needs, I saw someone did a system requirement test on that game when it just came and see it try to allocate 14GB of VRAM when it doesn’t even use half of it, the setting was 1080p high setting native resolution.
12GB at 1440p is fine with this console cycle. Developers will not create higher resolution textures just for high end PCs if it will not run on a Playstation 5. Which makes 24GB useless.
unless you actually do PC things and apply mods and reshaders...
@@totalermist that's true but that's a different argument. Personally, I have learned not to bother with mods.
@@mrnicktoyou Are you joking? Why not? There are so many games that have mods better than what the original developers even did. I'm still playing new Left 4 Dead 2 campaign maps for free that blow away anything Valve did. It turned a game that would've been entertaining for a couple of years into a game that is still immensely entertaining after 15 years.
Not to mention, that 24GB card is going to age a hell of a lot better in the next 2-3 years than a current card with 12GB-16GB.
@@TooBokoo I'm sure they will age better but I'm also sure the majority who spent top dollar for those cards today will be upgrading again next GPU Gen so 24gb helps the future used market more than first time owners of those cards.
@@totalermistThe vast majority of PC players are not interested in modifying their games, so that issue is irrelevant in the general use of the GPU and its vram.
12GB is a very good amount for 98% of games at 1440p.
Exactly what I was thinking. It might not matter now, but it's good if you don't want to upgrade for a while.
Will it be enough for extreme setting? for example Cyberpunk 2077 Ultra realistic reshading mod on 1080p? (To clarify, its a mod that push graphic to far far beyond normal Ultra setting)
@@mukamuka0 nope
Exactly! By the time its not enough.. A huge leap or seismic shift will probably happen then. So, It's not a real worry in the here or now. Get what works with at least 12 gigs and be happy!? But seriously, I purchased a 4070 Super and no regrets for price or my gaming needs. Cheers!
@@megadeth8592 dang it...too bad
as a 6800 user who plays less demanding non AAA games at 4k it's also interesting to me that the 6800 also appears to beat the 7700xt in performance per watt by 30-50watts. Unless you require AVI encoding the 6800 seems like the sleeper GPU for 2024...
The wattage reports differently between rdna 2 and 3. Pretty sure the 6800 is only showing GPU power but the 7700xt is showing total board. I think in reality there isn't a meaningful difference in efficiency.
Nah the RNDA2 GPU’s don’t show the full wattage that the RNDA3 and Nvidia GPU’s do.
I have a 6800 but can't take advantage of it because i had to spend all my money on a 4060 laptop for school, instead of upgrading the 5 year old i5 that's paired with the 6800. It's painful to watch these videos and see how good the 6800 is, because i literally have it but i also kind of don't.
Daniel is correct. 6800 doesn't show the total board power usage. I think it's another 40-50 watts added onto the 6800 to get the total board wattage, so they basically have the same efficiency.
@@ninjafrozr8809For what school do you need a 4060 Laptop?
Something to consider are games that progressively use more vram during longer play sessions but there is no way for Daniel to test that effectively.
that's called a memory leakage, it's not supposed to happen.
@@MaxIronsThird its not "supporsed to happen", but it does happen in some game
@@legendp2011 Só you're saying I need a card with more VRAM bc of a bug and that's normal? I don't get it.
@@MaxIronsThird what did you not get?
@@legendp2011he should also test with an operating system that is out of date. Perhaps had a virus. He could do a test while having 1000 tabs open. Yes, all valuable pieces of information
OK, it's starting to get enoying: How about 10Gig? A LOT of people have the 3080, and it was/is a very good deal at certain times. And yet, it get's completely overlooked again and again. I just wonder how much it is limited against the 12GB-version or the 4070 (super).
A lot. 3080 is completely insane. 1080 ti had 11gb. Many years later you get 10gb on 3080 which is absolutely insane. Also add the prices people paid for it...
the 3080 is gonna struggle in the very near future, 10gb isnt gonna be enough for 1440p really especially with newer games today
@@Defineddalready struggles
How about 11Gb?
Well let me tell you. I have an RTX 2080ti and it's a perfect Nvidia midrange GPU. Bought used 2 years ago for only $300 before Vram drama. Used 3070 was much more expensive.
The 3080 was ridiculously expensive almost until it was discontinued due to scalpers. And by that time the 4070 was released which was the best new GPU option for that performance tier (similar performance with more VRAM and dlss3 for a lower price).
You were talking really well about this card (6800)last autumn, It went on sale with starfield and I picked it up. Still going strong and can play just about anything I want on high at 1440p. Really happy with this purchase.
Thanks!! gr8 video.
I always had stutters in HL: Alyx with the RX6600XT (8GB, the only reasonable card during the GPU-apocalypse), Now when I have an RX6800 and 16GB I no longer have those issues, and the Radeon software reports 14.5GB usage in that game (when playing at 5400x2700) so I guess it was a VRAM-related all along. (Or me not understanding that 8GB wasn't enough for "high" textures) Today I just love the security of having plenty of VRAM. Throughout the ages, VRAM is the single thing that has increased longevity of GPUs, and I am happy to have some proper headroom.
I have 8 GB RTX 3070 and in Horizon Forbidden West I had to change textures for high instead of very high just because I didnt had enough memory and my FPS dropped from 90-110 to even 36 in cutscenes... Its fair to say that 8 GB is definitely not enough in 2024 for 1440p.
just bc you had to lower one setting , which visually you wouldnt even notice during gameplay , it is not enough ? Really?
@@BastianRosenmüller try to run stalker 2 with 8gb. it isn't even enough for 1080p
@@essn6807 Stalker 2 is a terrible optimized game which runs like shit even on my rx 7800xt with 16 gb VRAM. Anyway what is the problem with lowering Texture settings and optimize the game so that your card can run it? The most games run perfectly fine with 8gb Cards. Just bc a few bad optimized games dont run , doesnt meen 8gb is not enough. For 99% of Cases it is enough and in the 1% of cases where it doesnt, you can just lower some settings.
One thing to 15:00
It seems like the ram usage on the RX 7700 XT is way higher with 16.7GB compared to 15.1GB on the RX 6800. Not sure if there were maybe other reasons for it, but just something I think is interesting
It seems like that the same happens on 1440p. But also happens on Starfield 17:44 while it doesn't even come close to using 10GB of VRAM
Hmmm..interesting..🤔
It varies. At 3:14 the 7700XT system is using less ram than the 6800 - 14.4 vs 15.4. In Alan Wake2 sometimes the 7700xt is higher and sometimes lower.
@@lawrencekallal6640 oh yeah there it is like that. Would be interesting to know why it is like this
I just ended up getting 7900 gre, 16 seems like a safe spot and was relatively much cheaper than a 4070S. Sure no nvidia feature set but it is what it is- so far been enjoying it!
Honestly with this test I feel like it's difficult to compare 12gb vs 16gb because there isn't any games that actually use over 16gb vram (unless RT), once those become more mainstream due to consoles then we'll see performance difference. Just like how 8gb vram cards are becoming more irrelevant.
8GB cards started to struggle when new gen consoles were released. Same will happen to 12GB cards, once the next gen consoles launch in 3-4 years. Until then there won't be any major changes in VRAM usage. We are experiencing the same game behavior for ~3 years - 8GB has become insufficient for max (ultra) settings even at 1080p, but still enough for high settings. 12GB is enough for 1440p/max and 16GB is enough for 4K/max.
The RTX 4070 and RX7800XT is basically the new gtx 1060 6gb and rx 580 8gb. Look how that turned out.
Yeah. The rx 580 aged far more gracefully.
Not the same. Both Nvidia Pascal and AMD Polaris have equal gaming features. RTX4070 just gets better ray tracing capability over the RX7800XT. As a gaming card, sure the RX580 is better today. However, for HTPC use, the GTX1060 is better because it has hardware VP9 decoder which UA-cam playback requires. RX580 doesn't , only RX5000 series and newer can decode VP9.
@@fleurdewin7958better raytracing in some games. It's mostly Cyberpunk that heavily favors Nvidia. In other raytracing titles AMD RDNA3 performance is on par or sometimes even better than Nvidia.
@@yt-mull0rAlan Wake? Ratchet and Clank? Robocop? Avatar Frontiers of Pandora?
@@yt-mull0r that's not totally true
Keep in mind this is just one-time testing. If you do actual gameplay or longer session of gaming, it going to nuts, especially in games where the memory leak issues are horrible, if you know what I mean. 🤷🏻♀️
10gb was enough ish on my 3080 but would have prefered 12-16 atleast to feel safe
10GB is enough for now on my RX 6700 in theory I can add 2GB more but this requires some soldering and bios editing.
@@kkrolik2106 I'd want to do that just because I can. Because not a lot of cards can have that even with soldering and a custom bios. If the 6700 was 8GB it might've even been worth it for the trouble but I'd say the 10GB memory will well outlast the capabilities of that GPU.
@@kkrolik2106 To be fair the RTX 3080 with its 320 bit memory controller would have deserved 20 GB of VRAM because that card just much more faster. Back then they put 1GB GDDR6X on every 32 bit because 2GB GDDR6X modules wasn’t available. In the other hand on 160 bit on the 6700 AMD already put 2GB GDDR6 modules per 32 bit, how would you put extra 2GB on it? Does for example 4x2GB module plus 1x4GB module work with that card, is a memory controller able to use different kind memory capacity because the only successful mod that I saw was the RTX 3070 8GB to 16GB but the guy on the 256 bit memory controller just replaced the 1GB modules to 2GB modules.
You don't need to feel "safe" it's not like games wont run if you hit vram limits, watch the 8gb video to see how little impact this has even in 2024
that's why I waited for 12GB variant to release (I got it on launch day and have it undervolted)
Maybe the 4070 Ti and the 4070 Ti super could close the gap to compare more accurately the VRAM, but these ones are still a good example, nice comparison
My exact thoughts.
4070Ti S is quite a bit faster than the regular Ti, the comparison wouldn't work.
Is less than a 5% in most cases, wouldn’t make a big difference, and it would a better representation letting aside generational improvements and architecture
@@Jav-z4s 10% you mean
Daniel has already made a comparison between 4070 Ti and 4070 Ti Super at 1440p and 4K res.
In 99% of cases 12GB was fine. There were a couple of examples, when Ti run out of VRAM, but in those cases even Ti Super could not provide playable experience.
VRAM panic is getting out of hands IMO. Even 10GB cards are still mostly fine, let alone 12GB models. It is just 8GB GPUs, that struggle in many modern games, since they usually need at least 8-9GB to run w/o any issues.
I have my OG 3080 10GB Ram and it's all working perfectly fine in 1440p and even in 4k in many games.
i have a 6800 and play AAA at 1440 and it smashes them all never had a single issue love mine
But that's 16Gb not 12 right ?
@@fabrb26 yeah
Love my 6800, bought it used but the coil whine is a bit annoying
@@jamieshephard2432Pfftt, weak card, cannt even use ray tracing, trash
@@JazanMuyanto yeah ok if you say so lmfao. ray tracing aint all that anyway n even if i wanted to use rt my gpu would handle it
I have a 7900XTX and just bought Avatar. I play at 4k with FSR and frame gen. The benchmark is pretty choppy with highs in the low 100s and lows as low as 25 yet actual gameplay is highs of 300 and lows of around 100. Allocated VRAM is 22gb!
From this video i can tell that even 8GB is almost enough for native 1440p Ultra settings. That means with a bit of DLSS/FSR 8GB is just the right amount of VRAM to have for 1440p in 2024. 10GB of VRAM will last you a bit longer, and 12GB of VRAM is FAR from being a bottleneck.
I actually have the RX 6800 16GB so i'm pretty well off but i also have a 4060 laptop which is 8GB so i occasionally check these videos haha.
I'm on 8gb 1440p and its great
It's not, games like Forbidden West would have noticeable chug on 8Gb at 1440p. And that's just an example off the top of my head. A lot of games allocate textures based on your VRAM as well, so you might get blurry textures as it approaches the cap even if you don't hit the drop FPS in half out of VRAM wall. 12Gb is probably fine for a couple years, but if you're buying a new card at this point I would assume you want to be covered for at least 5 years.
@@Zombie101”great”, you literally can’t max out textures which makes games look best that is not great.
RTX 3070 1440p high RT Reflections Cyberpunk 2077. After opening map or inventory i have drops to 15 FPS for 10 seconds, because vram bottleneck. 8gb Not Enought for 1440p high without rt lol
@@albert2006xp This isn't buying advice. This is just telling 8GB card owners that they can hold out on an upgrade for another year.
Forbidden West is in this video and it uses right around 8GBs at 1440p Very High. This allocation is on a 12GB card (games allocate extra memory if you have it) so it's safe to say it would use around 7.5G on an actual 8GB card. And again, that's one of the most demanding games of 2024 at native QHD Max settings. And then there's upscaling when you will need it next year, so you're covered there.
People who own 3070's or 6600XT's should be fine until next gen cards are available at sensible prices. I have a 6800 16GB so i'm good for the foreseable future.
You should test a 3080 with 10gb and 12gb versions, and just underclock the 12gb by like 5% to match the performance of the 10gb. I'd love to see that video, since that would be a good showcase if 12gb is enough compared to 10gb, and if those with 10gb versions were basically screwed by nvidia.
A lot of gamers are looking at VRAM the wrong way, been saying this for years.
Just because VRAM allocation says 14gb doesn't mean you need 16gb. If you have 12gb it will just allocate less memory. What matters more is your memory clock and bandwidth, the faster your gpu handles memory the less it needs to allocate.
Although no one thumbs up for your comment, i thought the same as you based on my own experience. Using VRAM is like a subway with a limited number of seats, and it can carry more people than the number of seats to their destinations without delay.
@@loxonin138 Ikr, ppl saying 12gb vram is not enough are literally dumb. I can play games native in 1440p at over 144 fps, I am even using 1620p DLDSR with Quality DLSS (mainly because TAA is trash, if DLAA is an option I use that instead of DLSS) and ultra settings and games still run like butter. Above 12gb for 4k ok sure, in SOME games. Most games aren't that demanding though, and the most demanding games are often poorly optimised or only have bad performance with ray tracing which is also true for the 4090. Try running a game with ray tracing on that card, you won't hit 100 fps. Clearly VRAM isn't everything because a 4090 has 24gb of VRAM and still can't do high fps at 4k. I wish more people had brains and stopped spreading misinformation, all these console players coming to PC thinking they know about hardware when they know fk all is to blame I think. Just because they come to PC and built one they think they know everything, but it's easier than ever to build a PC nowadays I could teach an 8 year old how to do it in 1 day. Back in the day, you would get crucified on sites like tomshardware if you even dared trying to say VRAM is the only thing that matters.
I got the rx6800 standard version one, and I'm happy with it. Does a great job
I use my 6700XT in my living room on my budget 4K TV and it usually requires that some form of upscaling be used. So far I haven't found a game that simply refused to work on it, although there's discussion about the importance of a card's memory bandwidth when trying to push a 4K image even with upscaling.
What about 10 GB on the rtx 3080 with ray tracing.
I've had no issues personally. Depending on how the 50 series looks, I might keep it for one more generation.
It's bad
My 2080 S runs raytracing fine with no issues. Cyberpunk ray tracing set to high and i get 40 fps mostly.
If you cant benchmark for 1-2 hours on each game to make sure your results are accurate, maybe include fewer games instead? Also, AMD vram usage and Nvidia vram usage are often not the same, a game that uses more than 12gb on AMD might use below that on Nvidia card, but I'm sure you already know that. But your video is titled as if AMD vram usage represents all GPUs.
Could also look at 3080 Ti (12gb) and 3090 (24gb). They are practically the same card.
Have you considered writing a pytorch program which allocates a tensor of say 4 GB to artifically restrict the vram avaliable to the games? Would it even work? If it does, This should serve better than finding different variant of a single GPU.
I had vram issues on my 12 GB 4070 TI when using path tracing on cyberpunk dlss quality 1440p. I eventually upgraded to a 4090 and my takeaway is that 12 GB is not an issue until path tracing and high-end Ray tracing starts to come into a play, which is not something that matters much with AMD right now
I remember upgrading from GTS250 to GTX1050Ti and being so happy that I can now play gta 5.
Now I'm struggling with modern games on my RTX2060
6800xt :) I run 1440p paired with a 5800x3d I’ll look forward to rdna 5
Same combo, its honestly faultless. I haven't had a single issue with framepacing, frametimes or framerates in any AAA title.
Having experienced my 3070 suffer artificial premature longevity issues due to the lousy 8GB VRAM makes me opt for a 16GB card even if it's not needed yet.
I have the RTX 4070 Super, and considering how where I am computer parts are insanely expensive, getting my pure performance build at $2000, which is outrageous just for 4070S, I will have to make it last for at least a decade. With that said I've settled on a 1080P monitor, hoping it'll be able to play all games in 1080P max at smooth 60FPS for the upcoming decade.
one game that uses an absurd amount of vram is Escape From Tarkov. I have a 3080 12gb and I have had to turn DLSS to performance when I have textures on high. When I turn textures to medium, I still have DLSS on quality to guarantee the VRAM usage stays under 12gb.
I ran Ghost of Tsushima at Very High preset (motion blur off/chromatic aberation off/depth of field off) on 3080Ti 12GB @5120x1440 (not 4k but nearly) and it was in the 80s-110s for the entire game. Worked fine.
I think at this tier of GPU, 12gb is plenty, but with a 4070/Super or especially a 4070 ti, things like path tracing and 4k become more viable in general, and the costs of the cards is high enough that you probably are aiming to keep it for a few years and don't want to be lowering settings due to vram constraints.
I'd be very hesitant to get a 4070 ti (non-super), especially, for that reason. And if the rumors of the 50-series are true, we should expect the next gen of mid-tier cards for Nvidia to have the same VRAM limits, while presumably having more power in general.
In my opinion memory issues in games come down to devs not properly mangaging memory because its hard and time consuming.
They have 4090 in there dev pcs so if it runs fine there they call it a day.
Also Nvidia is aware of this. Nvidia limits vram on the mid to low end to drive players to spend more via planned obsolescence.
I'm quite happy with my 12 GB 6700xt playing on 1440, but to be fair, I tend to play games that aren't too demanding, either. I'm confident the card will serve me very nicely for several more years at least.
Nvidia used to have better texture compression algorithms than Radeon, potentially reducing vram usage. Not sure if that is still true.
Still true, the 4090 often used about 500MB to 1GB less than the 7900XTX in the same games at the same resolutions and settings.
@@camdustin9164 trying Arc rn and it's 1-2GB worse than Radeon. Like Far Cry 5 w/ HD textures took ~10GB on Radeon but takes ~11.5GB on Arc.
I'm running Ghost of Tsushima with a 5800x3d+4070 on High 21:9 1440p with DLSS on Quality and Frame Gen enabled getting 135 fps - 140 fps. 12GB of Vram still isn't an issue at all right now.
Cause it's a console game from 4 years ago. Congrats
the issue here is people using presets that are not meant for these cards, the hardware requirements for this game have the medium preset as the recommended preset, very high is for top end cards running the game at 4k
I’d love to see an Nvidia RTX 2000 vs AMD Radeon Rx7000 series video comparing Ray Tracing to see how well Radeons RT compares to early Nvidia iterations 💪😇👍
Bro, the 7800XT RT is better than the 4070
And is a cheaper card
@@PanchoMngkn😂
@@PanchoMngkn wrong
@@PanchoMngkncompletely wrong
It’s mostly games that were designed for the XBX and PS5 that need more vram. Object density and textured were heavily optimized on consoles for a little over 8gb shared memory. Hardware Unboxed did a good video on Harry Potter and Resident Evil needing more vram.
I do wish you’d used an Nvidia GPU for this test, because their drivers both allocate and use less VRAM. But still a good test, about as expected. Another thing to note is that if you’re struggling with VRAM and have a secondary monitor, try running your side monitor off the motherboard. That usually cuts off about half a GB for me.
the 12gb card performs at VRAM bottleneck better when the game just doesnt load the textures properly. so it has less to process.
nowadays you always have to check the visual result, 1% lows, runtime dependency, etc.
low-mid tier gpus cant keep up with technical dev demand anymore
RDNA 3 RT advantage is not due to drivers, is hardware tweaks to improve BVH and instructions for discarding data, WTH lol
11:46 6800 could have been a bit cheaper by having only 12gb as it was plenty enough for the time,
I'd love to see these same tests at 2160p as well
The fact that my 4070 Super can run Cyberpunk 2077 at 1440p at the highest settings with modded 4K texture, run modded RT implementation, which is even heavier than the original RT implementation from the game, with FG and DLSS on, and the VRAM allocation never go over 11GB, is enough evidence to show me 12GB of VRAM is enough for the next few years for 1440p.
Fps and stability can take a hit before VRAM is 100% full.
My 3070ti never went above 7.2gb playing hogwarts legacy for example, but the performance stability was very bad and turning textures down would give me a very significant boost.
You know what's even better than having 12GB of Vram?...16 😂
I mean, if you're hitting almost your max on a current game, when do you expect to replace that card? I would expect it to still be capable of that for the next 4.5-5 years if I bought a refresh card in current year. Until 7000 series basically. And that's if you only game on your PC and don't run any local AI.
@@Augusto9588 I play the game for 1-2 hours at a time and never have any kind of issues other than my skill issue.
Use games that run only on current gen consoles. That’s the real test. Not cross gen games
I love it when you just searched for this yesterday, and now a detailled video explains it yay
i have a superultrawide 1080p monitor and might upgrade to 1440p superultrawide, would the RX 6800 be better? It's slightly cheaper than the 7700XT in my country (for extra clarification, I will be using it as an eGPU via oculink with either an 8840U or strix point/lunar lake)
If you're running 8gb of VRAM even on powerful GPUS like the 4060/4060ti/3060ti/3070 or even the 10 of 3080, reducing graphics settings to High than Ultra would certainly make this VRAM issue solvable, the additional particles on ultra compared to High will always give diminishing returns vs its performance no matter what the card.
Those who like to mod games (especially texture packs) will find the 12GB buffer can be exceeded much more easily. My previous GPU was the 1080TI. I didn't like the idea of upgrading to a card with only 1 extra 1GB of Vram, so picked up a 6800XT. The only Nvidia card I would consider buying now is the 4070TI Super, but it's still a bit too much on the expensive side for my taste.
My 6700xt rocks. Raytracing taxes the cpu too much for the benefits that I barely notice anyway, so thus far I have zero issues or games I can't play. Thanks for making all these comparison vids, very useful info, I appreciate all you do fellow Daniel.
Playing Cyberpunk 2077 in 1440p with everything maxed out, RT Psycho & Path Tracing on with a 4070 @ 60 FPS constantly using less than 200W and 11,5GB of VRAM
Why leaving out the RTX 3080 12GB? I thought you even had one? I see the 3080 12GB is close to the 7700XT and RX6800 in most games.
Will you do this for 4k too? 12 v 16
answer is obvious my dude.
@@Rihardololz Because he already did the 4070 Super vs 3090 video which is essentially that
I run my games at 4k on a 12gb card with no issues 🤷♂️
Because that's the current threshold for the 9th console generation.
PS6 and the next XBOX generation will change that, but we are 4 years away from the start
No the new Xbox is launched about 2 years from now on because they gave up against PS5 and they wanna start fresh with a banger console@@ehenningsen
I have a 6700 on my gaming rig, and a 5700xt on my secondary attached to the 4k tv. I can play nearly everything on the secondary just fine. The 67 just rocks without flinching.
Which settings?
Last year using 4070ti....and yes 12gb vram for 2k/4k ultra setting not enough juice for run AAA games, UE5 games or ps5 console port games.....games like forspoken, immortal of aveum and hogwart legacy is very laggy and destroy the experience......
After using 4080 all problem of vram issues is solve.....all games above already crank the vram around 13gb to 15gb vram on 2k-4k ultra max setting.
When I got my 4070 at launch, I never bothered stressing about whether 12GB of VRAM was enough for 1080p or 1440p. Consoles have 16GB unified so probably no more than 12GB dedicated VRAM. For me, not only will it be when the PS6 comes out that 12GB might not be enough, but when they're releasing games for PS6 only, kind of like early PS5 games worked on PS4 and PS5, but now it's PS5 only. So it's gonna be a few years off, and when it does happen my 4070 won't have enough horsepower anyway.
The thing with consoles is that devs tend to not crank the settings to the max on consoles so of course they wouldn't use more than 12GB.
Yeah basically 12Gb will hold... but be ready to not play at the highest settings in 3 years or so. You're gonna be turning down some RT, some textures way before PS6 games are a thing. PC release only features basically.
The thing about consoles that you have to remember that part of that “unified memory” is still reserved for system RAM. As someone else said as well. The consoles don’t even run aggressive textures either since they still prioritize FPS over fidelity.
My 4090 rarely exceeds 12GB for 4k, which is why I went ahead with a 4080 laptop instead of a 4090 laptop (a sale at half the price). The 4080 laptop is a 4070 desktop equivalent.
I don't expect problems until after the start of the next console generation and understand that I will have to utilize DLSS tech for the few most demanding games.
@@Iridiumcosmos Wha? Other way around. Consoles historically prioritise graphics over framerate
I'm currently playing Rise of the Tomb Raider, a 2015 game, and with 1440p DLSS Quality everything max I'm going beyond 8GB of VRAM, getting massive stutters on my 3070 Ti. This is a 9 year old game, if 8GB wasn't enough back then, why are we still seeing so many 8GB and 12GB cards in 2024.
23:45 - this. You should aim for 16GB even though PC has VRAM separate from system RAM.
We simply run higher settigs and resolutions and sometimes on bad ports. We need the headroom.
3070 user here with a 4k monitor. I can get away with 1440p and med/high textures upscaled to 4k on top of frame gen and I usually have enough VRAM for that. Otherwise I'll try to run 4k DLSS Performance + Lossless Scaling as well.
12GB is borderline for 4k but just fine for 1080p or 1440p. I think by time it's a problem at the lower resolutions it may be time to upgrade anyway.
Well using this logic i dont think upgrading a card will be caused by the VRAM but the pure performance that will be required to run the game that WILL REQUIRE more than 12gb of VRAM. IN that case you are upgrading not becuase of the VRAM but because of the CHIP technology and what is required to get the FPS, or both.
4K is certainly doable in a 7700 XT/6800 in many games, especially when using upscaling. Would have been useful and interesting to have 4K comparisons as well
If you're using up-scaling you're rendering at 1440p or less anyway. There might be slightly more vram usage over native but not much.
Before watching the video I already know this is mostly going to be about over the top settings in AAA games that I'm never going to play. Most people don't buy the kinds of cards that will run these games on the highest settings anyway. So I think since my 6 GB card is just now starting to struggle a bit in terms of playing modern games I would say 8 GB is fine for most people right now and 12 is probably going to be what I aim for in my next card and probably what I would recommend people start buying if they are buying new. Anyone who tries to say we 'need' more than that has unrealistic beliefs of what and how most people play.
Hi Daniel, this video was very useful. As I am trasitioning to a 4k set up it would help if you made a "is 12gb enough for 4k?" video.
It's not enough
Heya. Don't know if that really helps you, but I have been playing ghost of tsushima on a vega 64 (8gb) on high in 4k just fine! But it's with fsr quality, capped at 60fps. (The GPU isn't even sweating, running below 100% most of the time if you capped the frame rate at 60, tsushima in particular is super well optimized)
Generally speaking, the really outdated Vega 64 has been running everything just fine in 4k with medium to high graphics (30-60fps)! So you should be fine until your GPU upgrade if you are willing to turn down your settings a bit. I personally don't really see the difference between very high and ultra and the performance gains you get are totally worth tuning it down a notch.
Also (in my case) it helps that I don't even have the option to ray trace xD
@@lupintheiii3055 Oh I see
@@blubbel222 Thanks I does help. I use the 3060 12gb model which is similar in performance to Vega 64. At least now I know Ghost of Tsushima won't be a problem in 4k. I recon there are many games that do not cross 8gb even in 4k so hopefully it won't be a big issue.
Bought 4070 TiSuper for the VRAM. Right now those extra 4 arent really needed. But playing games at 1440. AAA games, it is borderline
Sometimes VRAM not effect the performance, but sometimes it does.. depends on the PC system, pc specs, games you play, monitor you use, or applications you use, or software you use... but for overall use, 8GB is fine, and 10GB or more is much better for future proof... same as you building a PC, either AM4 or LGA 1700.. but for future proof get AM5...
Ratchet and Clank Rift Apart is the only game I have spilled over VRAM on so far at max settings 1440p. 4070 12GB
Ever do RX-6650XT vs RX-6700 vs RX7600 ?
Im curious about HDR, not thats its related to this at all, but how many of you guys use HDR in your games, and if you do, do you use the windows HDR or RTX HDR from the new nvidia app? I guess most people just use SDR for the simplicity and cause thats what we've always had before
I wonder if GPUs are designed to only use a specific percentage of the available memory, keeping a chunk generally available for spikes. If that were the case, you would almost never see fully utilized RAM. Also, I would like to know what a GPU does as it approaches full RAM. What specific adjustment to the visual quality does it make? It is possible that it first adjusts color, sharpening, ect...rather than FPS.
At the end, does anyone/do you think that I would come across some difficulties into the future with my 7700xt when I play at 1080p?
You could underclock the 6800's memory and overclock the 7700 xt's memory a bit to match the speed for an even better comparison
II really like this comparison, but it would be nice to see the GPU Usage % on top left as well so at least sometimes we can rule out the potential reason of the games not being optimized ( low utilization )
Thanks for this Daniel! Would it be possible to add a GPU percentage utilization stat to your overlay? I noticed the 6800 power/wattage dropped for most of the ray tracing tests vs the 7700 XT which makes me think the GPU was not fully utilized during ray tracing. Probably drivers as you mentioned but the % would give us a bit more info on this
I was playing Avatar today with everything on Max at 1440p, and it was eating up a little over 13gbs. It looks beautiful though.
Question. is there anything inherently wrong with vram being the limiting factor of a GPU I understand its a manufactured limitation but, Would it be better to be limited in other ways like power , clock, cuda cores, or bus width.
Wouldn't a 4070 TI and a 4070 TI Super worked well too?
I think so, but difference is slightly larger
Just downclock 4070ti super to match performance and test vram hungry games.
I would actually be a more accurate comparison using two ADA GPUs.
@@KrisDee1981 there are lots of vram hungry games with dlss3 So with dlss3 frame gen we prob can see more than 12gigs of vram usage.
@@AlienGurke another problem is memory bandwidth difference. 4070ti super has 672GB/s vs 504.2GB/s (33% higher)
@@KrisDee1981 6800 has also more Bandwidth due to its 256bit bus vs the 192bit bus from the 7700xt.
With 21% a bit less but this should ve made some differences already if it was nessesary.
Thanks men for the vid!
would love to see the RT reflections only test on other cards at 4k
Would be interesting to see the rtx 4070ti in 1440p because often will be used with max setting and frame gen which add more vram usage
It is very simple 12 gigs of vram is enough for all or most games at native 1440p high settings with or without some level of raytracing and you should still get the bare minimum 60 fps up to 100+ fps depending on the games.
16GB needs to be on the next 5070. The 4070 super should have been 16Gb. The extra memory for advanced settings like DLSS scaling or RTX (both at the same time)
i dont understand why people saying we need 16 because the ps5 have 16gb but its split between the cpu and gpu and i see modren games use at least 10gb of cpu ram so...
I'm still using the RTX 2080ti that I bought used 2 years ago for only $300. 11GB of Vram is actually perfect for this performance class, and 2560x1440 or 3440x1440 resolutions.
It's perfect for almost all 4k titles too, just use DLSS for the few that have issues
Well, I was using a 42" LGC2 4K OLED and 2080ti was struggling in newer games like Alan Wake 2 even without RT, and with DLSS performance. It wasn't because of Vram, but just not enough GPU power.
6 months ago I changed to 34" 3440x1440 and it's a much better combo.
@@KrisDee1981 Alan Wake is a special case as it is one of a few games that started using mesh shaders. NVIDIA cards didn't really have a good utilizing of hardware for mesh until the 4000 series
@@ehenningsen Looking at the benchmarks Turing perform as expected in AW2.
1440p max, no rt
2080ti 40fps
Rx6800 is only 2% faster
4070 is 30% faster
I know for sure Pascal GPUs are struggling big time in AW2, even 1080ti, but Turing is still going strong in rasterization and light RT.
What gpu would you pick at these prices ( 25% tax included ) ? 4070 = 770€, and 4060 ti 16gb = 570€ ? I'm going to use NVIDIA features.
I've done my share of research and it all points to that for 1080p gaming, 12Gb VRAM is perfect, you only need 16Gb for 1440p.
But then the next question will be, is the VRAM on 128 or 256bit... and also how many cudacores.
For a GPU to be optimal, it needs to have fast VRAM, and enough cudacores to make the most out of the 12Gb VRAM.
I was very interested in ASUS Radeon RX 7600 XT Dual OC 16GB, but it turns out that this GPU doesn't really have enough "power"
to really use all those 16Gb VRAM, because that much VRAM is mostly used in 1440p.
So.. in the end. I believe the real question is.. for Budget and medium, who are the kings of 12gb GPU's ?.
A Bird whispered to me that the RTX 3060 ti 12gb could be a winner here - for pure 1080p gaming.
What's your opinion on this ?.
10Gb 3080 vs 12Gb 3080 crying out.. can you borrow one Daniel just to push this comparison?
It seems that the VRAM is not the only thing to consider, neither upscaling / retracing. it seems that we could underestimate the optimisation factor. Therefore, it becomes more and more difficult to buy hardware these days. That was a great video! Thanks.
Good choice of GPUs to use for the comparison. My opinion is that the 4070 TI variants are too powerful to make a comparison for these more commonly used resolutions.
Thanks for the info I was really stressing about the only viable upgrades in my price point being 12 GB
The RTX 3080ti and 3090. Both have the exact same GPU die, memory bandwidth, and near identical Cuda Core count... Same manufacturer, same hardware generation. The RTX 3080ti is essentially a 3090 with half the VRAM.