Also, Desktop area also matters, 4K desktop uses 800mb VRAM, 1080p uses 300mb or less., which is included in the allocated value. Multimonitors also add up.
Did not consider that. So with my double 1440p setup I will be noticing slightly higher vram than if I only used one. Wow that seems like such an obvious thing but I never considered it.
@@mooseonshrooms yeah I only started considering it when using Nvidia DSR 1080p to 4K+DLSS for those borderless fullscreen only games where you need desktop at 4K and saw 300MB VRAM jump to 800MB!!!
Also, the reason why 12 GB is enough for 99% of games at 1440p is because consoles allocate 12 GB. So until next gen consoles allocate 16 GB, almost all games will be designed for 12 GB. In time 16 GB and then 24 GB VRAM will become the norm as more applications leverage AI and future ray tracing.
@@alargecorgi2199 The thing about console games are ; they don't run on Windows and rely on DirectX , optimization is better for consoles as there is less driver overhead . So if consoles allocate 12GB for VRAM, doesn't mean its applicable to PC . Especially consoles often make use of upscaling feature, many of us PC gamers prefer to play at native resolution, not using FSR and DLSS .
Thanks for this Daniel! Would it be possible to add a GPU percentage utilization stat to your overlay? I noticed the 6800 power/wattage dropped for most of the ray tracing tests vs the 7700 XT which makes me think the GPU was not fully utilized during ray tracing. Probably drivers as you mentioned but the % would give us a bit more info on this
i have a superultrawide 1080p monitor and might upgrade to 1440p superultrawide, would the RX 6800 be better? It's slightly cheaper than the 7700XT in my country (for extra clarification, I will be using it as an eGPU via oculink with either an 8840U or strix point/lunar lake)
You were talking really well about this card (6800)last autumn, It went on sale with starfield and I picked it up. Still going strong and can play just about anything I want on high at 1440p. Really happy with this purchase. Thanks!! gr8 video.
Question - I have a widescreen 3440 x 1440p monitor, so when choosing video card (matching the budget), do I look still look for something that is better value for standard 1440p gaming, or will it need to go higher to 4K gaming?
so when some of the newest AAA games are already right at the edge of 12GB we can logically expect VRAM usage to increase above 12GB in the next years. That puts the current graphic cards market really into a weird position since Nvidia charges 800$ for the first real 16GB option (I dont count the 4060Ti). AMD starts with the 7800 XT which is slightly under 500$. Thats A LOT OF MONEY on a graphic cards alone for the normal PC gamer.
@@jayceneal5273 That's not true, tons of games are made by AA studios or indie studios and they care less about consoles and these studios are the ones that are making great games this year. And just because consoles are limited doesn't mean the same game won't use more on PC.
@@No-One.321 if any indie game or AA game does then it's not very optimized and that's rarely the case anyhow, usually they are made to be run on potatoes. You're just fear mongering for something that isn't going to be a problem
Yes it is only possible if your gaming at 4K resolution in certain games with very high quality textures. However it would be a 10fps difference at best because of the massive amount of L3 cache the 4070 has over the 3080. The 4070 Super has even more L3 cache and it performs the same as a 3080ti in 4k regardless of its 192bit bus because it also has a much faster narrow bus.
I’d love to see an Nvidia RTX 2000 vs AMD Radeon Rx7000 series video comparing Ray Tracing to see how well Radeons RT compares to early Nvidia iterations 💪😇👍
Have you considered writing a pytorch program which allocates a tensor of say 4 GB to artifically restrict the vram avaliable to the games? Would it even work? If it does, This should serve better than finding different variant of a single GPU.
I want to know how a 12 GB card works on a 4K display. I’m still working with a 3070 hooked up to my desk monitors and my LG OLED tv. It’s extremely limited at 4K and I wanna know if it’s worth getting a 12 GB card or waiting for the next generation.
I do 4k gaming with a 12gb 3080. No issues at all, so long as I don't turn on ray tracing. Some of the more recent games I'm having to drop to High instead of Ultimate settings.
I use my 6700XT in my living room on my budget 4K TV and it usually requires that some form of upscaling be used. So far I haven't found a game that simply refused to work on it, although there's discussion about the importance of a card's memory bandwidth when trying to push a 4K image even with upscaling.
One thing to 15:00 It seems like the ram usage on the RX 7700 XT is way higher with 16.7GB compared to 15.1GB on the RX 6800. Not sure if there were maybe other reasons for it, but just something I think is interesting It seems like that the same happens on 1440p. But also happens on Starfield 17:44 while it doesn't even come close to using 10GB of VRAM
I always had stutters in HL: Alyx with the RX6600XT (8GB, the only reasonable card during the GPU-apocalypse), Now when I have an RX6800 and 16GB I no longer have those issues, and the Radeon software reports 14.5GB usage in that game (when playing at 5400x2700) so I guess it was a VRAM-related all along. (Or me not understanding that 8GB wasn't enough for "high" textures) Today I just love the security of having plenty of VRAM. Throughout the ages, VRAM is the single thing that has increased longevity of GPUs, and I am happy to have some proper headroom.
Would not be a good test since the always allocate more VRAM than it needs, I saw someone did a system requirement test on that game when it just came and see it try to allocate 14GB of VRAM when it doesn’t even use half of it, the setting was 1080p high setting native resolution.
That's a good point (for older GPUs) but most of the anger with the Nvidia 40 series was powerful GPUs being held back by vRAM limitations (mostly in the laptop 4070 lineup but also in the desktop 4060 lineup).
@TechManMax they release high gb cards I would argue the bit rate held 40 series cards back. 128 and 256bit rates seem like garbage now a days 315 and higher should be the way to go.
@@bumperxx1 8GB cards are unacceptably low in 2023-2024 multiple UA-cam videos are proof of this. Bus sizes being gimped are also bad though PS5 has 256 bit bus. I think the 4070 Ti super is ok but the 192 bit bus on the regular 4070 is just garbage. As for the 4060 series they are just gimped in every possible way while charging more than a console with double the bus size and double the vRAM. Although prices have gone down a bit for the 4060 now luckily... And I do think the 4080/4080 super should have a minimum of 320 bit bus (256 bit is ok for 4070 Ti super). 4080 super has good vRAM, good performance, but who knows when it might end up limited by bandwidth.
My 6700xt rocks. Raytracing taxes the cpu too much for the benefits that I barely notice anyway, so thus far I have zero issues or games I can't play. Thanks for making all these comparison vids, very useful info, I appreciate all you do fellow Daniel.
Heya. Don't know if that really helps you, but I have been playing ghost of tsushima on a vega 64 (8gb) on high in 4k just fine! But it's with fsr quality, capped at 60fps. (The GPU isn't even sweating, running below 100% most of the time if you capped the frame rate at 60, tsushima in particular is super well optimized) Generally speaking, the really outdated Vega 64 has been running everything just fine in 4k with medium to high graphics (30-60fps)! So you should be fine until your GPU upgrade if you are willing to turn down your settings a bit. I personally don't really see the difference between very high and ultra and the performance gains you get are totally worth tuning it down a notch. Also (in my case) it helps that I don't even have the option to ray trace xD
@@blubbel222 Thanks I does help. I use the 3060 12gb model which is similar in performance to Vega 64. At least now I know Ghost of Tsushima won't be a problem in 4k. I recon there are many games that do not cross 8gb even in 4k so hopefully it won't be a big issue.
I know you said you don't have one on hands, but I would really like to see this kind of testing on the 10gb 3080, the first game I saw some problem with crashing is the RE 4 demo with RT On ultrawide 3440 1440 resolution don't know about the full game, clearly 10gb was really not enough VRAM for such a powerful GPU
OK, it's starting to get enoying: How about 10Gig? A LOT of people have the 3080, and it was/is a very good deal at certain times. And yet, it get's completely overlooked again and again. I just wonder how much it is limited against the 12GB-version or the 4070 (super).
A lot. 3080 is completely insane. 1080 ti had 11gb. Many years later you get 10gb on 3080 which is absolutely insane. Also add the prices people paid for it...
How about 11Gb? Well let me tell you. I have an RTX 2080ti and it's a perfect Nvidia midrange GPU. Bought used 2 years ago for only $300 before Vram drama. Used 3070 was much more expensive.
The 3080 was ridiculously expensive almost until it was discontinued due to scalpers. And by that time the 4070 was released which was the best new GPU option for that performance tier (similar performance with more VRAM and dlss3 for a lower price).
I wonder if GPUs are designed to only use a specific percentage of the available memory, keeping a chunk generally available for spikes. If that were the case, you would almost never see fully utilized RAM. Also, I would like to know what a GPU does as it approaches full RAM. What specific adjustment to the visual quality does it make? It is possible that it first adjusts color, sharpening, ect...rather than FPS.
II really like this comparison, but it would be nice to see the GPU Usage % on top left as well so at least sometimes we can rule out the potential reason of the games not being optimized ( low utilization )
Im curious about HDR, not thats its related to this at all, but how many of you guys use HDR in your games, and if you do, do you use the windows HDR or RTX HDR from the new nvidia app? I guess most people just use SDR for the simplicity and cause thats what we've always had before
Thanks for this video Daniel! It looks like 12GB of VRAM is good for now, unless if a game is unoptimized and unfinished. Or if you want to completely max out graphics settings.
Will it be enough for extreme setting? for example Cyberpunk 2077 Ultra realistic reshading mod on 1080p? (To clarify, its a mod that push graphic to far far beyond normal Ultra setting)
It seems that the VRAM is not the only thing to consider, neither upscaling / retracing. it seems that we could underestimate the optimisation factor. Therefore, it becomes more and more difficult to buy hardware these days. That was a great video! Thanks.
@@legendp2011he should also test with an operating system that is out of date. Perhaps had a virus. He could do a test while having 1000 tabs open. Yes, all valuable pieces of information
Not the same. Both Nvidia Pascal and AMD Polaris have equal gaming features. RTX4070 just gets better ray tracing capability over the RX7800XT. As a gaming card, sure the RX580 is better today. However, for HTPC use, the GTX1060 is better because it has hardware VP9 decoder which UA-cam playback requires. RX580 doesn't , only RX5000 series and newer can decode VP9.
@@fleurdewin7958better raytracing in some games. It's mostly Cyberpunk that heavily favors Nvidia. In other raytracing titles AMD RDNA3 performance is on par or sometimes even better than Nvidia.
I have a 7900XTX and just bought Avatar. I play at 4k with FSR and frame gen. The benchmark is pretty choppy with highs in the low 100s and lows as low as 25 yet actual gameplay is highs of 300 and lows of around 100. Allocated VRAM is 22gb!
Maybe the 4070 Ti and the 4070 Ti super could close the gap to compare more accurately the VRAM, but these ones are still a good example, nice comparison
Is less than a 5% in most cases, wouldn’t make a big difference, and it would a better representation letting aside generational improvements and architecture
I just ended up getting 7900 gre, 16 seems like a safe spot and was relatively much cheaper than a 4070S. Sure no nvidia feature set but it is what it is- so far been enjoying it! Honestly with this test I feel like it's difficult to compare 12gb vs 16gb because there isn't any games that actually use over 16gb vram (unless RT), once those become more mainstream due to consoles then we'll see performance difference. Just like how 8gb vram cards are becoming more irrelevant.
8GB cards started to struggle when new gen consoles were released. Same will happen to 12GB cards, once the next gen consoles launch in 3-4 years. Until then there won't be any major changes in VRAM usage. We are experiencing the same game behavior for ~3 years - 8GB has become insufficient for max (ultra) settings even at 1080p, but still enough for high settings. 12GB is enough for 1440p/max and 16GB is enough for 4K/max.
I would say you should wait at this point. Your current card isn't exactly a slouch, so you can game just fine. If you were running Pascal, and you couldn't realistically play some titles you wanted to, it might be more understandable to buy now. Why did you wait until this late into 40 series to consider buying, and what changed?
I have a 3070 and I'm waiting. I dont't see the point in buying something that will soon be superseded *and* should drop in price when the next gen is out.
I ran Ghost of Tsushima at Very High preset (motion blur off/chromatic aberation off/depth of field off) on 3080Ti 12GB @5120x1440 (not 4k but nearly) and it was in the 80s-110s for the entire game. Worked fine.
Good choice of GPUs to use for the comparison. My opinion is that the 4070 TI variants are too powerful to make a comparison for these more commonly used resolutions.
Keep in mind this is just one-time testing. If you do actual gameplay or longer session of gaming, it going to nuts, especially in games where the memory leak issues are horrible, if you know what I mean. 🤷🏻♀️
One other thing to consider is that recent poorly optimized games either used more VRAM when the first launched or poorly managed VRAM and ran out in longer sessions. This usually gets fixed in a patch, but its still a hassle.
I have a rtx3080 10gig and I'm playing at 1440p havnt ever had it run out of cram with what I play but I'm sure probably a game out there I could get it to overflow
as a 6800 user who plays less demanding non AAA games at 4k it's also interesting to me that the 6800 also appears to beat the 7700xt in performance per watt by 30-50watts. Unless you require AVI encoding the 6800 seems like the sleeper GPU for 2024...
The wattage reports differently between rdna 2 and 3. Pretty sure the 6800 is only showing GPU power but the 7700xt is showing total board. I think in reality there isn't a meaningful difference in efficiency.
I have a 6800 but can't take advantage of it because i had to spend all my money on a 4060 laptop for school, instead of upgrading the 5 year old i5 that's paired with the 6800. It's painful to watch these videos and see how good the 6800 is, because i literally have it but i also kind of don't.
Daniel is correct. 6800 doesn't show the total board power usage. I think it's another 40-50 watts added onto the 6800 to get the total board wattage, so they basically have the same efficiency.
I have 8 GB RTX 3070 and in Horizon Forbidden West I had to change textures for high instead of very high just because I didnt had enough memory and my FPS dropped from 90-110 to even 36 in cutscenes... Its fair to say that 8 GB is definitely not enough in 2024 for 1440p.
I do wish you’d used an Nvidia GPU for this test, because their drivers both allocate and use less VRAM. But still a good test, about as expected. Another thing to note is that if you’re struggling with VRAM and have a secondary monitor, try running your side monitor off the motherboard. That usually cuts off about half a GB for me.
one game that uses an absurd amount of vram is Escape From Tarkov. I have a 3080 12gb and I have had to turn DLSS to performance when I have textures on high. When I turn textures to medium, I still have DLSS on quality to guarantee the VRAM usage stays under 12gb.
If you cant benchmark for 1-2 hours on each game to make sure your results are accurate, maybe include fewer games instead? Also, AMD vram usage and Nvidia vram usage are often not the same, a game that uses more than 12gb on AMD might use below that on Nvidia card, but I'm sure you already know that. But your video is titled as if AMD vram usage represents all GPUs.
23:45 - this. You should aim for 16GB even though PC has VRAM separate from system RAM. We simply run higher settigs and resolutions and sometimes on bad ports. We need the headroom.
It’s mostly games that were designed for the XBX and PS5 that need more vram. Object density and textured were heavily optimized on consoles for a little over 8gb shared memory. Hardware Unboxed did a good video on Harry Potter and Resident Evil needing more vram.
3070 user here with a 4k monitor. I can get away with 1440p and med/high textures upscaled to 4k on top of frame gen and I usually have enough VRAM for that. Otherwise I'll try to run 4k DLSS Performance + Lossless Scaling as well.
Have to say though, your 6800 has some really low clockspeeds. My 6800XT red dragon goes to 2440 out of the box, 2700 if you press it. I think you have it in the "quiet" mode, which is TDP limited. Cyberpunk equally very low powerdraw of 160W.
I'm quite happy with my 12 GB 6700xt playing on 1440, but to be fair, I tend to play games that aren't too demanding, either. I'm confident the card will serve me very nicely for several more years at least.
I had vram issues on my 12 GB 4070 TI when using path tracing on cyberpunk dlss quality 1440p. I eventually upgraded to a 4090 and my takeaway is that 12 GB is not an issue until path tracing and high-end Ray tracing starts to come into a play, which is not something that matters much with AMD right now
Great video Daniel! Is a 7800 xt too much for 1080p max settings then? Is the 7700xt enough for at least 5 years at that resolution? I want to buy one of those but im pretty concerned about how it will perform in the future just for saving some bucks.
In my opinion memory issues in games come down to devs not properly mangaging memory because its hard and time consuming. They have 4090 in there dev pcs so if it runs fine there they call it a day. Also Nvidia is aware of this. Nvidia limits vram on the mid to low end to drive players to spend more via planned obsolescence.
I have the RTX 4070 Super, and considering how where I am computer parts are insanely expensive, getting my pure performance build at $2000, which is outrageous just for 4070S, I will have to make it last for at least a decade. With that said I've settled on a 1080P monitor, hoping it'll be able to play all games in 1080P max at smooth 60FPS for the upcoming decade.
I went with the 7700 XT Steel Legend instead of the XFX 6800 because I have a white build. 🤣 That said when I saw reviews the 6800 generally beat out the 7700 XT but I figured driver development would hopefully keep me going in the long run.
It really depends on how long you're using your GPU - all the 12GB discussion reminds me of the NV 3000 generation and cards like the otherwise great 3070(ti) that are now crippled by VRAM only. I tend to skip 2 generations, sometimes 3, and in the past I was lucky that I got the card with more VRAM because the competition became obsolete much faster - when I got my 8GB RX580, the competitors at the same if not usually higher price were the 1060 3GB and 6GB which wouldn't have lasted half that long, the competitor to my 6800 was the 3070 which has VRAM problems now and is already due for an upgrade. You can turn down graphics, sure, but if that means I have to reduce textures (which is still the main thing for enjoying nice graphics but also takes up most VRAM) I'm getting cranky, and given how much I enjoy texture mods for e.g. Bethesda games I want my headroom there - effects or RT are much less of a concern. BTW, did you compare FSR with XeSS in Hellblade? If you're just in need for a small bump to get over that 60 like me Intel XeSS could be the option for you, it's not as effective as FSR Q but looks nicer and crisper. Something maybe to test and recommend if available so an AMD recommendation doesn't always depend on FSR.
Consider the fact that even if there are no problems with performance on cards with less video memory, this does not mean that there are no problems with quality, for example textures.
Is 10 million dollars enough for a house
I can't believe I have to specify but this is a joke
Not in the west.
@BattleToads Depends.
🤓replies
Market says 8 million wasn't enough so ofc we discussing 12
@@Sevastousmight be able to get away with 10mil if you're willing to upscale
Also, Desktop area also matters, 4K desktop uses 800mb VRAM, 1080p uses 300mb or less., which is included in the allocated value. Multimonitors also add up.
Did not consider that. So with my double 1440p setup I will be noticing slightly higher vram than if I only used one. Wow that seems like such an obvious thing but I never considered it.
@@mooseonshrooms yeah I only started considering it when using Nvidia DSR 1080p to 4K+DLSS for those borderless fullscreen only games where you need desktop at 4K and saw 300MB VRAM jump to 800MB!!!
Also, the reason why 12 GB is enough for 99% of games at 1440p is because consoles allocate 12 GB. So until next gen consoles allocate 16 GB, almost all games will be designed for 12 GB. In time 16 GB and then 24 GB VRAM will become the norm as more applications leverage AI and future ray tracing.
I believe it does not matter, if you are using in-game exclusive fullscreen mode. But it would be interesting to make such a comparison anyway.
@@alargecorgi2199 The thing about console games are ; they don't run on Windows and rely on DirectX , optimization is better for consoles as there is less driver overhead . So if consoles allocate 12GB for VRAM, doesn't mean its applicable to PC . Especially consoles often make use of upscaling feature, many of us PC gamers prefer to play at native resolution, not using FSR and DLSS .
Curious to see 4070 ti vs 4070 ti super in 4k settings.. Similar difference to 7700xt / 6800
4070 never meant for 4k.. 2k only. If you got that 4k monitor, then upgrade to 4090
Thanks for this Daniel! Would it be possible to add a GPU percentage utilization stat to your overlay? I noticed the 6800 power/wattage dropped for most of the ray tracing tests vs the 7700 XT which makes me think the GPU was not fully utilized during ray tracing. Probably drivers as you mentioned but the % would give us a bit more info on this
i have a superultrawide 1080p monitor and might upgrade to 1440p superultrawide, would the RX 6800 be better? It's slightly cheaper than the 7700XT in my country (for extra clarification, I will be using it as an eGPU via oculink with either an 8840U or strix point/lunar lake)
man i love ur channel
keep up the good work ✌
+
You were talking really well about this card (6800)last autumn, It went on sale with starfield and I picked it up. Still going strong and can play just about anything I want on high at 1440p. Really happy with this purchase.
Thanks!! gr8 video.
I crack up every time you move yourself to point things out 😂😂I'm not even into gaming that much anymore but these videos are fun to watch
if its abt fun watching my old streams were fun to watch, i didnt think abt visual quality and just play games as usual
im pretty sure theyll automatically added to the studio, im too lazy to contact and ask for revival
Thanks for the continued analysis.
Question - I have a widescreen 3440 x 1440p monitor, so when choosing video card (matching the budget), do I look still look for something that is better value for standard 1440p gaming, or will it need to go higher to 4K gaming?
This was useful. Thanks
6750 XT user. 12 GB has been enough. Never been an issue.
That card right there was the best value of last gen. Saw some good sales on it when it was still new.
6700xt here, killer.
@@Mattribute6800 better deal honestly
Ye, there's plenty of other things besides that has an impact on performance, VRAM is rarely actually the issue
@@Mattribute 6800 or 6800 XT is a bit better in my opinion as it's more future proof
so when some of the newest AAA games are already right at the edge of 12GB we can logically expect VRAM usage to increase above 12GB in the next years. That puts the current graphic cards market really into a weird position since Nvidia charges 800$ for the first real 16GB option (I dont count the 4060Ti). AMD starts with the 7800 XT which is slightly under 500$. Thats A LOT OF MONEY on a graphic cards alone for the normal PC gamer.
It won't use more until the next console generation. thats roughly 4 years away, when you would already be upgrading anyway.
@@jayceneal5273 how do you know that? whats your source?
@@svenyproud they arent going to make the minimum vram usage more than what consoles can use lol
@@jayceneal5273 That's not true, tons of games are made by AA studios or indie studios and they care less about consoles and these studios are the ones that are making great games this year. And just because consoles are limited doesn't mean the same game won't use more on PC.
@@No-One.321 if any indie game or AA game does then it's not very optimized and that's rarely the case anyhow, usually they are made to be run on potatoes. You're just fear mongering for something that isn't going to be a problem
Nice comparison, floating mini Daniel strikes again.
Also, take a shot each time Daniel says "uh" 😵💫
I got the rx6800 standard version one, and I'm happy with it. Does a great job
I love it when you just searched for this yesterday, and now a detailled video explains it yay
Thanks for the info I was really stressing about the only viable upgrades in my price point being 12 GB
hey Daniel. is it possible, that the 3080 with 320 bit Bus could be better than the 4070 with just 192 bit bus in VRAM Limited scenarios ?
Yes it is only possible if your gaming at 4K resolution in certain games with very high quality textures. However it would be a 10fps difference at best because of the massive amount of L3 cache the 4070 has over the 3080. The 4070 Super has even more L3 cache and it performs the same as a 3080ti in 4k regardless of its 192bit bus because it also has a much faster narrow bus.
More memory bandwidth helps in higher resolutions, but when you run out of Vram it doesn't matter.
Newer will almost always beat similar older cards. The 50 series might mess that up though.
would love to see the RT reflections only test on other cards at 4k
I’d love to see an Nvidia RTX 2000 vs AMD Radeon Rx7000 series video comparing Ray Tracing to see how well Radeons RT compares to early Nvidia iterations 💪😇👍
Bro, the 7800XT RT is better than the 4070
And is a cheaper card
Thanks men for the vid!
Hey what should i choose between rtx 4060ti 16gb or rtx 4070 12gb ?
Have you considered writing a pytorch program which allocates a tensor of say 4 GB to artifically restrict the vram avaliable to the games? Would it even work? If it does, This should serve better than finding different variant of a single GPU.
I want to know how a 12 GB card works on a 4K display. I’m still working with a 3070 hooked up to my desk monitors and my LG OLED tv. It’s extremely limited at 4K and I wanna know if it’s worth getting a 12 GB card or waiting for the next generation.
I do 4k gaming with a 12gb 3080. No issues at all, so long as I don't turn on ray tracing. Some of the more recent games I'm having to drop to High instead of Ultimate settings.
I use my 6700XT in my living room on my budget 4K TV and it usually requires that some form of upscaling be used. So far I haven't found a game that simply refused to work on it, although there's discussion about the importance of a card's memory bandwidth when trying to push a 4K image even with upscaling.
Damn, was hoping my 3080ti is the 12gb in the video but 7700xt is fine, good video.
One thing to 15:00
It seems like the ram usage on the RX 7700 XT is way higher with 16.7GB compared to 15.1GB on the RX 6800. Not sure if there were maybe other reasons for it, but just something I think is interesting
It seems like that the same happens on 1440p. But also happens on Starfield 17:44 while it doesn't even come close to using 10GB of VRAM
Hmmm..interesting..🤔
It varies. At 3:14 the 7700XT system is using less ram than the 6800 - 14.4 vs 15.4. In Alan Wake2 sometimes the 7700xt is higher and sometimes lower.
@@lawrencekallal6640 oh yeah there it is like that. Would be interesting to know why it is like this
I always had stutters in HL: Alyx with the RX6600XT (8GB, the only reasonable card during the GPU-apocalypse), Now when I have an RX6800 and 16GB I no longer have those issues, and the Radeon software reports 14.5GB usage in that game (when playing at 5400x2700) so I guess it was a VRAM-related all along. (Or me not understanding that 8GB wasn't enough for "high" textures) Today I just love the security of having plenty of VRAM. Throughout the ages, VRAM is the single thing that has increased longevity of GPUs, and I am happy to have some proper headroom.
What game overlay for stats do you use?
Msi afterburner
You should have tested TLOU as well! That game actually makes gpus perform worse when it tries to use more vram than the gpu has.
Played it a few weeks ago. Worked like a Charme
that was before the patches
Well duh any game is gonna run worse with more vram needed than allocated
That's with any game. If any game uses more VRAM than a card has, the game will start using system memory which is a lot slower than GPU memory.
Would not be a good test since the always allocate more VRAM than it needs, I saw someone did a system requirement test on that game when it just came and see it try to allocate 14GB of VRAM when it doesn’t even use half of it, the setting was 1080p high setting native resolution.
What good is more ram 🐏 if the gpu can't properly utilize it with the games and software?
That's a good point (for older GPUs) but most of the anger with the Nvidia 40 series was powerful GPUs being held back by vRAM limitations (mostly in the laptop 4070 lineup but also in the desktop 4060 lineup).
@TechManMax they release high gb cards I would argue the bit rate held 40 series cards back. 128 and 256bit rates seem like garbage now a days 315 and higher should be the way to go.
@@bumperxx1 8GB cards are unacceptably low in 2023-2024 multiple UA-cam videos are proof of this. Bus sizes being gimped are also bad though PS5 has 256 bit bus. I think the 4070 Ti super is ok but the 192 bit bus on the regular 4070 is just garbage. As for the 4060 series they are just gimped in every possible way while charging more than a console with double the bus size and double the vRAM. Although prices have gone down a bit for the 4060 now luckily... And I do think the 4080/4080 super should have a minimum of 320 bit bus (256 bit is ok for 4070 Ti super). 4080 super has good vRAM, good performance, but who knows when it might end up limited by bandwidth.
Is Rx 6800 consuming 30w less power than 7700 xt?
My 6700xt rocks. Raytracing taxes the cpu too much for the benefits that I barely notice anyway, so thus far I have zero issues or games I can't play. Thanks for making all these comparison vids, very useful info, I appreciate all you do fellow Daniel.
Isnt the 4070 super 12gb and the 4070ti super 16gb?
Hi Daniel, this video was very useful. As I am trasitioning to a 4k set up it would help if you made a "is 12gb enough for 4k?" video.
It's not enough
Heya. Don't know if that really helps you, but I have been playing ghost of tsushima on a vega 64 (8gb) on high in 4k just fine! But it's with fsr quality, capped at 60fps. (The GPU isn't even sweating, running below 100% most of the time if you capped the frame rate at 60, tsushima in particular is super well optimized)
Generally speaking, the really outdated Vega 64 has been running everything just fine in 4k with medium to high graphics (30-60fps)! So you should be fine until your GPU upgrade if you are willing to turn down your settings a bit. I personally don't really see the difference between very high and ultra and the performance gains you get are totally worth tuning it down a notch.
Also (in my case) it helps that I don't even have the option to ray trace xD
@@lupintheiii3055 Oh I see
@@blubbel222 Thanks I does help. I use the 3060 12gb model which is similar in performance to Vega 64. At least now I know Ghost of Tsushima won't be a problem in 4k. I recon there are many games that do not cross 8gb even in 4k so hopefully it won't be a big issue.
I know you said you don't have one on hands, but I would really like to see this kind of testing on the 10gb 3080, the first game I saw some problem with crashing is the RE 4 demo with RT On ultrawide 3440 1440 resolution don't know about the full game, clearly 10gb was really not enough VRAM for such a powerful GPU
OK, it's starting to get enoying: How about 10Gig? A LOT of people have the 3080, and it was/is a very good deal at certain times. And yet, it get's completely overlooked again and again. I just wonder how much it is limited against the 12GB-version or the 4070 (super).
A lot. 3080 is completely insane. 1080 ti had 11gb. Many years later you get 10gb on 3080 which is absolutely insane. Also add the prices people paid for it...
the 3080 is gonna struggle in the very near future, 10gb isnt gonna be enough for 1440p really especially with newer games today
@@Defineddalready struggles
How about 11Gb?
Well let me tell you. I have an RTX 2080ti and it's a perfect Nvidia midrange GPU. Bought used 2 years ago for only $300 before Vram drama. Used 3070 was much more expensive.
The 3080 was ridiculously expensive almost until it was discontinued due to scalpers. And by that time the 4070 was released which was the best new GPU option for that performance tier (similar performance with more VRAM and dlss3 for a lower price).
i have a 6800 and play AAA at 1440 and it smashes them all never had a single issue love mine
But that's 16Gb not 12 right ?
@@fabrb26 yeah
Love my 6800, bought it used but the coil whine is a bit annoying
@@jamieshephard2432Pfftt, weak card, cannt even use ray tracing, trash
@@JazanMuyanto yeah ok if you say so lmfao. ray tracing aint all that anyway n even if i wanted to use rt my gpu would handle it
I wonder if GPUs are designed to only use a specific percentage of the available memory, keeping a chunk generally available for spikes. If that were the case, you would almost never see fully utilized RAM. Also, I would like to know what a GPU does as it approaches full RAM. What specific adjustment to the visual quality does it make? It is possible that it first adjusts color, sharpening, ect...rather than FPS.
II really like this comparison, but it would be nice to see the GPU Usage % on top left as well so at least sometimes we can rule out the potential reason of the games not being optimized ( low utilization )
Im curious about HDR, not thats its related to this at all, but how many of you guys use HDR in your games, and if you do, do you use the windows HDR or RTX HDR from the new nvidia app? I guess most people just use SDR for the simplicity and cause thats what we've always had before
What about 10 GB on the rtx 3080 with ray tracing.
I've had no issues personally. Depending on how the 50 series looks, I might keep it for one more generation.
It's bad
Why leaving out the RTX 3080 12GB? I thought you even had one? I see the 3080 12GB is close to the 7700XT and RX6800 in most games.
Ever do RX-6650XT vs RX-6700 vs RX7600 ?
Feels bad even watching the title with my 4gb laptop card lol
I have my OG 3080 10GB Ram and it's all working perfectly fine in 1440p and even in 4k in many games.
Thanks for this video Daniel! It looks like 12GB of VRAM is good for now, unless if a game is unoptimized and unfinished. Or if you want to completely max out graphics settings.
nvidia said 8gb will be enough for 1080 for more years to come, but for 1440 id say grab anything 16
12GB is a very good amount for 98% of games at 1440p.
Exactly what I was thinking. It might not matter now, but it's good if you don't want to upgrade for a while.
Will it be enough for extreme setting? for example Cyberpunk 2077 Ultra realistic reshading mod on 1080p? (To clarify, its a mod that push graphic to far far beyond normal Ultra setting)
@@mukamuka0 nope
@@mukamuka0 no
12 very good in FullHD
It seems that the VRAM is not the only thing to consider, neither upscaling / retracing. it seems that we could underestimate the optimisation factor. Therefore, it becomes more and more difficult to buy hardware these days. That was a great video! Thanks.
Something to consider are games that progressively use more vram during longer play sessions but there is no way for Daniel to test that effectively.
that's called a memory leakage, it's not supposed to happen.
@@MaxIronsThird its not "supporsed to happen", but it does happen in some game
@@legendp2011 Só you're saying I need a card with more VRAM bc of a bug and that's normal? I don't get it.
@@MaxIronsThird what did you not get?
@@legendp2011he should also test with an operating system that is out of date. Perhaps had a virus. He could do a test while having 1000 tabs open. Yes, all valuable pieces of information
Could also look at 3080 Ti (12gb) and 3090 (24gb). They are practically the same card.
I was playing Avatar today with everything on Max at 1440p, and it was eating up a little over 13gbs. It looks beautiful though.
The RTX 4070 and RX7800XT is basically the new gtx 1060 6gb and rx 580 8gb. Look how that turned out.
+++
Yeah. The rx 580 aged far more gracefully.
Not the same. Both Nvidia Pascal and AMD Polaris have equal gaming features. RTX4070 just gets better ray tracing capability over the RX7800XT. As a gaming card, sure the RX580 is better today. However, for HTPC use, the GTX1060 is better because it has hardware VP9 decoder which UA-cam playback requires. RX580 doesn't , only RX5000 series and newer can decode VP9.
@@fleurdewin7958better raytracing in some games. It's mostly Cyberpunk that heavily favors Nvidia. In other raytracing titles AMD RDNA3 performance is on par or sometimes even better than Nvidia.
@@pcmark-nlAlan Wake? Ratchet and Clank? Robocop? Avatar Frontiers of Pandora?
I know you tested new games like hellblade, avatar. But you also tested Cyberpunk, Horizon and ghosts which are 4 years old now and are ps4 games
Will you do this for 4k too? 12 v 16
answer is obvious my dude.
@@Rihardololz Because he already did the 4070 Super vs 3090 video which is essentially that
I'd love to see these same tests at 2160p as well
I have a 7900XTX and just bought Avatar. I play at 4k with FSR and frame gen. The benchmark is pretty choppy with highs in the low 100s and lows as low as 25 yet actual gameplay is highs of 300 and lows of around 100. Allocated VRAM is 22gb!
Maybe the 4070 Ti and the 4070 Ti super could close the gap to compare more accurately the VRAM, but these ones are still a good example, nice comparison
My exact thoughts.
4070Ti S is quite a bit faster than the regular Ti, the comparison wouldn't work.
Is less than a 5% in most cases, wouldn’t make a big difference, and it would a better representation letting aside generational improvements and architecture
@@user-hj2um5sz3h 10% you mean
what about 4k and 12GB?
I just ended up getting 7900 gre, 16 seems like a safe spot and was relatively much cheaper than a 4070S. Sure no nvidia feature set but it is what it is- so far been enjoying it!
Honestly with this test I feel like it's difficult to compare 12gb vs 16gb because there isn't any games that actually use over 16gb vram (unless RT), once those become more mainstream due to consoles then we'll see performance difference. Just like how 8gb vram cards are becoming more irrelevant.
8GB cards started to struggle when new gen consoles were released. Same will happen to 12GB cards, once the next gen consoles launch in 3-4 years. Until then there won't be any major changes in VRAM usage. We are experiencing the same game behavior for ~3 years - 8GB has become insufficient for max (ultra) settings even at 1080p, but still enough for high settings. 12GB is enough for 1440p/max and 16GB is enough for 4K/max.
Can someone tell me if should i upgrade my rtx 3070 for a 4070ti Super or should wait and see what's coming for RTX 50?
definitely should wait at this point
O have a 3080 and i AM waiting for 5080 or 5090
I would say you should wait at this point. Your current card isn't exactly a slouch, so you can game just fine. If you were running Pascal, and you couldn't realistically play some titles you wanted to, it might be more understandable to buy now. Why did you wait until this late into 40 series to consider buying, and what changed?
I have a 3070 and I'm waiting. I dont't see the point in buying something that will soon be superseded *and* should drop in price when the next gen is out.
Thanks for all the advices, very helpful
I ran Ghost of Tsushima at Very High preset (motion blur off/chromatic aberation off/depth of field off) on 3080Ti 12GB @5120x1440 (not 4k but nearly) and it was in the 80s-110s for the entire game. Worked fine.
Good choice of GPUs to use for the comparison. My opinion is that the 4070 TI variants are too powerful to make a comparison for these more commonly used resolutions.
Ratchet and Clank should have come to this testing aswell, Ratchet is a vram eater lol
11Gb+ in 1080p
Would be interesting to see the rtx 4070ti in 1440p because often will be used with max setting and frame gen which add more vram usage
6800xt :) I run 1440p paired with a 5800x3d I’ll look forward to rdna 5
Same combo, its honestly faultless. I haven't had a single issue with framepacing, frametimes or framerates in any AAA title.
Keep in mind this is just one-time testing. If you do actual gameplay or longer session of gaming, it going to nuts, especially in games where the memory leak issues are horrible, if you know what I mean. 🤷🏻♀️
One other thing to consider is that recent poorly optimized games either used more VRAM when the first launched or poorly managed VRAM and ran out in longer sessions.
This usually gets fixed in a patch, but its still a hassle.
45C on the 7700XT... frick, is that card liquid cooled???
I have a rtx3080 10gig and I'm playing at 1440p havnt ever had it run out of cram with what I play but I'm sure probably a game out there I could get it to overflow
as a 6800 user who plays less demanding non AAA games at 4k it's also interesting to me that the 6800 also appears to beat the 7700xt in performance per watt by 30-50watts. Unless you require AVI encoding the 6800 seems like the sleeper GPU for 2024...
The wattage reports differently between rdna 2 and 3. Pretty sure the 6800 is only showing GPU power but the 7700xt is showing total board. I think in reality there isn't a meaningful difference in efficiency.
Nah the RNDA2 GPU’s don’t show the full wattage that the RNDA3 and Nvidia GPU’s do.
I have a 6800 but can't take advantage of it because i had to spend all my money on a 4060 laptop for school, instead of upgrading the 5 year old i5 that's paired with the 6800. It's painful to watch these videos and see how good the 6800 is, because i literally have it but i also kind of don't.
Daniel is correct. 6800 doesn't show the total board power usage. I think it's another 40-50 watts added onto the 6800 to get the total board wattage, so they basically have the same efficiency.
@@ninjafrozr8809For what school do you need a 4060 Laptop?
It may matters if you use 4k upscaled and frame generation..
Anybody using RX 6700XT can tell me how are the frame rates?
Pls tell me I'll get a couple more years of 1440p with my 3080 10gb. Bought it 5 months ago for 350 and I've been loving it
You should be fine if you don't go crazy with RT and don't use FSR frame gen.
I have 8 GB RTX 3070 and in Horizon Forbidden West I had to change textures for high instead of very high just because I didnt had enough memory and my FPS dropped from 90-110 to even 36 in cutscenes... Its fair to say that 8 GB is definitely not enough in 2024 for 1440p.
Buddy I just completed Gost of tusima on my laptop Gtx 1650 4gb card averaging around 40-50fps in medium settings with FSR in quality 😂
Diablo 4 used a lot of VRAM. @1440p UW it's between 15 and 16gb on my 6950xt
I do wish you’d used an Nvidia GPU for this test, because their drivers both allocate and use less VRAM. But still a good test, about as expected. Another thing to note is that if you’re struggling with VRAM and have a secondary monitor, try running your side monitor off the motherboard. That usually cuts off about half a GB for me.
Bought 4070 TiSuper for the VRAM. Right now those extra 4 arent really needed. But playing games at 1440. AAA games, it is borderline
one game that uses an absurd amount of vram is Escape From Tarkov. I have a 3080 12gb and I have had to turn DLSS to performance when I have textures on high. When I turn textures to medium, I still have DLSS on quality to guarantee the VRAM usage stays under 12gb.
If you cant benchmark for 1-2 hours on each game to make sure your results are accurate, maybe include fewer games instead? Also, AMD vram usage and Nvidia vram usage are often not the same, a game that uses more than 12gb on AMD might use below that on Nvidia card, but I'm sure you already know that. But your video is titled as if AMD vram usage represents all GPUs.
23:45 - this. You should aim for 16GB even though PC has VRAM separate from system RAM.
We simply run higher settigs and resolutions and sometimes on bad ports. We need the headroom.
It’s mostly games that were designed for the XBX and PS5 that need more vram. Object density and textured were heavily optimized on consoles for a little over 8gb shared memory. Hardware Unboxed did a good video on Harry Potter and Resident Evil needing more vram.
3070 user here with a 4k monitor. I can get away with 1440p and med/high textures upscaled to 4k on top of frame gen and I usually have enough VRAM for that. Otherwise I'll try to run 4k DLSS Performance + Lossless Scaling as well.
Have to say though, your 6800 has some really low clockspeeds. My 6800XT red dragon goes to 2440 out of the box, 2700 if you press it.
I think you have it in the "quiet" mode, which is TDP limited. Cyberpunk equally very low powerdraw of 160W.
I'm quite happy with my 12 GB 6700xt playing on 1440, but to be fair, I tend to play games that aren't too demanding, either. I'm confident the card will serve me very nicely for several more years at least.
It'll be a long ahh time before I get my hands on a 4K dispray. a shame rearyy
10Gb 3080 vs 12Gb 3080 crying out.. can you borrow one Daniel just to push this comparison?
my 7900gre and 5800x are just handling runescape which is the only game i play at the moment, it really is a struggle
I had vram issues on my 12 GB 4070 TI when using path tracing on cyberpunk dlss quality 1440p. I eventually upgraded to a 4090 and my takeaway is that 12 GB is not an issue until path tracing and high-end Ray tracing starts to come into a play, which is not something that matters much with AMD right now
Hi Owen when testing VRAM, you should chose VRAM heavy games
Last of us part 1, Star Wars jedi, Hogwarts, Avatar ETC
Why test on poorly optimized games? Those should not be a deciding factor for people
RDNA 3 RT advantage is not due to drivers, is hardware tweaks to improve BVH and instructions for discarding data, WTH lol
Great video Daniel! Is a 7800 xt too much for 1080p max settings then? Is the 7700xt enough for at least 5 years at that resolution? I want to buy one of those but im pretty concerned about how it will perform in the future just for saving some bucks.
7700XT is fine for 1080p
In my opinion memory issues in games come down to devs not properly mangaging memory because its hard and time consuming.
They have 4090 in there dev pcs so if it runs fine there they call it a day.
Also Nvidia is aware of this. Nvidia limits vram on the mid to low end to drive players to spend more via planned obsolescence.
I have the RTX 4070 Super, and considering how where I am computer parts are insanely expensive, getting my pure performance build at $2000, which is outrageous just for 4070S, I will have to make it last for at least a decade. With that said I've settled on a 1080P monitor, hoping it'll be able to play all games in 1080P max at smooth 60FPS for the upcoming decade.
I went with the 7700 XT Steel Legend instead of the XFX 6800 because I have a white build. 🤣 That said when I saw reviews the 6800 generally beat out the 7700 XT but I figured driver development would hopefully keep me going in the long run.
It really depends on how long you're using your GPU - all the 12GB discussion reminds me of the NV 3000 generation and cards like the otherwise great 3070(ti) that are now crippled by VRAM only. I tend to skip 2 generations, sometimes 3, and in the past I was lucky that I got the card with more VRAM because the competition became obsolete much faster - when I got my 8GB RX580, the competitors at the same if not usually higher price were the 1060 3GB and 6GB which wouldn't have lasted half that long, the competitor to my 6800 was the 3070 which has VRAM problems now and is already due for an upgrade. You can turn down graphics, sure, but if that means I have to reduce textures (which is still the main thing for enjoying nice graphics but also takes up most VRAM) I'm getting cranky, and given how much I enjoy texture mods for e.g. Bethesda games I want my headroom there - effects or RT are much less of a concern.
BTW, did you compare FSR with XeSS in Hellblade? If you're just in need for a small bump to get over that 60 like me Intel XeSS could be the option for you, it's not as effective as FSR Q but looks nicer and crisper. Something maybe to test and recommend if available so an AMD recommendation doesn't always depend on FSR.
Should try robo cop, seen that hit 14gb
Consider the fact that even if there are no problems with performance on cards with less video memory, this does not mean that there are no problems with quality, for example textures.
Ratchet and Clank Rift Apart is the only game I have spilled over VRAM on so far at max settings 1440p. 4070 12GB