The Avatar devs had a presentation discussing how they use a dynamic texture streaming method to prevent performance issues with low VRAM cards, this allows even 4GB cards to work. DF has a video on it called " Avatar: Frontiers of Pandora Tech Deep Dive - GDC Reaction Special" at 15:00 min in, they explain it
yeah who cares, just dont buy it, then they will eventually stop relasing it. Nobody HAS to buy a 8gb 5060 or 5060ti. Ignore those products. If people always just cry about it but then still buy it, Nvidia wont change anything. I personally dont care, I will buy a 5090, and even if that one would stay at 24gb I wouldnt care bc its way more than enough. (it wont stay there but likely be 28gb or 32gb; more likely 28gb tho). If 5060 is out, people can buy a 4070 Super if they want more than 8gb VRAM and if they cant afford a higher class GPU of the 50 series. Or they can buy an AMD card if they want to do that.
And likely a 6gb version at some point like the 3050 6gb lol. They’re probably going to release 12gb 5070 and also slap 16gb onto a 5060ti in clamshell and charge extra $100
@@Herr_AffeIg it depends on availability in the end. Where I'm from I had to choose between the 7800xt and the 4070 super. The 4070 super was around $100 more expensive. I went with the 7800xt because I didn't think 8% more performance with less vram was worth $100. If they the same price, it would've been a tougher pick. But welp, Nvidia tax is an actual thing.
@@Herr_AffeIntel is actually really good and heckin' cheap, got one as an "inbetween" because my 3070 broke and honestly... Mostly everything just works and performance is good enough :D especially for like 150€ 💀
I see a lot of comments saying that the video just proves that 8 GB is for 1080p. Let's put one thing clear from the video: the 4060 TI chip can deliver more than what 8 GB is able to handle, and it's a fair assumption that someone tries to run on 4K using DLSS on performance with frame-generation. The world gaming community is not only US, there are countries where a 4090 costs the same price of a car, so buying a 60 card is usually the affordable way to go (and at least where I live, it is still more than 2K). The time for giving a little more of VRAM is beyond past now, 8 GB for a 60 class GPU should be unacceptable.
@@xerxeslv All honesty, gamers don't really have that much contribution when it comes to the overall sales. The reason why NVidia or AMD prioritizes buzzwords like crypto or AI now is because the industry level corpos buy them in bulk. Then again, we are still a demographic to consider since we still pump money for them. The reason why things seem to have stagnated is because people still buy them regardless of complaint. Voice has no say in sales, and until people stop buying these products, nothing will change. The fanboys simply propagate the mindset of buying without thinking, which leads to where we are.
Saw this coming with my 3070 and sold it. Got myself a 7800xt instead and couldnt be happier with my decision, Im gaming at 1440p without much limitations rn.
Well, yes, 24 is ridiculously overkill (except for ai or productivity applications). Of course, in 10 years that will change but that's how it's always worked. @brunoutechkaheeros1182
90% stutter and lag complains I read actually end up with just VRAM overflows. People are very uneducated when it comes to VRAM, and how it affects your games.
Don't forget to mention that other software in background is also using VRAM and its usage grows with newer software versions. I can use up to 3 GB even without any game: browser, steam, vs-code, image-editor, video-player on 2nd monitor, discord, tg, etc.
Apps use SRAM not VRAM, VRAM is used for only the monitor output and textures so only a 2nd monitor can make you use up VRAM hence 32gb of SRAM is recommended
@alargecorgi2199 artificial limitation. The fact that 16 GB variants can do fine on 4k shows that these gpus are unnecessarily getting limited to lower visual fidelity. Also the the bigger problem is not being able to match console quality textures and probably resolution too in future.
@alargecorgi2199 99% won't use 4k because they can't use 4k for the artificial limitation. None of these changes the fact that these GPUs are basically garbage compared to consoles. And for the price they're being sold, that's not a good sign. Maybe if it was sold for 100$, then maybe 8 GB would have been acceptable for a potato product.
The same way a lot of youtubers started copying Josh Strife Hayes' quip of holding a mug while talking and General Sam standing up with a big mic in his hand narrating things in the background. It's only a matter of time until it starts catching on, really.
🙂 the rx 6700 10gb is pretty decent, it should be enough for 1080 high, and it only uses a single 8pin plug so all u need is any 450w power supply, its a bit faster than a 5700XT
@@anitaremenarova6662 Oh boy you are wrong. The same comportament that he showed here with the 8GB card never going close to 8 GB in use is happening with 12 GB too. My 4090 has enough VRAM for games, and in most modern games I played used 13-15 GB.
I have 12g vram rn, I'll wait for 4 years before finally biting the dust and upgrading, one thing for sure, I'll get the top of the line shit next time whenever i upgrade : )
@@lupintheiii3055 i feel like at 1440p i will survive 12g, especially coming from 4gb vram up until two months ago, i wont mind lowering down the settings ykwim ? + my gpu is very capable (4070 super)
thats what i do. I currently have 8gb - a 1070. I wanted to buy a 4090 but I decided to just wait for the 5090. 24gb of the 4090 or the probably 28gb of the 5090 are more than enough for YEARS. I will play in 4k resolution but even then. Developers wont make games where you need more than 24gb VRAM anytime soon. Hell more than 16gb isnt something Devs will ask for anytime soon bc the majority of users just doesnt have that much. Let alone 24gb+, only a minority has so much VRAM. So I plan to keep my 5090 for a bit and maybe look at the 7090 and how good it is. When I can comfortably drive my upcoming 240hz 4k OLED monitor, Im extremely happy. A 5090 will be able to do it in some games with DLSS, but there are already UE5 games where this probably wont be possible in 4k, even with DLSS Performance. So if anything this will be the reason I like to upgrade to a 7090 or 8090, but certainly not bc I run out of VRAM. I firmly believe VRAM wont be even a single thought for even just a millisecond during the lifetime of my 5090.
1440p is going to be viable for that much time with 12 gb vram. Unless they make more unoptimised shit. Also I saw huge difference previously when I maxed out texture settings in games but these days high vs very high / ultra is barely noticeable.
@@DELTA9XTC The 90 class cards just don't seem worth it to me. Maybe I might get the 6090 because of my immaturity and poor sense of humor I don't know. But is a computer part in the range of $2000 worth it when you can get a whole very capable computer for the same price? If you want to upgrade from a 1070 to a 5090 then go for it it's not my money. But I just think there's more reasonable options.
The results seem like if you're buying a 4060 thinking it's a native 4k beast machine, you will have bad results with 8 or 16 gb because your card blows for native 4k no matter how much VRAM you give it. You could struggle at native 4k even on 16gb (Like how Ghosts barely hits 30fps on a game with PS4-tier visuals) or you could play at something realistic for bottom tier hardware and get 60 on 8gb or 16gb.
I dont think anyone should be buying 8gb card for 4k. 1440p is still fine on most games that are actually optimized with 8gb cards specially if you turn down solw settings
@@BlackJesus8463 This video literally proves that 8GB is enough for 1440p. When you're buying a 4060 series card you should not be expecting MAX SETTINGS EVERY GAME NO MATTER WHAT kind of performance. This is nothing new. 60 series has always been the card where you may need to lower some settings here and there.
Daniel, do this test again with only 16gb of system memory. When the 8gb card runs out of memory, it spills over to system memory and uses over 16gb. I'll assume people buying a 8gb 4060 wont be buying more than 16gb of system memory and therefore would probably cause even worse results.
Just lower texures form very high to high and problem solved. It also have almost no impact in image quality in most games It is not big deal as you make it
Like literally it’s that simple smh or just don’t use FG. I like Daniel’s content but damn this VRAM discussion will always be the same. And it’s just common sense to get more VRAM when you upgrade your gpu.
Infact it's actually worst than what he make it since textures have zero impact on performance, so have to lower that because you lack Vram is the dumbest thing ever
@@hircine92h 4060ti is 1080p gpu and u are crying u can see diff in 4k both gpus suck soft dick in this res no diff 8 or 30fps u wont play like this either way - and if game will use more than 8gb at 1080p 4060ti will have to slow die to work at this res either way as every feature cost performance not only vram - even framegen cost u real fps and u can easly see that with any program
You do not need to drop ALL game settings to fix such issues. Often reducing texture quality one step from Very High to High (or Medium) is already enough to make the game playable if the GPU itself is fast enough. Yes this is a downgrade in image quality, but if you are buying lower Midrange cards you should accept Medium Settings as your friend anyway.
That being said, medium in new games looks good most of the time. Its not like the early 2000s when shadows or meshes just went missing when turning the settings a bit down.
i feel like 8GB is less than a problem than some think it is, yes i think 4060 and 4060TI should have had at least 12GB but is 8GB really a problem ? if u buy a 4060TI u are NOT aiming for 4K ultra that is a fact, that is more of a "1440P high with DLSS Q" card, and for that 8GB prove to be just (barely) enough.
Its kinda okay, just the fact that games cant run smooth because of VRAM not GPU’s power sucks :(((( image having a decent GPU and it easily can handle ultra settings, but its limited by VRAM 🫠
@@blondegirl7240 nah, 28 fps is already a crappy frame rate for any game, it does not matter if 8gb sends you to 8 fpsinstead because you start from crappy performance anyway.
the point is that as the AAA games (games being unoptimized dramas aside) advance even 1080p can demand close to 8GB of vram already, with that kind of thinking everyone should be stuck with using just iphone 5 or Samsung S5 etc
My god I learn so much from you Daniel you're honestly excellent with your content to empower the ability to buy the best gpu performance for your money. Going in details about vram, RT, DLSS vs FSR, etc, all of it really helped me understand everything and be able to make the right the decision for what I want. I'm really not confident in how long it will take AMD to improve FSR, so I will be aiming for a good Nvidia priced card. But this info is so important for even choosing the correct Nvidia card! Thanks a lot.
In my opinion 60fps should be the mininum. 24-30fps in 4k high with 16gb vs 8gb. The 4060 Ti just doesnt have enough power to make use of that vram. 30fps isnt enjoyable and you are going to get much better experience if you just lower your settings little bit to get 60fps and by that point the difference between 8gb and 16gb is 5-15fps. and for the money you could get a card that has more power and you would end up getting more fps with that.
@jumbob That's what I was noticing too. The settings he used to create vram issues are so far beyond what those cards are suited for to begin with that it doesn't even really matter. The game is going to run like garbage at those settings either way, just marginally less so with more vram. Do people really think they're supposed to crank everything to the max on a midrange card? Higher settings don't necessarily even look better depending on what it is. Stuff like depth of field, motion blur etc I turn off anyway regardless of performance because I just don't like the way they look. A lot of stuff like that is a matter of personal taste. And when you dig a little deeper there are always settings that have a heavy performance impact for miniscule difference in visuals.
You wouldn't use some insane 4k config but the video literally shows it loses frames in Forbidden West at 1440p with DLSS quality. You're rendering a base 1080p image at that point and the FPS is still over 60 but it's less than it should be for that hardware. I don't think 1440p DLSS quality is some insane setting in a game that first came out in 2022 on freaking consoles.
I don't get it, if you are tight on budget and buying a 1080p capable gpu like 4060 or rx7600 , what's the point of running and comparing them on 4k, most of the people use 1080p either way if they have a budget build
@@nicholasxamotainiumgilgamesh not engine related, most "issues" are not related to the engine at all. it's just a catch all people who have zero idea of what they are talking about use to vent that their hardware sucks. anyway the main difference is texture size and how many textures the game has in use at any given point in time (what's needed to draw the current scene). however it isn't just "textures" either, some games utilize the GPU for heavy calculations and can flood it with large amounts of data which can gobble up vram as well. any engine, custom or not can utilize the gpu in this way. all 3d games use the first bit but only a handful use the second because it's more work to plan around it.
It ALLOCATES too much VRAM, but it does not really need that much. At native 4K/Maxed out/RT/High textures (8GB) it runs fine on the 12GB 4070 Ti. This card can barely offer locked 60 fps experience though, so I personally played this game with modded DLSS set to Quality. That way the game looks better than at native and provides 90+ fps. 12GB for 4K + RT for a game with decent quality textures is pretty reasonable, I'd say.
It's because UA-cam tech channels push people to buy more expensive cards with more VRAM, so game developers can release games that don't run well with less VRAM. Meanwhile, there are more people with less than 8 GB VRAM in the Steam hardware survey than there are people with more than 8 GB. I'm getting the feeling that channels like this one don't really reflect what typical gamers are buying and playing.
@@IcyTorment that's not the reason, games are made with specific hardware in mind. they don't care what tech channels are doing/saying lmao. these 8gb cards are not being bought, they are relics of the past that people are riding until they feel like they got max value out of them. how do i know this? i know it because i was one of those people. i was on a gtx 970 (3.5gb vram) until about half a year ago. now i have a 7900xt (20gb vram). that's a decade long gap. plenty of value, people spend more on fucking coffee it's crazy!
Had a 12Gb 3060 and I had no problem in terms of running some games at 4k with upscaling , at no point did the Vram become an issue , upgraded to 4070 super recently no problems so far either, it gets close to that 12Gb but 99% of games i played don't go over it. but it certainly is clear that at some point in the near future even 12Gb won't be cutting it anymore.
Eh, i have the same card and it still works out well enough for me on 1080p. Just manage your expectations and perhaps be like me and play older games that don't suck as much.
@@X_irtz bro a 3070ti isnt a 1080p card, GPU wise. I mean ppl use(d) 3060's for 1440p sometimes and that is a bit of a stretch imo but 3070ti is really a classic 1440p card. Or lets say it would be one, if it had more than 8gb VRAM. and thats exactly the problem. Performance wise its quite strong, its around an RX 6800 non XT. A 3080 is like 20% stronger and ppl used the 3080 as 4k card when it got released. A 4070 non super is only like 15% faster than a 3070ti according to techpowerups relative performance chart. thing is you cant even use the full 3070ti at higher resolutions bc of the VRAM limitation. And its not about all new games sucking, its that they often want more VRAM bc of new graphically intensive features and the 3070ti doesnt have that. Its what we can see here with Ghost of Tsushima at 4k Ultra.
@@X_irtz Why 1080p? I have a non-TI 3070 and play everything at 1440p fine. Both our cards should absolutely have more Vram but I dont understand why you are playing at 1080p.
These GPUs are targeted for 1080p, and VRAM management isn't even a big deal in game development. 8GB is enough for 1080p and will stay that way for quite a while.
That may be true, and Nvidia may want us to see the RTX 4060 Ti 8GB as a 1080p GPU... However the price says otherwise. There are no bad GPUs, only bad prices, and holy shit is the RTX 4060 Ti 8GB expensive for a 1080p GPU. For ~$400 price I'd hope to get decent 1440p performance without any kind of issues or guaranteed +60 fps at max settings in every game at 1080p, and the RTX 4060 Ti 8GB isn't that.
@diego_chang9580 It's true and I agree 100%, I can't even imagine why we don't have a GTX 4060 instead of a RTX 4060, ray tracing is too heavy for cards below a RTX 3080 in "RT" cores.
it doesn't matter what nvidia says they are targeted for 4060 is capable of very high settings and even 1440p with dlss, so why bottleneck it with low vram if it can do much more, and you can clearly see that from the video literally its bottlenecked even on 1080p with raytracing and fg, not because its weak but because of vram only,
@@Rasingard Probably something to do about the architecture and because it being an RTX card it helps it with the price. If it was a GTX 4060 people wouldn't even consider paying above $300, which is exactly why the card is overpriced. The RTX 4060 can do some RT, sure, and it can use Frame Gen... Most of the time. But honestly? The only reason I'd buy it right now is DLSS.
The 4060 Ti is in a very weird spot of being too weak to handle a lot of games at maxed 4k, while also having that be really the only area you actually begin to see a huge difference in VRAM performance. Yes, 20-30 frames at maxed 4k no DLSS is a lot higher than 5-8 frames, it's still essentially unplayable.
yep. in most scenarios whenever you have steady 60+FPS with 4060 12GB, you will also get 60+FPS on 8GB as well. so it's not such a big deal. 8GB is kinda doing the job. the problems start when you have 3070 ti with just 8GB, or 3080 with just 10GB. VRAM limits those much more.
This is why most people whining about lower vram weren't really completely justified. While 8 was a downgrade from what the 3060 offered, if you treat it as it's own thing there's not really much of a reason to justify that much more than 8, because it's only going to handle 1080p and light raytracing workloads anyway, what would you actually need more vram for? The 4060 ti definitely should have at minimum had 10 but the last minute doubling of vram offering to appease complainers made no sense because it's completely overkill at its performance level. It might be worth it for some ai or productivity workloads on a budget but other than that it's kind of silly
@@jayceneal5273 No, just no to everything you just said. You shouldn't even have this conversation in the first place, having 8GB on anything more expensive than $200 in 2024 is just miserable, the fact you justify that is the exact reason why that cards exist.
@@lupintheiii3055 miserable at over 1080p, sure. It's perfectly fine at that resolution level and lighter graphical settings which you should expect from a budget card. I have a 4060 laptop and have no problems when graphics settings are adjusted for its performance level. Why do you expect vram for full 4k path tracing levels on a card that is barely above console performance?
I would be interested to know how many games ran differently in 1080p which is the target resolution of the 4060 ti in the first place. All the other resolutions are for 4070 up.
This is the problem with 8GB of VRAM on a relatively more powerful GPU. The GPU has the ability to run at max quality or higher res or even now with FG, but you ended up needing to lower the quality or run at a lower res just not to hit that VRAM capacity bottleneck. Also on some games where the difference is only 5% to 10%, remember that this is just a snap shot, only a tiny part of the game. Normally as you play along, the VRAM usage will creep higher thus over time you see the performance on 8GB getting worse. One thing that a lot of people doesn't mention but I personally experiencing it, NOT ONLY GAMES USES THE VRAM! Owen uses 2 PC to capture the game, thus the game can basically have all the VRAM to itself. In reality, some people play while leaving their browser window open. Some uses 2 monitor, so while a game might be running on a single monitor at 1440p, just by having 2 monitor means the other monitor will also use a bit of VRAM and that little bit might just push that 8GB GPU from being playable to having stutters. Some people also like to stream thus might run OBS in the background, thus more VRAM usage. Most of the example I showed normally only increase the VRAM usage by a bit and might only affect the game if it is already borderline on VRAM usage, but if you want to be a vtuber, it might be a good idea to buy a 3060 12GB instead of 4060 8GB if you only have that amount of money for your GPU budget. Whether you use 2D avatar or 3D avatar, both will use a noticeable amount of VRAM, thus if you have 8GB VRAM, prepare for the avatar to ended up stuttering and even the game also stuttering or just accept that you only have 8GB VRAM thus play on low setting. This doesn't mean 8GB GPU is useless. Like Owen said in the video, basically you just need to be more careful. Something like 3070, 4060, 4060Ti 8GB are fast GPU that can play any modern games at good frame rate. It just that in some games with a certain settings a game might not perform right and it isn't because the GPU is slow but it is because there isn't enough VRAM. Lower the texture setting, play at lower resolution (and yes, lower resolution and not just lower the internal rendering resolution), use lower RT setting or turn it off, and usually some setting can use substantial amount of VRAM like higher shadow setting. Find the compromise that work best for you. I'm currently using 7800XT and my previous GPU was RX580 8GB. While I was on 580 I made a promise to myself that my next GPU will have at least 12GB of VRAM mainly because like I said much earlier, I had experienced running out of VRAM. The thing is that I was just playing Uncharted! I have a 4K TV as my monitor (thus bigger frame buffer size compared to 1440p or 1080p) and I also like to open browser and leave some app open in the background (not running, just open, usually work related app). Basically the game run like a slide show at a setting I knew it can handle (which is basically max setting). Need to set the texture to low to be able to run it. I did try to run the game with everything closed and yes, it ran normally. But if every time I play I need to close everything then it is a hassle that I prefer not to have.
Yeah background apps, especially ones that are "browser based" like web browsers themselves, discord etc can quickly start eating up vram unless you manually turn off hardware acceleration (but then they can be sluggish to use instead) - was very noticeable when gaming with a 2gb vram gpu lol - ideally the operating system should be able to reduce the impact of it but in practice it doesn't always to the best job at it
The problem is expection. Nobody should expect games to run smoothly at 4K very high on 8Gb vRAM....I see 1440p very high seems to run fine in Ghost - that's acceptable in my view. I'm not defending nVidia's practise here, but in 2024 an 8Gb card is a 4060(Ti) or 7600 tier card, which is around £339 here in the UK. 1440p very high on a £340 card seems okay to me?
@@cajampa refurbished ex-cryptomining units from China probably? They started flooding aliexpress when China tried to ban crypto on 2021. Those ex-miners 3060 Ti's are sold at my country at around 200 USD.
I think these are entry level cards and still have a place in the market. If you are not at maxed or ultra settings or being ambitious with resolution on a low end card I think a 8GB card is enough. I think $250-350 is where 8GB belongs
@@haukikannel yeah the die is too but supposedly they are cutting down its core count as well as keeping it gddr6. If I had to guess it would most likely be maybe 10% faster than a 4060, and either be between $280 (if Nvidia ever threw gamers a bone) and maybe $330 until 40 series dropped in supply or $299 and letting 40 series drop under msrp. 40 series is sitting around msrp still with about a year left until a 5060 release
price is key though, I got a 6700xt 12gb for a secondary build, it was $245 on eBay +free shipping. 8gb shouldn’t be more than that ever. Personally 3060 12gb or 6700 non xt 10gb are the lowest end cards I’d even recommend for entry.
@@puffyips comparing used market is a different ball game entirely but I agree with you in terms of price point to an extent. A 3050 was 8GB for about $250 msrp but a 4060(which should have been 4050) probably wouldn’t be able to use more than 8 regularly. They def should’ve kept it around $280 with the 4060ti 8GB around $329-350 & 16GB at $399. It would’ve been better received.
8 GB cards are for 1080p. At that resolution I can’t see any difference. (Who would play 4k with 4060 TI? I mean yeah you would try but would not play regularly. 4k is for 80 and 90 cards)
I have no idea why people buy 4060 and 4060Ti 8GB. If you want an Nvidia GPU and you can't afford a 4070, get a used 3060, 3060Ti, or 2080Ti. Spending more than around $200 for an 8GB card is stupid.
Why? If you have a 1080p monitor there's absolutely no need for a 4070+ gpu. Different story if you plan to play in 1440p or 4k, but 4060 can't handle those resolutions well anyway (it can 1440p with upscalers, but still)
@@xviii5780 exactly this is the issue!!!... People like you offering up excuses and defending a billion dollar company. You pay hundreds of $ for a planned obsolescence product, you literally buy it and self impose limitations in how to use it. I love how Nvidia was showing 1080p benchmarks a few months ago like it's the standard 🤣 when years before they were pushing 1440p and 4k. Now they realized same as Apple they can dry up their consumers by selling them junk.
The current "tipping point" for AAA games seems to be around 10~12 GB, so a 12 GB card will probably still be able to cache all textures / models, even at the the highest quality setting. But, in a couple of years, I suspect that won't be the case. So either get a card with 16 GB now, or be prepared to reduce texture / model quality by next year.
For anyone curious, Vram amount is mainly a concern when you increase your resolution. 1080p 6-8 is probably all you'll ever need, 1440p 8-12, you should probably stay away from 8GB though it'll work fine for most games, modern games will run better with 10+ 4k 12-16+
This is exactly my experience. It is fairly rare for 4k to exceed 12GB even, except for the most demanding games. I can downscale from 8k maxed settings most of the time without hitting 16GB, but occasionally, I can exceed it with the most demanding games. 4090 is my reference. NVIDIA does have a very good memory compression algorithm, so results can vary against AMD or Intel
*Bro thank you for saying it, I've been saying this for so long and people while quite literally still don't get it, it also depends on the settings too, you can max out most settings and turn VRAM heavy settings like shadows or reflection down if you're worried about VRAM. But it all comes down too buy the right card for the right resolution, not that hard.*
I am very pleased to choose Rx 6700xt instead of RTX 3070 because at that time the price of Rx 6700xt was much cheaper. It's been 2 years of use now and its performance is still very good with 12gb vram.
@stangamer1151 I have 32gb of fast ddr5 ram too I think the vram is swapping things to the ram I didn't face any stuttering issues with my 3070 in most of the games I always playing at 1440p and sometimes 4k if I can get good fps with dlss on Because with 4k there is no aliasing anymore in the games
@berkertaskiran what I meant When at 1440p there is noise and little shimmering in the character hair and edges even with the best anti aliasing solution that game offer But at 4k there is nothing the game is so detailed and sharp And at 5k it's so so clean and no need to activate anti aliasing too But at 5k my poor gpu can do many games Only not very intensive games
Note that most modern games don't have much a visual difference on 1440p and 1080p when changing textures from very high to high. If your not running a 4k monitor than you can sacrifice a little and get more performance for the same look as a larger vram size of the same card. Learn what you like in games (settings wise) and then decide what brand and gpu you actually want.
This is pretty much what I'm running into with a RTX 3080M 8GB laptop. It can boost all the way up to 140W so it has the performance but at 1440P in newer games, I have to drop the textures to high or even medium to avoid performance issues. Thankfully, once I do that, I can game at mostly high settings in both Ghost of Tsushima and Horizon Forbiden West with DLSS at Quality. My main rig has the 3080 10GB and it's similar but those extra few gb of VRam make the issue less pronounced. This is why Losssless Scaling has been a game changer for me as I just lock the games at 40fps and I'm able to get 120fps with their 3X FG.
@@BlackJesus8463 Yeah. The point is you can connect it with a cable or two to your monitor (which probably has a USB hub for keyboard/mouse/other accessories) and unplug it and take it wherever you go. Also, that way one has to replace only one device (gaming laptop) rather than two (a Macbook/XPS 13 like ultrabook + gaming PC) every 3-4 years and always has all their data in one place. Not everybody's trying to get 240+ FPS on a maxed out gaming PC because they have godlike aim and have to extract every single FPS on a high refresh rate monitor to be a pro, or a mediocre streaming career, or want to show off on Reddit/YT. Why do people act retarded when it comes to setups based on gaming laptops?
Funny how Avatar, runs just fine on 8 gb GPU, with no massive visual quality downgrade vs 16g GPU. At GDC they explained how they integrated techniques to manage memory efficiently (something akin to sampler feedback). 8 gb should be just fine for resolutions
The market will still dictate on how devs will optimize their games, steam just conducted a hardware survey last month for developers and 8gb vram are still the largest chunk of the consumer base. Once people moved on and 12gb is the standard, we will have games that will peg 12gb and videos saying how 12gb vram is doomed from the start.
@@MrAnimescrazy I just want basic rasterized 60 fps at native 1080p for atleast 6-7 years because those fps will be enough for the features like framegen or only for normal smoothness of the gameplay
@satyamwathrey7704 ok then yeah either card would be overkill for 1080p 60 fps but I would get the 4080 super since it performs better then the 4070 ti super and it will last a bit longer. My pc is in my profile picture with my first all white build with a 4090/ 7800x3d/ 64 gigs of ddr5 pc build so my pc will last a very long time but I am looking toward the next gen pc parts so I may upgrade.
@@MrAnimescrazy Actually I want an overkill card because I want it to last long for the 1080p resolution because the normal performing card won't last long for the gaming I will wait 6-7 more months to see what 50 series have to offer then I'll decide accordingly to my budget
@satyamwathrey7704 ok and it will be hard to get a 5,000 series card but if you can get one compare the performance to the 4070 ti super and the 4080 super of course depending on your budget.
Don’t play 4K Very High if your GPU can’t handle it VRAM wise. 8GB isn’t enough for 4K typically. Stick to 1080p or 1440p and lower some settings. It also depends on what games you’re playing. But yeah it sucks that gaming companies are still selling 8GB cards. Base should be 12GB at least.
The 4060's crippled PCIe interface makes the spillover issue even worse. One of the main benefits of PCIe 4.0 X16 was that while it only made a small difference over PCIe 3.0 x16 in normal gaming, when there was spillover to system RAM, the doubling of throughput made it far more capable. For example, an 8GB card could handle 1-1.5GB of spillover to system RAM and only suffer a 15-20% performance hit. While with PCIe 4.0 x8, even 500MB of spillover starts to lead to far larger performance hits. This is why an RTX3060 8GB handles spillover to system RAM far better than the 4060 8GB, even though the 4060 is around 15-20% faster in terms of compute performance.
Future proofing is just a coping mechanism, you always upgrade before the hardware becomes a problem unless you are poor and if you can't upgrade then you shouldn't be looking into buying new GPUs.
Realty is that 8 GB is more than enough to fill plenty of high Quality textures, and if games cannot run well with it that's because developers are lazy to optimize better, or manufacturers push developers to use more memory to make them sell more expensive cards.
That's not reality at all. Working on a PS5 the developers have access to 12.5GB of memory and they use about 10GB for graphics memory. The Xbox Series X has 10GB of memory intended for graphics and 6GB intended for everything else. The developers simply won't cripple their games too much to run on PC.
It took this video 8mins to actually show me the true difference, which is why I tell people that 8GB is not the end of the world at all. Who? In there right mind would be buying a 4060 and playing at 4K, come on man… there shouldn’t even be any testing done at such a ridiculous resolution for a budget card. Make a new video with both cards side to side at 1080p only and now let’s see how unnecessary 16GB is for what? Twice as much money.
@@arenzricodexd4409 Nah, zWormz Gaming did a bunch of test using various gpus, using high end cpu, and he also came to the conclusion that vram usage is higher in that area.
Fleet's End is extremely CPU demanding area. My tuned R5 5600 can barely hold 60 fps here. While VRAM-vise the DLC does not use more than the main game. Played both the main game and DLC at 1620p/DLAA/Very High on my 12GB GPU w/o any issues. While at 4K + DLSS Quality this game needs more like 13-14GB.
why is the vram usage less on the 16gb tho? forbidden west 4k high, its showing 7.2 for the 8gb card and 5.8 for the 16gb? doesnt make any sense Cyberpunk 1080 RT overdrive also has the 16gb card at 5.6gb and the 8gb is at 8.2 with spillover?
@@Zombie1014060ti 8gb💀 at what cost? I payed $400 for a 6800xt 16gb last year, even 3080’s were $450 at that time. I wouldn’t even take a 4060ti 8gb for free.
That was actually my takeaway too. But I didn't see that as a bad thing, from the results I just thought well I guess I should just default to 1080p on modern games on my 3060ti.
@@userblame632 HFW gives me 60+ fps in most areas at 1440p high settings (texture, 4x anisotropic, level of detail, etc.) with DLSS Quality. It drops below 60 in some high populated areas sometimes. This is a mid tier card, not meant for ultra 1440p gameplay.
Reduce textures and problem is fixed. No idea why its so difficult. Seems many just want to play at Ultra or the game is not worth it. Better yet, don't play new AAA titles and save money on both games and hardware. You may even avoid all the bugs new games are released with these days. What this shows is that the slower cards don't really need a lot of VRAM. They are to slow anyway. A game with Ultra textures and the rest at medium to low don't look that good anyway.
people basically gift money to Nvidia, buying over-expensive GPU's.. and you expect them to play a terrible optimized ass game with HIGH settings? how dare you!1!1!1 is ultra or ultra, and 4K since I NEED to see that little wood texture of that tree that is 10 meters away of my character, or that water reflextion, I pay money to see that, dont care about gameplay
13:55 I don't think it's a run to run variance, the 8gb card is consistently pulling 10% less power so it's still being throttled somehow (and it's not temps). Also, if you look at both scenes, you can actually notice more details on the right in some cases. I think it's doing some compression to 'keep up' with the other card, if you look carfully you see a 1gb allocation difference in vram after all so there's definitely something happening. With avatar, some of the biggest differences I see are coming from the dust clouds. I also see some measure of difference in soil/plant textures, but those didn't feel as pronounced.
That's just to illustrate the issue, but it gets a lot worse than that, many games downgrade the texture packs, so you're not getting the visual quality that you're supposed to get, you just don't notice a performance dip, in fact since most reviewers are looking at frames per second only, 1% lows etc, they miss the fact that the game looks like s***. And that's at 1080p/1440p... In today's and yesterday's 2023 games, you shouldn't be paying 500 to $600 for 8 GB of RAM that will end up giving you a subpar experience. We're not talking about just framerates here, we're talking visual quality. In addition, you shouldn't be paying a used car price for video card that the company wants you to throw away in 2 years That is absolutely on the manufacturer There's an old word for that, it's called usury Don't blame the customer just because they're the ones who are trying to have a better time with their very hard-earned dollars, in a market where they cannot have what they actually want. This is monopolist behavior
I was worried about vram when the 40 series released but now.... All I care is gameplay, I dont want eye candies and lower my settings even if it is too overkill so that I can actually see the important things happening and react on time. (4070 ti)
That was a really well made analysis. 👍👍 Regarding the screen mode used for these tests, were you using Fullscreen or Borderless? There are some games that can stutter in either of those modes, depending on the game itself and the engine they are running on. One of the weirdest examples that I know of is Dead Island Remaster along with DI: Riptide Remaster. On 3090 RTX 24GB VRAM, both games have stutters in Fullscreen, but run super smooth in Borderless or Windowed, which is kinda odd. For comparison, they both run flawlessly on a 680 2GB VRAM and 1080 8GB VRAM, regardless of the screen mode. 🙂
This is showing that 8gb vram is only for 1080 with medium or lower 1440. Need 12gb for higher settings in 1440. 1440 is what games should being run since even 6yrs ago. The AMD 7000 and the Nvidia 40xx should have had 12gb vram as the base line.
AMD cards have more Vram because that's about the only thing going for them against Nvidia. If Nvidia gives the same amount of Vram as generously as AMD does, no one would be buying AMD cards.
Even for 1080p in some demanding games if u put textures and settings in higher quality, ur gonna run into problems. 8GB Vram has been there since about 2015 and its showing age now. 11GB Vram should be minimum by now.
I can attest that my RTX3080 10GB would run out of VRAM with Horizon Forbidden West if use any texture setting beyond medium if I have the Framegen mod turned on even with DLSS Quality on just 1440P.
My almost 10 years old R9 390 has 8gb vram. Only scumbag Nvidia uses vram to limit the potential of their GPUs to run at higher resolutions. Sadly the future 5060 will never be a 1440p gaming GPU. We will never move to 1440p standard because of Nvidia.
nVidia sure is hilarious I want to upgrade from my it-set-itself-on-fire-and-this-is-not-a-joke GTX 1080, but I don't want to downgrade VRAM and I'm sure they'll figure out how to do that. _Introducing the RTX 5090 Plus Ti Super X-treme (3.2 GB Edition)_
@@eclxysThey do, it’s meme’d because the 4070 makes absolutely more sense once you look at the price/performance. Base model 4060 is a good card, could still be $20 cheaper IMO.
I'm playing GoT on my 6gb 1660 super game actually runs great 70-80 fps on med-high with FSR. I was on a 1060 3gb before and it was impossible to load textures on majority of newer games even the going from 3gb to 6gb was pretty big.
@@joshmonus ive only had crashes with my 7900xtx in two games that i play quite a bit since swapping from an nvidia card thats counter strike 2 (driver timeout issues) and team fortress 2 (which just randomly closes sometimes when connecting to a server) which im trying to figure out if its my ddr5 - 6000mhz ram or something else but other than that the software and amd seems pretty good no stutter when paired with my 7950x3d cpu
Great video, thanks for the information overload. My personal takeaway is that for my 8GB 3060ti I should think about trying out 1080p with higher settings besides what I usually do (what I usually try first now is 1440p with high settings, and then DLSS Q if that does not give me 60fps, but 1080p@60fps is fine for me as well, and as this video points out DLSS takes up memory too).
Having enough Vram is not about framerate, it never was about framerate, it's about immage quality. You can have a game playing fine at 60fps while looking like trash because there's not enough Vram to load assets... wich is maybe worst of having lower framerate.
@@WrexBFOn any modern game engine it will affect immage quality before making any difference in framerate, that's my point. Making an entire video just looking at framerate is basically useless.
1080p 12gb is enough in 99,9% of cases. If you play at 1440p or even 4k with a card that has 12gb VRAM, its obviously not a high end card bc then it wouldnt have 12gb VRAM. So what that means, is that you can use DLSS. You SHOULD use it bc you get a massive amount of performance for almost no impact to image quality, sometimes DLSS upscaling is even better than native rendering with TAA anti aliasing. dont forget, as soon as you upscale, the actual rendering resolution is what dictates how much VRAM you use, not the upscaled output resolution. And in new games, where you can run into problems with 12gb, you will very likely have DLSS. At 1440p and 4k resolution DLSS is really a godsend. this doesnt excuse low VRAM offerings but its a realistic outlook of real world usage. UE5 gams with nanite, lumen, path tracing etc. are so freaking intensive, you will need upscaling anyways, so VRAM is less of a concern than when rendering native 1440p or especially 4k.
@@mrman6035 those games ain't even fun and just milk gamers for money in the way of DLC. Get a game from 5 to 10 years ago it is just way more fun to play.
BS. I tried Hogwarts Legacy on 1080p with an RX 6700XT 12GB. The game was eating around 9-10GB of Vram. Also Resident Evil games would eat huge Vram if textures and settings are pumped up. Ye 8GB Vram in 2024 is just garbage.
@@hircine92h yeah, that's called terrible optimization, there's even people with 24Vram GPUs that say the game consume almost all of it and yet people playing this game with a rx6600 1080p ultra settings at 54-60 fps no problem so no, 8GB Vram in 2024 is not garbage, is the AAA industry and their lack of optimization in their games... or they did that on purpose so people waste in more GPU's, since companies dont find it profit if someone stay with his GPU for more than a decade
@@hircine92h It had memory leaks which was fixed + you just do not need to use max settings every time... BTW. Nividia has better compression for vram so 8gb =/=8gb. I am playing cyberpunk 2077 with max texture settings without problems on 6gb vram 🤣 Nowadays every game should have automatic texture swap, when vram is full like in the Avatar game, to bad it is not a case.
@@hircine92h allocation is not equal to usage, for example I've had 16gb ram for thr longest time and the game I always play uses around 12gb then I upgraded to 32gb and the game now uses 17gb of ram but still performs the same.
Cyberpunk test is actually misleading. 8gb cards easily run out of vram when playing even with simple raytracing, overdrive hits even harder in some areas or after some time, but thats for thw NC without PhantomLiberty. Dogtown just destroys 8gb cards with raytracing, overdrive is unplayable mostly (tested with my 3060ti). Even 12gb is not really enough for dogtown (my 4070super runs out of vram there sometimes). They shouldve added a demo run for Dogtown really.
if you dont care about ray tracing, 8gb is fine. i have a vanilla 3080 (10gb) and i often have performance drops in rt games due to vram the longer i play, restarting the game brings my fps back. I have 1440p monitor... and it doesnt matter if i use dlss quality, balanced, performance, because the final resolution is the same so the presets use the same 1440p assets... i think 12gb is the bare minimum these days if you want a little bit of every graphic tech this gen has to offer (ray tracing, upscales, frame generation, high or ultra textures, 1440p or 4k etc). 1080p is the new 720p.
Funny thing about games is that many play old titles, and your 1060 will be perfectly fine for older titles like Battlefield 4 and will perform pretty incredible I don't judge people for having old gpus, some don't even care about modern titles, and play DOTA 2 and old RTS style of games Having a powerful GPU may actually cause issues in older titles I have 4090 and Battlefield 4 never utilizes 90% of my GPU, which in turn forces my CPU to work more. Because CPU is not fast enough for the gpu, and the title not being demanding enough to drive the gpu creating some issues in frame times. You want that 90% sweet spot somewhere, with low CPU usage I play older titles, and yet I have 4090 and 13900k. I was doing just fine with RTX 3070 in BF4 that I play to this day It's the Unreal 4 and Unreal 5 engines that take a toll on your gpu. They are not optimized for responsiveness, but mostly an engine that looks good and has potential for beauty, at the cost of the performance. Unreal 5 is brutal on GPU, even 4090 struggles to keep 160 fps in most titles consistently
Sell your old GPU to fund your upgrade at Jawa! jawa.link/OwenGPUJune24 Use code OWEN10 to fund your upgrade!
The Avatar devs had a presentation discussing how they use a dynamic texture streaming method to prevent performance issues with low VRAM cards, this allows even 4GB cards to work. DF has a video on it called " Avatar: Frontiers of Pandora Tech Deep Dive - GDC Reaction Special" at 15:00 min in, they explain it
Went and the price is a rip off, I can sell it on eBay at a much better price
No need to upgrade. My 6900 xt with 16 gb vram does a great job and I've yet to find a game that taxes it. And I play Vr games.😁
no thanks its offering $40 for my 5500XT mech 8gb OC
Could you compare faster 8 GB gpu to 4060 Ti 8GB? Like RTX 3080 vs 4060 Ti 8GB.
Remember folks nvidia is still going to release a 8gb vram gpu with the rtx 50 series and amd will do the same 🤔
The more you buy the more you save
yeah who cares, just dont buy it, then they will eventually stop relasing it. Nobody HAS to buy a 8gb 5060 or 5060ti. Ignore those products. If people always just cry about it but then still buy it, Nvidia wont change anything.
I personally dont care, I will buy a 5090, and even if that one would stay at 24gb I wouldnt care bc its way more than enough. (it wont stay there but likely be 28gb or 32gb; more likely 28gb tho). If 5060 is out, people can buy a 4070 Super if they want more than 8gb VRAM and if they cant afford a higher class GPU of the 50 series. Or they can buy an AMD card if they want to do that.
nVidia are banking on their new compression (NTC) which has to be implemented into games... greed or short slightness's? I leave that up to you
And likely a 6gb version at some point like the 3050 6gb lol.
They’re probably going to release 12gb 5070 and also slap 16gb onto a 5060ti in clamshell and charge extra $100
Just download more VRAM
How bad is 8GB of VRAM in 2024...
Me watching this video with 4GB of VRAM in 2024
Dodged a bullet right there. Good thing you didn't get an 8GB GPU.
Yeah i feel you. I am watching on a gtx 1050ti
i think its funny that im watching this video about how 8gb isnt enough and read this comment about you having 4, whilst i have 512 MEGABYTES of vram
😂😂 same here bud
Me with my 4GB 3050 laptop.
Meanwhile me with 6GB: Chuckles, I'm in danger. 💀
at minimum only the RTX 5050 should be 8GB, everything else needs to be 12GB+
Sadly people keep buying so they have no reason to up the VRAM.
theres another company offering higher vram capacities, look em up
@@Herr_AffeIg it depends on availability in the end. Where I'm from I had to choose between the 7800xt and the 4070 super. The 4070 super was around $100 more expensive. I went with the 7800xt because I didn't think 8% more performance with less vram was worth $100. If they the same price, it would've been a tougher pick. But welp, Nvidia tax is an actual thing.
rtx 5030 4gb
@@Herr_AffeIntel is actually really good and heckin' cheap, got one as an "inbetween" because my 3070 broke and honestly... Mostly everything just works and performance is good enough :D especially for like 150€ 💀
I see a lot of comments saying that the video just proves that 8 GB is for 1080p. Let's put one thing clear from the video: the 4060 TI chip can deliver more than what 8 GB is able to handle, and it's a fair assumption that someone tries to run on 4K using DLSS on performance with frame-generation. The world gaming community is not only US, there are countries where a 4090 costs the same price of a car, so buying a 60 card is usually the affordable way to go (and at least where I live, it is still more than 2K). The time for giving a little more of VRAM is beyond past now, 8 GB for a 60 class GPU should be unacceptable.
1080p or 2k with some upscale may be, still enough. Not talking about new cards tho...
As long as there are fanboys who defend the skewed price to value ratio, it will never change.
@@Nif-kun I would presume companies dont care about fanboys or even opinion, only sales.
Question is: how much fanboyism really contribute to sales.
@@xerxeslv
All honesty, gamers don't really have that much contribution when it comes to the overall sales. The reason why NVidia or AMD prioritizes buzzwords like crypto or AI now is because the industry level corpos buy them in bulk. Then again, we are still a demographic to consider since we still pump money for them. The reason why things seem to have stagnated is because people still buy them regardless of complaint. Voice has no say in sales, and until people stop buying these products, nothing will change. The fanboys simply propagate the mindset of buying without thinking, which leads to where we are.
Rx 6600 8gb is pretty popular GPU
Saw this coming with my 3070 and sold it.
Got myself a 7800xt instead and couldnt be happier with my decision,
Im gaming at 1440p without much limitations rn.
Saw a Rx 6800xt on jawa for 390$
Still hanging with a 3070 because a 7800xt would cost me 3x what I paid my 3070. And I'm not planning to upgrade before GTA 6 hits steam on PC
Did the same with my 3070, got a RX 6950 XT on sale and could not be happier. CPU is the current bottleneck so will be the next upgrade.
@@jacomoolman6503 6950xt is also apparently better than the 3090 without rt. Fricken insane ppl still sleep on amd.
im using 3070ti its still fine but maybe next year is when it really starts to struggle
12gb vram is the sweet spot..16gb vram is ideal..8gb vram is shoot on sight..
For now.
@@sandboy5880 Thanks Captain Obvious.
Well, yes, 24 is ridiculously overkill (except for ai or productivity applications). Of course, in 10 years that will change but that's how it's always worked. @brunoutechkaheeros1182
12GB is barely enough if you want to play at console settings...
@@lupintheiii3055 not even close to being true lol
Task Manger also shows when VRAM overflows in to system RAM, handy to keep an eye on.
CapFrameX will show you as well
90% stutter and lag complains I read actually end up with just VRAM overflows. People are very uneducated when it comes to VRAM, and how it affects your games.
Don't forget to mention that other software in background is also using VRAM and its usage grows with newer software versions. I can use up to 3 GB even without any game: browser, steam, vs-code, image-editor, video-player on 2nd monitor, discord, tg, etc.
Apps use SRAM not VRAM, VRAM is used for only the monitor output and textures so only a 2nd monitor can make you use up VRAM hence 32gb of SRAM is recommended
@alargecorgi2199 artificial limitation. The fact that 16 GB variants can do fine on 4k shows that these gpus are unnecessarily getting limited to lower visual fidelity. Also the the bigger problem is not being able to match console quality textures and probably resolution too in future.
@alargecorgi2199 99% won't use 4k because they can't use 4k for the artificial limitation. None of these changes the fact that these GPUs are basically garbage compared to consoles. And for the price they're being sold, that's not a good sign.
Maybe if it was sold for 100$, then maybe 8 GB would have been acceptable for a potato product.
Someone is going to make a meme about you moving your whole person around to point at things 😂
This is why we love Daniel. He is the real deal, a legend, and to some.. a mouse pointer
Cursor Daniel is probably my favorite edition of Daniel
bout to make my windows mouse cursor into a transparrent Daniel pointing his finger at anything i want him to point his finger at
Daniel Owen The UA-cam Pointer Channel. Sounds about right.
The same way a lot of youtubers started copying Josh Strife Hayes' quip of holding a mug while talking and General Sam standing up with a big mic in his hand narrating things in the background.
It's only a matter of time until it starts catching on, really.
Would love to see a How bad is 10gb vs 12gb video.
Same as here, in the vast majority of games you'll never go past 12GB in 1440p or upscaled 4K. 8-10GB is an automatic no-buy in 2024.
he did a 4070 vs 3080 months ago, so you have that, its still from 2023 or early 2024.
🙂 the rx 6700 10gb is pretty decent, it should be enough for 1080 high, and it only uses a single 8pin plug so all u need is any 450w power supply, its a bit faster than a 5700XT
@@anitaremenarova6662 Oh boy you are wrong. The same comportament that he showed here with the 8GB card never going close to 8 GB in use is happening with 12 GB too. My 4090 has enough VRAM for games, and in most modern games I played used 13-15 GB.
@@Just4Games2011 Allocated VRAM isn't the same as the one being used.
My 1080 ti with 11gb of vram:
Best purchase you ever made if you got that thing at launch.
what about your 1080 ti with 11gb of vram?
that card is to old..
@@groenevinger3893 and yet it plays newer titles just fine
Is weaker than an RTX 4060.
It would have been nice to add the 3060 12GB and see if it beats the 8GB one in some cases
It does, none of these examples would brick a 12GB card.
Accademically: yes. Practically: it probably doesn't have the horsepower to lead to any desirable outcome.
I still have 8gb (built in feb 2021). When I upgrade from 1440p to 4k and/or play newer titles, I'll upgrade to 16gb+.
I have 12g vram rn, I'll wait for 4 years before finally biting the dust and upgrading, one thing for sure, I'll get the top of the line shit next time whenever i upgrade
: )
Just get enough Vram, RX 6800 was $400 two years ago
@@lupintheiii3055 i feel like at 1440p i will survive 12g, especially coming from 4gb vram up until two months ago, i wont mind lowering down the settings ykwim ? + my gpu is very capable (4070 super)
thats what i do. I currently have 8gb - a 1070. I wanted to buy a 4090 but I decided to just wait for the 5090. 24gb of the 4090 or the probably 28gb of the 5090 are more than enough for YEARS. I will play in 4k resolution but even then. Developers wont make games where you need more than 24gb VRAM anytime soon. Hell more than 16gb isnt something Devs will ask for anytime soon bc the majority of users just doesnt have that much. Let alone 24gb+, only a minority has so much VRAM.
So I plan to keep my 5090 for a bit and maybe look at the 7090 and how good it is. When I can comfortably drive my upcoming 240hz 4k OLED monitor, Im extremely happy. A 5090 will be able to do it in some games with DLSS, but there are already UE5 games where this probably wont be possible in 4k, even with DLSS Performance. So if anything this will be the reason I like to upgrade to a 7090 or 8090, but certainly not bc I run out of VRAM. I firmly believe VRAM wont be even a single thought for even just a millisecond during the lifetime of my 5090.
1440p is going to be viable for that much time with 12 gb vram. Unless they make more unoptimised shit. Also I saw huge difference previously when I maxed out texture settings in games but these days high vs very high / ultra is barely noticeable.
@@DELTA9XTC The 90 class cards just don't seem worth it to me. Maybe I might get the 6090 because of my immaturity and poor sense of humor I don't know. But is a computer part in the range of $2000 worth it when you can get a whole very capable computer for the same price? If you want to upgrade from a 1070 to a 5090 then go for it it's not my money. But I just think there's more reasonable options.
Interesting that even when you can adjust things to get it to work, there are certain modes that just don't behave in newer games on the 8 GB version.
Nvidia wont change the amount of vram if ppl buying 8gb cards at 400 dollars
Daniel chose an excellent batch of games to test.
Fr💀
These are the games which I also want to play
On my 16GB 4060Ti
The results seem like if you're buying a 4060 thinking it's a native 4k beast machine, you will have bad results with 8 or 16 gb because your card blows for native 4k no matter how much VRAM you give it.
You could struggle at native 4k even on 16gb (Like how Ghosts barely hits 30fps on a game with PS4-tier visuals) or you could play at something realistic for bottom tier hardware and get 60 on 8gb or 16gb.
I dont think anyone should be buying 8gb card for 4k. 1440p is still fine on most games that are actually optimized with 8gb cards specially if you turn down solw settings
8GB is not fine enough for 1440p.
@@BlackJesus8463 it is in many games if you turn down some settins
@@BlackJesus8463 This video literally proves that 8GB is enough for 1440p. When you're buying a 4060 series card you should not be expecting MAX SETTINGS EVERY GAME NO MATTER WHAT kind of performance. This is nothing new. 60 series has always been the card where you may need to lower some settings here and there.
@@OneDollaBillNo it's not, try using a GPU with more than 12GB then compare immage quality
@@lupintheiii3055 you a bot, look at the video. Stop posting bs
The man doing his duty to say "nice" when he runs into the number 69. Salutations, sir!
Daniel, do this test again with only 16gb of system memory. When the 8gb card runs out of memory, it spills over to system memory and uses over 16gb. I'll assume people buying a 8gb 4060 wont be buying more than 16gb of system memory and therefore would probably cause even worse results.
Just lower texures form very high to high and problem solved. It also have almost no impact in image quality in most games
It is not big deal as you make it
Like literally it’s that simple smh or just don’t use FG. I like Daniel’s content but damn this VRAM discussion will always be the same.
And it’s just common sense to get more VRAM when you upgrade your gpu.
Infact it's actually worst than what he make it since textures have zero impact on performance, so have to lower that because you lack Vram is the dumbest thing ever
ultra and high settings may not look to different but to think that games moving forward aren’t going to keep using more n more vram is delusional
ye lower textures even tho u bought an expensive GPU +$400. Seems logical. Great times.
@@hircine92h 4060ti is 1080p gpu and u are crying u can see diff in 4k both gpus suck soft dick in this res no diff 8 or 30fps u wont play like this either way - and if game will use more than 8gb at 1080p 4060ti will have to slow die to work at this res either way as every feature cost performance not only vram - even framegen cost u real fps and u can easly see that with any program
You do not need to drop ALL game settings to fix such issues. Often reducing texture quality one step from Very High to High (or Medium) is already enough to make the game playable if the GPU itself is fast enough.
Yes this is a downgrade in image quality, but if you are buying lower Midrange cards you should accept Medium Settings as your friend anyway.
That being said, medium in new games looks good most of the time.
Its not like the early 2000s when shadows or meshes just went missing when turning the settings a bit down.
Rules of happy gaming:
1. Turn off fps counter
2. Use recommended settings
3. Enjoy the game
If your playing single players games yes
I dont need FPS counter to see the texture is not loading properly because my GPU is running out of VRAM.
I'm glad that I sold my 3070 and got a 4070tiS when it launched, 8GB definitely became a hindrance for 4K in 2024.
dlss brother
@@xxNiceLeaderxx Isn’t always enough like in CP77 RT.
@@sapphyrus I don't use RT in my games so i'm good on that.
so i guess people buy 4060 to game 4k? lmfao idiot
4k cards already have 8+gb vram
4070ti is shit for 4k only legit 4k card is 4090
i feel like 8GB is less than a problem than some think it is, yes i think 4060 and 4060TI should have had at least 12GB but is 8GB really a problem ?
if u buy a 4060TI u are NOT aiming for 4K ultra that is a fact, that is more of a "1440P high with DLSS Q" card, and for that 8GB prove to be just (barely) enough.
Its kinda okay, just the fact that games cant run smooth because of VRAM not GPU’s power sucks :(((( image having a decent GPU and it easily can handle ultra settings, but its limited by VRAM 🫠
@@blondegirl7240 nah, 28 fps is already a crappy frame rate for any game, it does not matter if 8gb sends you to 8 fpsinstead because you start from crappy performance anyway.
@@blondegirl7240 me who is having a 16GB 4060Ti but limited due to its power 🥲
the point is that as the AAA games (games being unoptimized dramas aside) advance even 1080p can demand close to 8GB of vram already, with that kind of thinking everyone should be stuck with using just iphone 5 or Samsung S5 etc
My god I learn so much from you Daniel you're honestly excellent with your content to empower the ability to buy the best gpu performance for your money. Going in details about vram, RT, DLSS vs FSR, etc, all of it really helped me understand everything and be able to make the right the decision for what I want. I'm really not confident in how long it will take AMD to improve FSR, so I will be aiming for a good Nvidia priced card. But this info is so important for even choosing the correct Nvidia card! Thanks a lot.
In my opinion 60fps should be the mininum. 24-30fps in 4k high with 16gb vs 8gb. The 4060 Ti just doesnt have enough power to make use of that vram. 30fps isnt enjoyable and you are going to get much better experience if you just lower your settings little bit to get 60fps and by that point the difference between 8gb and 16gb is 5-15fps. and for the money you could get a card that has more power and you would end up getting more fps with that.
@jumbob That's what I was noticing too. The settings he used to create vram issues are so far beyond what those cards are suited for to begin with that it doesn't even really matter. The game is going to run like garbage at those settings either way, just marginally less so with more vram. Do people really think they're supposed to crank everything to the max on a midrange card? Higher settings don't necessarily even look better depending on what it is. Stuff like depth of field, motion blur etc I turn off anyway regardless of performance because I just don't like the way they look. A lot of stuff like that is a matter of personal taste. And when you dig a little deeper there are always settings that have a heavy performance impact for miniscule difference in visuals.
You wouldn't use some insane 4k config but the video literally shows it loses frames in Forbidden West at 1440p with DLSS quality. You're rendering a base 1080p image at that point and the FPS is still over 60 but it's less than it should be for that hardware. I don't think 1440p DLSS quality is some insane setting in a game that first came out in 2022 on freaking consoles.
I don't get it, if you are tight on budget and buying a 1080p capable gpu like 4060 or rx7600 , what's the point of running and comparing them on 4k, most of the people use 1080p either way if they have a budget build
All this aside, RE4 remake uses wayyy too much vram for the graphics its putting out and i dont know why its the way it is
game engine
@@nicholasxamotainiumgilgamesh
not engine related, most "issues" are not related to the engine at all. it's just a catch all people who have zero idea of what they are talking about use to vent that their hardware sucks.
anyway the main difference is texture size and how many textures the game has in use at any given point in time (what's needed to draw the current scene).
however it isn't just "textures" either, some games utilize the GPU for heavy calculations and can flood it with large amounts of data which can gobble up vram as well.
any engine, custom or not can utilize the gpu in this way. all 3d games use the first bit but only a handful use the second because it's more work to plan around it.
It ALLOCATES too much VRAM, but it does not really need that much. At native 4K/Maxed out/RT/High textures (8GB) it runs fine on the 12GB 4070 Ti. This card can barely offer locked 60 fps experience though, so I personally played this game with modded DLSS set to Quality. That way the game looks better than at native and provides 90+ fps.
12GB for 4K + RT for a game with decent quality textures is pretty reasonable, I'd say.
It's because UA-cam tech channels push people to buy more expensive cards with more VRAM, so game developers can release games that don't run well with less VRAM.
Meanwhile, there are more people with less than 8 GB VRAM in the Steam hardware survey than there are people with more than 8 GB. I'm getting the feeling that channels like this one don't really reflect what typical gamers are buying and playing.
@@IcyTorment
that's not the reason, games are made with specific hardware in mind. they don't care what tech channels are doing/saying lmao.
these 8gb cards are not being bought, they are relics of the past that people are riding until they feel like they got max value out of them. how do i know this? i know it because i was one of those people.
i was on a gtx 970 (3.5gb vram) until about half a year ago. now i have a 7900xt (20gb vram). that's a decade long gap. plenty of value, people spend more on fucking coffee it's crazy!
Had a 12Gb 3060 and I had no problem in terms of running some games at 4k with upscaling , at no point did the Vram become an issue , upgraded to 4070 super recently no problems so far either, it gets close to that 12Gb but 99% of games i played don't go over it. but it certainly is clear that at some point in the near future even 12Gb won't be cutting it anymore.
Bought a 3070ti before the 40 series and the 12gb 30 series came out. Feels real bad now
always me everytime i upgrade... like a month later a Ti version or so comes out with extra gigs or speed 😭
Eh, i have the same card and it still works out well enough for me on 1080p. Just manage your expectations and perhaps be like me and play older games that don't suck as much.
@@X_irtz bro a 3070ti isnt a 1080p card, GPU wise. I mean ppl use(d) 3060's for 1440p sometimes and that is a bit of a stretch imo but 3070ti is really a classic 1440p card. Or lets say it would be one, if it had more than 8gb VRAM. and thats exactly the problem. Performance wise its quite strong, its around an RX 6800 non XT. A 3080 is like 20% stronger and ppl used the 3080 as 4k card when it got released. A 4070 non super is only like 15% faster than a 3070ti according to techpowerups relative performance chart.
thing is you cant even use the full 3070ti at higher resolutions bc of the VRAM limitation. And its not about all new games sucking, its that they often want more VRAM bc of new graphically intensive features and the 3070ti doesnt have that. Its what we can see here with Ghost of Tsushima at 4k Ultra.
Should've had the foresight, people who bought a 4070ti for 1440p and 4080 for 4K will feel the same soon as well.
@@X_irtz Why 1080p? I have a non-TI 3070 and play everything at 1440p fine. Both our cards should absolutely have more Vram but I dont understand why you are playing at 1080p.
These GPUs are targeted for 1080p, and VRAM management isn't even a big deal in game development. 8GB is enough for 1080p and will stay that way for quite a while.
That may be true, and Nvidia may want us to see the RTX 4060 Ti 8GB as a 1080p GPU... However the price says otherwise.
There are no bad GPUs, only bad prices, and holy shit is the RTX 4060 Ti 8GB expensive for a 1080p GPU.
For ~$400 price I'd hope to get decent 1440p performance without any kind of issues or guaranteed +60 fps at max settings in every game at 1080p, and the RTX 4060 Ti 8GB isn't that.
@diego_chang9580 It's true and I agree 100%, I can't even imagine why we don't have a GTX 4060 instead of a RTX 4060, ray tracing is too heavy for cards below a RTX 3080 in "RT" cores.
it doesn't matter what nvidia says they are targeted for
4060 is capable of very high settings and even 1440p with dlss, so why bottleneck it with low vram if it can do much more, and you can clearly see that from the video
literally its bottlenecked even on 1080p with raytracing and fg, not because its weak but because of vram only,
@@Rasingard Probably something to do about the architecture and because it being an RTX card it helps it with the price.
If it was a GTX 4060 people wouldn't even consider paying above $300, which is exactly why the card is overpriced.
The RTX 4060 can do some RT, sure, and it can use Frame Gen... Most of the time. But honestly? The only reason I'd buy it right now is DLSS.
The more you buy, the more you slave. strong copium in this section here.
Imagine spending money on an obsolete 8gb 5000 series card that would probably cost $300. "The more you buy, The more you SLAVE"
waiting for the gddr7 dweebs lol
Just turn down texture settings or get Radeon.
Nvidiot Tax
Those will sell a lot!
Slave lol
The 4060 Ti is in a very weird spot of being too weak to handle a lot of games at maxed 4k, while also having that be really the only area you actually begin to see a huge difference in VRAM performance. Yes, 20-30 frames at maxed 4k no DLSS is a lot higher than 5-8 frames, it's still essentially unplayable.
yep. in most scenarios whenever you have steady 60+FPS with 4060 12GB, you will also get 60+FPS on 8GB as well. so it's not such a big deal. 8GB is kinda doing the job. the problems start when you have 3070 ti with just 8GB, or 3080 with just 10GB. VRAM limits those much more.
Its for 3d and creative work. U can do alot with that vram and power
This is why most people whining about lower vram weren't really completely justified. While 8 was a downgrade from what the 3060 offered, if you treat it as it's own thing there's not really much of a reason to justify that much more than 8, because it's only going to handle 1080p and light raytracing workloads anyway, what would you actually need more vram for? The 4060 ti definitely should have at minimum had 10 but the last minute doubling of vram offering to appease complainers made no sense because it's completely overkill at its performance level. It might be worth it for some ai or productivity workloads on a budget but other than that it's kind of silly
@@jayceneal5273 No, just no to everything you just said.
You shouldn't even have this conversation in the first place, having 8GB on anything more expensive than $200 in 2024 is just miserable, the fact you justify that is the exact reason why that cards exist.
@@lupintheiii3055 miserable at over 1080p, sure. It's perfectly fine at that resolution level and lighter graphical settings which you should expect from a budget card. I have a 4060 laptop and have no problems when graphics settings are adjusted for its performance level. Why do you expect vram for full 4k path tracing levels on a card that is barely above console performance?
I would be interested to know how many games ran differently in 1080p which is the target resolution of the 4060 ti in the first place. All the other resolutions are for 4070 up.
99.9% will work just fine with 8gb… even with 4gb! When running 1080p
This is the problem with 8GB of VRAM on a relatively more powerful GPU. The GPU has the ability to run at max quality or higher res or even now with FG, but you ended up needing to lower the quality or run at a lower res just not to hit that VRAM capacity bottleneck. Also on some games where the difference is only 5% to 10%, remember that this is just a snap shot, only a tiny part of the game. Normally as you play along, the VRAM usage will creep higher thus over time you see the performance on 8GB getting worse.
One thing that a lot of people doesn't mention but I personally experiencing it, NOT ONLY GAMES USES THE VRAM! Owen uses 2 PC to capture the game, thus the game can basically have all the VRAM to itself. In reality, some people play while leaving their browser window open. Some uses 2 monitor, so while a game might be running on a single monitor at 1440p, just by having 2 monitor means the other monitor will also use a bit of VRAM and that little bit might just push that 8GB GPU from being playable to having stutters. Some people also like to stream thus might run OBS in the background, thus more VRAM usage. Most of the example I showed normally only increase the VRAM usage by a bit and might only affect the game if it is already borderline on VRAM usage, but if you want to be a vtuber, it might be a good idea to buy a 3060 12GB instead of 4060 8GB if you only have that amount of money for your GPU budget. Whether you use 2D avatar or 3D avatar, both will use a noticeable amount of VRAM, thus if you have 8GB VRAM, prepare for the avatar to ended up stuttering and even the game also stuttering or just accept that you only have 8GB VRAM thus play on low setting.
This doesn't mean 8GB GPU is useless. Like Owen said in the video, basically you just need to be more careful. Something like 3070, 4060, 4060Ti 8GB are fast GPU that can play any modern games at good frame rate. It just that in some games with a certain settings a game might not perform right and it isn't because the GPU is slow but it is because there isn't enough VRAM. Lower the texture setting, play at lower resolution (and yes, lower resolution and not just lower the internal rendering resolution), use lower RT setting or turn it off, and usually some setting can use substantial amount of VRAM like higher shadow setting. Find the compromise that work best for you.
I'm currently using 7800XT and my previous GPU was RX580 8GB. While I was on 580 I made a promise to myself that my next GPU will have at least 12GB of VRAM mainly because like I said much earlier, I had experienced running out of VRAM. The thing is that I was just playing Uncharted! I have a 4K TV as my monitor (thus bigger frame buffer size compared to 1440p or 1080p) and I also like to open browser and leave some app open in the background (not running, just open, usually work related app). Basically the game run like a slide show at a setting I knew it can handle (which is basically max setting). Need to set the texture to low to be able to run it. I did try to run the game with everything closed and yes, it ran normally. But if every time I play I need to close everything then it is a hassle that I prefer not to have.
Yeah background apps, especially ones that are "browser based" like web browsers themselves, discord etc can quickly start eating up vram unless you manually turn off hardware acceleration (but then they can be sluggish to use instead) - was very noticeable when gaming with a 2gb vram gpu lol - ideally the operating system should be able to reduce the impact of it but in practice it doesn't always to the best job at it
Still at 1080p, my 3060Ti will las for a long time...
The problem is expection. Nobody should expect games to run smoothly at 4K very high on 8Gb vRAM....I see 1440p very high seems to run fine in Ghost - that's acceptable in my view. I'm not defending nVidia's practise here, but in 2024 an 8Gb card is a 4060(Ti) or 7600 tier card, which is around £339 here in the UK. 1440p very high on a £340 card seems okay to me?
Ppl blame but still byuing nvidia. Im at 3060ti from aliexpress, payd 200$ my next will be a amd
True! I crossed the fence - upgraded from a 2060S 8gb to a 7900GRE 16gb
What are you talking about dude?
The Ali prices for the RTX 3060ti i see is around 400$. So where do you find them for only 200$?
@@Tiber234 same with sapphire nitro+ 7900 gre
@@cajampa refurbished ex-cryptomining units from China probably? They started flooding aliexpress when China tried to ban crypto on 2021.
Those ex-miners 3060 Ti's are sold at my country at around 200 USD.
@@cajampa i got mine for 320$ on ebay 2 years ago now i think. Its used stuff Ig.
honestly depends on the resoultion... for 1080p 8gb vram is alright, and hence i dont see a need to upgrade my 3070
I think these are entry level cards and still have a place in the market. If you are not at maxed or ultra settings or being ambitious with resolution on a low end card I think a 8GB card is enough. I think $250-350 is where 8GB belongs
I expect that next gen 8gb will be close $500… But who knows. Gddr7 will be really expensive in the beginning.
@@haukikannel yeah the die is too but supposedly they are cutting down its core count as well as keeping it gddr6. If I had to guess it would most likely be maybe 10% faster than a 4060, and either be between $280 (if Nvidia ever threw gamers a bone) and maybe $330 until 40 series dropped in supply or $299 and letting 40 series drop under msrp. 40 series is sitting around msrp still with about a year left until a 5060 release
price is key though, I got a 6700xt 12gb for a secondary build, it was $245 on eBay +free shipping. 8gb shouldn’t be more than that ever. Personally 3060 12gb or 6700 non xt 10gb are the lowest end cards I’d even recommend for entry.
@@puffyips comparing used market is a different ball game entirely but I agree with you in terms of price point to an extent. A 3050 was 8GB for about $250 msrp but a 4060(which should have been 4050) probably wouldn’t be able to use more than 8 regularly. They def should’ve kept it around $280 with the 4060ti 8GB around $329-350 & 16GB at $399. It would’ve been better received.
This is regarding 5060 and 4060 though, I believe 5060ti will be 12gb
8 GB cards are for 1080p. At that resolution I can’t see any difference. (Who would play 4k with 4060 TI? I mean yeah you would try but would not play regularly. 4k is for 80 and 90 cards)
I have no idea why people buy 4060 and 4060Ti 8GB. If you want an Nvidia GPU and you can't afford a 4070, get a used 3060, 3060Ti, or 2080Ti. Spending more than around $200 for an 8GB card is stupid.
I also forget sometimes, but majority of PC gamer have no clue about hardware.
Nvidia is hoping that you don't know what you're doing and have to upgrade more frequently.
Why? If you have a 1080p monitor there's absolutely no need for a 4070+ gpu. Different story if you plan to play in 1440p or 4k, but 4060 can't handle those resolutions well anyway (it can 1440p with upscalers, but still)
@@xviii5780 exactly this is the issue!!!... People like you offering up excuses and defending a billion dollar company. You pay hundreds of $ for a planned obsolescence product, you literally buy it and self impose limitations in how to use it. I love how Nvidia was showing 1080p benchmarks a few months ago like it's the standard 🤣 when years before they were pushing 1440p and 4k. Now they realized same as Apple they can dry up their consumers by selling them junk.
@@xviii5780 Wrong.
The current "tipping point" for AAA games seems to be around 10~12 GB, so a 12 GB card will probably still be able to cache all textures / models, even at the the highest quality setting. But, in a couple of years, I suspect that won't be the case. So either get a card with 16 GB now, or be prepared to reduce texture / model quality by next year.
For anyone curious, Vram amount is mainly a concern when you increase your resolution.
1080p 6-8 is probably all you'll ever need,
1440p 8-12, you should probably stay away from 8GB though it'll work fine for most games, modern games will run better with 10+
4k 12-16+
This is exactly my experience.
It is fairly rare for 4k to exceed 12GB even, except for the most demanding games.
I can downscale from 8k maxed settings most of the time without hitting 16GB, but occasionally, I can exceed it with the most demanding games.
4090 is my reference. NVIDIA does have a very good memory compression algorithm, so results can vary against AMD or Intel
*Bro thank you for saying it, I've been saying this for so long and people while quite literally still don't get it, it also depends on the settings too, you can max out most settings and turn VRAM heavy settings like shadows or reflection down if you're worried about VRAM. But it all comes down too buy the right card for the right resolution, not that hard.*
I am very pleased to choose Rx 6700xt instead of RTX 3070 because at that time the price of Rx 6700xt was much cheaper. It's been 2 years of use now and its performance is still very good with 12gb vram.
That's weird
On my rtx 3070 at 4k ghost of tsushima it's run like the 4060 ti 16gb
Why?
It is because the more bandwidth or the resizable bar?
Probably both. Higher memory bandwidth may help here a little bit. And enabled ReBar uses a bit more VRAM.
@stangamer1151 I have 32gb of fast ddr5 ram too
I think the vram is swapping things to the ram
I didn't face any stuttering issues with my 3070 in most of the games
I always playing at 1440p and sometimes 4k if I can get good fps with dlss on
Because with 4k there is no aliasing anymore in the games
@@GameRTmasterNo aliasing? Haha. If only that were true. There's HEAVY aliasing in 4K.
@berkertaskiran what I meant
When at 1440p there is noise and little shimmering in the character hair and edges even with the best anti aliasing solution that game offer
But at 4k there is nothing the game is so detailed and sharp
And at 5k it's so so clean and no need to activate anti aliasing too
But at 5k my poor gpu can do many games
Only not very intensive games
@@GameRTmaster I always need AA at 4K. Either DLSS or MSAA. Even 4x doesn't look good.
Note that most modern games don't have much a visual difference on 1440p and 1080p when changing textures from very high to high. If your not running a 4k monitor than you can sacrifice a little and get more performance for the same look as a larger vram size of the same card. Learn what you like in games (settings wise) and then decide what brand and gpu you actually want.
This is pretty much what I'm running into with a RTX 3080M 8GB laptop. It can boost all the way up to 140W so it has the performance but at 1440P in newer games, I have to drop the textures to high or even medium to avoid performance issues. Thankfully, once I do that, I can game at mostly high settings in both Ghost of Tsushima and Horizon Forbiden West with DLSS at Quality. My main rig has the 3080 10GB and it's similar but those extra few gb of VRam make the issue less pronounced. This is why Losssless Scaling has been a game changer for me as I just lock the games at 40fps and I'm able to get 120fps with their 3X FG.
Youre using an external monitor with a gaming laptop.
@@BlackJesus8463 I've done it before, big deal
@@BlackJesus8463 Yeah. The point is you can connect it with a cable or two to your monitor (which probably has a USB hub for keyboard/mouse/other accessories) and unplug it and take it wherever you go. Also, that way one has to replace only one device (gaming laptop) rather than two (a Macbook/XPS 13 like ultrabook + gaming PC) every 3-4 years and always has all their data in one place. Not everybody's trying to get 240+ FPS on a maxed out gaming PC because they have godlike aim and have to extract every single FPS on a high refresh rate monitor to be a pro, or a mediocre streaming career, or want to show off on Reddit/YT.
Why do people act retarded when it comes to setups based on gaming laptops?
Funny how Avatar, runs just fine on 8 gb GPU, with no massive visual quality downgrade vs 16g GPU.
At GDC they explained how they integrated techniques to manage memory efficiently (something akin to sampler feedback).
8 gb should be just fine for resolutions
The market will still dictate on how devs will optimize their games, steam just conducted a hardware survey last month for developers and 8gb vram are still the largest chunk of the consumer base. Once people moved on and 12gb is the standard, we will have games that will peg 12gb and videos saying how 12gb vram is doomed from the start.
If i buy 16gb vram card like 4070ti super or 4080super for 1080p gaming would it last for atleast 6years for gaming???
Yes either card would last for a very long time at 1080p. What frames are you trying to get?
@@MrAnimescrazy I just want basic rasterized 60 fps at native 1080p for atleast 6-7 years because those fps will be enough for the features like framegen or only for normal smoothness of the gameplay
@satyamwathrey7704 ok then yeah either card would be overkill for 1080p 60 fps but I would get the 4080 super since it performs better then the 4070 ti super and it will last a bit longer. My pc is in my profile picture with my first all white build with a 4090/ 7800x3d/ 64 gigs of ddr5 pc build so my pc will last a very long time but I am looking toward the next gen pc parts so I may upgrade.
@@MrAnimescrazy Actually I want an overkill card because I want it to last long for the 1080p resolution because the normal performing card won't last long for the gaming I will wait 6-7 more months to see what 50 series have to offer then I'll decide accordingly to my budget
@satyamwathrey7704 ok and it will be hard to get a 5,000 series card but if you can get one compare the performance to the 4070 ti super and the 4080 super of course depending on your budget.
Don’t play 4K Very High if your GPU can’t handle it VRAM wise. 8GB isn’t enough for 4K typically. Stick to 1080p or 1440p and lower some settings. It also depends on what games you’re playing. But yeah it sucks that gaming companies are still selling 8GB cards. Base should be 12GB at least.
The 4060's crippled PCIe interface makes the spillover issue even worse. One of the main benefits of PCIe 4.0 X16 was that while it only made a small difference over PCIe 3.0 x16 in normal gaming, when there was spillover to system RAM, the doubling of throughput made it far more capable. For example, an 8GB card could handle 1-1.5GB of spillover to system RAM and only suffer a 15-20% performance hit. While with PCIe 4.0 x8, even 500MB of spillover starts to lead to far larger performance hits.
This is why an RTX3060 8GB handles spillover to system RAM far better than the 4060 8GB, even though the 4060 is around 15-20% faster in terms of compute performance.
Future proofing is just a coping mechanism, you always upgrade before the hardware becomes a problem unless you are poor and if you can't upgrade then you shouldn't be looking into buying new GPUs.
Realty is that 8 GB is more than enough to fill plenty of high Quality textures, and if games cannot run well with it that's because developers are lazy to optimize better, or manufacturers push developers to use more memory to make them sell more expensive cards.
That's not reality at all. Working on a PS5 the developers have access to 12.5GB of memory and they use about 10GB for graphics memory. The Xbox Series X has 10GB of memory intended for graphics and 6GB intended for everything else. The developers simply won't cripple their games too much to run on PC.
It took this video 8mins to actually show me the true difference, which is why I tell people that 8GB is not the end of the world at all. Who? In there right mind would be buying a 4060 and playing at 4K, come on man… there shouldn’t even be any testing done at such a ridiculous resolution for a budget card. Make a new video with both cards side to side at 1080p only and now let’s see how unnecessary 16GB is for what? Twice as much money.
Test Forbidden west on Burning shores area, i find my 8gb card really struggling in busy areas such as the main town even at 1080p.
that's sounds like more about CPU bottleneck
@@arenzricodexd4409 Nah, zWormz Gaming did a bunch of test using various gpus, using high end cpu, and he also came to the conclusion that vram usage is higher in that area.
Fleet's End is extremely CPU demanding area. My tuned R5 5600 can barely hold 60 fps here. While VRAM-vise the DLC does not use more than the main game.
Played both the main game and DLC at 1620p/DLAA/Very High on my 12GB GPU w/o any issues. While at 4K + DLSS Quality this game needs more like 13-14GB.
watching this with a RX 580 bought back in 2017 and still rocking! Playing GoTS with FSR3 1080p at 60fps fine here
its a great video, yes think that one better first
why crysis again though
why is the vram usage less on the 16gb tho? forbidden west 4k high, its showing 7.2 for the 8gb card and 5.8 for the 16gb? doesnt make any sense
Cyberpunk 1080 RT overdrive also has the 16gb card at 5.6gb and the 8gb is at 8.2 with spillover?
i run 8gb on 1440p and its all good
It’s alright but not ideal, it’s so much easier to enjoy a game when you can just play on maxed out settings from the get go
@@puffyips it's the 4060ti, gonna wait for the 5090 to drop
@@Zombie1014060ti 8gb💀 at what cost? I payed $400 for a 6800xt 16gb last year, even 3080’s were $450 at that time. I wouldn’t even take a 4060ti 8gb for free.
I hope you have a capable monitor for a 5090 or else that’s an easy way to waste $2000 for just a gpu.
@@puffyips Bought for 379£ and its a temporary gpu. Monitor oled 240hz 2k
Does FSR produce the same frame rates with 8gb or 16gb given that video memory wouldn't be an issue?
Thank you for showing us 8gb cards are 1080p cards.
That was actually my takeaway too. But I didn't see that as a bad thing, from the results I just thought well I guess I should just default to 1080p on modern games on my 3060ti.
@@sanderbos yeah, I wish you could push ULTRA on 1080p tho. That way you can drop to high and enjoy 70+
@@sanderbos 3060 Ti does fine on 1440p on medium/ high settings. Why expect top of the line performance from a mid range card?
@@oliversmith2129 i mean i had a 3070 and had to lower settings in hfw in 1440p cause it was using too much vram, even on dlss balanced
@@userblame632 HFW gives me 60+ fps in most areas at 1440p high settings (texture, 4x anisotropic, level of detail, etc.) with DLSS Quality. It drops below 60 in some high populated areas sometimes. This is a mid tier card, not meant for ultra 1440p gameplay.
Reduce textures and problem is fixed. No idea why its so difficult. Seems many just want to play at Ultra or the game is not worth it.
Better yet, don't play new AAA titles and save money on both games and hardware. You may even avoid all the bugs new games are released with these days.
What this shows is that the slower cards don't really need a lot of VRAM. They are to slow anyway. A game with Ultra textures and the rest at medium to low don't look that good anyway.
people basically gift money to Nvidia, buying over-expensive GPU's.. and you expect them to play a terrible optimized ass game with HIGH settings?
how dare you!1!1!1
is ultra or ultra, and 4K since I NEED to see that little wood texture of that tree that is 10 meters away of my character, or that water reflextion, I pay money to see that, dont care about gameplay
Ghost of Tsushima/Forbidden West are both terribly optimised games when it comes to Vram utilisation. I think its more of a Nixxes problem.
can you do a 4060 8gb vs 306012gb comparison?
I just bought an XFX Speedster Qick 7800 XT to upgrade from my 3070 Ti. The VRAM was the primary reason for doing it.
weird ´´upgrade´´
@@groenevinger3893 depends on the prices for both. used 50 $ more wouldnt be bad
There is no sense upgrade. 7900gre would be better or Evan higher 7900xt or IMO The best 4070Ti Super
13:55 I don't think it's a run to run variance, the 8gb card is consistently pulling 10% less power so it's still being throttled somehow (and it's not temps).
Also, if you look at both scenes, you can actually notice more details on the right in some cases. I think it's doing some compression to 'keep up' with the other card, if you look carfully you see a 1gb allocation difference in vram after all so there's definitely something happening.
With avatar, some of the biggest differences I see are coming from the dust clouds. I also see some measure of difference in soil/plant textures, but those didn't feel as pronounced.
I got 7900 xtx it's always interesting seeing how much vram being used at 1440p maxed out. 14 gb in ghost of tsushima maxed for me.
Tried to get my hands on 4060ti 16gig Unfortunately the price was way above msrp and only 8gig was available. So I got a new 3060 12gig.
i mean if u playing 4k at ultra with 8 gb vram card dont blame the company blame ur self for being stupid
That's just to illustrate the issue, but it gets a lot worse than that, many games downgrade the texture packs, so you're not getting the visual quality that you're supposed to get, you just don't notice a performance dip, in fact since most reviewers are looking at frames per second only, 1% lows etc, they miss the fact that the game looks like s***. And that's at 1080p/1440p...
In today's and yesterday's 2023 games, you shouldn't be paying 500 to $600 for 8 GB of RAM that will end up giving you a subpar experience. We're not talking about just framerates here, we're talking visual quality.
In addition, you shouldn't be paying a used car price for video card that the company wants you to throw away in 2 years
That is absolutely on the manufacturer
There's an old word for that, it's called usury
Don't blame the customer just because they're the ones who are trying to have a better time with their very hard-earned dollars, in a market where they cannot have what they actually want. This is monopolist behavior
I was worried about vram when the 40 series released but now....
All I care is gameplay, I dont want eye candies and lower my settings even if it is too overkill so that I can actually see the important things happening and react on time. (4070 ti)
Why would anyone spend top dollar for 4k monitor then buy a 60 or 60 ti card? 🤡
That was a really well made analysis. 👍👍 Regarding the screen mode used for these tests, were you using Fullscreen or Borderless? There are some games that can stutter in either of those modes, depending on the game itself and the engine they are running on. One of the weirdest examples that I know of is Dead Island Remaster along with DI: Riptide Remaster. On 3090 RTX 24GB VRAM, both games have stutters in Fullscreen, but run super smooth in Borderless or Windowed, which is kinda odd. For comparison, they both run flawlessly on a 680 2GB VRAM and 1080 8GB VRAM, regardless of the screen mode. 🙂
This is showing that 8gb vram is only for 1080 with medium or lower 1440. Need 12gb for higher settings in 1440. 1440 is what games should being run since even 6yrs ago. The AMD 7000 and the Nvidia 40xx should have had 12gb vram as the base line.
4050 series should have been the only one with 8GB
4060 12
4070 12-16
4080 16
4090 24+
AMD cards have more Vram because that's about the only thing going for them against Nvidia. If Nvidia gives the same amount of Vram as generously as AMD does, no one would be buying AMD cards.
Even for 1080p in some demanding games if u put textures and settings in higher quality, ur gonna run into problems. 8GB Vram has been there since about 2015 and its showing age now. 11GB Vram should be minimum by now.
I can attest that my RTX3080 10GB would run out of VRAM with Horizon Forbidden West if use any texture setting beyond medium if I have the Framegen mod turned on even with DLSS Quality on just 1440P.
My almost 10 years old R9 390 has 8gb vram. Only scumbag Nvidia uses vram to limit the potential of their GPUs to run at higher resolutions. Sadly the future 5060 will never be a 1440p gaming GPU. We will never move to 1440p standard because of Nvidia.
nVidia sure is hilarious
I want to upgrade from my it-set-itself-on-fire-and-this-is-not-a-joke GTX 1080, but I don't want to downgrade VRAM and I'm sure they'll figure out how to do that.
_Introducing the RTX 5090 Plus Ti Super X-treme (3.2 GB Edition)_
people laughs and meme on 4060 ti 16gb for 4k gaming. but its actually pretty capable with DLSS and frame generation.
i thought ppl meme on it for its price
@@eclxysThey do, it’s meme’d because the 4070 makes absolutely more sense once you look at the price/performance. Base model 4060 is a good card, could still be $20 cheaper IMO.
Your grammar and opinions are laughable.
I'm playing GoT on my 6gb 1660 super game actually runs great 70-80 fps on med-high with FSR. I was on a 1060 3gb before and it was impossible to load textures on majority of newer games even the going from 3gb to 6gb was pretty big.
Nvidia is really ripping their customers off with these low VRAM cards.
and amd with those drivers and stutters pick ur poison
@@erisium6988 Drivers and performance are fine. Fanboys are really childish and don't contribute anything meaningful.
@@erisium6988 "me when I lie" :P
@@joshmonus ive only had crashes with my 7900xtx in two games that i play quite a bit since swapping from an nvidia card thats counter strike 2 (driver timeout issues) and team fortress 2 (which just randomly closes sometimes when connecting to a server) which im trying to figure out if its my ddr5 - 6000mhz ram or something else but other than that the software and amd seems pretty good no stutter when paired with my 7950x3d cpu
Great video, thanks for the information overload. My personal takeaway is that for my 8GB 3060ti I should think about trying out 1080p with higher settings besides what I usually do (what I usually try first now is 1440p with high settings, and then DLSS Q if that does not give me 60fps, but 1080p@60fps is fine for me as well, and as this video points out DLSS takes up memory too).
Having enough Vram is not about framerate, it never was about framerate, it's about immage quality.
You can have a game playing fine at 60fps while looking like trash because there's not enough Vram to load assets... wich is maybe worst of having lower framerate.
What are you talking about? Insufficient VRAM can affect both the framerate and the image quality.
@@WrexBFOn any modern game engine it will affect immage quality before making any difference in framerate, that's my point.
Making an entire video just looking at framerate is basically useless.
Daniel: How bad is 8GB of VRAM in 2024
Me: enjoying the video with a GTX 1650
Now 16GB is a must or you will hit easy 12GB in all new games and 8GB is not sufficient.
I hate that modern gaming makes that a necessity. Too many just throw in 4k textures so we can see every mole on a character's face.
1080p 12gb is enough in 99,9% of cases. If you play at 1440p or even 4k with a card that has 12gb VRAM, its obviously not a high end card bc then it wouldnt have 12gb VRAM. So what that means, is that you can use DLSS. You SHOULD use it bc you get a massive amount of performance for almost no impact to image quality, sometimes DLSS upscaling is even better than native rendering with TAA anti aliasing.
dont forget, as soon as you upscale, the actual rendering resolution is what dictates how much VRAM you use, not the upscaled output resolution. And in new games, where you can run into problems with 12gb, you will very likely have DLSS. At 1440p and 4k resolution DLSS is really a godsend.
this doesnt excuse low VRAM offerings but its a realistic outlook of real world usage. UE5 gams with nanite, lumen, path tracing etc. are so freaking intensive, you will need upscaling anyways, so VRAM is less of a concern than when rendering native 1440p or especially 4k.
8gb of vram has been DOA for many years
Well, blame nvidia that refuses to give more vram even though people have been demanding it.
@@mrman6035 those games ain't even fun and just milk gamers for money in the way of DLC. Get a game from 5 to 10 years ago it is just way more fun to play.
Great video as always, have you ever tried using Nvidia Broadcast to prevent annoying background audio on your mic during recording?
1080p and no point to complain about 8gb vram most of the time...
BS. I tried Hogwarts Legacy on 1080p with an RX 6700XT 12GB. The game was eating around 9-10GB of Vram. Also Resident Evil games would eat huge Vram if textures and settings are pumped up. Ye 8GB Vram in 2024 is just garbage.
“most of the time…”
@@hircine92h yeah, that's called terrible optimization, there's even people with 24Vram GPUs that say the game consume almost all of it
and yet people playing this game with a rx6600 1080p ultra settings at 54-60 fps no problem
so no, 8GB Vram in 2024 is not garbage, is the AAA industry and their lack of optimization in their games... or they did that on purpose so people waste in more GPU's, since companies dont find it profit if someone stay with his GPU for more than a decade
@@hircine92h It had memory leaks which was fixed + you just do not need to use max settings every time... BTW. Nividia has better compression for vram so 8gb =/=8gb.
I am playing cyberpunk 2077 with max texture settings without problems on 6gb vram 🤣
Nowadays every game should have automatic texture swap, when vram is full like in the Avatar game, to bad it is not a case.
@@hircine92h allocation is not equal to usage, for example I've had 16gb ram for thr longest time and the game I always play uses around 12gb then I upgraded to 32gb and the game now uses 17gb of ram but still performs the same.
Cyberpunk test is actually misleading. 8gb cards easily run out of vram when playing even with simple raytracing, overdrive hits even harder in some areas or after some time, but thats for thw NC without PhantomLiberty. Dogtown just destroys 8gb cards with raytracing, overdrive is unplayable mostly (tested with my 3060ti). Even 12gb is not really enough for dogtown (my 4070super runs out of vram there sometimes).
They shouldve added a demo run for Dogtown really.
Nvidia will add VRAM generation for 50 series
Deep learning video memory.
Lmao
Yup people were complaining about the regular 4060 for almost no reason. It’s a 1080p gpu and only has issues if you try to game in 4k with it
Frame gen was such a scam.
if you dont care about ray tracing, 8gb is fine. i have a vanilla 3080 (10gb) and i often have performance drops in rt games due to vram the longer i play, restarting the game brings my fps back. I have 1440p monitor... and it doesnt matter if i use dlss quality, balanced, performance, because the final resolution is the same so the presets use the same 1440p assets... i think 12gb is the bare minimum these days if you want a little bit of every graphic tech this gen has to offer (ray tracing, upscales, frame generation, high or ultra textures, 1440p or 4k etc). 1080p is the new 720p.
using a 4060 ti for 4k gaming...
It is like using 800kg car to town 4000kg trailer…
depends on what game
If you only have a 24" monitor 1080p, what is the best GPU and CPU?
Who in their right mind would buy a 4060ti 8gb to play at 4k very high settings and then complain about vram issues?
People expecting at least console-like performance from a $400 in 2024 (consoles are past mid-cycle BTW).
The 16 gb version is doing just fine. That's the point. There shouldn't be a 8 gb version.
@@lupintheiii3055 consoles dont run these games at 4k. in some its sub 1080p.
Did you watch the video? He adressess this very early.
@@lupintheiii3055 Are consoles running at 4k very high settings native?
The 3070 would be my last 8gb card, won't upgrade until I got atleast double the vram in the 70 series.
Nahh bro, my 1060 with 3gb is enough!!!!
Literally not a big deal. You can't even tell texture settings UNLESS you sit up close to your panel and nitpick.
Maybe enough for Gta San Andreas.
Funny thing about games is that many play old titles, and your 1060 will be perfectly fine for older titles like Battlefield 4 and will perform pretty incredible
I don't judge people for having old gpus, some don't even care about modern titles, and play DOTA 2 and old RTS style of games
Having a powerful GPU may actually cause issues in older titles
I have 4090 and Battlefield 4 never utilizes 90% of my GPU, which in turn forces my CPU to work more. Because CPU is not fast enough for the gpu, and the title not being demanding enough to drive the gpu creating some issues in frame times.
You want that 90% sweet spot somewhere, with low CPU usage
I play older titles, and yet I have 4090 and 13900k. I was doing just fine with RTX 3070 in BF4 that I play to this day
It's the Unreal 4 and Unreal 5 engines that take a toll on your gpu. They are not optimized for responsiveness, but mostly an engine that looks good and has potential for beauty, at the cost of the performance. Unreal 5 is brutal on GPU, even 4090 struggles to keep 160 fps in most titles consistently
i have a budget rig with a 780 ti and it still amazes me in some games. 3gb gddr5
@@4m470 depends on the game, in cyberpunk and rdr2 there is very big diff, but there are some tricks to improve that
I think the better way to do rhis would be to change just the texture/VRAM intensive settings