VRAM - is 8GB or 12GB really enough to play Games in 2023?

Поділитися
Вставка
  • Опубліковано 2 жов 2024

КОМЕНТАРІ • 1,2 тис.

  • @MarcoGPUtuber
    @MarcoGPUtuber Рік тому +935

    I still run 4 GB of VRAM. It works so well. All my games run flawlessly on my 800x600 CRT!

    • @philscomputerlab
      @philscomputerlab Рік тому +143

      4MB Voodoo is all you need ☺

    • @MarcoGPUtuber
      @MarcoGPUtuber Рік тому +52

      @@philscomputerlab that's right! My voodoo 3 2000 should be all that's enough!
      Now time to play some DOOM!

    • @Po5itivemind5et
      @Po5itivemind5et Рік тому +5

      @@MarcoGPUtuber loooool

    • @pf100andahalf
      @pf100andahalf Рік тому +37

      640x480 is where it's at.

    • @ShinyHelmet
      @ShinyHelmet Рік тому +18

      @@philscomputerlab Yeah, mine served me well playing the original Half Life at 640x480 on a 14 inch monitor. 🤩

  • @MovoSt
    @MovoSt Рік тому +178

    A follow up test on 1080P Ultra and 1440P low/ ultra would be great.

    • @matthewIhorn
      @matthewIhorn Рік тому +6

      Exactly!

    • @dgonsilver-fs5gf
      @dgonsilver-fs5gf Рік тому +3

      100

    • @SomeSloan
      @SomeSloan 11 місяців тому

      Would be great to see a vid on vram usage at these resolutions! Based on this videos results, I think it would be safe to assume that if you want to play games at high-ultra settings on 1080p without worry- 10-12gb vram should be a safe bet

    • @upfront2375
      @upfront2375 8 місяців тому

      @@SomeSloan For today 8 is enough at 1080p. For yrs to come, you'd need to either down the textures OR upscale anyways to stay at 60fps.. both of which will greatly lower the vram usage. There're 2 reasons why 3060ti-4060 has only 8gb. 1.It's enough for the perf. power at lower res. high fps use case. 2.So ppl who need more go buy much more expensive cards. The 12gb on 3060 and 16 on 4060ti aren't useless overall for some creative scenarios, but for gaming?? sht it's gonna be exactly like 4gb 750ti.... like a big D on a catholic priest! lol

  • @wireless1235
    @wireless1235 Рік тому +121

    I think there needs to be a part 2, which includes the impact of specific settings to vram usage and also includes the use of lower tier cards.

    • @dreamcat4
      @dreamcat4 Рік тому +5

      if there is a part 2, it would be nice if byan can bring up a table for which of the games being tested are re releases from the latest console generation (who have *almost* 16gbs of addressable vram, minus the game and the operating system.... = so are more like 14gbs max.)
      versus modern game releases which are either originally last generation console releases? or pc only releases? do such games even exist anymore? i mean ever since xbox division already purchased the entire aaaa* games industry pretty much (near enough)
      and how much addressable vram did the last gen consoles have? like the ps4... ah only 5.5gb. about same for xbox one.
      then xbox 1x is 12gb max. and ps5 is 16gb max. but minus the shared memory overheads.

    • @gozutheDJ
      @gozutheDJ Рік тому +2

      bro, it's called, do it yourself.

    • @simsdas4
      @simsdas4 Рік тому

      I second this, sure you can run high settings but expect to drop textures and shadows for example.

  • @DJ_Dopamine
    @DJ_Dopamine Рік тому +39

    I game on a 1080p display. Not having issues with 8GB. Anyway, I'm always happy to turn things down a notch (or two) from Ultra if necessary. The visual difference is usually marginal.

  • @sc337
    @sc337 Рік тому +136

    Hi Bryan, hope to see a follow up video by using a real 6GB, 8GB and 12GB cards on the exact same games. I bet the RAM utilization will be much different from the results of this video. Love your vids. Peace

    • @techyescity
      @techyescity  Рік тому +46

      No worries man, will definitely be doing that for you! This for me personally is a whole journey that I want to uncover and learn about. I will make this a whole series.
      However starting out with the two 'unlimited' vram cards is for me a base case to then inference further data against.

    • @naturesown4489
      @naturesown4489 Рік тому +1

      There are a lot of channels and sources that have done those comparisons, they're very similar

    • @sc337
      @sc337 Рік тому +3

      @@techyescity much appreciated Bryan. Keep up the good work! 👍👍

    • @laszlodajka5946
      @laszlodajka5946 Рік тому +3

      Yeah. Im having a 10gb 3080 and the last of us warns me of it when i push all the settings up to ultra but still runs well. So the 10 gb may still fall into the ok zone for now. May be interesting to see on what settings u can get away with less vram.

    • @Peter.H.A.Petersen
      @Peter.H.A.Petersen Рік тому +2

      ​@@techyescity Also, I don't think even a 3070 ti could run 4K Ultra with Raytracing ON at proper frames, if it had 24GB and so isn't it irrelevant if it has enough vram to do it, if it can't do it anyways?

  • @SuperConker
    @SuperConker Рік тому +40

    This is what I think nVidia should have done for Vram on the entire 4000-series:
    -4050/4050 ti 12 GB
    -4060/4060 ti 12 GB
    -4070/4070 ti 16 GB
    -4080 16 GB
    -4090 24 GB
    Basically not a single model with under 12GB of Vram.

    • @Terry1212
      @Terry1212 Рік тому

      The 4070 and 4070 Ti have 12GB of VRAM

    • @SuperConker
      @SuperConker Рік тому +10

      @@Terry1212 I know, i'm just saying that's what nVidia SHOULD have done with the Vram.

    • @SherLock55
      @SherLock55 Рік тому +4

      What's the point of 12gb on a 4050 LMFAO, it's not even fast enough to run at higher resolutions and settings where it would be needed.

    • @SuperConker
      @SuperConker Рік тому +3

      @@SherLock55
      The 4050 would perform about the same as the 3060,
      which has 12 GB of VRAM.
      The 3060 again, performs the same as the good old 1080 ti
      (with it's 11 GB of VRAM).
      There are already games out in 2023 that can eat through 12 GB of VRAM at 1080p
      (and 1080p is not even a high resolution).
      So to release new cards in 2023 with as little as 8 GB of VRAM is a joke.
      Starting the lower-end models at 12 GB is perfectly fine.

    • @SherLock55
      @SherLock55 Рік тому +3

      @@SuperConker The only games eating 12gb of VRAM at 1080p are unoptimized pieces of trash not worth playing, don't get it twisted. Just because some devs are lazy or incompetent doesn't mean you actually need so much VRAM at such a low resolution.

  • @philscomputerlab
    @philscomputerlab Рік тому +22

    For Windows XP Retro Gaming, less is more. Some games have issues with large VRAM, best to have 1 GB (GT 710 FTW) 😅

    • @Rabbit_AF
      @Rabbit_AF Рік тому +1

      What if video card companies made cards that shared system memory again. I was a bit thrown off when a S3 Graphics card I was testing was doing this. ATI had a feature like this called Hyper memory.

    • @ShinyHelmet
      @ShinyHelmet Рік тому

      I've still got a 256mb 7600 GT for all that retro malarky! 🥰

    • @devilzuser0050
      @devilzuser0050 Рік тому

      I selld a gtx titan cause heroes2 & nfs2se doesn't start on it under XP. (6gb vram)

    • @necuz
      @necuz Рік тому +1

      @@Rabbit_AF That's exactly what the Windows Video Memory Manager is doing, that's why games typically only run really poorly instead of crashing when you run out of VRAM.

    • @bryanwages3518
      @bryanwages3518 Рік тому

      ​@Rabbit _AF amd vega cards can do this. It's called hbcc. You can expand your vram with your system ram.

  • @f.ferenc88
    @f.ferenc88 Рік тому +5

    1080p then 4k? Where the fucks 1440p ??? That is todays golden standard....

  • @pkpnyt4711
    @pkpnyt4711 Рік тому +82

    I think what were kind of missing with this test is we're using the top 2 high end cards from red and green. These cards have the highest bandwidth and throughput compared to the mid and lower end cards. These things might be fast enough to not having to load as much ram as they are fast enough to deal with it. How would ram usage look with a 4070ti and a 6950XT or even a more mid range offering? Im not so sure, but it's a legit question I have in mind.

    • @pf100andahalf
      @pf100andahalf Рік тому +10

      Faster cards don't use less vram.

    • @SirJohnsonP
      @SirJohnsonP Рік тому +8

      Yeah it can only get worse with those. But yeah Nvidia is now focusing on AI gpus, Musk ordered 10.000pcs for twitter, and OpenAI ordered more than 30.000pcs. So now they have a perfect excuse to lower the pc gpu production, and focus on AI gpu market... Too bad latex and leather fans are still buying nvidia gpus, giving them a reason more to produce 8gb 500$ gpus in 2023

    • @standarsh8056
      @standarsh8056 Рік тому +6

      Not how it works. Faster memory = more frames, but if you lack the memory buffer in the first place it will still tank performance

    • @sc337
      @sc337 Рік тому +3

      From what I observed, some games uses less VRAM on lower VRAM cards. For examplea same game with same settings, a 4GB card may show 3.5GB utilization while a 8GB card may show 4.5GB utilization. I think it makes more sense to test the games with a real 6GB, 8GB & 12GB cards. Anyway, still appreciate Bryan's effort

    • @pf100andahalf
      @pf100andahalf Рік тому +5

      @@sc337 Vram will overflow into system ram. In your example of a lower vram card using less vram, it's using a hell of a lot more system ram.

  • @CameraObscure
    @CameraObscure Рік тому +5

    This test was almost meaningless. Using cards with plenty of VRAM only shows the maximum VRAM the game would load into the VRAM buffer. A better test would be testing two equivalent cards for respective resolution, a good comparison would be RTX3070 Vs a RX6700XT an 8GB versus a 12GB card. To see how the frame times and texture quality differs @1440p to how the VRAM limitation actually affects what that card can do with ultra/ High textures. That is where you will really see the différance. For the higher requirements of newer games going forward. It seems that 4K textures are going to be the norm for many newer games, with no lower texture packs being used for lower resolutions. Upcoming OS changes such as streaming textures directly from storage will supplement lower VRAM Capacities when they eventually make it into the OS. It's not as simple as your making out in this video.

  • @Kapono5150
    @Kapono5150 Рік тому +278

    So happy to see Nvidia users stand up for themselves on the 4070. Even fake frames doesn’t get them to open the purse.

    • @LeJimster
      @LeJimster Рік тому +74

      Honestly, the frame generation and even upscaling tech feel scammy to me. Especially when they're advertising it in their benchmarks. I much prefer running my games at native resolution.

    • @Verpal
      @Verpal Рік тому +55

      @@LeegallyBliindLOL TBH I do know some people prefer the native with jaggies, but I felt most people who claim upscaling is a scam is simply saying that because FSR 2 isn't remotely competitive for now, they don't hate the tech, they just hate NVIDIA.

    • @LeJimster
      @LeJimster Рік тому +14

      @@LeegallyBliindLOL Well I can't use DLSS because I'm on AMD. But I'm pretty sure both DLSS and FSR have weird ghosting issues in movement and also strange artifacts. I only use FSR for performance reasons and at the resolutions I'm using it I notice a big visual fidelity drop (edited, because of brain performance failure). DLSS may be better, but I still think it's getting to the point where they aren't producing faster GPU's but faking performance through these techs and artificially locking the software to newer cards. The only way I would like to use this tech would be in reverse, since taking higher resolution and downscaling it produces a crisper image.

    • @mr.obeydoge5266
      @mr.obeydoge5266 Рік тому +1

      Only one thing I can say. These companies dont care about us and I dont know why bother defending but I digress. Here is the main thing people, fake frames is fake frames. Native is definitely the way to go since it truly measures the raw capability of the component. There I said it. Dont allow them to control the market for so long and support their bad habit of setting insane value on products that are supposed to be produced in reasonable prices.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Рік тому +18

      @@LeJimster so, you in reality, don't have a real world reference point. From my experience, even at 4K, FSR is noticeably worse (usually blurry) and I don't notice any artifacts with DLSS unless I use performance mode in some titles. I modded in DLSS for RE4 and it was a night and day difference. You can believe me or not. But in the end, UA-cam doesn't convey the differences well enough.

  • @Beezzzzy_
    @Beezzzzy_ Рік тому +37

    Any reason 1440p is left out? I think this will be a good data base to put together. Nobody really discusses VRAM, the 1080ti is still a solid card cause of its 11GB of VRAM being released 6 years ago, and we're still having cards come out with 8GB or less, 12GB should be the baseline sold in 2023, not 6GB-8GB anymore.

    • @HxR-eSports
      @HxR-eSports Рік тому +1

      Yea there a reason. It would have shown results that went against his agenda.

    • @edeka3
      @edeka3 Рік тому +1

      ​@@HxR-eSportsdo you think 8gb is enough to run at 1440p or 1600p? A little future proof?

  • @peterkeller7880
    @peterkeller7880 Рік тому +25

    Its great to see you do this. This was needed to see. Not everyone games on 4k ultra. This helps people really make an informed decision when getting a new gpu especially when coming from older generation of gpu cards. Thank you for your hard work.

    • @thejollysloth5743
      @thejollysloth5743 Рік тому

      I’m gonna grab a 16GB RX 6800 off EBay for £350. Since I don’t give a toss about RT it will last me at 1080p until the next console generation comes out.
      I don’t mind having to turn down a couple of settings like shadow quality, fog and other weather effects in 5 years as long as I get the top levels of anti aliasing, render distance, and texture quality.
      I’ve got a feeling a used RX 6800 16GB will be fine for that at 1080p for many years to come.
      I also think that a Ryzen 7 5800x will be more than good enough for 1080p ultra or high settings for 5 years or so, and they are so cheap now with a decent B550 board and 32GB of 3600 CL16 RAM.
      1080p is fine for me. I only use a 24 inch screen so I don’t really notice the pixelation I would with a 27 inch or larger screen.
      A lot of my friends still play CSGO at 900p to get the most FPS they can and the lowest latency. And these are pro level players who have 360hz or higher monitors downscaled from 1080p to 900p.
      A I really can’t notice much of a difference between 1080p and 1440p, but that could just be me…or the fact I’m so used to that resolution.

  • @Rizzlas
    @Rizzlas Рік тому +13

    I have a 3070 8G , and I can confirm with Hogwarts Legacy in 1080p Ultra settings , I have some trouble having a smooth gameplay all the time (I have to use some optimization tool to do it) , but my friend have a 3060 with 12Gb of VRAM , he runs it in high-ultra settings got a lot smoother experience than me.
    Pretty sad to be honest :/ I'm thinking about maybe sell my 3070 to get a amd card of the same grade with more VRAM :)

    • @95928225
      @95928225 Рік тому +5

      I sold my 3070 for 400$ and bought a new 6700xt and it is wayy better. No vram crashes like in re4, deathloop, forza horizon, howgarts, last of us

    • @Rizzlas
      @Rizzlas Рік тому

      @@95928225 yeah but i'm doing a lot of render on adobe premiere and loosing cuda acceleration is not possible

    • @simon89oi
      @simon89oi 10 місяців тому

      ​​@@Rizzlas 2080ti should fit your needs than

  • @Thezuule1
    @Thezuule1 Рік тому +42

    I use my GPU for VR and even with the relatively low memory requirements of most VR games, you still need to render above 4K and even 12gb is likely not enough now, and certainly won't be in a few years.

    • @gozutheDJ
      @gozutheDJ Рік тому +5

      Vr is its own thing.

    • @darkkillex7220
      @darkkillex7220 Рік тому +1

      Same here, I got a 3080 thinking I would be able to easily run VR with it. Except I bought it back when they still had only 10GB of VRAM and as soon as I try to run any game that's not a dedicated VR game in VR I'm just limited by the VRAM

    • @anthonylong5870
      @anthonylong5870 Рік тому +2

      Bro Vr is at best 1080 lol , most is only 720 per eye....Your not rendering 4K

    • @Bos_Meong
      @Bos_Meong Рік тому +4

      @@anthonylong5870 Actually its very close to 4k. quest 2 has resolution totaling 3664x1920 vs 4k 3840 x 2160. despite all of this hl alyx only consume 7gb vram all setting maxed out. Its all about how optimized the game, we need to stop supporting shitty port games

    • @Thezuule1
      @Thezuule1 Рік тому +2

      @@anthonylong5870 each eye is more than 1080p dude..

  • @timberwear369
    @timberwear369 Рік тому +54

    I definitely would like you to include 1440p High Settings. You only tested the two extremes, 1080p vs 4K and Low vs Ultra. 1440p High for me makes much more sense. But maybe with highest texture settings.

    • @MsNyara
      @MsNyara Рік тому +2

      1440p Ultra aiming for high frame rate tends to be the same as 4k HD High/Ultra 60FPS discussion.

  • @TheSakrasta
    @TheSakrasta Рік тому +5

    I did not expect the impact of resolution on vram usage to be that small compared to the quality settings.
    It might be very interesting to have a bunch of tables for some of the newest titles, which show vram usage at 1080/1440/4k + low/medium/high/ultra settings. Because a lot of games look very similar at high instead of ultra. So with such a table you could find the sweetspot settings for your personal vram amount. If dropping from 1440 to 1080 only saves you 1GB of vram, but going from ultra to high saves you 2GBs, you would most likely have a better looking game at 1440 high compared to 1080 ultra, while also using less vram.

    • @angrydragonslayer
      @angrydragonslayer Рік тому +3

      The textures are still the same quality, you just use less of it on the screen

    • @bigturkey1
      @bigturkey1 Рік тому

      just use dlss. i never have to turn down settings. i just changed dlss settings.

    • @dreamcat4
      @dreamcat4 Рік тому +1

      yeah i agree somebody out there, if they are doing tables include its primary platform it was targeted for. whether a console, then what usable vram that console had. or if it was a pc only game. because clearly that is useful information to include in such tables... since that is the common underlying reason for these escalating vram requirements. each new console generation
      [edit] and lets hope sony ps6 will not come with greater than >24gbs of gddr.... or we will all be in trouble! ha

  • @kasmidjan
    @kasmidjan Рік тому +9

    Ngreedia can Pay their investors with Cheap Leather jackets
    If they keep stingy with VRAM

  • @RobertJianu
    @RobertJianu Рік тому +39

    I still run 4GB VRAM, both on my gtx 1050ti PC and my rtx 3050ti laptop. The 3050 has decent performance (like a watercooled and overclocked gtx 1070 desktop that a friend of mine has) but the lack of vram is starting to show. At least I don't game that much anymore. My next gpu will probably be from AMD tho

    • @Verpal
      @Verpal Рік тому +5

      Ampere desktop is suffering from botherline insufficient VRAM already, and yet for some reason NVIDIA decided to squeeze the low end ampere laptop even harder, people who buy low end stuff need it to last longer, yet NVIDIA decided to screw them in particular.

    • @sergeleon1163
      @sergeleon1163 Рік тому

      Yeah I was on GTX 1050ti and the 4GB really started to limit me, I upgraded this week for €250 to a RTX 3070 8GB and even when on specific games like here shown could be limited I will drop down settings as I'm aware of this 8GB can be hampering (in the future), while many games it will still be okay. But when on a budget both NVIDIA and AMD are playing gamers for too high prices and forgetting about people on a budget.

    • @Killersnake432
      @Killersnake432 Рік тому

      I upgraded last month from a 1050ti to a RTX 3060. I was GPU and vram limited now I can play stuff I used to play far better and have the vram space for games like RE4 Remake that I can do crazy high settings with. I would had gone for 3060ti but that 8GB vram buffer turned me away from it.

    • @TechHarmonic
      @TechHarmonic Рік тому +4

      I remember briefly having a 3050 ti laptop and it got on my nerves fast. Even with older games, maxing them out at higher resolutions I would get horrible frame drops because of the vram running out. I returned the Legion s7 since it was $1k and it felt super overpriced for that performance.

    • @RobertJianu
      @RobertJianu Рік тому +1

      ​​@@TechHarmonic damn, I know how it feels. The 3050ti is decent for 1080p even with most recent games. You can't go higher than 1080p or bump the graphics too high because the vram will make your experience horrible. I kept it because I needed portability and the good part is that I got it for around 500$ at the time and it has a Ryzen 7 5800H, 16gb ram, 512gb nvme ssd and a 10 bit 165hz display. It's pretty good for my not so demanding games (FH5, RDR 2, God of War, Sons of the forest etc) and media creation (mostly editing in Photoshop since the screen has excelent colors, Sony Vegas and making documents) but I wouldn't recommend this GPU for a true gamer, 4gb vram is just unacceptable. Always aim for at least a xx60 series card since they age pretty good or just go with AMD (lower prices and higher vram than nvidia)

  • @altun8310
    @altun8310 Рік тому +20

    Hi from Canada. Thank you for the video. Timely analysis and I'll keep an eye out for your future ones on this topic. My suggestion is that you should add 1440p high settings as benchmark. That represents the upgrade path for the majority of people who still play at 1080p. Also, I also noted in youtube videos the ram usage/allocation between nvidia and amd. That would be interesting if you could investigate and explain!

  • @Art_Vandelay_Industries
    @Art_Vandelay_Industries Рік тому +66

    What's crazy to me is that the graphic fidelity doesn't actually looks that good, considering the requirements. I think optimization should be more of a focus nowadays. That would also help with the insane prices for hardware atm.

    • @ShinyHelmet
      @ShinyHelmet Рік тому +19

      The thinking seems to be that they develop for the hardware available on the new consoles and then just port it to PC.....and hopefully patch it later!

    • @sven957
      @sven957 Рік тому +15

      They optimize for consoles which have 16GB combined memory.
      Yes, they COULD optimize it better but that costs a lot of dev time and money which they would rather invest into other parts of the game which makes total sense, considering consoles make up most of their revenue. Although there are titles like TLOU which are actually REALLY badly optimized.
      But other than that the only party to blame here is nvidia who decided to build planned obsolescence into their cards.

    • @grlmgor
      @grlmgor Рік тому +3

      @@sven957 Well if they don't optimize, then don't buy their game.

    • @sven957
      @sven957 Рік тому +4

      @@grlmgor Sure if you dont want to play uh - pretty much all of the major upcoming titles. Again you cant blame the devs in most cases (you can in TLOU) - 8 GB was first seen on a GPU in 2015. Nvidia did this to make you upgrade when their next overpriced generation drops.
      12GB right now is barely enough to max out games (if I'm paying that much for a fucking GPU you better let me max those settings!) just like how 8GB two years ago was barely enough. The 40 series cards will run into the same issues in max 2 years.

    • @peterpan408
      @peterpan408 Рік тому +1

      For 1080P there is certainly a fidelity limit set by the pixels, that could be optimized for in the engine.

  • @stratuvarious8547
    @stratuvarious8547 Рік тому +18

    I know I didn't spend hundreds on a new GPU to play at low settings, which is why I bought a RX 6900 XT. Reasonable price, 16 GB of VRam, it was the best choice in my price range.

  • @Cogglesz
    @Cogglesz Рік тому +8

    I'm rocking 8gb on my 3060ti, performance is fantastic for the price. Noticed my only cap seems to be 4K, Doom Eternal with ultra Nightmare texture pooling (everything else can be 1440p Ultra Nightmare), Forza 5 Eats it all up despite being able to run 120 v-synced. Game would pause and complain of low bandwidth (Much more than the Series X funny enough) I've wanted to just play at 60 with higher geometry to match the X. Turns out i''ll always bump into this issue, It's annoying my 64gb of ram is basically doing nothing. (Last of us managed to get 21GB of usaged though so gg's)
    Honestly i think 12gb is the new 8gb card. Midranged Vram amount seems to be inflating like our currencies. I feel some blame on porting has to be stated. RE4 medium textures are worse than PS4 somehow and it'l eat up a 8gb card happily.
    It's kinda sad when you've a lot of Resources but what holds you back is 8gb of Vram? Call me a boomer but i always felt 8 would be perfect for gaming, In the past we only really seen 12+ in professional cards only a few years back.

    • @Toulkun
      @Toulkun Рік тому +1

      It comes down to trash optimizations too

  • @masterkalel06
    @masterkalel06 Рік тому +11

    Hey Brian. Any chance on the V Ram tests, can you do 1080 Ti and 2080 TI against similar performing 8 GB cards. Since you're all about the used price performance, I'm curious if the 11 Gigs make those cards perform better going forward.

  • @jqwright28
    @jqwright28 Рік тому +4

    I'd say you're right about games being designed for 4k and then scaled down. Also games like TLOU remake that are next gen ps5 only titles, probably also are designed for 16gb of unified system memory or whatever it's called, so they probably will run best on anything that can give them 12-13gb on the gpu.

    • @gozutheDJ
      @gozutheDJ Рік тому +3

      ALL games have been this way for a while. doom 3 had a level for the maximum quality, uncompressed textures and then all the lower quality settings were scaled down from there. that's why games don't look like literal mud on low settings these days.

  • @maxdema115
    @maxdema115 Рік тому +9

    Too few games tested (and one very broken like TLOU) to have a reliable analysis. And would have been great to see 1440P results, since the RTX 4070 has been designed for that target.

  • @bctoy2779
    @bctoy2779 Рік тому +6

    DLSS3 Frame Generation also requires more VRAM. So with 4070Ti, you can already run into a situation where the card can do 4k60 or better but runs out of VRAM and stutters.

    • @JustGaming24
      @JustGaming24 Рік тому

      its not a 4k card tho

    • @brunoutechkaheeros1182
      @brunoutechkaheeros1182 Рік тому

      @@JustGaming24 so why the hell people say 4070 ti beats 3090? wasnt 3090 a 4K card? lmao

    • @JustGaming24
      @JustGaming24 Рік тому

      @@brunoutechkaheeros1182 more or less same performance is not considered a 4k gpu because of the 12gb vram the 3090 has double the amount.

  • @RFLCPTR
    @RFLCPTR Рік тому +2

    VRAM usage and how much is reserved by the game adjusts to the amount of VRAM present on your GPU.
    You would notice that when testing with an actual 4 GB VRAM card, instead of using a 24 GB VRAM card...

  • @Hostile2430
    @Hostile2430 Рік тому +3

    I upgraded my GPU to 1660Super 6GB only recently and i already feel outdated trying to run some current games at high settings i exceed or consume most of my Vram and suffer from stuttering.
    Crazy to think just a few years ago 8GB vram was considered overkill and now its becoming bare minimum requirements to run most modern AAA titles at acceptable framerates.

    • @zicksee0
      @zicksee0 9 місяців тому

      dawg 1660 super isn’t meant to run games at high settings lol.

  • @ChiekoGamers
    @ChiekoGamers Рік тому +4

    I'm still enjoying video games at 1080p high settings. I don't see the point of Ultra graphics.

  • @IamMarkSmith
    @IamMarkSmith Рік тому +12

    In my opinion, Nvidia is using tech like DLSS to be able to pinch on their physical hardware specs to not only charge more relative to the mindshare they have in the GPU market, but increase their profit margins all around. AMD is the closest they have ever been as competition to Nvidia with their current crop of RDNA 3 cards. If they can get the price to performance right on the forthcoming 7800 and 7700 models we will see them make inroads into Nvidia’s market share. We all win when there’s competition in the marketplace. I’m not a fanboy of either company, but I am a fanboy of better value for my money.

  • @prosecanlik4296
    @prosecanlik4296 Рік тому +5

    This is ONLY for those singleplayer titles, where you only aim for 60 fps at high or ultra at 1080p or higher. I usually don't play those games, only multiplayer shooters, esports like CSGO, so for me, 8gb vram would do just fine. Planning to get RX 6600 for that matter

    • @danielkowalski7527
      @danielkowalski7527 Рік тому +2

      rx6600 undervolted eats only 80w ^^
      Idk why but colours are way better on rx6600 than my old 1650

    • @prosecanlik4296
      @prosecanlik4296 Рік тому +1

      @@danielkowalski7527 will try to undervolt it if I get it one day

  • @YouOnlyLiveOnce...
    @YouOnlyLiveOnce... Рік тому +5

    Good data. Please include 1440p settings next time.

  • @sebastienhebert6457
    @sebastienhebert6457 Рік тому +1

    New to your channel and I love it. You it the sweet spot pragmatic technical useful information. Thanks for that last part on 1080p high settings.

  • @HyperBawl
    @HyperBawl Рік тому +12

    Amazing content as always ! I'm sure my 6700xt will last loooooong

  • @BigLadGreen
    @BigLadGreen Рік тому +5

    New games are made to be unoptimised so you keep buying expensive upgrades. Gaming in 2023 is a scam. Old games are the future.

  • @barrysloas277
    @barrysloas277 Рік тому +4

    1 % of gamers are playing at 4k. Where are the 1440 stats that more people are gaming at

    • @Willbme4EVA
      @Willbme4EVA Рік тому +1

      The gamer side of me really does not want to see grass swaying in the wind, I want to look at my opponents from a distance, before they see me. Take that shot and move on. The only time I would like to see a shadow is when an opponent is on the roof and casts a shadow on the ground outside. Higher HZ, not the p's or K's is my thing.

  • @IMUSTHCOOSEANEWNAMEB
    @IMUSTHCOOSEANEWNAMEB Рік тому +2

    Yup 12 go of vram at 1080p, and it doesn't even look that much better compared to games from 5 years ago. It's almost like triple AAA these days are trash and severely unoptimized, shocking.
    No big deal anyway I play competitive games that run well at 1080p with 3go of vram.

  • @WTBMrGrey
    @WTBMrGrey Рік тому +32

    Nvidia is charging top dollar for their products, pushing DLS,RTX,A.I,reflex etc but skimping on Vram. The RX 470 came with 8gb vram and how old is that now?

    • @Kryptic1046
      @Kryptic1046 Рік тому +9

      It's a pretty counterintuitive thing Nvidia is pushing. On the one hand, they really want to sell you resource-intensive features like raytracing/path tracing but then they don't want to give you enough VRAM in the mid-range to actually use it along with decent textures. DLSS can only do so much. In the near future, you'll probably have to choose between either having high textures with RT off or lower textures with RT on. You simply won't get to do both due to VRAM constraints in newer games.

    • @NostalgicMem0ries
      @NostalgicMem0ries Рік тому +3

      wanna compare 3060 3070 performance vs rx470?

    • @WTBMrGrey
      @WTBMrGrey Рік тому +6

      @@NostalgicMem0ries well obviously the 3060/3070 are a lot more powerfull. There is the point though. The 3070 is a decent 1440 GPU, but it only has 8gb vram which is pathetic. Even the 3060 has 12gb.

    • @mikeymaiku
      @mikeymaiku Рік тому +1

      @@WTBMrGrey i guess you dont understand "why" it had 12gb

    • @mr.ihabissa8442
      @mr.ihabissa8442 Рік тому +1

      ​@@WTBMrGrey
      3060 has 12GB Vram beacuse of it's 128bit bus ,they can do either 6 or 12 not 8 .. Regardless the 3060 is a shit card even on 1080p vs 3060ti/ 3070.

  • @destrike702
    @destrike702 Рік тому +4

    Currently have a RX 6600 8gb, can run the games on mid to high and quite happy on the performance.

    • @SlowHardware
      @SlowHardware Рік тому

      I just sold my 6600 xt and bought a radeon vii, similar performance just 16gb vram. I did it because I got the radeon vii for $250 nzd 😅 sold the 6600 xt for $350 nzd

    • @destrike702
      @destrike702 Рік тому

      @@SlowHardware I hope i can find that on the same price, in my country, even the 2nd hand gpu market is overpriced.

    • @SlowHardware
      @SlowHardware Рік тому

      @destrike702 oh for sure it's usually way over priced for what it is. I just started bidding and got a good deal :) I'd just save a search on a couple sites and check occasionally and one may show up cheap :)

    • @evilleader1991
      @evilleader1991 Рік тому

      @@SlowHardware What about power draw

    • @SlowHardware
      @SlowHardware Рік тому

      @@evilleader1991 I have a 1000w psu it's fine

  • @kilmor3151
    @kilmor3151 Рік тому +1

    Should have just shown the process utilization as it will be prioritised and anything else will be purged if possible.

  • @telekarma
    @telekarma Рік тому +4

    Devs can use more hardware resources with new/current gen only titles and this is what we get. From what I've seen 4k ultra and 1080p ultra VRAM usage delta isn't too big, about 1-2GB difference. That doesn't bode well for lower VRAM cards even if they are otherwise fast enough.

  • @gozutheDJ
    @gozutheDJ Рік тому +1

    you can really see where the last of us is allocating that vram to, which is something no one ever points out.
    if you just take a second to look at a swath of wall or ground texture, where most games will use the same texture over large swaths of space, thats not the case with the last of us., its one of the most inssanely detailed ggames ive ever seen. no two sections of wall will look the same, tons of grime and overgrowth and filth its really incredible

  • @HairyScrambler
    @HairyScrambler Рік тому +6

    I think for the vast majority of games worth playing 4 GB will hold up for the near future, as long as devs can optimize their games. It’d a shame the 1060 3 GB is already starting to become unplayable 1080p in games like 2042.

  • @trr4gfreddrtgf
    @trr4gfreddrtgf Рік тому +6

    I don't think 12gbs is going to last long at all, I wouldn't be surprised if it runs out in 2 years or so. I think 16gbs is a much safer bet, most people want their GPUs to last 4-5 years and 16gbs should do that just fine.

    • @paranikumarlpk
      @paranikumarlpk Рік тому

      Ye 16gb for 1440p and 20gb for 4k is fine for 2 to 3yrs ... I really hate the 308010g for these latest games .. it sucks even for 1440p but still ppl argue 8gb is enough for 1440p for 5more yrs lol and they mindlessly support the greedy nvidia.. they don't understand the quality of high Textures and its impact of vram .. how can they expect the top quality visuals to run on the potato gpus with 12gb or below vram

    • @bigturkey1
      @bigturkey1 Рік тому

      12gbvram should last you until they start porting ps6 games

    • @pdmerritt
      @pdmerritt Рік тому

      it doesn't matter all that much imho. If you're like me and coming from a 1070ti even the 4070 would be a huge uplift. If it only lasted for 2yrs because of vram issues...I could sell the card with 1yr of warranty on it (so for a decent price) and buy from the newer generation that would, hopefully, have a better price performance ratio than this disappointing generation.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf Рік тому

      @@pdmerritt True, I don't think warranties carry over if you sell them used though. Pretty sure it's for the original owner only, might be wrong though.

    • @pdmerritt
      @pdmerritt Рік тому

      @@trr4gfreddrtgf how would anyone know? Even if I purchased with a credit card and my name is on the reciept...all the person would have to say is that it was a gift. I also don't have to register for the warranty.....the reciept will suffice.

  • @bledboost
    @bledboost Рік тому +7

    There is still one way to play all the latest games even with 4GB of VRAM. Just play at 720P! In many cases you will get a better experience playing at 720P High than 1080P Low. This is especially true on gaming laptops since the screen is smaller so the difference in resolution is less noticeable.

    • @roadrash2005
      @roadrash2005 Рік тому

      I have a 4K tv, I can’t go backwards lol I tried it was painful

    • @sololoquy3783
      @sololoquy3783 Рік тому +1

      but you effectively gimped your card at that point... so yay nvidia!

    • @bledboost
      @bledboost Рік тому

      ​@@InnerFury1886 Well you wouldn't need to play pixelart games or even the average game at 720P because they don't have high VRAM requirements. I'm talking about the latest high end games that choke when you don't have enough VRAM. I'm just saying 720P is still an option to make those games very playable.

  • @SlowHardware
    @SlowHardware Рік тому +10

    Brian can you do a video on the radeon vii vs the 2080 in 2023? I'm curious how it stacks up now with it having 16gb vram

    • @mateyv
      @mateyv Рік тому +4

      or 2080 ti vs 3070

    • @grlmgor
      @grlmgor Рік тому

      A 2080 only has 8GB

    • @Kryptic1046
      @Kryptic1046 Рік тому +2

      @@mateyv - I've seen at least one channel (I don't remember which) comparing the 2080ti vs the 3070 in some recent titles and the 3070 was getting trounced in some of the tests by the 2080ti due to the 3070's VRAM limitation. The 3070 was as fast as the 2080ti, until it wasn't, due to the 8GB of VRAM.

    • @bryanwages3518
      @bryanwages3518 Рік тому

      I have a radeon vii and my cousin has a 2080. In most new games I can play at higher settings than he can and I don't have the frame drops like he does. We play a ton of warzone and his frame times are awful.

    • @SlowHardware
      @SlowHardware Рік тому

      @bryanwages3518 thanks for the info, looks like I made a good choice getting one :)

  • @tomtomkowski7653
    @tomtomkowski7653 Рік тому +6

    We have had 8GB for so long that I would say it is obsolete. I mean, would you buy a new $400 GPU like the 4060ti and be already forced to lower settings at 1080p?
    12GB very soon will be the standard for 1080p gaming and if you want 1440p with RayTracing then you should have 16GB which should be the standard right now.
    12GB should be a standard for sub $500 GPUs and 16GB should be a standard for GPUs for more than $500 and 8GB should be some entry-level cards for sub $150.

    • @klanas40
      @klanas40 Рік тому +1

      It should, but it doesn't mean that will happen soon.

    • @brettlawrence9015
      @brettlawrence9015 Рік тому +2

      Yeah buying a brand new gpu for the current prices and having to reduce settings is a joke. At 4k I could understand but not 1080p 1440p.

    • @Willbme4EVA
      @Willbme4EVA Рік тому

      if we are makeing requests to GPU makers that says allot. They do not seem to be listening. But if they are? Give me a supplemental plug and play alternative for Vram. Preferably a slot stuffer for Xmas.

    • @bigturkey1
      @bigturkey1 Рік тому

      12gbvram should last you until they start porting ps6 games

    • @brettlawrence9015
      @brettlawrence9015 Рік тому

      @@bigturkey1 depends if you want ultra settings then no. 12gb will be for medium to high settings.

  • @slc9800gtx
    @slc9800gtx Рік тому +2

    It would be cool if tested at 1440p instead of going from 1080 straight to 4k. Must people do not game at 4k.

  • @nukedathlonman
    @nukedathlonman Рік тому +5

    I was thinking most games would be optimized for 2K these days... Now I know it's only a snap shot, and the accuracy has been called into question numerous times, but Steam's hardware survey does indicate 1080 is the most common resolution and it's on a very slow decline, with the next large chuck being 1440 and growing very strong.

    • @nukedathlonman
      @nukedathlonman Рік тому

      @El Cactuar No, 2K is 2560x1440

    • @nukedathlonman
      @nukedathlonman Рік тому

      @El Cactuar 1080 is HD (or as some companies call it "FHD")

    • @nukedathlonman
      @nukedathlonman Рік тому

      @El Cactuar Oh, you're going by cinema resolution for the 2K labeling. Monitor manufacturer's will use QHD or 2K to describe 2560x1440.

    • @nukedathlonman
      @nukedathlonman Рік тому

      @El Cactuar No, that's HD... Or FHD if you're going by manufacturers since they insist on calling 720 "HD"

  • @BeniBalak
    @BeniBalak 7 місяців тому +1

    FYI, I had to buy a 4090 to be able to run my new 2x4K PCVR HMD (Bigscreen Beyond) bc my 12gb 3080ti was choking on vram at high quality setting, e.g. Alyx at ultra setting. The GPU was able to keep reasonable FPS but would choke on large texture files which I found out using a utility with a detailed data overlay.
    Just FYI for the tiny minority of PCVR users who do need at least 16g today :).

  • @buda3d2007
    @buda3d2007 Рік тому +16

    I use Blender where Vram is king on larger scene files, once you run out of vram your card might as well be the equivalent of a great sports car spinning its tyres working 10 times as hard to get the job done when it would only need to do it once had it had more vram.

    • @furynotes
      @furynotes Рік тому +1

      Even with character portraits 12gb is recommended.

  • @sapphyrus
    @sapphyrus Рік тому +2

    If someone's on 4K, they can shave off about 1-2GB by using performance DLSS. It's what I have been doing with 3070 and it worked alright (high even if not ultra textures) so far with newer games without RT.

    • @Lordssr
      @Lordssr Рік тому

      Dlss 3 can save 4

  • @darkkillex7220
    @darkkillex7220 Рік тому +27

    I've definitely noticed the VRAM on my 3080 10G being a limiting factor in quite a few games recently...

    • @greenbow7888
      @greenbow7888 Рік тому +5

      The 3080 10GB card could not even run Far Cry 6 HD textures, within a month of the 3080 release.

    • @thomassmith9362
      @thomassmith9362 Рік тому +3

      Well it is now almost 3 years old, you shouldn’t be expecting to max out games on it. I’m going on along nicely with that card, 1440p@high on the last of us works great.

    • @Bos_Meong
      @Bos_Meong Рік тому

      try to run msi afterburner and see if your vram is actually eating up or not, dont just "noticed" lmao

    • @kaythree8302
      @kaythree8302 Рік тому +3

      @@Bos_Meong any decently competent person would assume that’s what he meant by “noticed”.

    • @Bos_Meong
      @Bos_Meong Рік тому

      @@kaythree8302 decently competent? Thats my line for you. I bet 100% was just placebo, he was just assuming and not really tested it out himself. Because me rn running cyberpunk at overdrive and it only eat 9Gb of ram hows this a limiting factor? Also 3080 cant do overdrive anyway so at ultra its gonna consume far less vram, probably 7gb. Maybe he was playing trash of us, which is a badly optimized game

  • @surfx4804
    @surfx4804 Рік тому +1

    I have seem 15k+ vRam used on 4090 in Cyberpunk, Ultra settings, 4k, DLSS Quality, Path Tracing, Frame Generation

  • @danieljayasiri7739
    @danieljayasiri7739 Рік тому +2

    Upgraded my old 6600k platform but still holding onto my 1080ti from evga 😅 upgrading soon but this time I'll probably go amd and take up the Sam as I've upgraded to ryzen 7 anyway... ❤ for all the 1080ti users still holding on

    • @firexz4185
      @firexz4185 Рік тому +2

      I believe you shouldn't upgrade your gpu now cuz this 40 gen or 7000 Gen are horrible

    • @blinksone2768
      @blinksone2768 Рік тому

      @@firexz4185 7000 is decent.

    • @Willbme4EVA
      @Willbme4EVA Рік тому

      1080 ti is no joke, its really hard to upgrade without selling a great, great, grandchild at a -10% loss.

    • @danieljayasiri7739
      @danieljayasiri7739 Рік тому +1

      @Willbme4EVA it's even harder when you plan on passing it down to your kid and you need to part with >1.5k nzd for your next gpu 😆

    • @firexz4185
      @firexz4185 Рік тому

      @@blinksone2768 for the price they are not worth it

  • @soldier9927
    @soldier9927 9 місяців тому +2

    I’ve played titles like god of war , elder ring , fallout 4 , atomic heart , RDR2 at high to ultra settings 4k DLSS quality at 40-50 fps stable on Rtx 2060 laptop legion 5 115w 6gb vram and ryzen 6 core cpu . On Lg oled tv full screen

  • @Verpal
    @Verpal Рік тому +5

    Generally NVIDIA seems to do texture/color compression more aggressively, thus the difference in utilization, still doesn't justify how stingy NVIDIA have been on VRAM though.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +2

      those delta color compression to my knowledge did not lower VRAM usage that much. in nvidia presentation those delta color compression helps to reduce bandwidth requirement quite significantly. that's why nvidia cards usually have much lower bandwidth vs AMD card of the same performance tier. AMD on their part decided not to optimized this because initially they thought HBM will going to totally replace GDDR memory in GPU in the future.

  • @peterjansen4826
    @peterjansen4826 Рік тому +2

    It is well known that NVidia has more agressive compression, that is why the memory utilization is lower. Is that good or bad? Hard to tell, I know that in the past with NVidia you had more visible artefacts due to that compression, I don't know if that still is the case. Matter of critically comparing the pictures with uncompressed video-recording.

  • @mattfarrar5472
    @mattfarrar5472 Рік тому +4

    Would have been good to see you run a 6 or 8gb card in those titles to see what it would do on different settings...

  • @techclub8528
    @techclub8528 Рік тому +1

    I would like to see a comparison for when you hit that VRAM limit the difference in stutter/ performance between Sata SSD or NVME GEN 2 vs GEN 3 VS GEN 4 and how improving SSD speed performance could help reduce problems when the graphics card needs to go back to storage to load more assets.

    • @DJ_Dopamine
      @DJ_Dopamine Рік тому +1

      Agreed. Also what magnitude of difference the main system memory configuration makes to all this.
      (Memory speed/number of channels/timings/etc.)
      Also, if putting Windows/OS on one SSD with the games on another SSD helps much.
      I do this myself, but I have never actually measured the difference.

  • @fVNzO
    @fVNzO Рік тому +6

    I think it's important to note that these figures are indicative of what game developers *tolerate* in order to make their games work on popular graphics cards. Had the average been higher (Nvidia spending 20 bucks more on their GPU's etc.) games would indeed look better, grander or load quicker - and more vram would be needed. The mere second the average VRAM count goes up to 16+ game devs will just eat it up as they can finally make more intricate game environments. So, as data points these are fun benchmarks to run but they are ultimately a product of the limitations involved with producing software for customers who have been well frankly scammed for the past 6 years with no discernible increase in graphics VRAM in the mid range.

    • @TheAkashicTraveller
      @TheAkashicTraveller Рік тому +4

      Except's not what we're seeing. They're clearly targeting the consoles and ngreedia just isn't keeping up.

    • @soniofficial6017
      @soniofficial6017 Рік тому

      ​@@TheAkashicTraveller 😂😂😂

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +1

      @@TheAkashicTraveller the issue on PC is those ultra (plus RT) what makes people think 8GB and 12GB no longer enough because console have 16GB. but console most often only use what equals to medium on PC. in reality developer on console most likely did not use most of the memory on VRAM like many people think it was. game like returnal for example the developer want 120FPS. so they actually render the game at 1080p and then using their own upscaled tech to upscaled the final image result to 4k. game like spiderman the ray tracing reflection are rendered at 1080p instead of 4k. and in some comparison the reflection on water puddle are being completely disabled on console.

    • @fVNzO
      @fVNzO Рік тому

      @@TheAkashicTraveller This is exactly what we are seeing. Consoles are proving my arguments. The PS5 spends 16B of shared memory with a basically infinitely large cache behind it right - it's the cheapest way to just give developers more VRAM. And they've been completely hamstrung on desktop since nvidia just decided to stop at pascal. This video is showing you that usage is around 12GB max for a lot of games, that is telling you precisely that developers are completely stuck and they have little ability to give us better games when there is no standard way to cache more data like on consoles.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Рік тому

      What is up with all these comments claiming Apples to apples comparisons with consoles.
      A) the PS5 for example uses GDDR6.
      B) Quite a bit of it is reserved.

  • @Psyko_Blood
    @Psyko_Blood Рік тому +2

    TLOU part 1 is just bad optimize.. look RDR2, it looks (for me) better than this game and is muuuuuuch better to run xD So IT CAN ! but some devs are lazy in optimization's or so.. there Games they can do it (Red dead for example), so its no excuse to have bad performance/optimization. Especially for triple A guys.

  • @projectc1rca048
    @projectc1rca048 Рік тому +3

    LOL! When you said VRAM - Magedon I literally laughed out loud, love it. Only @Tech Yes City man. I imagine with all these latest and greatest AAA titles raising the minimum pc requirements to run their games, especially the games that will be using Unreal Engine 5, 12gb of vram will be the new minimum/standard. Of course it will depend on the resolution and settings people play at. Great topic for a video and appreciate all the hard work my guy. Keep up the great Tech Yes City content.

  • @tomaszgajewski6699
    @tomaszgajewski6699 Рік тому +1

    Great video
    But where is 1440p data? 😭

  • @kurilrick2207
    @kurilrick2207 Рік тому +9

    8 gigs of VRAM is plenty enough for 1080p, except of course when you run poorly optimized PC ports like The Last of Us or Hogwarts Legacy

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 Рік тому +3

      Or when you play well optimized games with raytracing (RE4 Remake, Doom eternal). 8gb is not good for a card that cost more than $300.

    • @brettlawrence9015
      @brettlawrence9015 Рік тому +1

      You do realise that it will be become more prevalent now games are built for next gen consoles.

    • @kurilrick2207
      @kurilrick2207 Рік тому

      @@nombredeusuarioinnecesaria3688 Haven't played the RE4 Remake yet, but I played Doom Eternal with RT enabled (1080p) and faced no issues so far. But I agree that 8 gigs of VRAM is too little for an expensive card, I really hope that the next Nvidia's 70 card (5070) will have at least 16 gigs of VRAM or else it's gonna be an overpriced garbage

  • @pcmaravilla
    @pcmaravilla Рік тому +2

    man i play with 6gb (rtx 2060) at 1080p and i dont have major problems, even in the last of us 1080p medium, 60 fps no stuttering.... i think people is just obsessed with 4K

  • @puddingfoot
    @puddingfoot Рік тому +4

    Hey! Love your content. Ive also been living in Japan for the last 17 years or so and sometimes check out janpara for deals, and totally agree with your stance on buying used parts for viable builds for 95% of games.
    However, with VRAM requirements being 12gb for high settings in new AAA titles, I am torn between buying a few cards:
    rx6800 ' ~48,000 yen, used
    rx6700t- ~38,000 yen, used
    rtx 3070* ~43,000 yen, used
    *Im only considering the 3070 for the video AI upscaling feature in VLC/chrome/edge (Video Super Resolution). What a killer feature! Does AMD offer similar features or is it in the works? I'd go with AMD in a heartbeat if so.
    Nvidia recommends the 3070 for the highest level (setting level 4) of Video Super Resolution. However, some 3060ti users report using VSR at level 4 without issues. Do you have an opinion on this? Maybe the technology is too young.
    hope your allergies arent so bad now. This year has sucked for cedar allergies in Japan but we are in the home stretch to golden week and less pollen in the air! ganbare🤘

  • @yetson
    @yetson Рік тому +2

    A moment of silence for the 4~6GB VRAM Pascal GPU users.

  • @ruxandy
    @ruxandy Рік тому +7

    Great video! I would say that 12 GB VRAM at 1440p should be more than enough for the foreseeable future. I mean, sure, in the next couple of years there will probably be a new game which might require more than that for the absolute Ultra settings (Ultra textures, in particular), but I don't think we'll see a game where 12 GB of VRAM is unusable for High details anytime soon (it might happen when the next-gen consoles come out, but that's still a long way from happening). I for one have played The Last of Us with Ultra details @ 1440p on a Ryzen 7 5800X3D + RTX 4070Ti, and the experience has been absolutely flawless (had no crashes, and no stutters -> 65+ FPS for the 1% lows). So if this game runs great (and, as we all know, this title is the 'best' example of poor optimization), then I am not worried at all for the next 2 - 3 years. Fun fact: I've actually also played TLoU (in its entirety) on my backup PC, with an RTX 2060 @ 1080p/High details, and the experience, while not flawless, it still wasn't bad at all and very playable (and I didn't experience any crashes with this card either - must've been very lucky).
    On the other hand, 8 GB of VRAM is a whole different discussion. There were multiple signs throughout the past few years that... yeah, 8 GB VRAM wasn't gonna cut it anymore (especially considering the fact that new consoles came with 16 GB of unified memory). It's unfortunate that a lot of people did not listen and, even worse, they ended up spending 800+ euros for RTX 3070s and other cards like these.

  • @AndyBarber1981
    @AndyBarber1981 Рік тому +2

    I was on warzone yesterday ,play on a ultrawide 3440x1440 put the game to extreme on Al-mazrah to see what my 4070 performance was like and it hit 11.2gb Vram

  • @ellypsis603
    @ellypsis603 Рік тому +9

    16GB should be standard these days, nvidia sold the 10 series with 8GB back in 2016!!
    And thats why those cards aged so well

  • @RonBurrgundy
    @RonBurrgundy Рік тому +1

    If you want to play without stuttering at 1080p just set your desktop resolution to 1080p and set scaling to display in graphics driver. Run the game in borderless window. This way you are running your desktop at 1080p and the game , your are not downscaling from 2160p to 1080p. This way you are saving Vram. Don't forget set scaling to display instead of GPU. Now I can run the last of as and hogwarts on High textures.

  • @jamesbolho
    @jamesbolho Рік тому +2

    Your video and analysis just confirms one thing. Although many recent cards should have launched with more VRAM, the truth is that this sudden issue is also to blame on lack of optimization.

  • @Tubes78
    @Tubes78 Рік тому +4

    12GB is enough for most high end gaming for now.. But so was 8GB just a few months ago. I'm not sure what's needed in 2-3 years. I can imagine people spending 800+ US don't want to start lowering a lot of settings that fast.

  • @majcyl
    @majcyl Рік тому

    its all about how much texture and details you can load or needed. buying any new card i would only accept 16gb or above

  • @ABaumstumpf
    @ABaumstumpf Рік тому +8

    With VRam it also heavily depends on the engine and the game it self how they treat it.
    Some engines by default just try to keep more stuff in memory if more is available, some games (Hogwarts, lastOfUs) just waste memory left right and center. Some games dynamically adjust LoD to stay without memory.
    Hogwarts can be run on a GTX 970 with medium-settings 1080p - just not with how the game is delivered. They really need to fix that crap when even the community has already fixed that with mods.

    • @winebartender6653
      @winebartender6653 Рік тому

      "Fixed with mods" lmfao no. The only thing any of those mods, that even "worked", did was adjust the culling distance, which made texture pop in and texture resolution hilariously bad.
      Let's also stop pretending that they are "hogs" when they are developed with a vram buffer of 12gb for consoles.
      And the reality of the situation is that you shouldn't be limited by vram, you should be limited by the chips performance to then adjust settings that fit your fps needs. You shouldn't need to crank down settings because your vram buffer is too small.

    • @ABaumstumpf
      @ABaumstumpf Рік тому

      @@winebartender6653 ""Fixed with mods" lmfao no. The only thing any of those mods, that even "worked", did was adjust the culling distance, which made texture pop in and texture resolution hilariously bad."
      nah, people have shown that the game keep full resolution textures loaded for background objects that are only rendered at lowest LoD.
      "Let's also stop pretending that they are "hogs" when they are developed with a vram buffer of 12gb for consoles. "
      aka - "we know it runs with 12 GB so just cram in everything even if that degrades performance on all hardware cause it still runs okish".
      "You shouldn't need to crank down settings because your vram buffer is too small."
      And you shouldn't need to crank down settings cause a tree 760 meters away rendered at lowest LoD is using more Vram than an NPC standing right next to your character.

  • @ilkerdemirci3720
    @ilkerdemirci3720 Рік тому +1

    Not even 12 is enough, 16 should be minimum for future proofing. 7900xt is actually at sweet spot

  • @vaggeliskosiatzis5487
    @vaggeliskosiatzis5487 Рік тому +4

    it's simple. Τhe developers on current consoles have access to 13,5gb of vram while in previous gen they had 5,5gb. When ps4 pro and xbox one x was out you could buy a brand new rx 580 8gb, a lot more Vram than the total amount that could be used on consoles back then, for 240$ that had the same memory bus and bandwidth with the consoles and it was faster than the pro and the same performance with the one x. Now that the consoles can utilise 13,5gb in total, where are the 300$ 16gb Gpus with the same memory bus and bandwidth and providing a little more or similar performance compare to the consoles?? That's the question that Pc gamers should do than make bad decisions by buying a 8gb 450$ or 800$ 12gb cards and then blame the games that need MORE VRAM than they supposedly should to look good. That is a given when a 500$ console can use up to 13,5gb of vram while playing a video game. Accept reality and make better purchasing decisions by looking on the used too and consider both companies, Nvidia and AMD before you buy a Gpu.

    • @brettlawrence9015
      @brettlawrence9015 Рік тому

      They should have bought a card from amd. You could tell by the next gen specs what would happen. 16gb will cover this whole gen.

    • @bigturkey1
      @bigturkey1 Рік тому

      ps5 has 12gb vram xbox has 11

    • @vaggeliskosiatzis5487
      @vaggeliskosiatzis5487 Рік тому

      @@bigturkey1 no, that's incorrect. Both consoles have 16gb Vram total and 2.5gb is reserved for the OS that they run. The rest can be used from the devs.

    • @bigturkey1
      @bigturkey1 Рік тому

      @@vaggeliskosiatzis5487 i think its 4 for the OS.

  • @NostalgicMem0ries
    @NostalgicMem0ries Рік тому +1

    two most broken games dont reperesnt entire 99.9% of gaming industry... also playing 4k on 3070 3070 ti 4070 4070ti that are pure 1440p gpus is stupid, for 4k there is 3090 4080 4090 their titan version coming too, or amd 7900+. only last of us even comes close to 12gb vram on 1440p, neither hogwarts neither cyberpunk or others break 10gb at 1440p. buy gpu accoding to your resolution and also high vs ultra res looks identical in most games, hardware unboxed did great video how to stop wasting vram. 8gb for 1080p, 12gb for 1440p and for 4k 16gb or more
    also testing vram usage on 4090 7900 is stupid comapred to low gpus, more vram gpu has more it uses, especially on amd side.

  • @vmpere2637
    @vmpere2637 Рік тому +2

    If developers can’t be assed to fit a scene’s textures into 8 GIGABYTES of memory then they’re just shit developers. 8 gigabytes is enough for gaming, it’s on developers to utilize it effectively.

  • @necuz
    @necuz Рік тому +8

    Much better VRAM management than what these games are doing is possible on more recent hardware, but PC is lagging behind the upgrade cycle a lot for obvious reasons. Using a DX12 Ultimate feature called Sampler Feedback you can figure out which parts of which textures need to be loaded at what quality in order to render a scene, this would massively cut down on VRAM usage especially in open world games. That could further be combined with DirectStorage 1.1 to quickly load textures on demand. The kicker is you then need to set your minimum requirements to RDNA2 or Turing, since that was when support for these was introduced. That would be a bold move, but I do wonder how many of the people still rocking their old 1060 are actually buying $70 AAA releases in 2023?

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому

      AFAIK RDNA 1 did not even support any DX12 ultimate feature.

    • @necuz
      @necuz Рік тому

      @@arenzricodexd4409 Ack, you're right. For some reason had the impression RDNA1 also had rudimentary support for this.

    • @MrMeanh
      @MrMeanh Рік тому

      The issue is that DirectStorage 1.1 will increase the load on the GPU (asset decompression done by the GPU etc.), this will 100% reduce the available compute for rendering the game. From what I've heard it's something like at least a 10-20% performance hit if you want to use DS+Sampler Feedback while rendering the game. This all means that I'm sceptical of Sampler Feedback being a good solution for reducing VRAM usage at the moment.

    • @necuz
      @necuz Рік тому +1

      ​@@MrMeanh It certainly isn't going to be free, however 20% sounds more like a situation like trying to run TLoU at Ultra on a 8 GB card where you're constantly running out of memory. So the question becomes, would you rather have smooth 20% less fps or the current stuttery mess?
      Additionally, almost all of the usual suspects among released games that have triggered this debate struggle to be GPU bound, so in many cases it might actually end up being essentially free...

  • @PotatMasterRace
    @PotatMasterRace Рік тому +2

    12:47 basically games use the same 4K texture packs for both resolutions.

    • @ms3862
      @ms3862 Рік тому

      Exactly. This was even confirmed by two developers on MLIds and podcast - games aren't even use proper high resolution textures yet - enabling 4k just outputs a higher resolution for the image but the underlying asset textures are the same at all resolutions and they are low quality. Take even the best looking game you know of, move the camera right up against a texture and it will low very blurry low quality

  • @TheSleppy
    @TheSleppy Рік тому +7

    I think if this type of test was done for longer example 10 minutes of play vs 30 minutes etc, the VRAM utilization would be even higher.
    Sometimes a 5 minute benchmark doesn't give the whole story.
    I agree overall that 12GB is the new minimum, great video.

    • @mnemonic8757
      @mnemonic8757 Рік тому +1

      Exactly. I don't think everyone has time for such tests. I remember the FFXV open world game. With the best textures, it ran smoothly right after turning on, but when you drove around the land and visited a city, the vram got clogged. I think this problem can occur in other titles, especially with open world, where you have to load a huge amount of data.

    • @greggmacdonald9644
      @greggmacdonald9644 Рік тому +1

      Hardware Unboxed addressed this recently, their vid about this is well worth watching. I agree that 12GB is a new minimum, and I'd also say that 16GB is better.

    • @RicochetForce
      @RicochetForce Рік тому

      @@greggmacdonald9644 Yeah, I'd say 16GB is what mid range cards should have. 8GB VRAM is stone dead and 16GB of system RAM is much the same.

  • @fatidicusaeternus6498
    @fatidicusaeternus6498 Рік тому +1

    Newer engines use vram for much more than just textures and that's the underlying reason for increased vram requirements.

  • @inmypaants
    @inmypaants Рік тому +13

    Thanks for testing these on GPUs with sufficient VRAM Brian. People arguing that you don’t need more than 8 and using 8 to test don’t realise games dynamically scale down textures. It’s fine if you have 8 and don’t notice or mind, but don’t argue that 8 is enough for midrange new GPUs, it simply is not.

    • @adriancioroianu1704
      @adriancioroianu1704 Рік тому +1

      It is if you adjust some video settings here and there. Because only on 4k you see 8+ required on low and mid-range is not targeted for 4k, people don't buy a 3070 to play at 4k, it's ridiculous. On the other side there are people (mainly rx 580 users) who try to convince new buyers that 12GB cards are trash and they should go for 16GB or more even on 1440p because they saw a 4k benchmark (max settings) where the Vram went over 12. Its funny and sad in the same time.

    • @FenrirAlter
      @FenrirAlter Рік тому

      @@adriancioroianu1704 Yes, i m most definitely buying a 600$ GPU, so I can just barely scrap by in 1440p while playing on high settings with rt on.

    • @inmypaants
      @inmypaants Рік тому +1

      @@adriancioroianu1704 Brian didn’t show 1440p, plenty of people buy 3070 class GPUs to game at 1440p. That resolution also runs into issues at 6GB and it won’t be long until 8GB is saturated too. I don’t think people should sell and buy bigger GPUs mind you, I just think people should be vocalising to these companies that 12 is the minimum for midrange and really 16 is the value Nvidia and AMD should offer to really entice buyers.

  • @Carlos-wl5fn
    @Carlos-wl5fn Рік тому

    2060 super user here with 8GB of ram. I support your comment on not going crazy on selling your GPU and that PC environment is great. I don't have a lot of time to play so I will be picking my games carefully, they have to run great and also good to know that if I would still want to play these few games I can at least set the textures to medium and have a good time. Of course the people obsessed with the latest and greatest and money to spare can chase that, I would not get a 4070 unless you would get rid of it in 2 years.
    Looking forward to what AMD is offering, tempted for a 6800/6800xt but I want to know what is next, I am looking at 16gb vRAM and 200% uplift from my 2060 super

  • @MinosML
    @MinosML Рік тому +3

    Bryan yet again listening to what the community is preoccupied with and giving us a banger video with tons of useful info! Hope to see more vids on this subject so people buying GPUs at this point in time are aware of the compromises they'll have to make with lower amounts of VRAM. Devs are definitely focusing more and more on the current Consoles/4K and it shows. Thank again for the quality content!

    • @Willbme4EVA
      @Willbme4EVA Рік тому +1

      Totally agree, more, more, more. Just one vid can not hit the full spectrum. But he does try to jam allot in this vid. Tech Yes City Series pls

  • @lsnderick
    @lsnderick Рік тому +1

    How come CyberPunk looks better, but uses less VRAM even with full path-tracing? That's just lazy developers and poor optimization I say.

  • @hartsickdisciple
    @hartsickdisciple Рік тому +3

    I don't see any 1440p results, but based on the 1080p and 4k numbers, it looks like 12gb should be enough for 1440p high/ultra.
    There's no solid reason to believe VRAM requirements will increase substantially from where we're at now during this console generation. 8gb is the new comfortable minimum, 12gb for 1440p, and 16gb for 4k if you want a little headroom.

  • @garipoter6336
    @garipoter6336 Рік тому +1

    Next time when you're testing please include 1440p low-high-ultra, 1080p low-high-ultra, why am asking for high and ultra separated, because a lot of game see a minimal difference in visuals between high and ultra and a big gap in fps so it would be nice to know how is the difference there, also 1440p is pretty common resolution, even more mainstream than 4k...

  • @oldmanwithers4565
    @oldmanwithers4565 Рік тому +3

    Test a 4gb card to see how much it actually affects things in the real world.

  • @Zel912
    @Zel912 7 місяців тому +1

    For me, 3070 8GB still can serve me for next 3 years. How many AAA titles require huge amount of vram? Just a few. some of them are also broken. And there will not be many released in next few years.
    With high/medium setting and dlss, i can play RDR2, Hogwarts, Cyberpunk at 4k stable 60fps. Vram is still under 8GB. there is no need to set all graphic as ultra. Just tune down the settings.
    Ultra is for benchmark, not for playing, developers never optimize game in ultra. And i dare no-one can spot difference between ultra and high in real game play.

  • @benjaminmaher8896
    @benjaminmaher8896 Рік тому +4

    My 1070 is still pushing but not for long I feel like with the trend games are adopting of being wildly unoptimized

  • @clearskiesastro1028
    @clearskiesastro1028 Рік тому +1

    really wish you had included 1440p

  • @SKHYJINX
    @SKHYJINX Рік тому +4

    Wouldnt it be more detailed to use different generations and compare baselines, as different architectures buffer more than others in various engines.
    4090 low textures still seems to buffer more than say a 6gb 980ti at same settings... so differing architecture might hold revelations where just using flagship GPUs might skew results with their huge caching buffers.
    I hope for another round using 8GB gpus, not the flagships where game engines might buffer more just cos it sees a higher power gpu device ID.

  • @KOT-ANGRY
    @KOT-ANGRY Рік тому +1

    Linus father, you tested 4090 at 4K maxed out and got 60-70 fps, mhm! Its 4090, 13gb VRAM, but 3070, 3080 ect, CANT candle this game at 4K maxed out, so you will reduce graphics setting to get normal FPS and its mean - your GPU will use LESS VRAM at normal FPSes 😊!

  • @6n100-ent
    @6n100-ent Рік тому +1

    Why not test 1440p high? On the steam survey hardly nobody plays 4k but people play 1080 high, and 1440 high

  • @GameplayUnboxed
    @GameplayUnboxed Рік тому +1

    IMO xx50 nvidia card or lower = 12gb vram
    xx60 and xx70 = 16gb vram
    xx80 and xx90 = 24gb vram

  • @martytube821
    @martytube821 Рік тому +1

    Just seems some newer games the vast majority of gamers worldwide work be able to play as most people have 8GB vram or lower, so who are these game companies selling too.