Why VRAM Is So Important For Gaming: 4GB vs. 8GB

Поділитися
Вставка
  • Опубліковано 3 жов 2024

КОМЕНТАРІ • 1,5 тис.

  • @JarrodsTech
    @JarrodsTech 7 місяців тому +2245

    It's a good thing you can always just download more VRAM.

    • @EyesOfByes
      @EyesOfByes 7 місяців тому +88

      Yeah, I remember when we used to buy RAM on CD but then Napster came along...
      (I'm also joking)

    • @JarrodsTech
      @JarrodsTech 7 місяців тому +139

      @@EyesOfByes Now RAM is all subscription!

    • @Remi_Jansen
      @Remi_Jansen 7 місяців тому +22

      Usually for free too! No idea why people spent money on physical ram, its ridiculous.

    • @Splarkszter
      @Splarkszter 7 місяців тому +35

      NVidia is worse than Apple

    • @chadbizeau5997
      @chadbizeau5997 7 місяців тому +39

      ​@@JarrodsTech RAM as a Service!

  • @AthanImmortal
    @AthanImmortal 7 місяців тому +865

    I never understood why Hardware unboxed caught so much flak originally for suggesting that GPU VRAM was not moving on as fast as the gaming industry was, and that 8GB cards were going to age a lot faster. Instead customers should be angry that Nvidia was selling us the same 8GB of VRAM on the 1070 (2016), 2070 (2018) and 3070 (2020), and STILL wanted $400 for an 8GB card in 2023 in the form of the 4060Ti.
    Go back 4 years from 2016 and in 2012 you have the 670 with 2GB. Memory *quadrupled* in the same time frame in the mid range. Yet everyone was barking "game devs need to optimise", sure if you want graphics to stay the same, why don't they optimise for 2GB then? Because at some point we need to up and move on.
    The fact is that someone that bought a 1070 in 2016 can still play at the same level they did 7 years ago, but at the point they run into VRAM issues, those same issues are going to affect someone that bought a 3070 just 3 years ago, regardless of the difference in 3d capability between the cards.
    I'm glad to see HUB still championing this point.

    • @lookitsrain9552
      @lookitsrain9552 7 місяців тому +156

      They bought cards with 8gb of ram for 800 dollars and have to argue about it to make their purchase seem good.

    • @vmafarah9473
      @vmafarah9473 7 місяців тому +98

      1070 8gb , 2070 should be 10gb , 3070 sb 12gb, 4070 sb 16gb .in my opinion.

    • @RobBCactive
      @RobBCactive 7 місяців тому +34

      Ironically I remember HUB doing the opposite when many of us pointed out Nvidia were skimping on VRAM in 2020/21 and valuing the 12/16GB configurations RDNA2 offered.
      All you needed to do was listen to game devs.

    • @highlanderknight
      @highlanderknight 7 місяців тому +27

      I agree about NVIDIA not putting enough VRAM on their cards. HOWEVER, we are Not forced to buy those cards. If you buy one, you have little right to complain.

    • @gamingunboxed5130
      @gamingunboxed5130 7 місяців тому

      ​@RobBCactive that didn't happen 😅

  • @Phil_529
    @Phil_529 7 місяців тому +340

    First 8GB card was the Sapphire Vapor-X R9 290X in Nov. 2014.

    • @magikarp2063
      @magikarp2063 7 місяців тому +34

      Msi had a 6gb version of 280X, a midrange card from 2013.

    • @lennartj.8072
      @lennartj.8072 7 місяців тому +20

      Sapphire da goat fr

    • @Pixel_FX
      @Pixel_FX 7 місяців тому +71

      Sapphire also created the first blow through cooler. Then almost a decade later Jensen made nvidiots believe they invented that with 30 series cooler lmao, it only took some 10 mins of bullshitting with AI, aerodynamics, thermal dynamics yada yada.

    • @mitsuhh
      @mitsuhh 7 місяців тому +2

      280X was high end, not mid range@@magikarp2063

    • @tomstech4390
      @tomstech4390 7 місяців тому +15

      @@magikarp2063 6GB hd7970 (same card) existed before that.
      There is no architectural change from HD7000 to Rx200 series (except 260x bonaire which added Truaudio)
      Infactl the shaders were unchanged from hd7000 to Polaris RX 400, even then it was a switch to 2x FP16 instead oif 1xFP32 units, vega added rapid packed math but again not *that much* changed.
      From Fermi 200 series to RTX20... GCN was amazing.

  • @crazylocha2515
    @crazylocha2515 7 місяців тому +28

    Surprised me how well Steve put things and it kept becoming interesting at every level. Great piece 👍 (Thx Steve)

  • @chriscastillo291
    @chriscastillo291 7 місяців тому +104

    I love these VRAM videos! No one talks about this or let alone testing on it.. thanks for the vids!❤

    • @bns6288
      @bns6288 7 місяців тому +7

      UA-cam is full of videos like this.^^ the only rare comparison is 4gb vs 8gb imo. idk why we still talk about 4g while 8gb is already too low.

    • @zodwraith5745
      @zodwraith5745 7 місяців тому +1

      Lolwut? There's a ton of videos chasing the vram boogieman.

    • @tyre1337
      @tyre1337 7 місяців тому

      @@zodwraith5745 tons of vram drama clickbait videos, very few actual deep dive testing like this, the only one i can think of is daniel owen

    • @dave4148
      @dave4148 6 місяців тому +2

      No one? Really?

    • @ElysaraCh
      @ElysaraCh 6 місяців тому +1

      "No one talks about this"? It's been talked about by all the major tech channels since the 30 series launched lmao

  • @andrexskin
    @andrexskin 7 місяців тому +49

    I guess that we should already look up on a 8GB vs 12GB of "similar cards".
    An example would be the 3060 12GB vs 3060Ti 8GB, trying to spot if there are already cases where the raster performance from 3060Ti isn't enough to balance the quality a 3060 12GB would achieve with higher texture quality

    • @TheKims82
      @TheKims82 7 місяців тому +14

      HUB did test this earlier where the 3060 actually outperformed the 3080 10GB. I believe it was in Hogwart Legacy just when the game came out.

    • @andrexskin
      @andrexskin 7 місяців тому +10

      @@TheKims82 I think that unoptimized RTX is kinda a niche case.
      It would be better to extend the tests to more cases

  • @ojassarup258
    @ojassarup258 7 місяців тому +156

    Would be interested in seeing 8v12v16 GB VRAM comparisons too!

    • @vmafarah9473
      @vmafarah9473 7 місяців тому +12

      As 30 series suck in Vram, developers are forced to reduce game quality to incorporate handicapped 30 series, 12 GB going to be enough for 1440p . Fc Ngreedia. The same company released 6GB 3060 laptops. 8gb 4070 laptops.

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 7 місяців тому +8

      It's possible to use 18gb in modern games at 4k. The game will use what it can. If you have less than ideal it will just stutter more and look worse but it won't run out because of Safety features keeping you from running out. My friend had a 3080 and had usage at 8.5gb then got a 7900xtx and saw those same games actually took advantage of more like 16-20gb. More than double the fps.

    • @c.m.7692
      @c.m.7692 7 місяців тому +1

      ... and Vs 20gb!

    • @АндрейШевченко-к6й
      @АндрейШевченко-к6й 7 місяців тому +6

      There is no vcards that can offer 8-12-16Gb variants at the same time. But you can re-watch RTX3070vsRX6800, or seek for videos like "RTX3070-8Gb vs RTX3070-16Gb(modified)"

    • @zfacersane272
      @zfacersane272 7 місяців тому +1

      @@vmafarah9473what are you even talking about not true… but i agree with the laptop part

  • @Ale-ch7xx
    @Ale-ch7xx 7 місяців тому +42

    @24:29 Correction: The first Radeon card that had 8gb was the Sapphire Vapor-X R9 290X 8gb released on Nov 7th, 2014.

    • @Hardwareunboxed
      @Hardwareunboxed  7 місяців тому +17

      We're talking about the first official AMD spec.

    • @guille92h
      @guille92h 7 місяців тому +7

      Shapphire knows how to download Vram😂

    • @Ale-ch7xx
      @Ale-ch7xx 7 місяців тому +20

      @@Hardwareunboxed The way it was worded I didn't know you were referring to the official AMD spec

    • @OGPatriot03
      @OGPatriot03 7 місяців тому +1

      That was an epic GPU

    • @andersjjensen
      @andersjjensen 7 місяців тому

      @@Ale-ch7xx They always are. Cards are set to official spec clocks for core and memory if a Reference or Founders Edition is not available, etc, etc. What individual board partners do will be put into specific model reviews.

  • @pvtcmyers87
    @pvtcmyers87 7 місяців тому +8

    Thanks for creating the video for this and for taking a different look at the vram issue.

  • @sergiopablo6555
    @sergiopablo6555 7 місяців тому +83

    I don't know why, but this is the only channel where I click "like" even before the videos start. It may be the absolute lack of clickbait on the titles, how useful all of them are, or how relaxing it is to watch someone speak without yelling or jumping around.

    • @GankAlpaca
      @GankAlpaca 6 місяців тому +4

      Haha same. Just checked and I already liked the vid instinctively. Maybe it's because the topic is so important to me and generally a thing that a lot of people think about.

    • @TheRealEtaoinShrdlu
      @TheRealEtaoinShrdlu 3 місяці тому

      You don't know why? Clearly not a very bright one....

  • @BUDA20
    @BUDA20 7 місяців тому +142

    also some of those cross the 16 GB of RAM limit because of the low VRAM capacity, so... people with 4GB cards, are likely to have 16GB of ram... so it will tank even more

    • @kaznika6584
      @kaznika6584 7 місяців тому +16

      That's a really good point.

    • @DivorcedStates
      @DivorcedStates 7 місяців тому

      How is it that 16gb ram instead of 8 for example is bad for a setup with a 4gb vram card? I dont understand.

    • @Pasi123
      @Pasi123 7 місяців тому +10

      @@DivorcedStates Some of the games used more than 16GB of system memory with the 4GB VRAM card so if you only had 16GB RAM you'd see even bigger performance hit. With the 8GB card that wouldn't be a problem because the system memory usage was below 16GB.
      8GB single-channel RAM + 4GB VRAM card would be straight from hell.

    • @Apollo-Computers
      @Apollo-Computers 7 місяців тому +1

      I have 16gb of ram with 24gb of vram.

    • @gctypo2838
      @gctypo2838 7 місяців тому +7

      ​@@DivorcedStates The point is that if you're using a 4GB VRAM card, the extra VRAM demanded gets "spilled over" into RAM, taking a game that might require 12GB of RAM to requiring 17GB of RAM. If you only have 16GB of RAM, this spills over into swap/pagefile which will tank performance _exponentially_ more. Very few people using a 4GB VRAM card will be running with 32GB of RAM, which makes that 16GB RAM threshold so significant.

  • @Pandemonium088
    @Pandemonium088 7 місяців тому +106

    Next up, 24gb of vram and unreal 5 enginre still has traversal stutter 😅

    • @tenow
      @tenow 7 місяців тому +19

      recent UE5 titles lack good VRAM settings. Robocop for example always tries to fit into 8 Gb VRAM and as a result always has texture pop-in. Saw it in some other recent titles as well.

    • @residentCJ
      @residentCJ 7 місяців тому +4

      ​@@tenow ithought nanite was the holy grail in ue5 to eliminate texture pop in.

    • @ROFLPirate_x
      @ROFLPirate_x 7 місяців тому +17

      ​@@residentCJonly if you have enough vram to use it. Nanite doesn't stop texture pop in, it better adjusts LOD over distance. I.e the closer you get the more details it renders. If you go past the VRAM buffer, you are still gonna have pop in as your system is forced to use system ram, which is much slower for the GPU to access. Also. Devs have the ability to not use nanite, it is quite resource intensive. So not every dev is gonna implement it.

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 7 місяців тому +1

      Because not even 24gb is what you need for a 80 series card to use it's power properly. I'm having trouble playing even beat saber on my 3080 with a index.

    • @TheCgOrion
      @TheCgOrion 7 місяців тому

      No kidding. I have NVME, X3D, 96GB Ram (other work), and 24GB Vram, and they still manage to not have that scenario take advantage. 😂

  • @9r33ks
    @9r33ks 7 місяців тому +28

    Yes, textures makes all the difference. Back in a day of my GTX1080 I'd always try to pump up textures as high as I can, while lowering everything else as much as reasonably practicable. Textures make all the difference.

    • @viktorianas
      @viktorianas 7 місяців тому +3

      Exactly, I put everything on low and then put high/ultra textures, next goes lighting, effects, then shadows, draw distance and other bells and whistles with ray tracing at the very bottom on priority list.

    • @9r33ks
      @9r33ks 7 місяців тому +9

      @@viktorianas indeed. I hate when developers try "optimising" and "balancing" graphical presets. Devs always put textures resolution and details way too low on their presets, making the game look like ass, while it could look so much better with some useless particle effects disabled or lowered, and decent textures raised to reasonable level. Vram capacity and its importance really opened my eyes on the subject. I won't let nvidia and AMD mislead me with "great deals" of gpus having reasonable price but low vram.

    • @Rapunzel879
      @Rapunzel879 7 місяців тому

      That's incorrect. Resolution makes the biggest difference, followed by textures.

    • @defeqel6537
      @defeqel6537 7 місяців тому

      @@Rapunzel879 nah, I often rather crank up shadows than resolution

    • @viktorianas
      @viktorianas 7 місяців тому

      @@Rapunzel879 lol, resolution is obviously #1, yet it mostly depends on your monitor, so this setting is pretty much determined by its native resolution.

  • @DJackson747
    @DJackson747 7 місяців тому +190

    I can't imagine getting any less than 12 GB these days. Granted most of what I play are older games or low spec games but it still let's me play newer software at med/high settings comfortably. A 4GB card should be like $99 usd at this point as it's gonna age like milk.

    • @erikbritz8095
      @erikbritz8095 7 місяців тому +13

      should be $75 max for a 4gig gpu in 2024

    • @ricky4673
      @ricky4673 7 місяців тому +9

      For me, I cannot do without at least 16. I prefer 24 to 32. Next year, I want 64.

    • @lorsch.
      @lorsch. 7 місяців тому

      @@ricky4673 32GB would be quite nice, really. A little more future proof than 24GB

    • @mitsuhh
      @mitsuhh 7 місяців тому

      you want 64gb vram next year ?@@ricky4673

    • @MrSeb-S
      @MrSeb-S 7 місяців тому +18

      Only 128gb vram

  • @Code_String
    @Code_String 7 місяців тому +32

    It sucks that some nice GPU chips will be held back by the VRAM buffer x: .The 12Gb of my 6800m has been surprisingly more useful than I could have thought.

    • @roqeyt3566
      @roqeyt3566 7 місяців тому +5

      On mobile, that card was awesome for the generation
      There wasnt a lot of vram to go around in affordable laptops, which the 6800m fixed

    • @timalk2097
      @timalk2097 6 місяців тому +2

      @@roqeyt3566 bruh you talk as if it's a decade old gpu, it's still by far one of the best options for laptop gpus atm (especially if you're on a budget)

  • @CharcharoExplorer
    @CharcharoExplorer 7 місяців тому +41

    The R9 390 was not pointless. Modders had a blast with texture mods and LOD/Model mods with it.

    • @pumcia718
      @pumcia718 6 місяців тому +1

      Oh I had one of those from Sapphire. The thing didn't care what game I threw at it at the time.
      Eventually it started thermal throttling and some other thing that I couldn't fix. In early 2019 I got the 16GB Radeaon VII to replace it, dude's still going strong.

    • @infinitycovuk699
      @infinitycovuk699 6 місяців тому +1

      had the sapphire 390 nitro was a beast of a card for the price.

  • @theslimeylimey
    @theslimeylimey 7 місяців тому +251

    Sitting here still happy with my "ancient" 1080TI with 11gig of ram and "only" 484 GB/s memory bandwidth.

    • @N.K.---
      @N.K.--- 7 місяців тому +21

      Cmon bruh that's pretty legendary for 1080p and works decent on 1440p on any game except for games with horrible optimization

    • @shagohodds
      @shagohodds 7 місяців тому +19

      That's BS. i own a 1080ti a 3080 and a 4080 the 1080ti is pretty much dead in the water for most recent games at 1080 and certainly at 1440p@@N.K.---

    • @KnightofAges
      @KnightofAges 7 місяців тому +81

      @@shagohodds You're coping hard for the fact you spend tons of cash on GPUs every gen. Except for Ray and Path Tracing, the 1080ti runs pretty much everything at 50-60fps at 1440p, and is much faster at 1080p. Even in Alan Wake 2, the one game the 1080ti could not run above 30fps due to the mesh shader technology, they are going to put out a patch that will allow it to run the game at around 50-60fps. Now, you're free to spend thousands on a GPU every gen for small gains, but don't try to gaslight owners of the 1080ti, who know very well the resolutions and settings they're gaming at, as well as the fps they get.

    • @Phil_529
      @Phil_529 7 місяців тому +22

      @@shagohodds Not really. 1080 Ti is pretty much PS5/Series X power but lacks proper DX12 support. It's still a fine 1440p card if you're using upscaling. Avatar medium settings with FSR quality mode averages 56fps.

    • @kosmosyche
      @kosmosyche 7 місяців тому +6

      @@Phil_529 Well, depends on what games you play really. And it's not a VRAM-related problem mainly, rather than in general it's getting really really old for modern games. I had GTX 1080 Ti for many many years (too many because of the crypto craze and my refusal to buy a new card for idiotic prices) and I'll be honest, I'll take RTX 3070 8GB over it any day of the week and twice on Sunday, just because it works substantially better with most modern DX12 games, despite the lower amount of memory.

  • @Владимир-я1з5ш
    @Владимир-я1з5ш 7 місяців тому +7

    Great job with testing Steve, thanks for an update on VRAM matter.

  • @xkxk7539
    @xkxk7539 7 місяців тому +31

    some notable past examples are 980 ti (6gb) vs fury x (4gb) and 780 ti (3gb) vs 290x (4gb). vram helped push out more performance for those cards

    • @Pixel_FX
      @Pixel_FX 7 місяців тому +4

      390X came with 8gb and released same month as 980ti. typical nvidia planned obsolescence

    • @yasu_red
      @yasu_red 7 місяців тому +1

      ​@@Pixel_FXThat was also with a good price on a 512 bit bus. Those were better days.

    • @PhAyzoN
      @PhAyzoN 7 місяців тому +1

      @@Pixel_FX I loved my 8GB 290X (essentially identical to a 390X) but let's be real here; the 980Ti was better across the board at everything and for a far longer period of time. That extra 2GB didn't do a damn thing for the 390X.

    • @OGPatriot03
      @OGPatriot03 7 місяців тому

      @@PhAyzoN Sure, but the 980 ti was considerably newer than the Hawaii architecture. The fact that it was rebranded for so long was thanks to AMD's "fine wine" over the years.
      Could you imagine if the 290x had that later spec performance all the way back in 2013? It was purely a software holdup all that time.

    • @defeqel6537
      @defeqel6537 7 місяців тому +1

      @@PhAyzoN GTX 980 Ti was also about 50% / $200 more expensive

  • @gamingoptimized
    @gamingoptimized 7 місяців тому +210

    Well, I know I made a mistake when I didn't spend 50 dollars more for a GTX 1060 6GB... (I have the 3GB one...)

    • @Radek494
      @Radek494 7 місяців тому +48

      Ouch

    • @kingplunger6033
      @kingplunger6033 7 місяців тому +59

      yeah, that is like one of the worst gpus ever

    • @GloriousCoffeeBun
      @GloriousCoffeeBun 7 місяців тому +45

      I bought a 3070 over 6800 just for cuda cores.
      Guess I am the next 🫠

    • @baophantom6469
      @baophantom6469 7 місяців тому +6

      Bruh ..

    • @PandaMoniumHUN
      @PandaMoniumHUN 7 місяців тому +19

      @@kingplunger6033 Hard disagree, it was a great GPU for it's time in 2016. Sure, even back then it made more sense to buy the 6GB model, but I was using the 3GB variant of that card between 2016-2019 without any issues. Every gaming having 4K and 8K textures only started a few years ago, that's when having way more VRAM was starting to be necessary.

  • @TheIndulgers
    @TheIndulgers 7 місяців тому +10

    I don't know why people defend nvidia (trillion dollar company btw) for skimping on vram.
    This same company gave you 8gb for $379 EIGHT years ago with the gtx 1070. People out here coping over their $800 4070ti purchase.
    50% more vram for over double the price 3 generations later doesn't sound like progress to me.

  • @807800
    @807800 7 місяців тому +489

    Any GPU over $300 should have at least 16GB now.

    • @TTM1895
      @TTM1895 7 місяців тому +9

      ikr?

    • @Radek494
      @Radek494 7 місяців тому +72

      4060 Ti 16 GB should be $350, 8 GB version should not exist and 4060 8 gb should be $250

    • @MasoMathiou
      @MasoMathiou 7 місяців тому +66

      12Go is acceptable in my opinion, but definitely 16Go above $400.

    • @N.K.---
      @N.K.--- 7 місяців тому +26

      Even 12gb will make sense

    • @Eleganttf2
      @Eleganttf2 7 місяців тому +40

      lol stop being DELUSIONAL wanting a 16GB Vram on a measily 300$ and besides why would they need to put massive 16GB Vram on it, just take a look at how RX 7600 XT,Arc A770 16GB or 4060 Ti 16GB is lol especially at just 300$ where i dont expect the gpu to perform good even at 1080p why would they wanna put 16gb ? for "future proofing" bs ? you do realize that if your gpu is getting weaker especially as its getting older having more vram wont help right ?😂what we need is the right Vram for the right PERFORMANCE SEGMENT but for 350-400$ gpu i would agree that it needs 12gb at bare minimum

  • @Kiyuja
    @Kiyuja 7 місяців тому +14

    I think many people dont realize is that modern games dont just crash with insufficient VRAM but rather dynamically lower assets in order to prevent crashes or stutter. This doesnt mean you dont "need" more. Especially these days where DLSS and RT are more and more important, these techniques store temporal data in VRAM, this scales with your resolution. I consider 12GB to be low end these days.

  • @Kelekona_808
    @Kelekona_808 7 місяців тому +1

    Highlighting the visual differences in the Vram images was very helpful for me. Usually I fail to see the differences between cards when going over visual comparisons.

  • @Jamelele
    @Jamelele 7 місяців тому +29

    I have a 3080 10GB, great thing they released a 3080 12GB

    • @Phil_529
      @Phil_529 7 місяців тому +39

      10GB was a terrible choice by NVIDIA. They marketed that as a "flagship" but it had less VRAM than the 11GB 1080Ti from years earlier for the same $700.

    • @mertince2310
      @mertince2310 7 місяців тому +9

      @@Phil_529 Its a great choice for nvidia* bad for users. If they can get away with less vram and people even defend this choice, why would they put more?

    • @Phil_529
      @Phil_529 7 місяців тому +4

      @@mertince2310Well clearly they didn't have to. Ampere was peak crypto and the day I got my 10GB for $700 I could've sold it for $1500 followed by it being sold out for almost 2 years. To be fair the 10GB at 4K was mostly OK for the first 2 years but it fell off a cliff once "next gen" games started to come out. Dead Space remake and RE4 was the final straw for me and I upgraded to a 4090.
      Pretty sure the lack of VRAM is directly related to NVIDIA wanting to push AI users into more expensive GPUs. At least now they offer a 16GB model for $450 so next generation shouldn't be so outrageous. These overpriced mid ranged cards shouldn't be limited to 12GB and asking for $600+.

    • @BitZapple
      @BitZapple 7 місяців тому +5

      @@Phil_529 Even with 10GB I personally never actually ran into actual problems even playing games like Hogwarts legacy. 1440p and I guess I just didnt reach that area yet, but even if I would I can always go from Textures Ultra to High (which looks the same), problem solved, There's also even a mod that reduces VRAM usage.

    • @Phil_529
      @Phil_529 7 місяців тому +3

      @@BitZappleHogwarts Legacy ate VRAM like no ones business at launch if you were using ray tracing (14.2GB at 1440p). That was another game that choked up my 10GB card. I also had problems with Need for Speed Unbound (easy fix just go from ultra reflections to high and it's fine).

  • @Rexter2k
    @Rexter2k 7 місяців тому +177

    Boy, we are so lucky that current generation gpu's have more vram than the previous gen, right guys?
    Guys?...

    • @shadowlemon69
      @shadowlemon69 7 місяців тому +30

      Double the VRAM, Double the price

    • @Azhtyhx
      @Azhtyhx 7 місяців тому +4

      I mean, this is not exactly something new. Let's take a look at the Nvidia side of things, going back to the 200 series in 2009, using data related to the '70 and '80 model cards:
      * Two generations saw a 0% increase in VRAM compared to the previous generation. The GTX 500 Series as well as the RTX 2000 Series.
      * The '70 has had an average amount of VRAM of 87.5% compared to the '80, in the same generation.
      * The '80 has had an average increase in VRAM of 38.5% compared to the previous generation.
      * The '70 has had an average increase in VRAM of 39.3% compared to the previous generation.

    • @shadowlemon69
      @shadowlemon69 7 місяців тому

      @@Azhtyhx I'm talking about the price increase that came with more VRAM though
      GTX 480 1.5GB - $500
      GTX 580 1.5GB - $500 (590 3GB - $700)
      GTX 680 2GB - $500 (690 4GB - $1,000)
      GTX 780 3GB - $650, 780 Ti 6GB - $700 (TITAN 6GB - $1,000)
      GTX 980 4GB - $550, 980 Ti 6GB - $650 (TITAN X MAXWELL 12GB - $1,000)
      GTX 1080 8GB - $600, 1080 Ti 11GB - $700
      RTX 2080 8GB - $700, 2080 Ti 11GB - $1,000 (TITAN RTX 24GB - $2,500)
      RTX 3080 10GB - $700, RTX 3080 Ti 12GB - $1,200, 3090 24GB - $1,500
      RTX 4080 16GB - $1,200, RTX 4090 24GB - $1,600
      Remember that 10-series gave a massive jump in performance compared to the 20-series increased price with insignificant gains. The price hike that started in 2018 wasn't really justifiable. And it kept on increasing after 2 generations. So, it's something new lol.

    • @upon1772
      @upon1772 7 місяців тому +1

      ​@@shadowlemon69The more you spend, the more you save!

    • @OGPatriot03
      @OGPatriot03 7 місяців тому +2

      I had an 8gb GPU since 2014... 8gb is absolutely out of modern spec these days, it would've been like running a 2gb GPU back in 2014 which would've been low end for sure.

  • @vulcan4d
    @vulcan4d 7 місяців тому +10

    Just add sockets on the back of the GPU to upgrade the memory and create a new industry standard of upgradable vram modules.

    • @kenshirogenjuro873
      @kenshirogenjuro873 7 місяців тому +4

      I would SO love to see this

    • @klv8037
      @klv8037 6 місяців тому +5

      Nvidia's not gonna like this one ☠️☠️🙏🙏

    • @kajmak64bit76
      @kajmak64bit76 5 місяців тому +1

      ​@@kenshirogenjuro873that is actually possible
      I saw a video of some dude testing out some modified RX 580 from China it had like 16gb of VRAM ( or was it 32? )
      And it's real... it worked and used more VRAM and everything detected the extra VRAM
      So it's entirely possible to just add more VRAM via soldering but it may vary from card to card

  • @axlfrhalo
    @axlfrhalo 7 місяців тому +2

    Halfway through the vid but i love this, gives very good understanding of how VRAM makes a difference and exactly when texture do and dont impact performance and just how drastic it can get.

  • @Zayran626
    @Zayran626 7 місяців тому +10

    would love to see a video like this with the 4060 Ti 8 GB / 16 GB cards

    • @vvhitevvizard_
      @vvhitevvizard_ 7 місяців тому +4

      4060 ti 8GB should not exist at all. and 4060 ti 16GB to be priced $350. well, $400 at max

    • @Rachit0904
      @Rachit0904 7 місяців тому +4

      There already is! Look at the 4060 Ti 16GB review

  • @teardowndan5364
    @teardowndan5364 7 місяців тому +14

    The difference between 4GB-to-8GB and 8GB-to-12+GB is that the baseline image quality has a much higher starting point, which makes lowering details to save VRAM and VRAM bandwidth far more palatable with 8GB when shopping in the $150-250 new range today. Anything much over $250 really should have 12GB at a bare minimum.

  • @metallurgico
    @metallurgico 7 місяців тому +10

    My 2014 980Ti has 6GB of RAM... 10 years later games still looks like Half Life 2 and we still have 4-8 GB cards. That's real innovation.

    • @chillhour6155
      @chillhour6155 7 місяців тому +5

      Yeah these Unreal 5 games look boring in appealing garbage but everyone's seems to have drinking the coolaid, they REALLY must've like that matrix demo

    • @mitsuhh
      @mitsuhh 7 місяців тому

      Robocop is fun@@chillhour6155

    • @TheBoostedDoge
      @TheBoostedDoge 6 місяців тому +7

      Actually look worse because devs rely more and more on upscalers instead of actually optimizing their games

    • @Grandmaster-Kush
      @Grandmaster-Kush 6 місяців тому +1

      Upscaling crutch + TAA + Poor optimization + high development cost + lack of interest in their work / homogenization of coding due to thousand and thousand programmers and developers being churned out in "bideogame skewls" and you have uninspired low risk modern AA and AAA games as a result.

    • @metallurgico
      @metallurgico 6 місяців тому +2

      @@Grandmaster-Kush that's why I started studying bass guitar instead of gaming.

  • @MrSmitheroons
    @MrSmitheroons 7 місяців тому

    I had been meaning to do some testing like this myself but found it too daunting. You deserve massive props for doing this, and doing the subject justice. The conclusion section at the end going over the timeline of "how we got here", along with the context of the multi-preset, multi-texture-quality and visuals comparisons, it is just *chef's kiss*. So rounded and complete.
    The only way to give more context would be to show more "negative results" where nothing interesting happened, to contextualize which games it doesn't matter in. But I imagine in 2024 this is getting to be not that many recent games, for one thing, and would arguably bloat an already long video at half an hour.
    But this was just a really great video, and I really do agree it adds good context for this moment we're in, trying to see how well 8GB will do not just today (we already see it *start* to struggle in certain games), but in a few years down when 12-16+GB cards will be more normalized and devs will want to throw more "bang" into the higher presets to really give gamers something nice to look at in those presets.
    As you've shown it's not just about visuals, not just about performance, but often both. It either makes no difference, or you're starting to trade quality for performance despite the compute part of the card (actual GPU chip) being fast enough. The question is when it's going to be more relevant, but you make a strong case that it's "starting basically now," and that "the transition will probably be pretty complete within a year or three" to where 8GB will be felt badly as a limit.
    I know games will tend to be *launchable* with 8GB VRAM, but poorly optimized ports and AAA launches are still a thing, for those that jump on a title on day one (or worse, pre-order)... and you're still going to be leaving some performance *and* visuals or both on the table, that the GPU could have handled otherwise.
    I think it's high time we understood NVidia and AMD are cheaping out if they give less than 12GB on a multi-hundred-dollar piece of hardware when it's not costing them nearly as much as they're upselling us for 12GB+ VRAM. I don't consider intel to be a trend-setter in this area, but at least they have leaned into their VRAM performance a lot of the time, so I don't suppose they're too egregious, but I do hope they're listening as well.
    Sorry for the long post, this has been a topic on my mind for some time now. Thanks much again, for all the testing and well considered conclusions. Cheers to you and the team!

  • @mopanda81
    @mopanda81 6 місяців тому +2

    Once again doing tests on lower spec hardware at 1080p gives us a lot of perspective that doesn't exist with new card reviews and ranked framerate lists. Thanks so much for this work since it really fleshes out the whole picture.

  • @menohomo7716
    @menohomo7716 7 місяців тому +21

    "no you don't understand you NEED to preload 8 gigs of texture in dedicated memory" -the devs probably

    • @Splarkszter
      @Splarkszter 7 місяців тому +1

      To be fair, textures do the majority of the job on making a game look good.
      Yes 4K(or more) textures are absolutely dumb. But oh well.

    • @sush7117
      @sush7117 7 місяців тому +8

      well, yes. You can see what happens if textures are loaded in RAM instead of VRAM in this video

    • @Splarkszter
      @Splarkszter 7 місяців тому +3

      ​​@@noir-13 Search about the storage space difference. It may seem like a small little number but the growth is exponential.
      More resolution doesn't necessarely yield higher details.
      It's an incredibly wide-spread issue.
      The whole dev market is filled with people that don't know what a 'job well done' is or even means.
      2K it's fine i guess, 1K still. 4K or more is just unnecesary because of the exponential storage space consumption(that also well applies to texture loading speed, world loading speed and VRAM consumption)

    • @DarkNia64
      @DarkNia64 7 місяців тому +1

      Sounds like someone doesn't care to do their own research ​@noir-13

    • @mikehawk6918
      @mikehawk6918 7 місяців тому

      @@noir-13 Sounds like you're 12 years old. Also seems like it judging by your profile picture.

  • @TheGarageGamer86
    @TheGarageGamer86 2 місяці тому

    Not having a go at you guys here, I love the content!! But I noticed with the Jedi Surviver benchmark the VRAM usage decreased on the 8GB card when enabling the Epic textures which seemed a little weird. But I mean otherwise, thankyou for the video! I'm currently on the market for a gaming laptop and this video has helped me a lot! Cheers.

    • @Hardwareunboxed
      @Hardwareunboxed  2 місяці тому

      Do you mean the RAM usage?

    • @TheGarageGamer86
      @TheGarageGamer86 2 місяці тому

      Definitely VRAM, on Low Textures it was 7.3GB, but on Epic Textures it was 5.9GB, I was thinking maybe it was supposed to be the other way around?

  • @adink6486
    @adink6486 7 місяців тому +9

    How is it so hard to for the fanboy to understand? Obviously you want to max out most of the graphical settings when you spend $600+ on a GPU. That's the whole point. We should never need worry about running out of VRAM if the company give us enough in the first place. Why would I want to lower my graphical settings so that I have enough VRAM to run the game. We want NVIDIA to treat us better. That's it.

    • @innie721
      @innie721 7 місяців тому +1

      I want AMD to treat us better too. Imagine buying their highest end offering like the 7900 XTX only to get 25fps at 1440p with RT on medium in Alan Wake 2, a AAA title and one of the best titles of 2023. Scammed.

    • @defeqel6537
      @defeqel6537 7 місяців тому +1

      @@innie721 AMD is irrelevant in the market, and memory subsystems are a "solved" problem: it would be trivial for nVidia to increase VRAM amount. Heck, with the 3080 they had to put in work to cut the VRAM amount from 12 to 10 GB, by disabling parts of the chip.

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp 6 місяців тому

      @@innie721 Again it isn't raytracing, its raytraced lighting and shadows. Nvidia may as well call it RTLS. Were it true raytracing the game would be unplayable. Its bad enough that "RT" has as bad a performance hit it does on Nvidia let alone AMD.

    • @Shockload
      @Shockload 3 місяці тому +2

      @@innie721 Bro no one is buying AMD for RT what are you on about ? people buy AMD because of rasterization performance per dollar and the VRAM

  • @Celis.C
    @Celis.C 7 місяців тому +1

    Interesting video idea: statements/stances you've made in the past that you might reconsider now, as well as the why of it (new insights, new developments, etc).

  • @ivaniliev2272
    @ivaniliev2272 6 місяців тому +4

    I will tell you why. Because game developers are lazy to optimize the games. VRAM should not affect much the framerate, but draw distance, texture resolution and mesh LODs(level of details). If the game is overfilling the GPU memory, it means that someone, somewhere did a really bad job and I am saying this as a game developer.

  • @ravanlike
    @ravanlike 7 місяців тому +4

    Speaking of VRAM, yesterday one polish tech youtuber compared 3070 8GB vs 3070 16GB (modded, all memory dices were replaced). Findings were very similar to yours (when you compared 3070 8GB with one gpu for professionals (in reality 3070 chip with 16gb)). 1% low fps was higher with more VRAM, especially when running games with RT enabled.
    ua-cam.com/video/9mWJb25pU8A/v-deo.htmlsi=er0AK11pJyAEeMNC&t=172

  • @MrHamof
    @MrHamof 7 місяців тому +11

    So what I'm getting from this is that the 6500XT should always have been 8gb and would have been much better received if it was.

    • @Hardwareunboxed
      @Hardwareunboxed  7 місяців тому +11

      Yes

    • @TrueThanny
      @TrueThanny 7 місяців тому +5

      Yes and no. Yes, it would have been better with 8GB. No, it would not have been better received, because it was a pandemic card, created explicitly to reach the $200 point in a time of drastic supply line disruption. Doubling the VRAM would have notably increased its price at the time, and it would still have received negative reviews on that basis, even from HUB, which ignored the effect of the pandemic on pricing of all cards for some bizarre reason. Specifically, they compared AMD's MSRP values for cards released in the midst of the pandemic, which accounted for supply line disruption, to nVidia's fictional MSRP values for cards released before the supply line disruption.
      The 6500 XT was only ever supposed to be a stop-gap measure that allowed people to get a functional graphics card for $200, when used versions of much slower cards were selling for a lot more.
      AMD should have at the very leased ceased production of the 4GB model after supply lines cleared up.

    • @defeqel6537
      @defeqel6537 7 місяців тому

      @@TrueThanny Indeed, pandemic pricing for GDDR6 was about $15 / GB (while it is around $3 /GB now), extra 4GB would have cost about $60 more + OEM/retail margins which are often based on the price of the product (so about $80 more total)

  • @TheZoenGaming
    @TheZoenGaming 7 місяців тому +75

    I always laugh when someone claims that more than 8GB isn't necessary on a card just because it isn't a high-end card.
    Not only does this show that they don't understand how VRAM is used, but mods can really crank up the amount of VRAM used by a game.

    • @Dragoonoar
      @Dragoonoar 7 місяців тому +8

      yea but lets be real here, the only game you'd mod the shit out until it utilizes more than 8 gb of vram is skyrim. there are strong arguments against 8gb vram, but mods aint it

    • @nimrodery
      @nimrodery 7 місяців тому

      Sure, if you tweak your settings you can sometimes find a game where you can run out of VRAM on one GPU while the equivalent GPU with more RAM keeps running fine, but in most cases you'll get bad performance with the higher VRAM GPU as well. Basically lower end cards don't need as much VRAM because they won't see as much benefit. I wouldn't say "no" to extra RAM but I'm not going to spend an extra $150 or $200 on it when buying a low end GPU (not a dealbreaker). For the midrange and up everything should be over 10/12 GB.

    • @TheZoenGaming
      @TheZoenGaming 7 місяців тому +21

      @@nimrodery I also laugh when people show that they failed to read the comment they are replying to, or, for that matter, that they didn't watch the video that comment is posted to.

    • @morgueblack
      @morgueblack 7 місяців тому +27

      ​@@nimrodery"lower end cards don't need as much VRAM".....Bruh, did you even watch the video? Hardware Unboxed literally used a 6500XT (that's a lower end card, don't you know?) to disprove your point. What are you on about?

    • @nimrodery
      @nimrodery 7 місяців тому +1

      @@morgueblack No, I was talking about 8 vs 12 or 16, HUB already has a video about 8 vs 16 in the form of the 4060 ti, which shows that for the most part there's no performance gains to be had. For instance you can enable RT and ultra settings on both, and while you may run out of VRAM on one, the other GPU isn't putting out high FPS because of the settings and the fact it's a 4060 ti. No GPU should have less than 8 at this point.

  • @lencox2x296
    @lencox2x296 7 місяців тому +6

    @HUB: So lession to learn from the past: In 2024 one should recommend 16GB cards only, or at least 12 GB as the bare minimum for mid-range GPUs.

  • @mxyellow
    @mxyellow 7 місяців тому +4

    I sold my 3070 as soon as I saw HUB's 16GB vs. 8GB VRAM video. Best decision ever.

  • @sirdetmist3204
    @sirdetmist3204 6 місяців тому +3

    No modern GPU should be releasing with less than 12gb of ram. All the high end should be 24+.

  • @Pamani_
    @Pamani_ 7 місяців тому +4

    20:48 I think there is an error here. The perf shown for the 4GB is at high textures while the script says it's low textures

    • @adnank4458
      @adnank4458 7 місяців тому +2

      Agree. noticed the same error. don't know if it's the only error.

    • @Shockload
      @Shockload 3 місяці тому

      errors cant be avoided when doing 100 of tests with 100 of quality presets

  • @happybuggy1582
    @happybuggy1582 7 місяців тому +19

    1060 3GB owners are crying

    • @highwayvigilante
      @highwayvigilante 6 місяців тому +2

      7800GTX 256MB owner here. Literally all you need

    • @kajmak64bit76
      @kajmak64bit76 5 місяців тому +2

      1060 3gb was a mistake and should never have been made
      Or if it was to be made it should atleast have 4gb of VRAM since GTX 1050 ti has 4gb's like wtf

    • @Burago2k
      @Burago2k 28 днів тому

      @@kajmak64bit76 yep, and im pretty sure the 3gb version of the 1060 wasn't much cheaper and the difference is massive

    • @kajmak64bit76
      @kajmak64bit76 27 днів тому

      @@Burago2k literally pointless existence of the 3gb variant
      Why couldn't they atleast make it 4gb so it has same amount as GTX 1050 Ti lol

  • @vasoconvict
    @vasoconvict 6 місяців тому

    These videos are so nice to listen to. Just get a good dose of tech info so I can stay updated (i haven't been in a few months) with a relaxing not-jumpy voice unlike other tech channels. Good job HU team

  • @OutOfNameIdeas2
    @OutOfNameIdeas2 7 місяців тому +4

    Nvidia was the dumbest buying decision I made in years. I bought the 10gb 3080. It lasted me 1 month before it got limited by it's VRAM. 4k max settings are undoable. It's not even good enough to run beat saber with a index at 120fps.

    • @lorsch.
      @lorsch. 7 місяців тому +2

      Same, replaced it by a 3090.
      For VR 24GB is definitely needed.

  • @TheHurcano
    @TheHurcano 7 місяців тому

    I really appreciate how clear this comparison illustrated the differences. Now I am really curious about performance and quality differences in similar gaming titles when going from 8GB VRAM to 12GB (or 16GB) on the next tier up in cards, especially when comparing 1080P to 1440P along with varied quality settings at both resolutions. Seems like that comparison could end up being a little less obvious on what the "correct" breakpoints of acceptability/desirability are, but might be more relevant to buyers in the $300-$400 market.

  • @toad7395
    @toad7395 7 місяців тому +4

    Its so scummy how NVIDIA refuses to give their cards more vram (and if they do it is absurdly overpriced)

  • @Ariane-Bouchard
    @Ariane-Bouchard 7 місяців тому

    One thing I wish you'd done more of is zoomed in/slowed down scenes for visual comparisons. Most visual differences I wasn't able to see on my phone, even though I tried zooming in manually. Making things bigger, focusing on the important details, would've probably been a relatively simple way to get around that issue.

  • @Jomenaa
    @Jomenaa 7 місяців тому +3

    My GTX 1070 Ti still going strong, OC'd to GTX 1080 lvls of performance and those sweet 8GB's of GDDR5 :)

  • @lake5044
    @lake5044 7 місяців тому +5

    One important thing to keep in mind is no matter which memory we're talking about (RAM, VRAM, etc), utilizing it close to its limits (say above 80%) will always have some extra overhead. The details can get very complicated (and I don't know them all) but it depends on many things like the controller and the algorithm used to allocate and retrieve the data. But basically, the issue boils down to having to move data around (either to consolidate unused space or to avoid random reads across different module subdivisions for performance reasons, etc). Those techniques sacrifice speed to be able to use as much as possible from the empty space, but the ideal for performance is to have so much empty space that you don't trigger those techniques that move data around.

    • @Citizenflaba
      @Citizenflaba 7 місяців тому +2

      4090 purchase vindicated

  • @hasnihossainsami8375
    @hasnihossainsami8375 7 місяців тому +3

    I think the biggest takeaway here is using higher quality textures almost never hurts performance, and so GPUs with insufficient VRAM should be avoided like the goddamn plague. 4GB cards are absolutely dead at this point, 6GB doesn't have much longer left and 8GB only falls short in some newer games and some edge cases in others. Buying an older 8GB card at discount/second hand prices still makes sense, imo. But newer GPUs? Considering the vast majority don't upgrade for atleast 3 years, they aren't worth it. 10GB is the minimum for a new GPU.

  • @alpha007org
    @alpha007org 7 місяців тому +2

    Thank you for not being upset, Steve. Mental stability is equally important as frametime stability.

  • @gregorsamsa555
    @gregorsamsa555 7 місяців тому +3

    I guess 4GB RX 580 performs weaker than 8GB RX 570 in latest modern games...

  • @itsyaboitrav5348
    @itsyaboitrav5348 7 місяців тому +3

    You should compare running GPUs on PCIE gen 3 vs gen 4 next, looking at whether people on gen 3 boards are being held back on there gen 4 cards

  • @emmata98
    @emmata98 7 місяців тому +9

    It was really worth it, to go for the 8 GB version of the R9 290 X, instead of the 4 GB version.

    • @hrayz
      @hrayz 6 місяців тому

      When my R9 290X-4GB card died I was able to get the RX 580-8GB card. That 8GB has allowed the card to survive to this day (good for 1080p gaming.)
      My friends and roommate are still using theirs, although I moved on to the RX 6900XT-16GB for 4k use.

  • @EmblemParade
    @EmblemParade 7 місяців тому +1

    To be honest, this increasing appetite for VRAM happened faster than most of us expected. Some devs were signaling that this would happen for a while, but we assumed that the baseline represented by PS5 and Xbox Series X would limit requirements for PC, too. The takeaway is that we can't assume that anymore. Devs are targeting a higher resource profile for PC gaming than for consoles, period.

  • @pivorsc
    @pivorsc 7 місяців тому +3

    Doesent more vram prevents random shuttering? When i play CoD my GPU uses around 14gb of vram that i believe is a data preload to prevent loading from the drive.

    • @imo098765
      @imo098765 7 місяців тому

      Its not VRAM reduce the random stuttering as much that you running out of VRAM you will introduce the stuttering. CoD just asks for everything possible and you wont see a difference on a 12GB, its just an "incase" we need it moment

    • @MrSeb-S
      @MrSeb-S 7 місяців тому

      Definitely! On 24gb GPUs everything work smooth.

    • @andersjjensen
      @andersjjensen 7 місяців тому

      Very simplistically you can say "There are two amounts of VRAM: Enough and Not Enough". Having more than enough doesn't help with anything. Having less than enough hits 1% lows the hardest and the AVG progressively harder the bigger deficit you have.
      Until the whole Direct Storage thing is rolled out the system RAM is still being used as a staging area before dispatch to VRAM, rather than the VRAM being used as a buffer for things that are still (far) out of view. This means that game stutters can also occur if you're low on system RAM, despite having enough VRAM.

    • @andersjjensen
      @andersjjensen 7 місяців тому

      @@MrSeb-SNot if you chuck them in a system with 8GB system RAM.

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 6 місяців тому +2

    Meanwhile in laptops, a 4070 costs $250-$300 MORE than a 4060 and both come with 8GB of memory.
    I remember buying an RX 480 with 8GB for just $250 over 7 years ago. Even back then we were saying that 4GB wasn't enough for 1440p.

  • @Azureskies01
    @Azureskies01 7 місяців тому +2

    everyone with 3070s and 3080s are now on suicide watch

    • @Nurse_Xochitl
      @Nurse_Xochitl 6 місяців тому +3

      I still use a card with 4GB of VRAM.
      I'm pissed, not so much at NVIDIA for not including more VRAM (although I'm still pissed at them for different reasons, I use Linux and they have no open source drivers for the GTX series of cards)... but at the gaming industry as a whole.
      The only thing the gaming industry optimizes is monetization, not the games themselves. A prime example of this is "EOMM", a way to rig matchmaking to increase "engagement" which basically refers to how much people play and pay.
      Modern games do NOT use SBMM!
      SBMM is actually a good thing (game companies hate it because they can't milk players as well, and content creators/streamers do NOT tell the truth about it (along with toxic no-life "veteran" players) because they hate it that they can't "pubstomp" new players when SBMM is properly implemented.
      Content Creators/Streamer BTW are often paid shills, so when EOMM is brought up, they often act skeptical at best, or tell lies about it (and refer to it falsely as SBMM) likely to cover up the game companies' ass... because otherwise they could lose their partnership, get hit with lawsuits and DMCAs, etc.
      Combine that with grinding/progression/unlocks (and of course, "FOMO" limited-time events) and every player who doesn't spend a bunch of time grinding or spending money on "Pay To Progress" (which is Pay To win) crap like boosters, will always fall behind and be at a disadavantage, and will generally have a worse time gaming.
      Engagement Optimized Match Making:
      web.cs.ucla.edu/~yzsun/papers/WWW17Chen_EOMM)
      On top of that, I'd imagine there's probably some sketchy deals going on behind closed doors with hardware manufacturers and game companies.
      Perhaps game companies get new high-end hardware at a huge discount and/or are incentivized to "optimize" their games to run "well" only on the latest, highest end hardware (while not giving any older/weaker hardware any real optimization). (Optimized and well are in quotes because they don't mean to actually optimize the game and have it run well... just barely playable on only the newest, most powerful shit... so they can sell more hardware upgrades.)
      I would not be surprised if there was this much corruption in the gaming industry, as I have seen a lot of it personally as a gamer (and BTW, as a nurse - I can say the healthcare industry is also very corrupt with big government and big pharma lobbying) I'd imagine it's a somewhat similar deal here, as I follow the game industry somewhat closely. Heck, even the Australian government is defending microtransactions via state-owned media.
      In Defense of Microtransactions | THE LOOT DROP
      ABC Gamer
      ua-cam.com/video/rUnO-njvVZA/v-deo.htmlfeature=shared
      TBH, we really don't need more than 4GB of VRAM... if game companies would just optimize stuff.
      People could just say to buy a better card, but then it's only a matter of time before that card also becomes useless... which generates more and more e-waste. Not everyone can afford to do that anyway either.
      There has to be a point where people put their foot down and crack down on bad optimization.
      People need to stop buying new hardware, especially higher end hardware... and use stuff longer.
      They also need to stop supporting bad games (online-only/DRM'd games, games full of Grinding/P2W/Gambling, games without self hosted server support, games without modding support, etc.)
      Only then will we see a change in the industry.

  • @Starkiller12481
    @Starkiller12481 6 місяців тому

    Great video! Been looking for guidance on VRAM/RAM behavior under different texture loads this was very informative.
    Thanks gentlemen 🙏🏾

  • @coganjr
    @coganjr 6 місяців тому +4

    I love how this community always complaining about more vram while game developer can get away freely to relased unoptimize game.
    What an awesome community LOL

    • @Hardwareunboxed
      @Hardwareunboxed  6 місяців тому +4

      Don't see how the two are connected, but okay. It would be very odd to expect modern games to work well on old 4GB graphics cards right?

    • @coganjr
      @coganjr 6 місяців тому

      @@Hardwareunboxed Yes modern game will not running well on 4GBs or even 8GBs of vram.
      What I mean is when we looked at the recent unoptimized game that need a lot of vram but don't have any graphic fidelity like Alan Wake 2, Cyberpunk 2077

    • @Hardwareunboxed
      @Hardwareunboxed  6 місяців тому +2

      Alan Wake 2 is pretty heavy on VRAM, especially if you use RT while CP2077 has fairly poor texture quality.

  • @Petch85
    @Petch85 7 місяців тому +1

    Always try if you can run the game with the highest texture settings. Sometimes you get a lot of quality for no performans loss at all or very little performans loss.

  • @GENKI_INU
    @GENKI_INU 7 місяців тому +2

    *Cries in buyer's remorse after buying a brand-new "RTX" gaming laptop with 6 GB of VRAM*

  • @_Azurael_
    @_Azurael_ 5 місяців тому +2

    To be honest, I am just getting into all this VRAM importance thing now because I am buying a new PC.
    My old PC with a GTX 1070 OC (8GB) is still able to run most games at 2k... I played Elden Ring last year with 50~60fps. I then started BG3, and that game forced me to lower graphic quality to medium/low, but is still working with 50~60fps.
    Because it's getting to the point were I have to lower graphics bellow Medium, I am upgrading, but I don't know if the problem is VRAM of simple raw power.
    The only time I see VRAM completly used is in very unoptimized games.

  • @owlmostdead9492
    @owlmostdead9492 7 місяців тому +5

    Same goes for RAM, I believe it's a crime against nature (as in creating e-waste) to pair a modern CPU with less than 16GB of RAM. More so if it's soldered and not upgradeable.

  • @ShaunRoselt
    @ShaunRoselt 7 місяців тому +1

    I'd love to see this, but on 4K. It's really interesting to see RAM usage.

  • @realizewave1513
    @realizewave1513 7 місяців тому +1

    i'd like to see testing at native 4k between 16 GB VRAM, 20 GB VRAM, and 24 GB VRAM cards. I suspect that most games today don't really use much over 16 GB VRAM even when using ultra 4k textures.

  • @Matthias-f2j
    @Matthias-f2j Місяць тому +1

    Aside from the fact that the 4GB RX 6500XT (my second graphics card) is relatively limited in addition to my main card, a GTX 1070, you have to keep in mind that this little card delivers sometimes incredible performance in relation to the technical limitations. Absolutely equivalent to my GTX 1070 in 90% of all my games.

  • @KimBoKastekniv47
    @KimBoKastekniv47 7 місяців тому

    You're the first channel I go to for day-1 reviews, but these "investigation / for science" videos is the cherry on top.

  • @aaron_333
    @aaron_333 7 місяців тому +1

    Very nice! I wonder if you can do one which compares 10GB, 12GB and 16GB?

  • @gwagnsso
    @gwagnsso 7 місяців тому +1

    Hitting a VRAM bottleneck is not a subtle experience, you REALLY notice it when playing a game.
    It's functionally the exact same as when an X3D chip powers through when the standard CPU struggles, the operation needed more local memory (cache in this case instead of vram) to complete the necessary steps in one go.
    All else being equal, better to have more than you need than not enough.

  • @rcavicchijr
    @rcavicchijr 6 місяців тому +1

    It's worth noting that with a 4gb card quite a few games use right up to, or more than, 16gb of system ram. Which is much more likely to be the max in a system with a 4gb graphics card.

  • @jimmyjiang3413
    @jimmyjiang3413 7 місяців тому +2

    It reminds me why professional (Quadro) RTX and Radeon Pros utilize double VRAM buffer. I am not sure whether or not something like RTX 4000 SFF (Ada) worth it for a given SFF build due to VRAM buffer reasons or simply performance per watt. 🤷🏻‍♀️

  • @falcon6329
    @falcon6329 7 місяців тому +1

    I think its partially because devs are not willing to invest recources into optimizing their games. Maybe if more gpus have lower vram, it will probably force devs to optimize their games. If every gpu will release with 16gb or more vram then devs would see no reason to include 4gb or 8gb cards

  • @Leelz2
    @Leelz2 7 місяців тому +2

    4 GB VRam died in 2023. 8GB VRam is new mainsteam.
    Its same difference like dual channel RAM 16 GB vs single channel 8GB. Huge...

  • @rxPACman
    @rxPACman 7 місяців тому +1

    would've been good to hear how memory bus and memory type ie gddr6x play a part in all this, good vid but 👍

  • @BaggerPRO
    @BaggerPRO 6 місяців тому

    Very informative and descriptive. Thank you!

  • @Hamsandwich5805
    @Hamsandwich5805 7 місяців тому

    I really liked this video, HU! Love the work you do. You ask good questions and seek the tough answers, being
    Just wanted to add - in some of those games where ram usage creeped over 12GB, you'll be in a really tight spot with most budget builds. Assuming you opted for the 6500XT for budgetary purposes, I'd reckon you likely also only have 8 or 16GB of sys RAM. That's going to make it nearly impossible to rely on sys ram for back up, even if your card's buffer is being fully consumed at 8GB.
    Windows will still take several GB of RAM, so those games will likely be totally unplayable or run very poorly when loading/switching between apps (the challenges of alt+tab like in the windows XP days).
    I'd really like to see a system that matches budget direction - maybe a 3600/5600CPU or i3 alternative with 16GB of ram (or maybe even 8GB) and a 6500XT8GB - along side the 'ideal' scenario with the 7800X3D to really compare how these will work for budget builds in years to come. People may not be realizing just how poorly the system will perform forgetting you're testing on a 7800X3D with 32GB+ of RAM.
    For future technologies, I'm curious how direct storage can help users with tighter budgets take advantage of speedier, low-cost NVME drives, instead of relying on potentially slower sys RAM (what a bizarre statement, considering how slow HDD's were!). It's possible most new games begin to roll out those direct storage technologies alongside 1TB PCI4/5 NVME drives that are relatively fast performance compared to budget GPU buffer speeds, meaning you won't need the upgrades to 16GB as quickly for 1080P gaming.
    Thanks! Keep up the awesome work!

  • @cybernd6426
    @cybernd6426 7 місяців тому +1

    It is also interesting to me, that there are a lot of games that are over 6GB on the RX 6500 XT while being still able to run at playable fps. Just shows that the 6GB on a 3050 is really not enough

  • @madarcturian
    @madarcturian 7 місяців тому

    We need more videos about this. Thanks so much guys. Sad how low some cards are these days on vram. I pray I live to see afforadble cards with a lot of vram. VR and proper image scalers of image quality really need a lot of vram.

  • @marksulloway5669
    @marksulloway5669 7 місяців тому +3

    Bothered by newer GPU's with narrower bus width. My current 8GB RTX 3060Ti has 256 bit bus. My 12 GB RTX 3060 has a 192 bit bus. Why would i want a narrower bus width on most of the newer lower end GPU's?

    • @Frozoken
      @Frozoken 7 місяців тому

      Exactly it's awful especially when nvidia again didn't use GDDR6x with the 4060ti making them problem.even worse. 192bit should be the minimum for anything with a 60, 256bit for anything with a 70. What's worse is they're all pretty stingy with cache here too which mitigates this issue, 4060 only got 17gbps memory (2gbps more than the 3060) and had its 32mb of cache cut down to 24mb despite the mobile variant having all of it and the 4060ti is just way too fast for 18gbps on a 128bit bus. It needs 192bit and preferably GDDR6x when it only has 32mb of cache max as well. Ironically enough it hinders the higher capacity variants and they're much more easily able to use more than 8gb of VRAM at higher resolutions but perform disproportionately worse due to the bus width. The only gpu this gen that got its weaker bus compensated for was the 4080 which got the fastest memory at 22.4gbps and 64mb of l2 cache which is only 8mb less than the 4090 despite the 4090 having 70% more cores.

  • @Burago2k
    @Burago2k 6 місяців тому +2

    Its still a extreme view of 8gb not being enough and for the future since very little amount of games use up a 8gb card and cards with 10gb+ vram are quite a lot more expensive.

  • @Hostile2430
    @Hostile2430 6 місяців тому

    2016 : We need more RAM
    2024: We need more VRAM

  • @RetrOrigin
    @RetrOrigin 7 місяців тому

    This also helps demonstrate that texture quality/resolution doesn't really affect framerates that much if any as long as you have enough VRAM.
    I often see people turning texture quality down even when they have a video card with more than enough VRAM thinking that would help with performance when it usually doesn't.

  • @ziokalco
    @ziokalco 7 місяців тому

    If I'm not mistaken. Games such as howarts legacy dinamicaly adjust real texture quality when the VRAM is saturated. Results may be missing some data

  • @williamrutter3619
    @williamrutter3619 7 місяців тому +1

    Interesting video, would be interesting to see this mixed up with pci 3.0 and 4.0, ddr4 and ddr5, and an rx580 4gb and 8gb.

  • @Legitti
    @Legitti 7 місяців тому +1

    Would love to see how 8gb VS 12gb differs

  • @solocamo3654
    @solocamo3654 7 місяців тому +1

    4gb failed me in 2018 with BFV and a 290X. (I was on a 4k monitor but playing upscaled from 3200x1800). Couldn't run the textures I wanted. Swapped it with an RX580 that was slower than my 290X (due to the oc I had on it and mem bandwidth) yet performance was stutter free now and min fps issues were solves. I learned this lesson in the early 2000's with a Geforce4 ti4200... all my friends bought the 64mb model, I bought the 128mb. Guess which card lived an extra few years as being usable in modern games.

  • @SPPACATR
    @SPPACATR 7 місяців тому

    Dude sounds sooooo chill in this video lol.

  • @unclej3910
    @unclej3910 5 місяців тому +1

    I don’t know why the vram discussion is such a devisive hot button topic.

  • @mashedindaed
    @mashedindaed 7 місяців тому

    Great video, I didn't realise VRAM had such an impact in performance once it had effectively been saturated. One slight critique in the charts, especially when talking about the percentage difference, is to always to compare in the same direction, otherwise the numbers could be misleading. For example, 15 is 50% more than 10 but 10 is 33.3% less than 15, so direction of travel between the numbers matters a lot because the difference between 33.3% and 50% is potentially massive. Alternatively an arrow on the charts to indicate which way you're comparing could help de-obfuscate the numbers.

  • @FreeWillMind
    @FreeWillMind 7 місяців тому

    Texture streaming is also one instance where a lack of VRAM is so painful, since pop ins are so obvious in terms of graphical glitches

  • @nicktempletonable
    @nicktempletonable 7 місяців тому +1

    Loved this video Steve!

  • @timduck8506
    @timduck8506 7 місяців тому +1

    Im so glad I brought a rtx 3080 16gb laptop version for my travels 3 years ago. total specs are 32gb ram and a 5900hx cpu with 8tb of storage.😃

  • @ImpreccablePony
    @ImpreccablePony 6 місяців тому +1

    RX580 8GB still serves me well today. Couldn't have said that if I had a 4GB version. My next upgrade will definitely be a 16GB card. No compromises here.

  • @greatwavefan397
    @greatwavefan397 7 місяців тому +2

    I'm surprised Dying Light 2 uses ~4GB VRAM regardless the GPU.

  • @IAmStillNotMatthew
    @IAmStillNotMatthew 6 місяців тому

    I ran a 750Ti up until the end of 2021. I was fine with it, it did play the games I liked at 60FPS, even Destiny 2 was fine, wasn't 60 at all times, but was perfectly fine for me. Upgrading to a Chinese jet turbine 2060 was a massive upgrade, but it was so loud it made matched the AC in my car at full tilt noise wise. So I grabbed a 6700XT and holy mother of VRAM. I mostly utilized ~5GB of VRAM on my 2060 and seeing shit like 11GB of VRAM utilized was wild, especially coupled with how much better games run now.