8 GB VRAM is a Problem. Is 10G any Better?

Поділитися
Вставка
  • Опубліковано 29 січ 2025

КОМЕНТАРІ • 1,9 тис.

  • @livedreamsg
    @livedreamsg Рік тому +1794

    If Intel can afford to put 16GB VRAM on a $350 card, I don't want to hear any excuses for Nvidia.

    • @BeatmasterAC
      @BeatmasterAC Рік тому

      NVidia: "but...but...mUh TeNsOr CoReS...mUh DlLs...MuH FrAmE gEnErAtIoN...mUh iNfLaTioN...mUh HigHeR tSmC pRiCeS...mUh CuDa CoReS..."

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh Рік тому +291

      agreed, i mean Nvidia has a $350 card with at least 12GB (3060) the 3060ti, 3070, 3070ti and 3080 having less VRAM than the 3060 is so stupid, Ngreedia at its best.

    • @KermenTheFrog
      @KermenTheFrog Рік тому +21

      The difference is intel was selling those cards at or near cost

    • @sorinpopa1442
      @sorinpopa1442 Рік тому +47

      Exactly , Puck Ngreedya , they keeping the gaming industry freeze in the last years bcs of they gpus severely lacking vrams. (3090 only exception)

    • @karlhungus545
      @karlhungus545 Рік тому +26

      Unfortunately Nvidia could care less what you think...or anyone else on YT for that matter. They own the GPU market, and will for the foreseeable future. You don't need that VRAM anyways, unless you only play crap console ports at 4K with the 1% that have a 4K monitor 🙄😂 Buy AMD then (you won't), or better yet, have a brain and just get a console...

  • @Barryhick186
    @Barryhick186 Рік тому +670

    Vram isn't the Problem. Nvidia's Vram is the problem.

    • @ForceInEvHorizon
      @ForceInEvHorizon Рік тому +11

      Lol more like AMD

    • @RationHans
      @RationHans Рік тому +52

      I thought the game Devs that do not optimize xd

    • @ForceInEvHorizon
      @ForceInEvHorizon Рік тому +6

      @@RationHans if AMD haven't released they're Fsr we wouldn't have this problem. Nvidia DLSS isn't the problem since its only exclusive to RTX card but once amd released FSR in which available to every card the devs got lazy optimizing they're games since they know we can just use dlss/fsr

    • @V1CT1MIZED
      @V1CT1MIZED Рік тому

      @@ForceInEvHorizon you sound like a child

    • @Rivexd
      @Rivexd Рік тому +154

      @@ForceInEvHorizon that’s like saying “if guns weren’t invented, people wouldn’t kill each other”

  • @LucidStrike
    @LucidStrike Рік тому +132

    I mean, the latest AAA games eventually become affordable Steam Sale games, and so the same problem eventually hits you even if you're not buying at launch.

    • @77wolfblade
      @77wolfblade Рік тому +20

      Remember no preorders!

    • @SpinDlsc
      @SpinDlsc Рік тому +10

      True. I also think his Steam argument only has a limited degree of validity, because if you also look at the Steam Hardware Survey and see what kind of graphics cards most people have, it's 50, 60 and 70-class cards, and a lot of those are still in the 10, 16 and 20 series. A big reason many people haven't wanted to upgrade in the last couple of years is because of the recent pricing problem in the GPU space and the current recession, so by that metric, most of those people aren't exactly going to try running any of the newer, shinier games.
      Also, if VRAM not being needed is the argument we were going to make, then we also have to ask why NVIDIA is going in so hard on marketing ray-tracing and now path-tracing to begin with when they aren't adding enough VRAM to help that feature run better on some of these cards in the long term. By the point of NVIDIA "not needing" to add more VRAM than is necessary for most users, we should also argue that they shouldn't be marketing ray-tracing to begin with.

    • @ZAGAN-OZ
      @ZAGAN-OZ Рік тому +2

      A lot of people bought and played Hogwards.

    • @ZoragRingael
      @ZoragRingael Рік тому +2

      + there are steam sales

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 Рік тому +1

      @@SpinDlscidk, I think raytracing is a good thing to push but it’s not there yet, so nvidia should be taking a hit to their profit margins to keep the prices actually making sense as opposed to doing the complete and utter opposite like they currently are.

  • @Tubes78
    @Tubes78 Рік тому +605

    I remember choosing the 6800xt 16gb over the 3080 10gb because of this.

    • @eldritchtoilets210
      @eldritchtoilets210 Рік тому +95

      Same, I guess it's the "fine wine" effect starting to settle in.

    • @OffBrandChicken
      @OffBrandChicken Рік тому +47

      Same starting to see my purchase to be the correct one over all.
      While some people are saying “you don’t need 8gbs” they are also throttling their game. While I’m turning every up to ultra.

    • @gruiadevil
      @gruiadevil Рік тому +72

      @@OffBrandChicken It's the same people that said "YoU dOn't nEeD a 4770K. What are you gonna do with 8 Threads? E8400, 2Cores/2Threads can run any game"
      In the meantime, in 10 years time, they swapped 6 CPU-s + Mobos + RAM kits, while I held on my i7.

    • @OffBrandChicken
      @OffBrandChicken Рік тому +19

      @@gruiadevil Same, it's not about what tech is "technically fast right now is this specific use case". But "Is this going to fulfill my use cases for next X years"

    • @OffBrandChicken
      @OffBrandChicken Рік тому +27

      ​@@gruiadevil I just love when Nvidia adds more than needed for the time, they're seen as the almighty jesus bestowing greatness. While when AMD does something extra, that frankly is very beneficial.
      The Nvidia users are like, "yes but i could get 10% speeds on games that use less than 8gbs 10 years ago. And that's what truly matters." Instead of thinking about the 20% they'll gain in the future.
      Like the copium is so hard that they don't even see it. Even the youtubers.

  • @madrain3941
    @madrain3941 Рік тому +266

    As soon as I heard that the RTX 4060 was gonna release with 8GB of VRAM I Instantly went ahead and purchased the RX 6700 XT with 12 GB of VRAM, and honestly, It is a HUGE game changer at least for me.

    • @weshouser821
      @weshouser821 Рік тому +13

      What I don't understand is that we have systems with 64gb of memory I really don't see why we cand have a card that has 32/64gb of vram instead of messing around with it. Why can't we just make cards that have upgradable vram slots? I don't know... it's over my head, but I really think it's because of "planned obsolescence".

    • @LeoMajor1
      @LeoMajor1 Рік тому +25

      @@weshouser821 Yes it isa bit over your head because GPU VRAM and SYSTEM RAM are not the same..... and a card with 64gb of VRAM would be HUUUUUUUUUUUUUUUUUUUUUGE and need a heavy power draw and more coooling and its even over my head so someone else can add to that

    • @weshouser821
      @weshouser821 Рік тому

      @@LeoMajor1 Would it really though?

    • @brkbtjunkie
      @brkbtjunkie Рік тому +4

      @@weshouser821 have you seen the prices of ddr5? Gddr6x is a whole different ballgame as well. Apples and oranges.

    • @Elinzar
      @Elinzar Рік тому +5

      Why we don't have 64gb cards in the consumer space is simply because Gddr6 is still not dense enough, like the enterprise 3090ti card had 48gb of Vram and I think this gen might have an 84gb one or something like that
      Also capacities like that where only archived by HBM3 in the past gen
      So yeah I would say even 20gb+ midrange cards are still miles to go, 16gb will become the norm tho and something I love about what AMD did with RDNA2 (absolute underated cards this gen) is that the 6800 all the way to the top tier all got 16gb, I mean the 6950xt should have gotten 24 at least but you get the point, from the high midterm to the high end got a fair bit of Vram and the 6700xt got 10
      Only the entry level got 8gb

  • @MrSwallows
    @MrSwallows Рік тому +141

    I remember when 512MB was enough for gaming.
    Thank you for your service, GeForce 8800 GT.

    • @ro_valle
      @ro_valle Рік тому +3

      I remember asking my dad for a 8800GTS 320mb and he surprised me with a 8800GTS 640mb , I was amazed by the amounts of vram

    • @davidrenton
      @davidrenton Рік тому

      my 1st PC had 4MB of Ram (yep MB, not GB), not in the GFX card, it did'nt have 1, 3D acceleration did'nt exist.
      I think my hard disk was 20MB
      4MB System ram total, but hey that was'nt the problem it was trying to get all the DOS drivers like CDROM,Sound,Mouse into the 1st 640K.
      Doom 1 final Boss was stuttery, but then i spent an insane amount and went to 8MB, Doom final boss, smooth as butter.

    • @m.i.l9724
      @m.i.l9724 Рік тому

      that means if youd commit to be a father when you were 18 your child could have a child my age i guess damn@@davidrenton

    • @davidrenton
      @davidrenton Рік тому +1

      @@m.i.l9724 not yet my child would off had to been a parent at 13 , which is unlikely, i'm 49, so 36 i would have had an 18 year kid, if i had them when i was 18. Hence my hypothetical grandkid could be 13.
      Saying that it's not impossible, people have kids at say 15, their children at 15, they are a grandparent by the time the are 30
      i recently watched a TV bit from the 80's, it was about 6 generations alive in 1 family, so from baby, mother, grandmother, great grandmother, great great grandmother and the great great great grandmother was still alive, they where they all together in the studio

    • @Rexperto6454
      @Rexperto6454 Рік тому

      In 2012 I thought 2GB in the GTX 660 will be enough for a decade but even 4-5 years later it was easily overwhelmed. Nvidia often screwed us over with VRAM capacity. The popular GTX 970 only had 3.5 usable VRAM and as soon as you used the last 0.5 GB that was much slower you started to see huge stutters.
      With Pascal and Turing they were quite generous but after that it's downhill once more. The 3070 TI and 4060 TI with 8Gb are such jokes.

  • @ConfusionDistortion
    @ConfusionDistortion Рік тому +58

    Working adult here that buys AAA games, so yeah, this affected me. It was sobering to start up Company of Heroes 3 and find I couldn't max it out due to a vram limit on my 2070 Super. Have had this card for 3 years, and I still like it, but yeah, sign of the times. So now it sits in a secondary pc and dropped the cash on a 7900 XT. Problem solved, and now I am back to running everything again and not sweating vram issues on Last of Us, COH 3 etc.

    • @kirbyatethanos
      @kirbyatethanos Рік тому +6

      Same boat as me. Recently had an RTX 2060 6GB. Upgraded to an RTX 4070 Ti(the 7900XT is more expensive where I live).

    • @xTurtleOW
      @xTurtleOW Рік тому

      Same man my 3080 was running out of vram very fast so dropped in 3090ti on second rig and 4090 on my main now no vram issues anymore

    • @xkannibale8768
      @xkannibale8768 Рік тому +4

      So go from ultra to high? Like lmao. There isn't even a difference you'd notice 90% of the time and it uses half the vram 😂

    • @legionscrest143
      @legionscrest143 Рік тому +1

      ​@@xkannibale8768 really?

    • @JonesBeiges
      @JonesBeiges Рік тому

      @@legionscrest143 yes really i did a cpu upgrade and can still play the newest titles with my 6 yr old gpu above console graphics.....
      Many people like you seem to fall for those youtube salesman....... Only idiot game devs create games for that 1% elitist pc nerds who need the 4 k ultra settings 144 hz because ......

  • @liberteus
    @liberteus Рік тому +197

    I own a 3080 10gb and my favorite game received tons of graphical updates to the point where 10gb isn't enough anymore, i had to cut all settings from ultra to a mix medium/high to get it over 60fps, down from 90 2 years ago.

    • @clownavenger0
      @clownavenger0 Рік тому +19

      so they added more demanding graphical settings and that hurt performance. okay cool.

    • @sirab3ee198
      @sirab3ee198 Рік тому +112

      @@clownavenger0 the GPUs are limited by VRAM, nothing to do with the demanding graphics, this is the same as r9 290x which had 4GB VRAM and 780TI which has 3GB RAM and the r9 290x aged much better than Nvidia counterpart. Nvidia is playing the planned obsolescence game forcing you to upgrade your GPU because it is RAM starved not because is slow. I can bet you my RX 6800 with 16GB RAM will run games smoother in 2 years than your 10GB RTX3080. When we told people 2 years ago about the limitations on 8GB VRAM on the RTX 3070 they called us Nvidia haters ....

    • @nazdhillon994
      @nazdhillon994 Рік тому +5

      which game

    • @weakdazee
      @weakdazee Рік тому

      literally same

    • @Tubes78
      @Tubes78 Рік тому +2

      ​@clownavenger0 that's always going to have an impact but I can't help but think that the card would lose less performance with more VRAM.

  • @sirab3ee198
    @sirab3ee198 Рік тому +16

    So AAA games are a niche now ????? :)))))))))))) RE4 sold 5million copies, Elden ring sold 20million Witcher 3 sold 40milion etc ....... I hate it when people misuse Steam Charts to prove their point, Hogwards Legacy is a singleplayer game same as Elden Ring (kinda) but after a month from launch people move on because it is a single player game!!! people finish it and move on. Nvidia giving you 8GB VRAM for the x70 series was a slap in the face for consumers, now they are doing the same with 12GB VRAM. People who bought the RTX 3070 and who will buy the RTX 4070 will want to play the latest AAA games.

  • @trr4gfreddrtgf
    @trr4gfreddrtgf Рік тому +252

    This is also why I'm going to take a 7900xt over a 4070 ti, 20gbs seems a lot more future proofed then 12

    • @VaginaDestroyer69
      @VaginaDestroyer69 Рік тому +39

      Yeah, and with FSR 3.0 coming out soon AMD is really closing the gap on Nshitia. RT is still not in a place where I would be willing to base my entire GPU purchase on just ray tracing performance alone. I can see AMD and Intel offering outstanding value to gamers in the future compared to Nshitia's extortionate pricing.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf Рік тому +13

      @@VaginaDestroyer69 Even the 7900xt and 7900xtx have made massive improvements on ray tracing, the 7900xt is only a little bit behind the 4070 ti (with raytracing) and I think we can expect the gap to close as the 4070 ti runs out of VRAM over time.
      I can't wait to see how FSR3 compares to DLSS3, it probably won't beat DLSS in terms of visual quality but hopefully it gives a similar performance boost.

    • @CRF250R1521
      @CRF250R1521 Рік тому +10

      7900XT has low 1% lows. I returned it for a 4080.

    • @chickenpasta7359
      @chickenpasta7359 Рік тому +2

      @@VaginaDestroyer69 you're acting like AMD is the hero in this situation. They literally wait until Nvidia drops their MSRP and then undercuts them by little

    • @WackyConundrum
      @WackyConundrum Рік тому +4

      @@CRF250R1521 Interesting! Do you remember a particular benchmark with these results?

  • @ngs2683
    @ngs2683 Рік тому +70

    I just want to say one thing. I got a 1060 6GB in 2016 and spent nearly 7 years with it. Then finally I took my hard earned money and bought a 3080 Ti in November, this black friday. 12GB of VRAM. Then IMMEDIATELY new AAA games became this crazy demanding and devs are saying that 12 GB is minimum. On top of that, NVIDIA is effectly implementing planned obsolescence. The 4070Ti, the superior card to my 3080 Ti, had no evolution in VRAM. It's a 12 GB card. I just gotta say...it hurts to get an upgrade after 6.5 years only to end up immediately becoming the new low tier for this future they speak of. And I do blame NVIDIA. No card above the 3060 should have only 8 GB and the 3080 Ti should have been a 16 or 20 GB card. 3070 owners have all the right in the world to be mad. NVIDIA KNEW this was an issue but they don't care. They still don't.

    • @aquatix5273
      @aquatix5273 Рік тому +5

      The cards would be completely fine with this list of VRAM. The VRAM on these cards would be how much they actually would need to stay efficient to their compute power:
      RTX 3050: 6 GB
      RTX 3060 + RTX 3060 Ti: 8 GB
      RTX 3070 + RTX 3070 Ti: 10 GB
      RTX 3080: 12 GB
      RTX 3080 Ti + RTX 3090 + RTX 3090 Ti: 16 GB
      RTX 4050: 6 GB
      RTX 4060: 8 GB
      RTX 4060 Ti: 10 GB
      RTX 4070: 12 GB
      RTX 4070 Ti: 16 GB
      RTX 4080: 20 GB
      RTX 4090: 24 GB

    • @dimintordevil7186
      @dimintordevil7186 Рік тому +1

      @@aquatix5273
      rtx 3070 = 12 gb of vram. lets be honest . it is cheap for factory
      rtx 3070 ti =12gb
      rtx 3080 = 16 gb
      3080 ti =16 gb
      3090 =20 or 24
      4050= 8 gb
      4060=10 gb
      4060 ti =12gb
      4070 =16gb
      4070 ti =16gb
      4080 = 1200 usd / 20 gb
      4090 = 24 gb

    • @ozgurpeynirci
      @ozgurpeynirci Рік тому +1

      @@aquatix5273 4060 Ti is WAY MORE capable than 8 gb, that's why they made a 16GB version. 4070 should be 16 as well if not more. As a 10GB 3080 owner, this hurts.

    • @aquatix5273
      @aquatix5273 Рік тому +1

      @ozgurpeynirci Yeah, doubt, 4060 ti barely is better than the 3060 ti, both cards don't have the performance.

    • @dimintordevil7186
      @dimintordevil7186 Рік тому +2

      @@aquatix5273
      3060 ti is faster than 1080 ti . 1080 ti was faster than 2080 super . nowadays 2070 is as fast as 1080 ti . therefore , 3060 ti is a great card .

  • @veda9151
    @veda9151 Рік тому +56

    It is very true that most don't actually affected by the vram issue now. The real controversy is Nvidia not providing enough vram while pricing their GPU as a high-end model. No one is complaining the 3050 or the 6600 only gots 8Gb. It's the 3070 and 3080(10Gb) that attract all the attention.

    • @MATRIX-ERROR-404
      @MATRIX-ERROR-404 Рік тому +1

      RTX 3070 /RTX 3070 Ti/ RTX 3060 Ti = 8 GB vRAM

    • @Iridiumcosmos
      @Iridiumcosmos Рік тому +6

      Because the 3050 & 6600 are entry level cards priced accordingly. The 3070/ti is around a whopping $500 with the same amount of VRAM as the entry level GPUs. Hence why people are calling out Nvidia’s stupidity.

    • @vectivuz1858
      @vectivuz1858 Рік тому +3

      @@Iridiumcosmos That is exactly his point though. Price accordingly and people will understand.

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      system ram like ddr4 or 5 or fast nvme can easily be used to compensate for the lack of vram, devs just need to implement it, but they are lazy af

    • @vectivuz1858
      @vectivuz1858 Рік тому +1

      ​@@r3tr0c0e3 Uhm yes some games do that, and it causes major lagging.

  • @Obie327
    @Obie327 Рік тому +39

    Very good observation VEX, The older Pascal cards with 8 gigs of Vram utilize only what features sets they have baked in. The problem now is all these new advanced DX 12 features plus higher resolutions become more taxing on limited Vram buffers in niche situations. There's a car analogy here: When it's fast but runs out of gas? (tiny tank) Or the car can get to sixty really quick but tops out at 80 mph? (low gearing) i really think everyone wants something that performs great and has future potential/practicality, Or value? Hoping their GPU will last a good while for their current pricey investment? Limiting the Ram only limits the possibilities for game developers.

    • @user78405
      @user78405 Рік тому +6

      Limiting ram should force game developers to open doors ...not milk them, that is john Carmack philosophy of good quality work ethics over quantity always become sloppy when having more than it can chew for a company, and ion storm is good example back then

    • @Obie327
      @Obie327 Рік тому +1

      @@user78405 I totally agree with you. But like Moore's Law is dead interview with the game developer... The modern Consoles are using around 12 gigs of Vram. I hate sloppy laze code, But I do like to experience everything that the developers have to offer? Maybe AI can help clean this up? I feel that if more adopt higher ram limits this issue won't be a problem going forward. I feel like we are in a weird transition period and Nvidia could be more generous with their specs? Have a great weekend and Easter!

    • @David_Raab
      @David_Raab Рік тому +1

      Some people like to buy a newly released graphics card (3070) for 600$-700$ and like that they have problems with already released games because of too less VRAM. People who critizice this are obviously AMD fanboys.

    • @Obie327
      @Obie327 Рік тому +2

      @@David_Raab It's been years that we have had 8 gigs of Vram on a premium GPU. My GTX 1080 is 7 years old and AMD even longer. The latest consoles "was", The warning sign that more ram was going to be needed. And now they are using 12+ gigs for their new Console releases. I just think it's a damn shame to pay 500+ for anything new with only 8 gigs and call it exceptable going forward. I think Nvidia just missed the mark with their current product stack. Also Nvidia's new stuff still has the older display connector standard. Which has me scratching my head since they have the same display tech on the $1600 RTX 4090 as well. Intel's ARC A770 LE is only $350 dollars and has the latest display connectors, DX 12 ultimate/Vulcan/XeSS Feature sets, And 16 gigs of Vram. Is Video Ram that expensive to put more on a $800 4070ti? I just think the whole current role out of GPU's are off on many levels. Time will rapidly tell how fast are cards become obsolete? Crossing fingers, Peace!

    • @David_Raab
      @David_Raab Рік тому +1

      @@Obie327 I agree to all of that. I find it a shame currently. I'm buying Nvidia now for nearly 20 years, and now i'm at the point of buying AMD instead. Nvidia now sells overpriced cards, the 4070 in my opinion should have been a 4060. I could live with such a card and 12GB if it would costs 300€, but not 600€. And yeah, they can't tell me that 4GB or 8GB more GDDR6 RAM can be so expensive. Any card over 300€ should have 16GB VRAM at least.

  • @stratuvarious8547
    @stratuvarious8547 Рік тому +37

    When Nvidia released the 3060 with 12 GB of Vram, everything up to the 3070Ti should have also had 12 GB. with the 3080 and 3080Ti getting 16 GB. I just hope this is the straw that costs them enough market share to change their ways, instead of always thinking they can do whatever they want and people will just buy it.

    • @MarcoACto
      @MarcoACto Рік тому +1

      The thing is that the 12 GB version of the 3060 was obviously aimed at crypto mining, which required a lot of vram and was hot at the time. It was never designed for gaming in mind.

    • @stratuvarious8547
      @stratuvarious8547 Рік тому

      @@MarcoACto Yeah, it's true, but that doesn't change the fact that the skews above should have still been increased. Making GPUs obsolete 3 years after their release is inexcusable, and that's all that giving those cards 8 GB of Vram has done.

    • @naturesown4489
      @naturesown4489 Рік тому

      @@MarcoACto Yeah that crypto thing is a myth. NotABotCommenter has the correct reason.

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому +1

      3060 will still have 30fps less than 3070/80 regardless of how much vram it has lol
      you people are clueless

    • @stratuvarious8547
      @stratuvarious8547 Рік тому +1

      @@r3tr0c0e3 Of course it'd have less FPS, it's a lower class GPU, I was talking about the longevity of the purchase. Maybe before calling someone "clueless", look at the context of the conversation.

  • @Doric357
    @Doric357 Рік тому +27

    6800xt with the 16GB seems to be a sweet spot. I'm a casual enthusiast so I won't claim to know the in's and out's about all this but creators have been talking about VRAM forever I always believe more is better. However, I don't believe it should be at such a high premium.

    • @tokki2490
      @tokki2490 Рік тому

      if you havea 6800xt... you are not a casual enthusiast lol

  • @Sinflux420
    @Sinflux420 Рік тому +35

    Just got a 20gb 7900XT. Being able to run RE4 at max with ray tracing and having 6 gb leftover is pretty nice, ngl. Didn’t realize this ongoing issue until after getting the card, glad it’s well-equipped!

    • @Drake1701
      @Drake1701 Рік тому

      Out of curiosity, what resolution do you play at?

    • @Austrium1483
      @Austrium1483 Рік тому

      What did you pay

  • @clownavenger0
    @clownavenger0 Рік тому +35

    Hogwarts was patched and works fine on a 3070 now. RE has a bugged implementation of RT so if you turn that off the 3070 does not have any issue in that game either. TLOU has PS1 textures on medium and other graphical issues regardless of hardware. If the question was "Is 8 GB on the edge for the 3070?" i would say yes but games are releasing completely broken which increases the need for over powered hardware. Some very good looking 3D titles with high quality textures use 4-6GB (Atomic Heart for example) on ultra while TLOU uses 8 while looking like a PS2 game at medium settings. I run a 3080 10GB myself and play everything at 1440p or 1440p ultrawide while using DLSS quality whenever offered to push about 100 FPS. I have not has a single issue but I only buy games on sale. So the game might be finished by the time I buy it. It seems like people just want to make excuses for game developers.

    • @DenverStarkey
      @DenverStarkey Рік тому +2

      well these games wer also designed around a card that had 16 gigs ,( the radeon cards) so the devs got sloppy with vram usage.

    • @jorge86rodriguez
      @jorge86rodriguez Рік тому +4

      just buying the game on sale avoids a lot of headaches jajjajaja early buyers are beta testers xD

    • @tyisafk
      @tyisafk Рік тому

      I played through RE on an RTX 2070 and Arc A750, both 8GB cards. I agree that RT (And hair strands on the Intel) was the main issue with the game. To be fair though, both reflection implementations aren't good at all so it's worth just having both RT and Screen Space turned off. I even used higher quality textures than the game suggested with those disabled as per Digital Foundry's suggestion and the game ran flawlessly on both cards. I'm glad I don't often care for big AAA titles, and I have a PS5 if I'm that desperate to play one that isn't optimized properly on PC, but I do feel bad for regular AAA game fans who exclusively play on PC. PC used to be the main go to for long term savings if you didn't mind more up front, but now a current gen console is definitely the better option if you just want to be able to play anything decently.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому

      @@DenverStarkey those are high end cards. How many people actually own such GPU? All the talk surrounding this "not enough VRAM" mostly if not all of them is about max setting. Years ago i read an interview with dev (forgot which tech outlet are doing it) they said on pc they will try to optimize their game even on intel iGPU because of how big the user base is. And back then intel iGP are considered as super crap.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +2

      10GB will soon not be enough. Dying light 2 maxes it out easily on RT. You'll have to increase you reliance on DLSS and lower texture quality in the upcoming years. Even flight simulator maxes out 10GB VRAM at 1440p. So....................

  • @stratuvarious8547
    @stratuvarious8547 Рік тому +18

    I expected when the games designed for the current gen consoles (Xbox Series, PS5) started releasing on PC, this was gonna start to be a problem. That's why when I was looking to upgrade my 2070, I was looking for something with a minimum of 12GB of Vram. Since I couldn't get a new 3080 (or Ti) for a reasonable price, I went with the RX 6900 XT and it's massive 16 GB of Vram. Since it was $650, It felt like the best price to performance in the price range I was looking at.

    • @latlanticcityphil
      @latlanticcityphil Рік тому +1

      Man, I love my RX 6900 XT, I have no problems and a great investment too. I can play all the games and have a great experience even with Cyper punk. 16 gb of VRAM DOES MAKE A DIFFERANCE!

  • @KobeLoverTatum
    @KobeLoverTatum Рік тому +112

    Nvidia: “Here gaming studio, $$ to use more VRAM”
    Also Nvidia: “Higher VRAM costs $$$$$$$$$$$$$”

    • @hardrock22100
      @hardrock22100 Рік тому +11

      You do realize the last of us and RE4 are AMD sponsored titles, right?

    • @gruiadevil
      @gruiadevil Рік тому +29

      @@hardrock22100 You do realize they use more VRAM precisely because AMD packs their GPU-s with more VRAM, and nVidia doesn't.

    • @hardrock22100
      @hardrock22100 Рік тому +13

      @@gruiadevil
      1. This person was trying to claim that Nvidia is paying devs to use more VRAM in games that are sponsored by AMD.
      2. It's interesting that amd sponsored titles are running like hot garbage.
      3. The company that ported the last of us was the same one that ported Arkham knight.
      4. The last of us crashes when you run out of vram. That should not be happening. I've seen it even BSOD some PCs.

    • @AntiGrieferGames
      @AntiGrieferGames Рік тому

      @@hardrock22100 The last of US is just a piece of shit port to getting bsod

    • @vaguedreams
      @vaguedreams Рік тому +4

      @@hardrock22100 3. The company that ported arkham knights is also the same company that ported uncharted legacy of thieves collection.

  • @friendofp.24
    @friendofp.24 Рік тому +30

    Heavily regretting buying the 3070 now. I camped outside of a Best Buy for 20 hours and had the option to choose any 30 series. I didn't understand at the time how much VRAM mattered.

    • @HUNK__S
      @HUNK__S Рік тому

      😂😂 suck to be you

    • @soumen8624
      @soumen8624 Рік тому +19

      It’s not your fault, VRAM really didnt matter until very recently.

    • @Stephan5916
      @Stephan5916 Рік тому

      @friend You live and you learn. Vram always mattered.

    • @Stephan5916
      @Stephan5916 Рік тому +3

      @clockworknick9410 It's still Nvidia's fault. Before the 3080 their flagship card was the 2080ti. That was 11gb of vram. They should have at least matched it or better the Vram with the base 3080 model.

    • @naturesown4489
      @naturesown4489 Рік тому +1

      @Clockwork Nick There were people saying at the time of 3070 release (hardware unboxed) that the VRAM wouldn't be enough in a couple of years. The 1070 had 8GB, the 2070 had 8GB.. so why would they not have put more on the 3070?

  • @themadnes5413
    @themadnes5413 Рік тому +27

    I had a 1080ti, and the main reason i did not get a 20 or 30 series was vram. 3080 has less and 3080ti had 12 gb, 1200€ for a 12gb vram gpu is kinda stupid and in this regard a sidegrade. Now i have a 4080, 16gb is still a bit on the low side for a 1200€+ gpu but i can live with that. I know amd is an option too, and i was about to get a 7900xtx but the price of the 4080 was like 50€ more. So i choose the nvidia gpu, also i like rt and dlss a lot.

    • @dededede9257
      @dededede9257 Рік тому +4

      I think 16gb still fine yeah he could be more for this price but i don't think you get vram limited

    • @zdspider6778
      @zdspider6778 Рік тому +11

      1200€+ is the price of a "decent enough" second-hand car.
      The MSRP of the 1080 Ti was $699.
      Ngreedia is laughing all the way to the bank every time a schmuck buys one, lol. They're sitting comfortably on the shelves, not even scalpers are touching them. But enjoy it, I guess. LOL. You got a step-down, btw. From a "Ti" to a "non-Ti" 80-class, for much more money.

    • @paranikumarlpk
      @paranikumarlpk Рік тому +3

      You could have easily choosed 7900xt but u just made an excuse to stick nvidia lol ggs

    • @dededede9257
      @dededede9257 Рік тому +6

      @@paranikumarlpk he have make the good choice for almost same price the rtx 4080 is better thant xtx and doesn't have issues like 100w idle with multi monitor

    • @vaghatz
      @vaghatz Рік тому +2

      ​@@paranikumarlpk DLSS

  • @denerborba4994
    @denerborba4994 Рік тому +29

    digital foundry recently made a review about RE4 remake and there they tested a 2070 super and on their analysis it seems that RE4 will never crash with texteures at 8 GB unless you are using ray tracing. i also have been playing the game with a 8gb card and have not faced any crash for far either about 13 hours in.

    • @viktor1master
      @viktor1master Рік тому

      I test re 4 remake with 3070 ryzen 9 3950x 16 gb ram with total limit of 12 gb of vram textures 8 gb funny tho in settings it showed me mi card have only 7 gb of vram sou i dont know in another games its normal 8 gb but i was impressed it was 135 fps to 70/80 lowest slighty over 60 30 min testing demo tho now i know i must have it the games looks sou good🤣

    • @JoeL-xk6bo
      @JoeL-xk6bo Рік тому +4

      it still has issues loading in high res textures. stop and look at certain surfaces the texture will go in and out.

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      rt doesn't look particularly accurate or appealing in this game anyway, besides it's just another condom they placed on a perfectly good looking game without it, so it was off just like in any other unoptimized garbage they tried to sell us to
      unless game goes full path tracing it simply not worth it and as we can see even 4090 struggles to do that at playable fps

  • @tech6294
    @tech6294 Рік тому +133

    Great video! We need more people talking about this. If Nvidia and AMD tomorrow put 24 GB as the standard midrange and 48 GB on the high-end games overnight would look photorealistic. And no vram doesn't cost that much more to go from 12gb to 24gb. You're probably only talking about a 40$ hike in price. These companies could easily make a 600$ 24gb card. They simply choose not to.

    • @Clashy69
      @Clashy69 Рік тому +23

      amd has already put enough vram on their cards even their low end 6000 series, nvidia should do the same and add at least 12gb vram on their lower end cards id even be fine if it was just 10gb vram but we'll see since they gave the 4070 12gb vram

    • @bronsondixon4747
      @bronsondixon4747 Рік тому +27

      It’d make no difference if 24gb was the minimum. It just needs to have more vram than the current console generation.
      Game developers wouldn’t take advantage of more than 16gb since that’s all they have available in PS5.

    • @66kaisersoza
      @66kaisersoza Рік тому +8

      ​@@bronsondixon4747 the console ram is shared with the OS.
      Around 10gb is for the games and the other 6gb is dedicated to the OS

    • @luisgarciatrigas3651
      @luisgarciatrigas3651 Рік тому +21

      ​@@66kaisersoza 13.5 for games, 2.5 for OS 👍

    • @retrofizz727
      @retrofizz727 Рік тому +13

      24gb is overreact wtf, you wont need 24gb for 4K before like 2030

  • @franciscoc905
    @franciscoc905 Рік тому +16

    Definitely was waiting to see what you have to contribute on the discussion. I definitely see this as a negative for people wanting to play AAA games in 2023 with high details, but it will be a fire sale of great deals and second hand graphic cards for competitive gaming.

  • @rebelblade7159
    @rebelblade7159 Рік тому +39

    I remember buying the GTX 960 4GB in 2015 for like 200$ equivalent brand new. That amount of VRAM was considered overkill for many but it allowed me to use it all the way up to 2021. VRAM matters a lot if you want to use a GPU for a long time.

    • @FatheredPuma81
      @FatheredPuma81 Рік тому +3

      At the time you gained almost nothing for whatever you paid extra and now you're gaining around 15% extra performance for whatever you paid extra.
      Just put the extra money you saved into a savings account, wait a few years, sell your 960 2GB, and get a 970 3.5GB for the exact same cost.

    • @jacobhargiss3839
      @jacobhargiss3839 Рік тому

      ​@@FatheredPuma81 that assumes the price actually does drop and you can find the cards.

    • @FatheredPuma81
      @FatheredPuma81 Рік тому

      @@jacobhargiss3839 Always has always will.

    • @FatheredPuma81
      @FatheredPuma81 Рік тому

      ​@@DeepfriedBeans4492 Looking at the wayback machine it was around $40 more. Toss that into a half decent Savings Account (not your local garbage bank) and that turns into $44 minimum in 5 years.
      GTX 970 was under $100 just before the mining craze and I'd guess the GTX 960 2GB was at the very least above $55. I actually sold a GTX 960 that summer and bought a RX 580 8GB for $120 but can't remember how much I sold it for. (Sold the RX 580 for $120 though a year later though lol)
      Sucks to be you I guess if you're too terrified of potentially mugged at a McDonalds in broad daylight with people around over $60. Sounds like you live in Ghetto-Siberia or something. I'd suggest moving.
      P.S. Do Ghetto-Siberian shippers not let you pay $2 to print a label? Do Ghetto-Siberian's not order things online and have loads of boxes and packing materials laying around? Does Ghetto-Siberian eBay not give you free shipping for broken items?

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 Рік тому

      @@FatheredPuma81 please tell me what savings account you use that gives 110% returns in 5 years because the only kind of accounts I know of with that much potential are not what I would call a ‘savings account’, and most certainly do not come without large amounts of risk, and are also called ponzi schemes and are illegal.

  • @theftking
    @theftking Рік тому +1

    No. It's not. After playing RE4R at 1440p on my freakin' $700 3070 Ti, I won't buy a card with less than 16GB VRAM ever again. I was hovering at 6.5-7.9GB VRAM utilization basically the entire time.

  • @Einygmar
    @Einygmar Рік тому +54

    VRAM is a problem but optimization affects this issue as well. Better texture\asset streaming and more optimized BVH structures for ray acceleration would fix a lot of problem. I think the bigger issue is memory bandwidth on the modern cards that limits the throughput affecting the streaming capabilities.

    • @Vasharan
      @Vasharan Рік тому +6

      Yes, but games will continue to be unoptimized, as long as every developer isn't John Carmack* or Takahisa Taura (Platinum Games), and as long as studios have deadlines and cashflow constraints.
      As a consumer, you can either not buy unoptimized games, or not buy underprovisioned hardware, or some combination of both.
      * And even Carmack's Rage was a buggy mess for years as he tried to get texture streaming to work seamlessly

    • @gagec6390
      @gagec6390 Рік тому +4

      @@SkeleTonHammer That's just not true though. 4K monitors and especially TVs have become very affordable. 1080p is only still the standard among e-sports players and extreme budget builds or laptops. Most people who are even somewhat serious about PC gaming have at least a 1440p monitor and the fact is that anything under 12gb of VRAM just isn't enough for even the near future much less the 4-5 years that most people keep their graphics cards. If you paid more than $300 for an 8gb card recently then you got fucking scammed. (I would know, I unfortunately bought a 3060ti a year ago instead of a 6700XT)

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      @@gagec6390 1440p even 1080p looks playable on oled, upscaling made it possible
      i'm never going back to garbage tft panel

  • @bladimirarroyo8513
    @bladimirarroyo8513 Рік тому +8

    Man im about to buy my first gpu and all your videos all answering my doubts.
    Thank you sm 🤗

    • @Ghostlynotme445
      @Ghostlynotme445 Рік тому

      @@z3r009 if you got a console buy another console

  • @metroplex29
    @metroplex29 Рік тому +8

    that's why i preferred to go for the 6800xt with 16GB vram

  • @Saltbreather
    @Saltbreather Рік тому +23

    If you go into the settings for MSI afterburner/RTSS, you can enable dedicated and allocated VRAM. That’ll give you a more accurate number when looking at how much VRAM a game is actually using.

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      by that account if you have 48gb game will allocate that much if needed, so yeah we need 64gb vram now lol
      funny how while all recent RE games are supposedly use more than 8 gb vram yet games run smooth and without any stutters, yet in settings it indicates that vram limit is exceeded
      RE4 remake crashed not because of that, it was eventually fixed later, you will get small stutter and dips if you enable RT, even on 4090, so vram is not in an issue in this case, Farcy 6 however will destroy your performance if you enable ultra textures and only got like 8gb vram, which kind of look just like high lol, RE games will use system ram to compensate, many games do that actually, simply because recent consoles have shared ram, be it ddr6 but still, lazy devs simply can't be bothered to port them properly, hence the vram rage

  • @NootNoot.
    @NootNoot. Рік тому +54

    I have to agree with you with the whole 'vram niche' point. I myself don't usually play AAA games that tax my gpu, although I do use workloads other than gaming that needs a lot of vram. Although, I do think that this whole vram fiasco, is a very important thing to discuss. Nvidias planned obsolescence should be put to a stop, and give consumers what they NEED for what they PAYED for. Like you said, performance is what scales with vram.
    The 1070 doesn't need anymore vram because of how it handles, unlike the 3070 where it should be able to play 1440p+ and shouldn't need to be bottleneck by memory, causing stutters, instability, to even not booting up the game. It's a business move, and it totally sucks. While these videos may seem 'repetitive' or controversial', I appreciate you making this.

    • @johnny_rook
      @johnny_rook Рік тому +5

      Define "niche".
      AAA games sell by the millions on PC and new consoles have at least, 12GB VRAM addressed to GPU with a 4yo RTX 2070 tier GPU. People (both devs and players) will use high res. textures if they can, regardless of GPU power and textures are the biggest contributor to VRAM usage.

    • @Lyka-clock
      @Lyka-clock Рік тому

      So far my 3060ti works well for mostly PC type games like RTS and some RPG's. I'll use console for AAA games but these days, most of its trash really. The ports haven't been that great to begin with. I tried Returnal on PC and it was stuttering no matter what settings i used and it wasn't a vram issue either. I already played Dead Space, RE4 and LOU. Maybe there needs to be a focus on making a new IP with great gameplay. That would be a wonderful idea! Lets hope Diablo 4 is actually fun. The demo was good but not great and this isn't like a new IP or anything.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +2

      ​@@johnny_rook millions of those PC does not have gpu with 12GB vram. In fact more than half of them probably only using iGPU.

    • @johnny_rook
      @johnny_rook Рік тому

      @@arenzricodexd4409 Yeah, not having enough VRAM is the issue isn't it?
      How do you know that, exctly? Isn't it funny when people pull numbers out of their asses, without a shread of evidence to support it?

    • @alanchen7757
      @alanchen7757 Рік тому +1

      @@johnny_rook proof is when amd competing gpu against 3070 3080 etc out last Nvidia due to having more vram

  • @SoftExo
    @SoftExo Рік тому +5

    5:52 during this interview the dev brings up stuff like "using VRAM to have high quality on all the body parts, like the eyes incase someone looks closer" and im thinking were supposed to pay 500$+ for the sake of having 4K resolution on 30 body parts of an NPC. You're not kidding when you say niche, how bout these devs make good games with griping writing and stop crap ports relying on DLSS/FSR to cover up laziness. 95% of the market is going to continue using 8~12GB products and indies will thrive.

  • @terkiestorlorikin5958
    @terkiestorlorikin5958 Рік тому +8

    7:39 The people that gets affected by the VRAM are the ones that like to run the game on Max settings, seriously, 90% of the games the differences between High and Ultra are barely noticeable, Alex from DigitalFoundry does amazing videos showing optimized graphical settings and most of the time you have to zoom in 200% to spot the difference between High and Ultra. I understand that the VRAM might be an issue in the future but some people should chill a little bit and ask themselves "Do I really need to run clouds at Ultra settings? Do I really need to run this specific setting at Ultra?".

  • @danielkatzman3406
    @danielkatzman3406 Рік тому +3

    Vram is important only if you are playing games in a resolution higher than HD!

  • @Jerad2142
    @Jerad2142 Рік тому +7

    One bright side about laptop gaming is a lot of these chips have way more VRAM on them, for example my 3080 laptop has 16GB of VRAM.

    • @dededede9257
      @dededede9257 Рік тому +4

      So for the first time of story the laptop gaming will age better that desktop

    • @spaghettiupseti9990
      @spaghettiupseti9990 Рік тому +2

      @@dededede9257 probably not, 3080 mobile gpu's don't perform like 3080 desktop cards, not even in close.
      a 3080 mobile is about 40-50% slower than a 3080 desktop.

    • @whohan779
      @whohan779 Рік тому +1

      Correct, @@spaghettiupseti9990, their naming is hugely misleading. Even a 3080 Ti mobile may be trounced by a 3060 Ti desktop depending on clocks (it's realistic).
      This is mostly because the RTX 3080 mobile is almost identical to 3080 Ti mobile, so they need the same memory bandwidth. While I'm sure Nvidia could explain away 8 GB for all 3080 mobiles (as they do for some), this wouldn't fly for the Ti models, hence they always have 16 GB on mobile.
      The mobile 3070 and up are (according to somewhat unreliable UserBenchmark) just 20% apart vs. 38% on desktop, so the only reason to pay up for an 80(👔) SKU (apart from higher power-limit) is the additional VRAM.

    • @UNKNOWN-li5qp
      @UNKNOWN-li5qp Рік тому

      But a laptop with 3080 will be like 3000 dollars and at that point just buy 3090 or 4090 lol

    • @Jerad2142
      @Jerad2142 Рік тому

      @@UNKNOWN-li5qp Don't even know if you can get "new" ones with a 3080 anymore, one with a 3080ti and a 12800HX is about 2149.99 if you went Omen though. My 4090 laptop was about $3,300. But yea, definitely paying a premium for portability.

  • @hyxlo_
    @hyxlo_ Рік тому +2

    Devs are not optimizing there games and we are blaming gpu manufacturers 🤦‍♂️

  • @seaspeakss
    @seaspeakss Рік тому +11

    Tbh I was expecting this. When Nvidia decided to put 8 gigs into the 1070, I was amazed, and looked forward for the future. But after the release of the 1080Ti, Nvidia got really comfortable, and havent really came out with a great card, considering cost-to-performance ratio (the 90 series cards are powerful, but super expensive, unlike what the 1080Ti was back in its day.) The 3070 STILL having 8 gigs of VRAM is what holds it back, and 3080 only having 10 is also a major dealbreaker for me.

    • @Mr.Stalin116
      @Mr.Stalin116 Рік тому

      Tbh I feel like 10 gb is enough for 3080 for its performance. I recently got a 4070 ti 12gb, and I’m playing in 4k just fine. It does run out of vram when I’m playing 4k ultra rt in some games, like cyberpunk with rt override but those games would run at 20-30 fps anyway with more vram so there is not really any point of having more vram. And it sucks that there aren’t many other options if u wanna experience rt. And amd just doesn’t run well in rt. After trying rt in cyberpunk I was amazed by how much better it looks.

    • @y0h0p38
      @y0h0p38 Рік тому +1

      ​@@Mr.Stalin116Right now, 10 gbs is pleanty fine. What about the future though? Its a higher end card, you should be able to use it for years without any issues

    • @KillerBsan_
      @KillerBsan_ 6 місяців тому

      ​@@y0h0p38Just buy our new product u slave!!! - Nvidia.

  • @ComradeChyrk
    @ComradeChyrk Рік тому +6

    I have a 3070 and I was concerned at first about the 8gb of vram, but so far I haven't had any issues. I play in 1440p but I never was really interested in things lile raytracing, or playing at ultra settings. As long as I can play 1440p at 100 fps, I'm happy with it

    • @David-ln8qh
      @David-ln8qh Рік тому

      I bought my 3070 for $1000 deep into the card-pocalypse when 3080s were in the $1500-$1600 range. I'm frustrated about the situation but still feel like I didn't really have many options and was probably better off pocketing that 5-600 dollars for my next card, which I'm hoping is a least a couple years out.
      For the record I also play at 1440p 80-120 fps and haven't yet run into problems.

    • @ComradeChyrk
      @ComradeChyrk Рік тому

      @@David-ln8qh I'm glad I waited cause I got my 3070 at sub 600$. I was holding out with a 970 until the prices dropped. I got my 3070 about a year ago.

    • @Trainboy1EJR
      @Trainboy1EJR Рік тому

      @@ComradeChyrk Still, wouldn’t it have made more sense to go with a 16gb AMD card if you weren’t going to bother with Ray tracing?

    • @ComradeChyrk
      @ComradeChyrk Рік тому

      @@Trainboy1EJR I wanted the dlss. Plus it was in my price range. The amd equivalent (6700 xt) was roughly the same price but didn't have as good of performance.

  • @VDavid003
    @VDavid003 Рік тому +7

    I'm just glad that my 3060 that I bought used has 12gb of VRAM.
    I actually wanted to have as much vram for the money as possible, since last time I went with a 3gb 1060, and in the end that 3gb bottlenecked the card in some cases.

    • @0xEF666
      @0xEF666 10 місяців тому

      same

  • @gorytarrafa
    @gorytarrafa Рік тому +2

    I have this setup: cpu is a i7-10700k , gpu RTX 3070 , 16gb of ram , The last of us part 1 wont even start on my computer , an error message apears saying not enough Vram . "Modern gaming"
    The Last Vram Of Us !

  • @lukasbuhler1359
    @lukasbuhler1359 Рік тому +8

    Planned obsolescence go crazy

  • @j.rohmann3199
    @j.rohmann3199 Рік тому +4

    I actually never had problems so far with my 3060 ti... it does amazing for me at 1080p and decent on 1440p. I will still be using it in like 4 years (if I live that long)

    • @j.rohmann3199
      @j.rohmann3199 Рік тому +1

      @@VulcanM61 damn, I was going to do the same thing!
      From 5600x to 5800x3d... but maybe I will just go for a 5900x instead. I didnt decide yet

    • @j.rohmann3199
      @j.rohmann3199 Рік тому +1

      @@VulcanM61 Epic!
      Yeah the X3D versions are crazy good. And pretty future proof!

  • @konstantinlozev2272
    @konstantinlozev2272 Рік тому +3

    The real problem with high VRAM requirements and even with raytracing requirements is not (!) that it taxes your hardware. It's that the visual output is very underwhelming for the uptick in hardware requirements.
    You seem to be younger, but I do remember the Crysis WOW moment when we were seeing what kind of visual fidelity was possible.
    I fired up Titanfall 2 yesterday and on high it is a stunning game. Fake reflections and all, but you know what? It runs on 6-7 year old mid-range hardware. And looks just gorgeous.

  • @Skylancer727
    @Skylancer727 Рік тому +11

    I completely disagree on it being a niche issue. I do agree AMD advertising higher VRAM hasn't helped them, but this is an incredibly serious issue. People bought 3070s, 3080s, and 3060s only a couple years ago and have the power to play newer games, yet it won't work. Remember that some even bought 3070s for over $800 during the mining gold rush. That's an incredibly rough place to be in. Especially since most people seem to own a GPU over 2 generations and even the 40 series is low on VRAM. Even at MSRP the 3070 was $500, the same as the next gen systems that are running the games and again, this GPU is objectively faster. This could scare people away from PC gaming just after it recently took off again.
    And yes it's only AAA games, today. But when games start adding features like direct storage (which most likely will) even 12GB will be in a tough spot. Hell even Satisfactory announced moving to UE5 for new map loading systems and nanite. More games are going to continue to do this. And many people play at least one AAA game. Did you see the player counts for Hogwarts Legacy? They even announced over 3 million sales on PC alone in the first week. And games like COD will also likely become similar in the next 2 years with dropping PS4 and Xbox One, likely this next COD game.

    • @DenverStarkey
      @DenverStarkey Рік тому +2

      i just bought a used 3070 in october of 22 , and i already feel like nivdia is standing on my dick. Re2R and Re3R both go over and crash when Ray tracing is on.

  • @stephenpourciau8155
    @stephenpourciau8155 Рік тому +4

    One little flaw is you did not turn on the setting that shows "memory usage \ process". This one in afterburner/rtss will show the ACTUAL vram usage of the application, and not what is allocated on the whole card.

  • @DeusVault
    @DeusVault Рік тому +2

    Yes, it is difficult to optimize massive games to use limited vram, especially in a limited time frame.
    Yes, for 1440p and 4k and Ray Tracing, more vram is going to help you in the long run.
    BUT, if you are not an obsessed setting slider and can deal with your games being on Medium or High instead of Ultra settings(which is barely noticeable), then 8GB vram is FINE.
    People running out and buying a 16GB vram card for 1080p drank the panic koolaid. There is zero reason you need that much vram right now unless you are going to be doing 4k, tracing rays, or doing some very demanding workload.
    I'd recommend 12gb just to not deal with issues on AAA games at launch or if you want to do machine learning, ray tracing, or max out all the settings. (except don't get a 3060, pretend it does not exist. 6700XT is just better value.)

  • @Ben256MB
    @Ben256MB Рік тому +12

    I don't think it's a problem because games are graphically more realistic and the texture sizes are bigger too .
    Remember the tech demo of Unreal engine V on PS5 I knew that 1440p might consume all 8GB or more .
    We can keep it extra real " Most games there very little difference between ultra and high settings .
    Just down the settings to high at 1440p or 4k in an 8GB card . People are too sensitive !!

    • @OffBrandChicken
      @OffBrandChicken Рік тому +3

      Or just add more VRAM and don’t have the issue to begin with.
      You see Im not even remotely worried about my card anytime soon, and I’ll be able to run my games at higher settings than Nvidia because of it.
      I can crank up settings without issue.
      I just find it ironic that a worse performing GPU is performing better now.

    • @gruiadevil
      @gruiadevil Рік тому +4

      You can't ask to crank down settings.
      I paid the nVidia Tax.
      I paid the 3000/4000 Series Tax.
      I expect to play a game at highest possible settings on the resolution of it's tier.
      You can't charge extra, and deliver less, just so people buy another GPU in 2 years times, because you want to sell more GPU-s.
      It's the same mentality General Motors had in the '70-80's by starting to making cars that brake in 10-15 years. And everyone followed suite.
      If you buy a BMW made before 2005, it's a tank.
      If you any BMW made after, it's going to start breaking piece by piece.

    • @Ben256MB
      @Ben256MB Рік тому

      @@OffBrandChicken Lol VRAM can't be added because it's on the PCB of the GPU board !!

    • @Ben256MB
      @Ben256MB Рік тому

      @@gruiadevil Lol bruh !! Because you bought a low end or a med GPU you don't get the same benefits as someone who paid $1700 for a 4090 .
      You have a choice of buy an AMD GPU which has bigger VRAM for less the price for slightly less performance .

    • @OffBrandChicken
      @OffBrandChicken Рік тому +1

      @@Ben256MB are you serious? I’m saying add more vram to begin with. How was that hard to understand?

  • @vitor900000
    @vitor900000 Рік тому +1

    One thing people overlook is more Vran = Higher cost.
    If you think Nvidia is overpricing their GPUs with suboptimal amounts of Vram, imagine how much they will want to price their GPUs if people keep asking for more Vram.
    A extra high bus 4gb of GDDR6x will add a 20$~40$ just in cost alone. Adding the profit + overpricing and you have a extra 60$~80$ added to the final price.
    The only solution would be to make so that GPU memory is modular like we have with our CPUs. You would be able to add as much Vran as you need instead of having a fix amount per product tier. Would also increase the lifespan of the GPUs since they would become upgradable.

  • @DeadlyKiller54
    @DeadlyKiller54 Рік тому +7

    Just seeing this makes me super glad i got my 6700XT for 360 with the 12 GB of VRAM it has. Yeah not an nvidia card, but she still performs good, and streams decently.

    • @Trainboy1EJR
      @Trainboy1EJR Рік тому

      Seeing this makes me even happier to have a $240 12gb RTX2060. XD Although it is almost certainly going to be my last Nvidia card, Intel is looking super good with ARC right now. Hopefully they teach Nvidia the lesson AMD hasn’t been able to. And with EVGA out of the scene, I completely understand not wanting to touch the 40xx series! Honestly I’m surprised more board partners didn’t “nope” out of this generation. XD

    • @GrainMuncher
      @GrainMuncher 9 місяців тому

      ​​@@Trainboy1EJRVRAM doesn't matter if the card is too weak to even use it. There's a reason the 3060 Ti 8gb destroys the 3060 12gb

    • @Trainboy1EJR
      @Trainboy1EJR 9 місяців тому

      @@GrainMuncher Destroyed? HA! 65fps vs 71fps is just the 192bit bus vs 256bit bus. I’ve learned to go with the most Vram card and have never been disappointed! 1gb GT220, 2gb GT640, 4gb 1050ti (laptop), 12gb RTX 2060. Let me repeat that, NEVER DISAPPOINTED!!! I will never sacrifice textures to play a game, because textures have zero impact to performance if you have the Vram for it.

  • @duskdrummer1667
    @duskdrummer1667 Рік тому +1

    What you are seeing in Afterburner is allocation, not actual usage!

  • @ItsFreakinJesus
    @ItsFreakinJesus Рік тому +4

    Adjust your settings and it's a manageable issue even with AAA games. Shadows and lighting have massive RAM hits with little to no visual difference at the higher settings for example.

  • @StraightcheD
    @StraightcheD Рік тому +4

    10:33 It obviously depends on what you want to play, so I think you're technically right. I think people worry because nobody likes to end up in a niche by accident and be told that the game you want to play next just happens to be one of the dozen titles to avoid on your $600 card.

    • @dashkatae
      @dashkatae Рік тому +1

      This. When you spend that much on a card, you kind of expect it to last you a few years and be able to handle new titles for a while.

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      @@dashkatae problem is it's not 2016 anymore and big corps don't care if they ever did, they will just double down at this point
      consoles might be the best option for triple a title players, it will run like a turd, but at least you didn't pay 1000$ instead of 500 for a fake4k 30 fps experience, all this dlss and fsr is fake resolution and fake frames at this point
      rest can be played on a potato anyway

  • @ner0718
    @ner0718 Рік тому +5

    Spending a lot of time in Blender (3D modelling and rendering) being stuck on a 6gb card is incredibly frustrating as I can't render most of my scenes on the GPU as my Vram runs full. Can't upgrade as I am still a student and don't have the money to buy a new gpu.

  • @bmo61950
    @bmo61950 Рік тому +1

    So, no one is going to talk about the absolute bad game optimization we get constantly? I wouldn't use the Last of Us as a reason to explain how bad 8-12gb cards are.

  • @Sybertek
    @Sybertek Рік тому +6

    No regrets with the 6800XT.

  • @PlutoKam
    @PlutoKam Рік тому +2

    Lol it's possible to just lower some settings a tad and go below the 8gb target. It's kinda annoying how overblown this is

  • @ElladanKenet
    @ElladanKenet Рік тому +6

    I upgraded from a GTX 960 4gb to a 3060ti in early 2021, and went from 720p to 1080p. The improvements were staggering, and it's still mostly impressive two years later, but there are a few games, like HL, that punish my system.

    • @jayclarke777
      @jayclarke777 Рік тому +1

      Went from a 1050Ti to a 3060Ti. It was like going from VHS to Blu-ray

    • @Trainboy1EJR
      @Trainboy1EJR Рік тому

      @@jayclarke777 as someone who had a 2gb GT640, I gotta say that Textures high, shadows low, everything else off looked really good. Never went past 40% gpu usage because of a CPU bottleneck. XD Played Lost Planet 2 totally cranked at 17fps. Had the game memorized from PS3, looked AMAZING on PC. Just Cause 2 24fps in the city, only like 15%GPU usage. XD
      Upgraded to 12gb 2060. Most Vram minimum price, high textures are what matters. Can’t wait to finish going through all my favorites in 4K 120fps! XD

  • @voteDC
    @voteDC Рік тому +2

    The vRAM issue is only for those who absolutely must be running at the top settings. I have a secondary PC I use just for media. It's an i7-4770K, GTX 970, and 8GB of DDR3 (in single channel), so not exactly a modern system. It runs Hogwarts Legacy at 1080p, all low settings (draw distance to high) with Quality FSR and rarely breaks from 60FPS. Sure it doesn't look as good as it does on my main gaming PC but it still looks and runs great.

  • @Hakeraiden
    @Hakeraiden Рік тому +6

    At this point I'm scared to buy, 4070 which will be released soon. Not sure if I should wait for 7800xt or 7700xt reviews from AMD. Recently completely upgraded my PC after more than 9 years, I want finally to play games and not rely on streaming (which is still awesome as replacement). For now, I will play on 1080p, but consider upgrading to 1440p in a year or less.

  • @laszlomiko9085
    @laszlomiko9085 Рік тому +2

    I'm running Hogwarts Legacy on an RX 6600XT at 1080p. No up-scaling, high preset with RT off, the VRAM usage was around 7.x GB. Changed the textures to medium, now it's usually under 7 GB. At 1080p you can't notice anyway, the game was made with 4K in mind. That's were you need textures at ultra (and also where you need a high-end GPU), high is probably fine on 1440p.

    • @TheTeremaster
      @TheTeremaster Рік тому +1

      TBH i feel like if you're shelling out for a high refresh 4k monitor, you can afford an XTX or a 90

  • @admiralcarrot756
    @admiralcarrot756 Рік тому +10

    Meanwhile, AMD 6000 series offers you VRAM based on card tier like 6500 for 4GB, 6600 8GB, 6700 12GB, 6800 16GB, and lastly 6900 for 16GB too.
    Nvidia 3000 series be like... 3050 8GB, 3060 8GB, 3070 8GB, 3080 10GB, 3090 24GB. See the problem there?

    • @user78405
      @user78405 Рік тому +2

      Having 8gb RAM feels like having 12gb in games ...while 10gb feels like having 16gb in games ...very different in tech between methodology on conservative usage that don't tax high in your system usage and finding out who are best game developer than lazy take more VRAM to cover up bad game flaws like forspoken that I find it embarrassing on quality AAA titles now vs doom brand is always been carefully respected still...and runs all cards today. Even 4gb GPU's due to idtech Carmack genius work he put into ...wish every developer is like him...to have brains to make games ...but like for spoken, I can tell the developer is rude and super lazy can bring entire team down with bad energy he or she spread into...talked about employee don't like his or her job

    • @VDavid003
      @VDavid003 Рік тому +1

      Actually, the regular 3060 is 12gb which is even weirder.

    • @silvershines
      @silvershines Рік тому

      Overall the line-up isn't too weird once you realize the original plan was that there was meant to be a RTX 3080 20 GB. But then the crypto boom happened and Nvidia decided to chase sales volume for that sweet crypto cash. Better to produce more cards than a good product.
      The past few crypto booms (that were isolated to AMD cards) also showed that regardless of what you do -- you will have to deal with a bunch of cheap 2nd-hand used cards cannibalizing your new card sales. So regardless of what happens, your company is going to be the baddie anyway so might as well raise the price and condition people to accept higher prices.

  • @zdspider6778
    @zdspider6778 Рік тому +2

    Dude, what are you talking about? 16 GB should be standard by now! Instead, we get GPUs that are way more expensive than they should be, with shitty memory bus and crappy VRAM capacity. You now get less hardware for more money. The whole GPU market is disgusting rn. Intel is the only one (God help us) that seems to make any sense, but only because they're trying to enter the market. If they had a better product, with better drivers, they would have fleeced us, too.

  • @VisibleVeil
    @VisibleVeil Рік тому +10

    Vram is not niche because those triple AAA titles will be on sale for the rest of us in 1-2 years time. When that happens, how will we play with the limited ram on these cards?

    • @gruiadevil
      @gruiadevil Рік тому +1

      You won't. Lol. You'll buy a new, better, much more expensive card and thank big Daddy nVidia for giving you another shite product.

  • @Ebilcake
    @Ebilcake Рік тому +1

    Issue is only really being highlighted because of the last of us, I'm on a 3080FE / 5800X3D and it runs just fine, it's just below the 10GB limit with DLSS and high textures and I'm 7 hours in since the hotfix and it's been perfectly stable. 3440*1440 HDR with DLSS Quality 80-120fps. No issues at all, butter smooth. Game looks stunning at times.
    If NVIDIA got a day one driver out that addressed the crashing, I wouldn't even know there was a vram issue., and that's been the case in all the games you mentioned.

  • @capnmoby9295
    @capnmoby9295 Рік тому +4

    It's probably the dev's becoming more and more complacent and the games becoming more and more complicated

  • @karlhungus545
    @karlhungus545 Рік тому +2

    Exactly, it's not a problem for gaming at the resolutions these cards were intended for...1440p. Big thumbs up to this guy for having some common sense! All of these 'sky-is-falling' not enough VRAM videos are using crap, unoptimized, poorly coded console ports as 'evidence'. First, NOBODY (less than 1% of gamers) has a 4K monitor. Of those, most have a 4090, which is the only truly capable 4K card on the market. 2nd, if you play nothing but crap, boring, run around with a wand/gun/stick console games...uh, GET A CONSOLE!! Why would you spend so much cash to try and play console games on a PC at 4K?! 🙄🤣 Also, learn the difference between 'allocated' and 'used' VRAM. Afterburner is NOT accurate. My son's have a 3060ti and 3070 respectively at 1440p and have experienced ZERO issues playing at Ultra settings in actual PC games (Total War Warhammer 3, Sons of the Forest, Squad, etc). DCS World is far and away the most demanding PC game you will ever play, and I didn't have any issues playing it when I had the 3070 (gave it to my son), other than having to turn AA down because it's such a system hog. If you want so much useless VRAM, buy AMD! (you won't) Nvidia will do what they want and could care less what YT thinks because you'll keep buying no matter what...

  • @simon6658
    @simon6658 Рік тому +7

    It's always good to force GPU makers to add more VRAM.

  • @hasnihossainsami8375
    @hasnihossainsami8375 Рік тому +1

    The problem with the example at 8:35 is that most of these are either multiplayer games that, besides low memory, require very little power to run or are far older games that, again, require very little power and memory to run with modern hardware. This is the reason why these games top the steam charts, because the kind of GPUs that are needed to run these games at an acceptable quality are easily accessible and/or affordable for the vast majority of people. You don't need to spend an entire PC's worth money on just the GPU to get a decent experience out of these kinds of games.
    Yes, people who play AAA games are a niche; it's an expensive hobby. But then again, people who buy GPUs for $600-700 or more are just as much a niche. They buy these cards *because* they expect to be able to run games that the vast majority cannot. It isn't unreasonable to think that the majority of people who own a recent $600+ GPU would want to try out at least one AAA game during it's relevancy, and this is where the argument falls apart.
    If I'm spending the kind of money on hardware that I expect to be able to comfortably play games on without resorting to unacceptable graphics quality, the hardware should meet those expectations. If the argument against this is "don't play AAA games," then one might as well say "don't buy expensive GPUs." Let's all just stick to our GTX1060s and play Apex and Dota forever.
    Or buy AMD.

  • @AntiGrieferGames
    @AntiGrieferGames Рік тому +10

    Stop being the VRAM drama and tell the devs to optimizing their aaa games like Indie Games did!

    • @naipigidi
      @naipigidi Рік тому +5

      Im putting you in the list of people with a brain.

    • @AntiGrieferGames
      @AntiGrieferGames Рік тому

      @@naipigidi I dont get it what that means. lmao.

  • @ChusmaChusme
    @ChusmaChusme Рік тому +1

    9:00 I'm a pretty heavy Davinci user and I usually cap out my 16gb card and 4gb gets paged to ram. This is usually with something like 6k braw footage, applied with some frame interpolation, noise reduction, and a couple of other nodes. When it came from either upgrading from the RX 6800XT or RTX 3070 at the time, this was during the mining boom so prices didn't make sense, the 16gb made sense for my purpose of use.

  • @hyena-chase2176
    @hyena-chase2176 Рік тому +7

    I find 12gb vram about right,would not go any lower if buying a new GPU in 2023 ,I play a few games like rust that use about 10+ gb at 1440p,so more Vram the better imo

    • @Pimp_Shrimp
      @Pimp_Shrimp Рік тому

      Jesus, Rust got chunky. I used to play it on a 970 (albeit at 1080p) just fine many years ago.

  • @StuffIThink
    @StuffIThink Рік тому +5

    Just got a 6650xt. Don't really care if I can run games on ultra in the future. As long as it holds out a few years playing on medium or higher I'll be happy.

    • @gruiadevil
      @gruiadevil Рік тому +4

      Yes. But you bought it cheap.
      Look at how much a 3070, 3070Ti, 3080, 3080Ti cost. Those are the ones discussed here.
      Not the cheap products. When you buy cheap, you say to yourself "If I can play new games at a combo of Medium/High Settings, so I can enjoy the game, I'm satisfied. If it lasts me for the next 2-3 years, I'm satisfied."

    • @StuffIThink
      @StuffIThink Рік тому +6

      @@gruiadevil he asked people with 8 gb cards what they thought. Just answering his question.

    • @vanquishhgg
      @vanquishhgg Рік тому

      Havent had any issues with my 6650xt Hellhound. Will last me another year until I do a full rebuild

  • @nixboox
    @nixboox Рік тому +2

    The issue isn't that deep. You're talking about needing...let me rephrase that..."needing"...to upgrade your graphics card when you have one from the LAST generation because it doesn't "quite" run all the latest games at their maximum settings. But for most of the real world this idea is insane. I have a pair of Nvidia Geforce 980s running in SLI that I have used since new. They have run every. single. game. I've wanted to play for the last ten years. It is only in the last year that I've found games that I am incapable of playing because they 4G of RAM on the cards is too little. No one who has a 30-series card is in need of upgrading for any valid reason. Those cards will run the latest games for the next 8 years with no problems. At this point, I would consider upgrading to a 40-series card because my 980s are, what, five generations old? The problem you ALL have is that chasing the next newest thing will always leave you unfulfilled. Learn to be happy with what you have and be thankful you don't have less. That's a general rule of thumb for living your best life.

    • @3rd.world.eliteAJ
      @3rd.world.eliteAJ Рік тому

      Amazingly put! These people want all the latest and greatest RTX features with 20GB vram for the lovely price of 200$... While simultaneously ignoring the fact that developers are just milking the consumers with terrible ports and remakes year after year.
      These same people say that developers are now targeting console vram targets. Yet, even new console games are running into performance issues - Redfall is running at 30fps on PS5 & XSX without a 60fps performance mode available on launch. LOL... Somehow this makes 8-10GB obsolete? No, the developers are making PC gaming obsolete... Requiring 400W+ GPUs just to run games on 1440p 60fps is absolutely hilarious.

  • @LordAshura
    @LordAshura Рік тому +2

    The problem is that when people pay good money for 3070/3080, they expect it to do high resolution with RT. But with high resolution and RT comes high VRAM usage and it is increasingly clear that these cards will not perform to expectations in the future.

  • @Lev-The-King
    @Lev-The-King Рік тому +20

    Hope AMD takes the opportunity to clown Nvidia for this... They probably won't.

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 Рік тому +8

      They did it at the time of the gtx 970 and its 3.5GB of Vram.

    • @lotto8466
      @lotto8466 Рік тому

      @@nombredeusuarioinnecesaria3688 my 6750xt has 12 gb and play hog warts ultra perfectly

    • @jay-d8g3v
      @jay-d8g3v Рік тому +1

      16gb on 6900xt, yum yum, newer 7xxx pushing 24

  • @MrFreeman1981
    @MrFreeman1981 Рік тому +1

    still rocking my 11GB 1080Ti all fine on a 1440p

  • @aeneasoftroy9706
    @aeneasoftroy9706 Рік тому +3

    next gen is here and that's it. moving forward you're just going to need high end hardware to play the latest games on pc. however, that doesn't mean you can't pc game, all it means is you need to have proper expectations for your hardware.

  • @Electric_Doodie
    @Electric_Doodie Рік тому +2

    Unpopular Opinion: VRAM isn't the issue, the consumer is.
    We can all talk about how bad Nvidias pricing is, and it is bad. But so is AMD with the 7000 series currently.
    There was back in the Day much much more people considering the different 10xx/20xx of Nvidia in "Tiers".
    And while the 3070 Ti was pretty powerful, so was the 3060 Ti and so on, but they all are more Low to Mid Tier Card's if we go by that. We are now ~2 Years ahead of their release, they playing a AAA Game at ULTRA settings at 1440p/4k isn't their purpose anymore. What some influencers do test them for still.
    Does the extra VRAM of AMDs 5000/6000 Series do them any favors tho? I mean, sure you can probably set them to 4k/Ultra unlike Nvidia and don't run into the VRAM Bottleneck for some games, but at what Frames?
    The 6750XT is even with more VRAM then a 3070 Ti performing worse in HUB Hogwarts Legacy Testing (AVG and Lows FPS) and the 6800XT is performing worse then a 3080. Their kind of "Counterparts" of the Green side.
    There's obviously some other interesting things to Note, Power consumption, Performance in things besides Gaming (Davinci, Blender, etc), Driver's.
    12GB of VRAM of a 4070 Ti doesn't seem plenty, but it's a 1440p Card rather than a 4k Card which is all the hype for so many people tho, but they completely forget that.
    If you don't care about anything besides Gaming, special features (DLSS, Reflex, Ray tracing, etc), nor care about any sort of power consumption (which leads to more $$$ spend over time), AMD might be the right choice for you. Especially if you can get the Card's cheaper (like here in Germany, a 7800XT is ~100€ cheaper then a 4070 Ti).
    Always vote with your Wallet, not for Red or Green because someone said so / influenced you.
    People are just blindly following their favorite influencers choice and don't think much anymore it seems.

  • @OffBrandChicken
    @OffBrandChicken Рік тому +4

    You know the only people that tend to try and even remotely justify 8gbs nowadays even on lower end graphic cards tend to be Nvidia users.
    You notice that? I think some people are coping and hoping they didn’t make a bad decision going with Nvidia this time.

    • @gruiadevil
      @gruiadevil Рік тому

      The amount of Copium is high in these comments :))
      I noticed that too.
      And they're not just nVidia users. They're nVidia users who bought a 3070-3080 card during the boom. Ofc they are going to lower settings. They are still paying the loan they took to purchase that card :)))

    • @OffBrandChicken
      @OffBrandChicken Рік тому +1

      ​@@gruiadevil It's so crazy on like how predictable their responses are. Like "Most Top Steam Games don't even require it."
      Like people don't realize that maybe the reason that they are still the most played. Is because the players Graphics Cards can't handle more.
      Most "Gamers" are gaming on laptops/prebuilts with lower end graphics.
      People that are building/upgrading, are doing so with the intent of playing modern games. Because you wouldn't waste that type of money otherwise.

    • @David-ln8qh
      @David-ln8qh Рік тому

      Don't you need 8gb to say 8gb works for you?

  • @FrankCrz-pg1mu
    @FrankCrz-pg1mu Рік тому +2

    Having this problem now with my RTX 3070 8GB VRAM, I have enough power to play at 1440p ultra/high settings but not enough Vram for the game to be stable. Tbh, I’m starting to think even 16G Vram wouldn’t be enough for next gen games.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Рік тому +1

      16gb is going to be new main vram for 1440p like 8gb started for 9 years ago. I am going to sell my rtx 3070 under price to be rid of it so fast I can like 440€ shipping free in Sweden to but new rx 6750xt for 550€ or rx 6800 used. Only game struggle for me now is mw2 1440p played game at 1440p medium high mixed hits 8gb easily, basic I am at 6 to 7gb usage

  • @axxessdenied
    @axxessdenied Рік тому +10

    Seeing how things are unfolding makes me pretty happy that I picked up a 3090. I've managed to hit 23+gb usage in CP77 with a bunch of texture mods.

    • @scythelord
      @scythelord Рік тому +5

      Yep, same here, but I knew this was coming. You can't stagnate with VRAM levels that were available 10 years ago. The Radeon R9 290X had 8 gigs of VRAM back in 2013. 8 gigs was good for 2016. Today it's practically minimum tier. Double or triple that is just more sensible.

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      @@scythelord depends on what you play, 8gb is still enough for games from 2017/8 and most people play older games, like csgo, mmos etc
      so only minority play unoptimized garbage triple a titles

  • @LagunaFox
    @LagunaFox Рік тому +1

    VRAM matters only based on your resolution. Seriously it is that simple. I have a laptop with a 1650 in it, that is 4 Gigs of GDDR5 memory. I play RE4 just fine at 1080p. Do I have all the graphics settings jacked up? No, and you know what... the game still looks gorgeous. People seriously need to get off their high horse about jacking up all the graphics settings to ultra or epic or whatever the highest settings are. This isn't 2010 anymore people, the difference between high and ultra or epic is almost imperceptible (as a preset anyway, certain settings will make a difference like draw distance), you really won't notice the difference. Take the next step down from high to medium and you will probably notice a difference, but is it a big difference? Not really.
    People act like if you can't play a game with maxed out settings it isn't even worth playing and it really disgusts me.

  • @16xthedetail76
    @16xthedetail76 Рік тому +5

    My GTX 980ti will be going hopefully for another 2 years...

  • @volkswagenmember
    @volkswagenmember 6 місяців тому +1

    picked up an amd rx 7800 xt with 16gb to avoid this problem, and im a 1080p gamer

  • @Verpal
    @Verpal Рік тому +9

    Considering there is only one singular GPU that was released with 10GB of VRAM, if the dev already decided to ''SCREW 8GB, lets put in more texture!'', I don't see how and why they will stop themselves at 10GB instead of the much more popular 12GB threshold.

    • @scythelord
      @scythelord Рік тому +6

      They literally didn't even consider PC graphics cards when making the game. It's made to use what the Xbox Series X and Playstation 5 can use, which is a lot more than 8. This isn't a result of them upscaling anything for the PC release or intentionally screwing anyone. They're just straight porting the same game over.

  • @nyrahl593
    @nyrahl593 Рік тому +2

    The EVGA 3090's had their VRAM in clamshell mode and its frustrating, because those cards clearly did not have issues running their VRAM in clamshell mode. So I really wonder how much more it would cost vendors to double the VRAM on current chips since clearly the X070 series must(JEDEC standards and all) support up to 16GB, the 080's 20GB.

  • @soup-not-edible
    @soup-not-edible Рік тому +3

    When I bought a 16GB RX 6800, I wasn't thinking much about its VRAM, but this is a godsend.
    I prefer having more RAM that's slower than less RAM that's faster.
    (Definitely laughing at Nvidia for the backlash of this "supposed" planned obsolescence)

    • @r3tr0c0e3
      @r3tr0c0e3 Рік тому

      ironically 3080 with less ram still will be 30% faster than your 6800 and no vram will change that lol
      30 fps with stuttering is just as bad as 30 fps without, when the times comes to that and by then this cards will be just e-sports cards lol

  • @CryingWolf916
    @CryingWolf916 8 місяців тому +1

    they made the rtx 3080 a 1440 p card with 10 gb vram which means you cant play ultra on some new games without running out of vram its pretty smart of them but scummy at the same time

  • @MrGalax00
    @MrGalax00 Рік тому +3

    I wanted a 3070 or 3060 ti for my HTP but ended up getting a 6700XT because it was cheaper, and had more VRAM. On my main PC I'm using a 3090 FE, so I don't have to worry about VRAM usage.

  • @jorge86rodriguez
    @jorge86rodriguez Рік тому +2

    Thank you and no you are not being controversial your are being honest, the vram issue only affects "AAA" (whatever that means) games on high settings. There is a huge catalogue of games that do not demand high vram, and the ones that do, turning off or lowering some of those graphical intensive settings alleviates the issue.
    That does not means it is fine and it sucks NVIDIA puts a limitation on a costumer that spend so much money on a premium product, however it is not a general pc gaming issue as a lot people have made it look

  • @AlchemyfromAshes
    @AlchemyfromAshes Рік тому +4

    I'm on the fence on this being a niche problem. I would agree saying it's a bit more of a niche that people buy AAA PC titles on day one or immediately after release vs. console. I think there are a fair number of PC gamers who are like me though. I won't pay the initial AAA price for a game. They are bound to have serious bugs and I'm a bit older and used to older pricing, so the current standard AAA price just seems crazy to me. I will look at them seriously 1-2 years from release though when the price is cut in half in a steam sale etc. If Hogwarts Legacy has an OK sale any time this year for example I'll be picking it up. Definitely at half off, but maybe even if it just drops down to 75% retail. I picked up CyberPunk as soon as it went to 50%.
    I upgrade my video card every 4-6 years which I think is relatively common, so there is a fair chance that VRAM issues are going to impact me within the next year, and before I upgrade again (picked up a 3080 about a year and half ago). So, to me, the problem isn't niche as much as it's just delayed a bit. AMD has shown that VRAM cost doesn't have to be a serious factor in being competitive pricing wise. NVidia is just making up their own rules at this point and testing what the market will bend over and accept it seems. AMD is happy enough to do the same. Just my opinion, but at the obscene prices that NVidia and AMD are charging for cards right now, RAM shouldn't have ever been an issue. It should be plentiful on anything but a bargain bin card. It's like buying a luxury car and the dealer trying to stick the crappiest tires and no accessories on the thing, all while telling you that you should just shut up and be happy they're selling it to you. You don't expect the dealer to skimp on bells and whistles when you're paying well. It seems the video card manufacturers have lost touch with the average consumer and don't realize or care that, for most of us, video cards ARE luxury items. Very few of us can treat them like throw away electronics and just upgrade every time a new model comes out to keep up.
    From experience with friends and acquaintances, I would venture to say there are a fair number of people this already effects even at the AAA price though. For example, to people who are also console gamers, or coming to PC gaming from consoles, the AAA price is just the normal price of a game. It's not expensive. Most people probably aren't buying a AAA title a month, but it seems likely that a large number of gamers would pick up a AAA title or two a year, especially if they haven been waiting on the title for a long time. I think this could at least impact momentum for interest in PC gaming in the near future.

  • @malikgod
    @malikgod Рік тому +2

    8gb is enough. games are currently optimized terribly! also, it's just a scam to sell new cards with more ram. which for the most part, is really not going to improve most games. re2-3 also went to 12gb of ram on 8gb cards, and did not crash those games

  • @brkbtjunkie
    @brkbtjunkie Рік тому +4

    Something to be aware of is cached memory vs actual memory use. Many games load up the vram to the hilt, but only a portion of it is being used. 8GB was fine for me at 1440p/165hz and 4K/60hz on my 2070 and 10gb is fine on my 3080. I have zero issues with vram and the hitching I do get sometimes is not a vram issue, it’s a shader calculation issue or engine frame time issue.

  • @matd2100
    @matd2100 Рік тому +1

    Just don't run on ultra. High is 90% the graphical fidelity and much less vram resource-hungry

  • @c523jw7
    @c523jw7 Рік тому +5

    You bring up some good points here. All that matters is your personal experience and your card fitting your purpose. 10g has been more than enough on the games that I play and really happy with my experience . Now I do think nvidia suck for not giving their cards enough vram, no excuse for that. 4 year upgrades seems about right just a shame vram usage has really spiked the last few games. Best to future proof for any card purchase moving forward though.

  • @dazdaz2050
    @dazdaz2050 Рік тому +1

    @Vex nice video i was hoping for this exspecially as i have the 3080 10GB
    Ok sorry if this has already been mentioned below but i dont have the patience to read all the comments to check and i have allot to try and get across, so this post might be abit disjointed lol. The lack of Vram is only half the story and people with powerful cards like the 3080 and 3070ti and so on should be considering these facts well before upgrading.
    1. (Memory Bus/Controller size) on GPU and 2. ( Pc system ram speed)
    When you run out of Vram all that happens is the extra ram needed for game assets spills over into the system ram THATS IT, yes system ram is traditionally allot slower than Vram but most cards with a large memory bus can swop data in and out of Vram so fast it actually some what compensates for the lack of Vram amount, also if you tune your Pc system ram that will also help dramatically too and its free to try.
    The 3080 has two things going for it gddr 6X and a bus width of 320bits. I personally overclocked my DDR4 dual rank+dual channel system ram from 3200 cl16 to 3766 cl16 (the maximum my cpu IMC could handle with four sticks) which i personally recommend everyone try well before upgrading something as expensive as a graphics card.
    If i was on a intel system my ram speed could be in the 4000+ faster ram would set you back maybe 100 once you factor in the sale of your old kit compared to a new high end GPU with 20gb+ of Vram which could be an extra 600+.
    Finally cap fps with riva tuner to 60 or at least 10fps under what you see your card is able to hit consistently and youl most probably find your still good to go.

  • @TheWretchedWorld
    @TheWretchedWorld Рік тому +7

    I said 10gb wouldn’t be enough for a 3080, considering the next Gen consoles have a pool of 16gb of vram. I bought a 3090 and was told by other gamers that I wasted my money and 10gb is plenty. The writing was on the wall, you can predict the future hardware requirements just by looking at the new consoles.

    • @TheEnd0117
      @TheEnd0117 Рік тому +4

      Well they don’t have dedicated VRAM they have 16gb of unified ram. Consoles share their ram with the system OS and other processes. So games don’t have access to 16gb of VRAM. Consoles are also easier to optimize so it’s not really an issues.

    • @albertoalves1063
      @albertoalves1063 Рік тому +1

      Is not 16GB of VRAM, is 16GB of shared memory, but even if was 16GB of VRAM this should not be a problem for any RTX 3000 since we all see cards like the GTX 1050 with 2GB of VRAM and the GTX 1060 with 3 and 6GB of VRAM and the console at the time was the PS4 and Xbox one both consoles had 8GB of shared memory and those cards was able to run every game, the GTX 1050 was weaker but run almost like the consoles and the GTX 1060 was a lot better. The amount of VRAM is not the problem, the problem is the optimization, because if a game like Read Dead Redemption 2 can run on a GTX 1050 2GB with textures on ultra at 1080p 30fps, these new games should run on any RTX 3000 without problem and just to be clear when I said "run without problem" I'm not saying a RTX 3050 4GB run The Last of Us at ultra, but it should run at least on medium/high.

  • @pinakmiku4999
    @pinakmiku4999 Рік тому +2

    I absolutely agree with later part of video. People are making a big mess of it. Especially Amd sheeps rant on this and try to justify their card purchase. Doesn’t matter.

  • @user78405
    @user78405 Рік тому +3

    Lot of people said that adding extra 16 GB of system ram in their system with 3070 fix the majority of frame stuttering in LAST OF US and RE4 remake when I did notice it using 24gb of system ram...what a shocker....now I know why 3070 dips to 20fps when 16gb ram is bottle necking of data to process...now with 32gb...it stops and no more shader loading during gameplay...and fps minimal went up to 41 fps

    • @OffBrandChicken
      @OffBrandChicken Рік тому +3

      When your computer cant use more VRAM it uses system ram. It’s still a VRAM issue at the end of the day.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Рік тому

      My system can get up to 16gb vram usage but 8gb come from my rtx 3070, games crash or low fps in games when vram is not enough and in your case is that your vram was not enough and ram memmory xD

  • @Tom-sd2vi
    @Tom-sd2vi Рік тому +2

    the 20GB rx7900xt seems like the only reasonable priced, good GPU with proper VRAM configuration this gen.