Nvidia's 16GB RTX 3070... Sort Of: 16GB A4000 vs 8GB RTX 3070

Поділитися
Вставка
  • Опубліковано 21 лип 2024
  • Gigabyte GeForce RTX 40 Series Laptops: www.aorus.com/en-au/laptops/l...
    Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 3050 - geni.us/fF9YeC
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3080 - geni.us/7xgj
    GeForce RTX 3090 - geni.us/R8gg
    Radeon RX 6500 XT - geni.us/dym2r
    Radeon RX 6600 - geni.us/cCrY
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6900 XT - geni.us/5baeGU
    Nvidia's 16GB RTX 3070... Sort Of: 16GB A4000 vs 8GB RTX 3070
    Video Index:
    00:00 - Welcome back to Hardware Unboxed
    00:52 - Ad-spot
    01:48 - The RTX A4000
    04:32 - Test System Specs
    04:48 - The Last of Us Part 1
    05:56 - Hogwarts Legacy
    06:48 - Resident Evil 4
    07:21 - Forspoken
    08:04 - A Plague Tale: Requiem
    09:10 - The Callisto Protocol
    09:35 - Final Thoughts
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo
  • Наука та технологія

КОМЕНТАРІ • 2 тис.

  • @beigebox1990
    @beigebox1990 Рік тому +1222

    Nvidia cuts corners where it wants. VRAM has been a stain on their cards for a long time.

    • @existentialselkath1264
      @existentialselkath1264 Рік тому +105

      Yeah, even my 2060 super was limited by its 8gb at 1440p 60fps. I don't understand how even the 3070 and 4060 still have the same limitation. What's the point in all that extra power if you can't use it?

    • @boingkster
      @boingkster Рік тому +72

      @@existentialselkath1264 and 8GB is the same amount my 1070Ti has... go figure hey?

    • @PinHeadSupliciumwtf
      @PinHeadSupliciumwtf Рік тому +35

      You can literally see the reason in this video. The a4000 which cost around 2k in Australia meant for productivity. If you sell a 16gb 3070 there's no reason for anyone using Nvidia for graphic design or whatever to buy an a4000.
      Edit: he said it himself and I commented just a few minutes before watching that part.

    • @existentialselkath1264
      @existentialselkath1264 Рік тому +33

      @@PinHeadSupliciumwtf power consumption? Extra productivity features and support? The A series isn't just about vram

    • @christophermullins7163
      @christophermullins7163 Рік тому +19

      I wouldn't say theyre "cutting corners"
      Nvidia is selling Lovelace by preemptively limiting ampere vram. This is the reason 3070ti are getting as cheap on my local market as 6800xt. 3070/TI buyers got ripped. RIP

  • @apoc341
    @apoc341 Рік тому +738

    The 8gb vram on the rtx3070 is typical of Nvidia’s business model these days of making their cards hamstrung enough so you have to update to a new card in a couple of years of its launch. They don’t want another gtx 1060 6gb lasting for years on their hands again.

    • @JackJohnson-br4qr
      @JackJohnson-br4qr Рік тому +61

      GTX 1060 6GB was a garbage even 2 years ago. I have RX 480 8GB and lots of older games easily eat more than 6GB. Red Dead Redemption 2, Shadow of the Tomb Raider, Rise of the Tomb Raider... You name it.

    • @StatusQuo209
      @StatusQuo209 Рік тому +125

      I think the more solid comparison would be the 1080ti. That card is a mistake that Nvidia will never make again. That card was too good for too long.

    • @TRONiX404
      @TRONiX404 Рік тому +13

      100% Navidia intentions, the 10 and 20 series cards had more vram.
      The 1080ti is still a solid 1440p gpu.
      God of War 4K 60fps HDR ua-cam.com/video/k09L0Wny7EQ/v-deo.html

    • @NothingXemnas
      @NothingXemnas Рік тому +29

      ​​@@JackJohnson-br4qr
      You say that, but I still own a 1060 6GB and it runs everything I want. It is all about keeping up with expectations (lowering graphics as needed). My sister fucking smashes in Rainbow Six Siege.
      Maybe it is just how we treat our hardware at my home?
      Edit: mind you, I say "lower graphics", but I still play Stellaris and GTA at Full HD and all graphics maxed out. R6S also runs on my PC with graphics on high and Full HD. Whatever you call "trash", I have no idea.

    • @KrisDee1981
      @KrisDee1981 Рік тому +36

      1070 was great to. 8GB of Vram 7 years ago.

  • @ionutbagmuianu884
    @ionutbagmuianu884 Рік тому +795

    So RTX 3070 is the spiritual succesor of 970, good GPU but limited by VRAM, history repeats itself

    • @katyuuki2261
      @katyuuki2261 Рік тому +58

      So 5000 series will be baller like Pascel?

    • @Herbertti3
      @Herbertti3 Рік тому +54

      @@katyuuki2261 Hopefully. Would be nice to get nvidia for change. I'm not willing to gimp on vram and price ceiling is 700€.

    • @laorakaora1764
      @laorakaora1764 Рік тому +36

      @@katyuuki2261 No. By ur calculations, it should have been 4000 series and maybe could have been if not for pricing and deceptive naming.

    • @chriswright8074
      @chriswright8074 Рік тому

      @@Herbertti3 you mean amd

    • @trr4gfreddrtgf
      @trr4gfreddrtgf Рік тому +8

      @@katyuuki2261 that would be amazing but I don't think nvidia would ever make that mistake again

  • @Thor_Asgard_
    @Thor_Asgard_ Рік тому +332

    With the 40 series Nvidia just tried to see how much they can insult us right into our faces without the stupid people noticing it.

    • @pk417
      @pk417 Рік тому +9

      If u don't know 4000 series r not selling well
      Wake up crypto is over now

    • @MistyKathrine
      @MistyKathrine Рік тому +6

      Just get a 4090 or buy a used.

    • @unruler
      @unruler Рік тому +13

      They didn't realize that crypto boom is over and anything won't fly anymore.

    • @Frigobar_Ranamelonico
      @Frigobar_Ranamelonico Рік тому +3

      Poor people be like😂, you as ever had any other hobby? I ensure you that the pc one it's not that costly.

    • @prestige6390
      @prestige6390 Рік тому +36

      @@Frigobar_Ranamelonico insecure people be like:

  • @nasko235679
    @nasko235679 Рік тому +407

    Buildzoid's brain so massive he knows HUB's full inventory just from casually watching videos.

    • @vensroofcat6415
      @vensroofcat6415 Рік тому +66

      He knows what you did last summer. And more.

    • @MistyKathrine
      @MistyKathrine Рік тому +30

      He knows their inventory better than they do.

    • @butifarras
      @butifarras Рік тому +44

      Buildzoid is just built different

    • @Bajicoy
      @Bajicoy Рік тому +2

      Lmao

    • @andersjjensen
      @andersjjensen Рік тому +3

      @@butifarras And we're all grateful for it.

  • @GearheadK20C4
    @GearheadK20C4 Рік тому +306

    And yet some people will still defend the 3070 and 3070 Ti having 8GB of VRAM...

    • @Ausf
      @Ausf Рік тому +15

      They offer what the market demands. If no one bought 8GB cards, they would stop trying to sell them. Devs can also do better with optimization scaling. Both HL and TLoU are running lazy junk code.

    • @existentialselkath1264
      @existentialselkath1264 Рік тому +116

      @@Ausf it's not just poorly optimised games. Forza horizon is one of the better optimised games out there. It's so well optimised even my 2060 super can run it flawlessly, with RT and everything, but I cant do it with max textures because I hit the 8gb limit.
      The fact that even a 3070ti is only capable of running forza at the same settings as my 2060 super is obsurd and there's no defence for it.

    • @Kryptic1046
      @Kryptic1046 Рік тому +80

      @@Ausf - The market isn't "demanding" 8GB cards, it's just that a lot of consumers didn't know any better and Nvidia happily took advantage of that. HL and TLOU aren't the only games pushing to 8GB and beyond, and with UE5 set to become the engine of choice for many developers, games pushing well beyond 8GB at 1080p are going to become much more common. Nvidia knew all of this when they released the 30 series.

    • @Lionheart1188
      @Lionheart1188 Рік тому +1

      @@Ausf What a dumb ass comment

    • @Ausf
      @Ausf Рік тому +4

      @@Kryptic1046 They're two that have terrible scaling. As the comment above you mentioned, it is possible to get the scaling right. Obviously lower VRAM means lower texture quality.
      Nvidia's new 70 cards aren't selling well, so the market is waking up to not spending that kind of money on cards like that. Part of it is ignorance, sure, but I expect people to learn eventually and the market will sort it out. Once demand drops for 8GB, Nvidia will stop making them, except at the entry level. There are still people that just want to play on the cheap.

  • @TooBokoo
    @TooBokoo Рік тому +421

    Just for pure hilarity, you should do a video on the 12GB version of the RTX 2060 that Nvidia, for some reason, decided to release. That might be the most random card variant in the last several years, given the VRAM amount they gave the 3070 and OG 3080. It sure would be funny watching the 2060 do better in modern games than the 3070.

    • @troosimimimmi
      @troosimimimmi Рік тому +18

      its actually same performance as 2070

    • @Gay-is-_-trash
      @Gay-is-_-trash Рік тому +14

      Oh, so Nvidia GPUs are running out of memory when previously they had more than enough at the "same" settings.. Yeah😂 This is just a manufactured outrage

    • @montreauxs
      @montreauxs Рік тому

      whatever

    • @turtle7389
      @turtle7389 Рік тому +7

      I do have one myself because upon release last year it was the first and only real card around my region that became a somewhat reasonable purchase after the market slowly started to recover. I also use a 4k Monitor and play graphically less intensive titles such as Deep Rock Galactic, Dirt 2.0 and Valheim so I was confident it would still handle them well enough. And it did! DLSS definitely helped where available, but I have to admit that only Dirt 2.0 pushed over 6GB of VRAM, so I guess you could have just used a 6GB Version anyway. But I agree, would be interesting to see how it holds up in these newer titles

    • @TooBokoo
      @TooBokoo Рік тому +18

      ​@@turtle7389 Fire up RE4 Remake. My 3080ti was choking at 1440p, even with textures only at the medium settings (4GB) and ray tracing enabled.

  • @yetson
    @yetson Рік тому +253

    For a moment I thought this video was about the modded 16GB RTX 3070 that had it's memory modules replaced by Paolo

    • @V1CT1MIZED
      @V1CT1MIZED Рік тому +19

      Would be interesting to see a video on that

    • @zubairsaifi5850
      @zubairsaifi5850 Рік тому +3

      Bump

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +35

      I remember someone doing the same with RTX 2070 Super. VRAM ain't expensive, yet Nvidia just wants to increase profits even cheaping out such a crucial component.

    • @alexusman
      @alexusman Рік тому +7

      You mean by Vik-On?

    • @flimermithrandir
      @flimermithrandir Рік тому

      Same, but this works as good imo.

  • @larsbaer3508
    @larsbaer3508 Рік тому +303

    A 16 gb 3070 or 20gb 3080 would have been good for yeaaaars, so they cut it for selling a new card within only a couple of years ....

    • @Electric_Doodie
      @Electric_Doodie Рік тому +9

      Imagine, 3 Years after release, setting 1 setting to High instead of Ultra, to not bottleneck on VRAM. Crazy I know.
      Back then people made Videos about Optimized Settings for Gaming (best FPS, without loosing any or barely any visual fidelity), now people just hate on VRAM all-day.

    • @GewelReal
      @GewelReal Рік тому +122

      @@Electric_Doodie textures don't impact performance as long as you have enough VRAM and often provide the biggest visual difference so yea, we will be hating on too little VRAM

    • @larsbaer3508
      @larsbaer3508 Рік тому +99

      @@Electric_Doodie Imagine spending 1000 Dollars for a Card Like a 3080 two years ago and Not maxing settings 2 years later ... Imagine buying a "midrange" Card Like a 3070 for 700 to Not BE able to max settings in 1080 2 years later ....

    • @kimnice
      @kimnice Рік тому +5

      Well they need to sell cards to make money. If people buy card and are happy with it for 5 years..then that's bad business and stupid for them.

    • @pk417
      @pk417 Рік тому +21

      @@Electric_Doodie i have 3060 i bought it for 208 dollars it is weaker than 3070
      But i put all settings on ultra i dont have to turn down settings

  • @rg975
    @rg975 Рік тому +138

    Maybe Nvidia wanted this with planned obsolescence, but at the same time this is pissing off everyone who bought the 3000 series, and will likely steer a lot of them to AMD next time they upgrade.

    • @sudd3660
      @sudd3660 Рік тому +8

      i felt that problem with the 3080 i got, i got it way to late and it was way to little too late.

    • @L1vv4n
      @L1vv4n Рік тому +5

      I don't think it is a planned obsolescence. If it was, they would have introduced 4070 with 12 or 16Gb of RAM to increase chances that as much people as possible upgrade from 3070 to 4070, without waiting one more generation for 5070.
      Now they have a situation when some are switching to AMD if not for memory than out of spite.
      As it is, not sure what was that, except attempt at saving costs at every possibility.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +38

      Nah, they knew nobody buys AMD so they went ahead and took full advantage of their monopoly.

    • @paulmememan508
      @paulmememan508 Рік тому

      You clearly don't know how normie brains operate if you really think that. I mean look at Bose. Bose isn't an audio company, they are a "lifestyle company." Their home audio products sound like absolute dogshit (it's intentional as they use equalization that lowers certain frequencies in the range of the human voice to allow you to speak over the music easier in social situations) but are more expensive than technically and subjectively superior products. I was at my brother's last year and he was playing music through his Bose Soundlink Revolve II speaker and I was like "this sounds like shit" thinking it was some speaker he got at 5 below for $20. Even a $100 JBL Flip 4 sounded better than that $220 piece of garbage. Normie product choice is psychosocial value signaling first and foremost, not an analysis of the value proposition.

    • @EdU-od5ec
      @EdU-od5ec Рік тому +23

      People will still buy nvidia. People are sheep.

  • @StefandeJong1
    @StefandeJong1 Рік тому +84

    I sold my 3070 3 months ago for €450 to buy a 7900XTX instead. Since the VRAM fiasco took hold, the second hand price of a 3070 dropped to like €320 within 2 months.

    • @deleater
      @deleater Рік тому +9

      That person who bought it must be cursing non stop. Who would expect buying a euro450 card not capable of playing at 1080p ultra within 3 months LMAO xD

    • @NostalgicMem0ries
      @NostalgicMem0ries Рік тому

      should have got 4080 or 4090 if you bought 7900 xtx. it has way too high temps both core and ram. not to mention drivers issues.

    • @StefandeJong1
      @StefandeJong1 Рік тому +7

      @@NostalgicMem0ries Holy crap, when are people going to quit the 'driver issues' crap for AMD. That shit has been fixed for YEARS at this point and have not had a single issue. Also, 7900XTX was €1030 while the RTX 4080 = €1400 and 4090 = €2000. €1000 is already pretty ludicrous for a GPU in my opinion, but you do you. I undervolted my 7900XTX and runs full load around 72C at 280W.

    • @StefandeJong1
      @StefandeJong1 Рік тому +3

      @candyman I regretted not buying an RX 6800, but luckily I could sell my 3070 for 80% of the price I paid for it.

    • @giucafelician
      @giucafelician Рік тому

      @@NostalgicMem0ries ua-cam.com/video/VbV7xWplOSI/v-deo.html

  • @Jas7520
    @Jas7520 Рік тому +80

    Just to clarify, you can use the Studio drivers with the 3070 but that is different to the Professional drivers the (fka) quadro's use. Studio is basically just a stable branch of the GeForce drivers, while the pro drivers are a separate verified branch with a bunch of extra optimisations enabled (aka, you're not artificially crippled in a bunch of pro workloads). Not justifying the Quadro tax, just making sure people know you can't just install the studio drivers and get the same experience as the Pro drivers.

    • @MiGujack3
      @MiGujack3 Рік тому +17

      Good ol' segmentation.

    • @TheDravic
      @TheDravic Рік тому +6

      @@MiGujack3 All that incredible Research & Development costs money and Nvidia still need to make a lot of profit or they'd be considered a failing business.
      People call it greed but it's just the reality of how things are when you're the top dog. Just educate yourself and make informed decision when choosing what tier of products you buy for your needs.

    • @andrei007ps
      @andrei007ps Рік тому +4

      @@TheDravic Exactly, a lot of people just expect 2x performance every 2 years like is just some default rule (Moore's law) but each 2x more perf is more expensive than the last, it is harder and harder to squeeze more perf out of silicon.

    • @GeneralSouthParkFan
      @GeneralSouthParkFan Рік тому

      ​@@andrei007ps Not even considering rising costs of nodes with TSMC 5nm being like 16K in total atm (SS8N being like 5K-7K in comparison? I think?). The AD102 chip + GDDR6X + GPU board alone probably accounts for like $600 in total BoM, probably leaning towards $700 counting for everything else. But by all means, this doesn't by any means justify what we're seeing at the low end ATM; 8GBs more of GDDR6(X) is ~$40 in total.

    • @PainterVierax
      @PainterVierax Рік тому +6

      @@TheDravic Those prices are the result of the deviance of shareholders insatiable appetite and monopoly, they had plenty of plus-value to cover R&D costs and things like certifications. Nvidia milks both professionals and consumers for a long long time, and there is no justification but greed.

  • @Amin52J
    @Amin52J Рік тому +172

    I recently got a 7900XT to replace my 3080 mainly for VRAM issues, specially when playing 4K. If it had more VRAM, I probably wouldn't have done it.

    • @BlackParade01
      @BlackParade01 Рік тому +20

      I got an RTX 4090 for the exact same reason (it's getting delivered today!). Coming from an RTX 3070.
      The 4090 should last me the next 6-7 years with ease I reckon.
      Edit: it just got delivered!

    • @mayonotes9849
      @mayonotes9849 Рік тому +41

      ​@@BlackParade01 Considering it requires you to sell a kidney (it's a joke) if it doesn't last you for at least 6-7 that'll suck considering the asking price.

    • @pk417
      @pk417 Рік тому +9

      @@mayonotes9849 it will last more 4090 is great gpu....with dlss
      In my country it sells below msrp

    • @theplayerofus319
      @theplayerofus319 Рік тому +25

      @@pk417 in 6-7 years its barely a 1440p card tbh so no

    • @toonnut1
      @toonnut1 Рік тому +4

      I've done the same

  • @solocamo3654
    @solocamo3654 Рік тому +32

    Learned about vram back in the GF4 ti4200 days. It's the one thing that has kept me AMD for so long at this point aside from price/performance. I bought a 128mb ti4200, my friends bought the 64mb version. Needless to say mine lasted much longer. Same reason I went 6900XT over 3080 during the mining crisis by a landslide.

    • @JustAGuy85
      @JustAGuy85 Рік тому +3

      Same reason I went 6700XT 12GB over the 3060Ti or 3070. Which, at the time, the 3060Ti was the competition for the 6700XT. Now, with AMD's fine wine driver technology, the 6700XT is a 3070 competitor. At the time, the 3060Ti and 6700XT were the two cards priced the same.
      I debated and debated because of ray tracing performance and even DLSS vs FSR. FSR 2 was just a rumor at the time.
      BUT... I wasn't going to go from an RX480 8GB from 2016 to a 3060Ti/3070 8GB in 2022. All it took was 6-8 months for it to pay off and become the better card.
      I mean, I literally just got done playing Crysis 2 Remastered. I was showing anywhere from 10500MB-11500MB VRAM allocation, depending on the area. EVERY game I play pushes 9500MB VRAM or higher. Every single one.
      So... I went with my gut instinct and chose the 6700XT solely because of the 12GB VRAM it had. I just wish prices weren't so high at the time, or I'd have went up one more notch to the 6800 16GB. Don't get me wrong, I'd prefer a 6950XT, but I'm just saying, I had a budget and I had a stance on these ridiculous GPU prices. I held out as long as I could and eventually, the XFX Qick 319 6700XT 12GB hit $500 on sale at n3wegg. I bit the bullet and went with it.
      I've became nothing but happier as time has went onward with this card. And FSR 2 can be done VERY well. The Witcher 3 Remastered is a prime example. I run it with all the RT goodies on, but the main Global Illumination RT on performance, and @ 2560x1440, all max settings, FSR 2 @ performance ACTUALLY LOOKS AMAZING. I never go below FSR2 Quality in other games, but in Witcher 3 Remastered? It ACTUALLY looks beautiful and crisp and clean. Check it out.

    • @jd-foo
      @jd-foo Рік тому

      The sad part is that even in those products that don't have a balanced amount of VRAM, Nvidia is still a better deal.

  • @TimmyJoePCTech
    @TimmyJoePCTech Рік тому +9

    Man 3:26 I've been waiting 5 years to get a glimpse of those GPUs! I always wondered how you could house so many. That's one way to do it!! Fun stuff Steve!

  • @mikefize2279
    @mikefize2279 Рік тому +91

    The A4000 has been quite popular with the enthusiast Small Form Factor Crowd. The PCB is (apart from the power adapter) identical to the reference design of the 3060. You can retrofit a single fan heatsink (3060 Palit/Gainward) to it and have an efficient, powerful ITX GPU ... at a price ;)

    • @nadtz
      @nadtz Рік тому +5

      It's actually been driving the second hand price up as I'm sure this video will as well. You used to be able to find them for under $500 fairly regularly, now it's more like occasionally for $500 and more usually for $600.

    • @GeneralSouthParkFan
      @GeneralSouthParkFan Рік тому +2

      @@nadtz I had to cancel an Ebay order I had for $500 a month back due to a lack of funds so you can imagine how pissed I am when pricing shot up. Best shot is trying to find a good local listing then & hope for the best now.

    • @mikefize2279
      @mikefize2279 Рік тому +3

      @@nadtz I don't think the SFF Community is big enough to really make an impact. I think it's more to do with the fact that people really want an affordable card with lots of Vram. Not only for gaming, but also for different kind of production workloads, for ML, for AI purposes. A used A4000 is a lot of bang for the buck even at $600 for stuff like that.

    • @nadtz
      @nadtz Рік тому

      @@mikefize2279 Possible/probable. Either way it annoyed me that prices shot up over the last few months.

    • @GeneralSouthParkFan
      @GeneralSouthParkFan Рік тому

      ​@@nadtz I'm in the works to try getting a good local deal in person so who knows, maybe I'll been fine. Good luck to anyone else trying to fight for a good priced A4000.

  • @introvertplays6162
    @introvertplays6162 Рік тому +32

    Hardware Unboxed Next to comparing the 3080 12 GB with the 10 GB and maybe 3080 Ti, I would really be interested in seeing tests of the Vega 56 with HBCC enabled. Would be really interesting to see if this technology can get you some much needed performance increases these days.

    • @saricubra2867
      @saricubra2867 Рік тому

      Vega 56 is slower than a GTX1080Ti and the 3080Ti is almost twice as fast than a 1080Ti on raster or way more if you include the RTX features.

    • @introvertplays6162
      @introvertplays6162 Рік тому

      @@saricubra2867 You misunderstood me. I want a seperate video on 3080 10 GB, 3080 12 GB and 3080 TI and another video for the Vega 56 with HBCC alone. I am well aware of where the Vega 56 stands vs other cards. I just want to know in how many games the HBCC benefits you. And I want to know how much you benefit from 12 GB on the 3080 vs the 10 GB version in games where the 3080 struggles already today. The 3080 Ti would be a nice addition to know how much the added bandwidth works.

  • @richardx8202
    @richardx8202 Рік тому +24

    Hey Steve, thank you for revisiting this GPU... I actually own an RTX A4000 and it is a dream come true for an SFF enthusiast like me with its 140W power consumption

    • @samk7500
      @samk7500 Рік тому +2

      Any issues gaming on the A4000? I have access to a lot of cheap refurbished mobile workstations with professional GPUs due to my company replacing employee laptops every ~2 years. They seem ok from the few videos I could find.

    • @Nachokinz
      @Nachokinz Рік тому +2

      @@samk7500 The mainstream geforce drivers will install without issue on workstation cards while potentially unlocking more professional level features as there is little difference in Nvidia driver branches these days.
      Otherwise if you run an A4000 ensure it has plenty of airflow due to its single slot design; if that is an issue consider setting fan speed to 80% while reducing the power limit.

  • @sleepzzz5619
    @sleepzzz5619 Рік тому +18

    Well, I am glad that I bought the 3060 12Gb and not a 3060ti or 3070 for entry level videoediting and gaming. Hopefully many people can see now what it means, that Nvidia releases the new 4070(ti) with just 12Gb of Vram to downgrade them from their real performance, only to sell the same people the 5070 2 years later with too little Vram again. I really hope AMD will become stronger in professionell work like videoediting, because I don´t know if I want to spent money again on Nvidia if I am upgrading the card in the future

  • @b_s_p_k
    @b_s_p_k Рік тому +17

    looks like RTX 4060Ti 8GB is already dead even before launch

    • @kl0ness
      @kl0ness Рік тому

      It should offer slightly better performance than 3060ti so I highly doubt. These are entry cards (usually cheap), people buy them anyway. But I would personally skip 40x series unless you have the budget to go for 4080 or 4090.

  • @spitalul2bad
    @spitalul2bad Рік тому +29

    Those drawers are amazing!

    • @skydream3022
      @skydream3022 Рік тому

      yes bro,
      it's kind of teaser in present.

  • @serenedreemurr
    @serenedreemurr Рік тому +52

    I sent my old 3070 to a modder and he made it into 16 GB. You need to set the power management in nvidia control panel to "highest performance" so it doesn't go black screen when under light load. Other than that, it works great for gaming and productivity.
    EDIT: Moddable cards as far as I know: RTX 20 series: 2070 and above. RTX 30 Series: 3060 Ti and above (3060m or above for laptop).

    • @peterpan408
      @peterpan408 Рік тому +3

      Interesting! What does that cost..

    • @serenedreemurr
      @serenedreemurr Рік тому +15

      @@peterpan408 Roughly $100. Second-hand 3070s cost $330 in my region on average, so it's kinda okay if you have one with expired warranty and you need cuda and vram for your work, otherwise I suggest going for a second-hand 6800 XT ($400 in my region).

    • @jayclarke777
      @jayclarke777 Рік тому +3

      Well played. I wonder if I could do the same with my 3080 10GB

    • @vmafarah9473
      @vmafarah9473 Рік тому +1

      What about laptop 3070.

    • @serenedreemurr
      @serenedreemurr Рік тому +1

      @@jayclarke777 There are modders doing that for around $260.

  • @LeDragoX
    @LeDragoX Рік тому +7

    A Brazilian channel named Paulo Gomes had modded the RTX 3070 to operate with 16gb, but adding physically the memory chips.

    • @ivanbardov
      @ivanbardov Рік тому

      2 years ago dude @VIKon-msk did that already: ua-cam.com/video/14jzlR4yGCQ/v-deo.html

  • @forcinghandlesisdumb
    @forcinghandlesisdumb Рік тому +19

    Really appreciate the work you've done on how bad the VRAM issue is. I feel like you were on the cutting edge of figuring out exactly how bad the problem is and discovering it's worse than previously thought. Now thanks to your hard work, it's the talk of all of tech tube, and I think you can take most of the credit for steering people away from the coming 8gb 4060s which is a win for all the consumers who aren't going to end up with bad cards.

  • @regardtvandermerwe2675
    @regardtvandermerwe2675 Рік тому +34

    Beautiful piece of work. Again, thanks for all the work you do for the gaming community. The only way to get the message across is education and telling Nvidia it's not acceptable.

    • @m_sedziwoj
      @m_sedziwoj Рік тому

      So they stop selling gaming card, if you think you deserve more ;)

  • @JE-zl6uy
    @JE-zl6uy Рік тому +14

    This is why I got a 6750 vs a 3700ti, and I've not looked back. 12 gb might not be 16gb, but it's not 8gb

    • @thepathnotfound
      @thepathnotfound Рік тому +1

      Yeah I picked up a 12GB 6750xt for $420 a few months ago, it was a package deal with. 5800x3d from AMD. I love this card, the performance is everything I need.

    • @Scorpius165
      @Scorpius165 Рік тому +1

      12GB will likely be more than fine for the card's lifespan, also while not ideal, turning down texture quality settings will prevent VRAM spill out. Textures are still the single worst offender, with RT being second. Playing Last of Us or Hogwarts on lowered settings with 3070's is definitely more acceptable than playing at ultra with issues; although it doesn't "feel good", that's for sure...

    • @jose131991
      @jose131991 Рік тому

      @@Scorpius165 doesn’t help that texture quality is one the biggest and immersive visual settings you can tweak. Shame to have to play with worse textures on a card that cost the same as PS5 just by itself .

  • @Navi_xoo
    @Navi_xoo Рік тому +28

    Nice museum of GPU's there lol also, great test! Glad i sold my 3070 for a 6800.

    • @leanlifter1
      @leanlifter1 Рік тому

      Red cards age like fine wine. Enjoy.

    • @samk7500
      @samk7500 Рік тому

      I'm seeing a ton of discounted RTX 3070/Ti's on the used market in the US. The RX 6700/6800/XT is available in much smaller numbers with only a slight discount on the used market. It appears that the overall market is also moving on from 8GB of VRAM for 1440p gaming.

  • @viktortheslickster5824
    @viktortheslickster5824 Рік тому +17

    That power usage on the rtx a4000 is on point!

  • @JinglesOnJuniper
    @JinglesOnJuniper Рік тому +25

    Your videos encouraged me to sale my 3080 10GB while it still has value and buy a used 3090 24GB only cost me $200 for the upgrade and now I don’t have any worries about VRAM. Been very happy with it! Now if I could just convince my friend that his 3080 10GB is about to be on life support for high res AAA gaming.

    • @awebuser5914
      @awebuser5914 Рік тому +8

      "...convince my friend that his 3080 10GB is about to be on life support for high res AAA gaming." Right, you want to encourage a friend to dump a perfectly adequate card simply because a minuscule handful of recently released *badly-optimized console ports* have VRAM issues? That's a smart idea...

    • @kaant21
      @kaant21 Рік тому

      I want to do the same thing but im scared of second hands

    • @xxovereyexx5019
      @xxovereyexx5019 Рік тому +12

      @@awebuser5914 Are you live under a rock, insufficient 8GB vram issue also have been problematic on actual AAA pc games for years, not only on those recent console ports games.

    • @awebuser5914
      @awebuser5914 Рік тому +1

      @@xxovereyexx5019 "insufficient 8GB vram issue also have been problematic on actual AAA pc games for years" Really, do tell! *Every* GPU review I've seen in the last 3 years or so never mentioned a "problem" with VRAM and their test suites. Maybe some knuckle-dragger cherry-picked some oddball combination of obscure game and settings to try to make a point, but it's *never* been an issue. If anything, it was the opposite, where various testers proved that the 8GB 470/570 and others were a complete waste of time since the VRAM could be usefully utilized since the CARD ITSELF WAS NOT CAPABLE OF THE REQUIRED PROCESSING POWER.

    • @TheDravic
      @TheDravic Рік тому +6

      @@awebuser5914 RX 470 came out in 2016 though, that's 7 years and an entire console generation ago.

  • @valtarg1299
    @valtarg1299 Рік тому +22

    Nvidia : Promote RT
    Also RT : Need more VRAM

  • @taipeitaiwan10
    @taipeitaiwan10 Рік тому +6

    WTF. I still have no idea how this channel isn't at 2m subs already. Steve has been killing it for years. Top tier here.

  • @KeinZantezuken
    @KeinZantezuken Рік тому +5

    Consider adding PCI Bus usage graph to the VRAM limitation tests. RTSS have that sensor I believe.

  • @riba2233
    @riba2233 Рік тому +36

    haha, I was just reading some news about guys who soldered 16gb vram on 3070, and was just wondering how you should test it :)

    • @alexusman
      @alexusman Рік тому +1

      3070 16GB was tested 2 years ago, had lower 0.1% and 1% lows even back then in some games.

    • @hugovlsilva
      @hugovlsilva Рік тому +5

      There is a guy in Russia that did just that in 2021, if I'm not mistaken. Also, some people here in Brazil put 12 GB on an RTX 2060 some months ago, and 16 GB on an RTX 3070 last week. The only issue they had os that the screen flickers, which is solved by avoiding the GPU from entering in energy efficient mode.

    • @hugovlsilva
      @hugovlsilva Рік тому +2

      This is the link of the video: ua-cam.com/video/W6uaUHBNFOU/v-deo.html, however, everything is very experimental and, obviously, the video is in Portuguese.

    • @theSeventhDruid
      @theSeventhDruid Рік тому

      ua-cam.com/video/Oph2MKyimYY/v-deo.html 3070 16 gb vs 3070 8 gb. 2 year old test. Russian language.

  • @twinjuke
    @twinjuke Рік тому +4

    Great video again... I'd love to see similar comparison between 4070 and 6800(XT). And a 3090 vs 4070Ti for the same issue...

  • @jayb2705
    @jayb2705 Рік тому +2

    Seeing you open those drawers with all those gpu's is like getting a glimpse into Fort Knox.

  • @WiltshireTutorials
    @WiltshireTutorials Рік тому +2

    What timing! 😲 I just finished assisting in the development of modding the A4000 with a shunt mod to help increase the performance past a stock 3070Ti. Have a look at my videos on the shunt modded A4000. Would be cool to see you do a video on the shunt power mod for A4000 I helped develop 😁

  • @ChrisPkmn
    @ChrisPkmn Рік тому +4

    Steve & Tim, when possible could you show off the rtx 4000 sff Ada? I don't think its released yet but with no external power & 20GB I am very curious to see it run as this could be our new

    • @nadtz
      @nadtz Рік тому

      I've seen it listed a few places (in the UK) for £1,500. At 7W I get why the sff builders out there are excited about it even at that price.

  • @ArnikoGiri
    @ArnikoGiri Рік тому +26

    Someone actually modded Samsung 16GB memory on a RTX 3070.

    • @user-fe2oh8oj2u
      @user-fe2oh8oj2u Рік тому +1

      Damn. Would be great if someone benchmarked that "handmade" 3070.

    • @Nalguita
      @Nalguita Рік тому +1

      Nvidia say the 3.5Gb Vram should be ok wihout Issues. Me with 970 on Wolfstein 2, like 50% slower fps than Gtx 1060.

    • @chriswright8074
      @chriswright8074 Рік тому

      @@Nalguita duh 1060 on 980 performance

    • @user-mz1if8oe9k
      @user-mz1if8oe9k Рік тому +3

      @@user-fe2oh8oj2u It was made 2 years ago on "PRO Hi-Tech" channel.

  • @TheBlueStinger55
    @TheBlueStinger55 Рік тому +2

    I wonder if the Vega 56/64's and Radeon VII's HBM2 copes better at the limit compared to their equivalent cards from back in the day? Also does the extra vram on the Radeon VII help it with Last of Us or is it just too slow of a card to utilise it properly anyway?

  • @lvmarv1
    @lvmarv1 Рік тому +38

    I was looking at the list of graphics cards I have owned since I have been a PC gamer. It was something like 16 Nvidia cards to 1 ATi card. I did however buy a 6700xt for my wife's PC to see how a modern AMD card went and have been impressed with it. The kind of results in this video pretty much say to me that as soon as my 3080ti is done, I'll be off to AMD too.

    • @theplayerofus319
      @theplayerofus319 Рік тому +3

      yeah i got a 3080ti too but im scared beacuse my 12gb are barley enough at 4k atm

    • @pk417
      @pk417 Рік тому +1

      @@theplayerofus319 play at 1080p

    • @Dr1ftop1a
      @Dr1ftop1a Рік тому +13

      @@pk417 Going from 4k BACK to FHD is just asking your eyes to bleed.

    • @SimonMedia666
      @SimonMedia666 Рік тому +3

      I am on 3080 but I will probably go AMD next as nvidida is either extremally expensive or VRAM limited...

    • @Tc4ify
      @Tc4ify Рік тому

      At the really high end it's still all Nvidia though - nothing comes close to the 4090.

  • @samserious1337
    @samserious1337 Рік тому +9

    I got a M4000 8GB which is basically a GTX 970 with more VRAM, but with less clockspeed. Gladly, BIOS mod Tools exists and it runs now at 1100MHz core and 3200MHz VRAM @ 125W max.

    • @theplayerofus319
      @theplayerofus319 Рік тому +1

      yeah but in 2023 it surely is time for a upgrade ey

    • @samserious1337
      @samserious1337 Рік тому +1

      @@theplayerofus319 Thats just my backup card, RX 6650XT suits me fine

    • @hasnihossainsami8375
      @hasnihossainsami8375 Рік тому +3

      @@theplayerofus319 wasn't he point, the absolutely got more life out of that card than all 970 users in existence.

  • @andym975
    @andym975 Рік тому

    Great comparison vid, thanks!

  • @Roll_the_Bones
    @Roll_the_Bones Рік тому

    For anyone wondering about the Outro tune, it's a nice version of the (very) old London music-hall classic "My old man said Follow the van, and don't dilly-dally on the way" by Marie Lloyd, circa approx. 1910??? She was gone by 1922, and I can only find one proper filmed performance of her on YT, but it's "A Little of What You Fancy Does You Good" from when she was knocking on a bit, but she's something else, if you like that sort of thing!

  • @CPowell133
    @CPowell133 Рік тому +9

    I like how last year I was stating that 12GB VRAM would probably be a min for 1080p and that ,if you played at higher resolutions above that, then 16-24GB VRAM would be needed in a couple years.
    Devs have been asking for more VRAM around the time the latest gen consoles starred shipping with 16GB VRAM.
    But, I got berated in most video comments for being crazy and that 12GB was perfectly fine for even 1440p. Could be people just trying to justify their inflated GPU price with it's 12GB VRAM or less.

    • @queenform
      @queenform Рік тому +1

      buyer's remorse from all the 8GB card buyers

    • @brams.
      @brams. Рік тому +3

      Current gen Xbox and PS5 can allocate a bit more than 12GB of VRAM on their 16GB shared memory. It is obvious 12GB of VRAM will be the bare minimum, till next gen Xbox and PS6. With a NVME SSD to benefit from DirectStorage.

    • @erwinmatic5062
      @erwinmatic5062 Рік тому +2

      Could be opinions of young PC gamers who haven't had the wisdom yet to appreciate higher Vram.

    • @CPowell133
      @CPowell133 Рік тому +1

      I am still on a GTX 1070 and held off last gen due to pricing. But, I can start to tell I may have to upgrade this gen even though pricing isn't great. I'll probably go back to team Red this round given the VRAM concerns.

    • @CPowell133
      @CPowell133 Рік тому +1

      That, or people tend to forget that we have seen these issues before in those 1-2 or 2-4 GB generations. History repeats itself.

  • @Skazzy3YT
    @Skazzy3YT Рік тому +6

    My PC is known as the "regret" system. A 10700F, no upgrade path on 11th gen, and a RTX 3070, not enough VRAM

    • @theplayerofus319
      @theplayerofus319 Рік тому

      damn

    • @Ausf
      @Ausf Рік тому

      Can always sell and buy something else. Not like keeping obsolete components is of any real benefit. All that matters is the price difference between what you can sell your stuff for, compared to what the upgrade is.

    • @MiGujack3
      @MiGujack3 Рік тому +2

      Just get rid of the GPU while the value is high. The 10700f is a decent processor, it will last you for a long time.

    • @Rodrigo38745
      @Rodrigo38745 Рік тому +1

      your pc is still better than 99% of pcs, youre fine.

    • @DadGamingLtd
      @DadGamingLtd Рік тому

      F

  • @gmcd1987
    @gmcd1987 Рік тому +2

    I'm here in Aus building my first PC in 12 years and I have written off NVIDIA due to price of the new 40XX series and the VRAM on the 30XX series. You're spot on the money in my mind.

  • @Davidium84
    @Davidium84 Рік тому +1

    Great points lifted here and really well put forward video highlighting the necessity for at least 12gb vram on nvidias graphics cards!

  • @jasonhowe1697
    @jasonhowe1697 Рік тому +3

    I would have to wonder what the capacity is with 24GB versions of the each brand card

  • @M3RC155
    @M3RC155 Рік тому +6

    it would be interesting to see how the 3070 compares to the 2080ti since when the 3070 launched they were basically the same performance and it may be a different story now in these newer tittles

    • @erazzed
      @erazzed Рік тому +1

      Upvoted and commented for visibility. I'd like to see if the 2080Ti outperforms the 3070 just due to more VRAM.

  • @hankblah
    @hankblah Рік тому

    Very good content! Keep up the good work!

  • @dudenukem1594
    @dudenukem1594 Рік тому +1

    the question that popped into my mind does the resize bar (on) have more impact when the gpu is memory limited (like the 3070) or is there no effect for the limitation?

    • @cosminmilitaru9920
      @cosminmilitaru9920 Рік тому

      From what I understood, on Nvidia not much of a difference up to 8 GB; on AMD it's a good difference in performance, and on Intel dedicated GPUs it is mandatory to have Resize BAR on for them to work well.

  • @robertt9342
    @robertt9342 Рік тому +3

    It’s interesting how much and quickly vram is really ramping up.

    • @shoego
      @shoego Рік тому

      It's due to the end of crossgen games.

  • @hassansalih7229
    @hassansalih7229 Рік тому +4

    I have a 6900XT which I will hold onto until the RX 7000 series prices drop, because it looks like my 6900XT will be able to run most games until then... I hope

  • @Owen-fn8ff
    @Owen-fn8ff Рік тому +1

    Thanks, Steve. All these will be worth a revisit as new games come out. We have entered a new age for gaming I feel and it feels like it’s the late 90s or early 2000s again, when if you wanted to play the latest games, you just had to upgrade, simple as. Maybe that’s how it should be? We’ve been ok with the same hardware for years, maybe that’s the real regression? Now is the new normal.

  • @cracklingice
    @cracklingice Рік тому

    haha nice catch on having the nvidia quad row there BZ. I had completely forgotten.

  • @hasnihossainsami8375
    @hasnihossainsami8375 Рік тому +6

    Ah, yes. Where the "8GB is enough" stans at?

  • @javierpuerto4463
    @javierpuerto4463 Рік тому +4

    There are three factors that are key to understanding the GPU market that are often overlooked by UA-cam reviewers: 1) The APU market , 2) the console market and 3) Adaptive resolution technologies such as DLSS and FSR.
    Neither Nvidia nor Intel come even close to AMD's level of integration in APU design. Intel is trying hard to develop competitive GPU's (and hence APU's) but it's AMD who decides how eficient mainstream APU's can be and that is the gold standard that drives the GPU market.
    AMD could launch an APU capable of delivering the same amount of graphics processing power as that of an Xbox Series X or PS5 if they wanted to: It just wouldn't make any sense for them since it would undermine their CPU and GPU business. That's the reason AMD APU's were restricted to the OEM market while also being designed to perform way behind their current GPU counterparts. Early Zen architectures had a lot of power to spare, especially when compared to Intel's competing architecture, and yet AMD decided to play it safe by launching CPU's that didn't even feature a GPU!
    As long as AMD enjoys the "exorbitant privilege" of selling 22 million "APU's" to Sony and and Microsoft their dominance in the GPU and CPU market will be uncontested. AMD gets to decide what architecture game developers adopt by designing their gaming console chipsets in a way that favors their market strategy in the CPU and GPU businesses. Game developers will always design their games in accordance to console hardware requirements. There is a push to drive more VRAM in new games and Nvidia has benn caught off guard. AMD is clearly pushing developers in that direction while producing GPU's that have double the memory of their Nvidia counterparts.
    Finally, adaptive technologies are having a huge impact in how Nvidia and AMD design their new GPU's. Any technology that provides "free" FPS, while helping your product stand out in direct contrast to your rivals, will also increase performance in your older/lower line of products hence preventing users from buying into newer products. Once again, more VRAM helps new cards compete favorably to older cards in the case of AMD in spite of FSR implementation.
    Where Sony and Microsoft decide to go with their new generation of consoles will dictate the future of the GPU market. There is no question that VRAM will be at least doubled but I believe the deciding factor will be how much FSR or DLSS or another similar technology they embrace with their hardware implementation.

  • @psychomccrazy
    @psychomccrazy Рік тому

    Nice shoutout to MLiD!! Keep up the excellent work for customer awareness

  • @dsirius1500
    @dsirius1500 Рік тому

    Great review Steve, thank you for your hard work.

  • @AlmightyGTR
    @AlmightyGTR Рік тому +8

    What a video!! This is final nail in the coffin of VRAM debate. It is absolutely a bottleneck, luckily I have a high VRAM card, so it doesn't impact me directly, but I do feel the pain of those who would be in this position.

    • @alargecorgi2199
      @alargecorgi2199 Рік тому

      Is it though? Really depends on how many more games have a 8GB VRAM issue as HUB preferred the 8GB 3070 when it launched. VRAM only became a problem because of a string o badly optimized games that aren't releasing frame buffer properly.

    • @Nachokinz
      @Nachokinz Рік тому

      @@alargecorgi2199 Publishers of these AAA games that are pushing graphical fidelity will only allow their employed developers optimize as much as what makes financial sense within a given time frame; guided by either economic conditions and/or targeted profit margins.
      The console crossover point is ending; as the newest gen consoles are equipped with 16gb of gddr6 memory that can be used as either a vram or system ram buffer, the 8gb/12gb cards are next on the chopping block. Enjoy what you have to the best of your abilities as technology moves forward regardless of ones feelings.

  • @felixtag5121
    @felixtag5121 Рік тому +9

    why not showcase 3070 vs 2080ti? they where pretty much the same but difference in RAM... would have been interessting to see how the once might holds up..

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому

      Great Idea.

    • @valenrn8657
      @valenrn8657 Рік тому

      Titan RTX (TU102) 24 GB vs RTX 3070 Ti (GA104) 8 GB GDDR6X.

    • @alexusman
      @alexusman Рік тому +1

      Different uArches so they can perform very differently.
      3070 vs A4000 on the other hand perform mostly linear with the freq differences between them.

  • @adamwinterton152
    @adamwinterton152 Рік тому

    Great insight and a good shout out to MLID

  • @Mach1048
    @Mach1048 Рік тому

    Steve, please tell me that the tool chest of GPU's actually have some sort of organisation system for those cards?
    It makes my eye twitch :D

  • @BrotherO4
    @BrotherO4 Рік тому +19

    Man.. it’s almost like nvidia planned this.

  • @gucky4717
    @gucky4717 Рік тому +5

    The A4000 is the best SFF Card to date with an aftermarket cooler similar to the Gainward 3050. The 4070 could have been the better choice, but there is no small cooler and I don't know if the small one fits.

    • @mikefize2279
      @mikefize2279 Рік тому +1

      The reference PCB of the 4070 is quite a bit larger than the A4000. Even if you fit a small heatsink to it, it would most likely be a ~200mm card at least. Still small, but not as tiny as the SFF version of the A4000.
      Another option is the RTX 4000 SFF Ada. 20GB of Vram, 167mm in length, low profile, 70W TDP and no supplementary power. Should deliver ~3070 Ti performance. And it's almost $1400.

  • @iancohn648
    @iancohn648 Рік тому

    Does anyone have a link to a full version of the outro song, it's such a bop and I've been looking all over for it

  • @RadmonGlass
    @RadmonGlass Рік тому

    Hello, I wanted to know where you got these 8 Gddr6 chips?
    I want to make this mod myself but I can't find them anywhere online.
    Thank you 😁

  • @Umbongo_Fever
    @Umbongo_Fever Рік тому +6

    Luckily for me my 3070 is fine for the games which I play as I honestly don't play that much in the way of AAA games, and when I do I don't bother with RT. I am however currently looking for a new GPU as I'd like to future proof my PC for longer. Whether I'd stick with NVIDIA or jump ship to AMD will come down to research.

    • @vladtepes3508
      @vladtepes3508 Рік тому +1

      The comment that matters. This is the idea. I play mainly apex and escape from tarkov so i don't give a fk about their v ram issue. I don't like sp triple A games. They are too boring. I play only mp so for me an 3070 is more than fine.

    • @oliversmith2129
      @oliversmith2129 Рік тому +2

      My 3060 Ti works fine in 1440p medium to high settings for most games I play. If that changes, I'll play the remaining 300 games in my backlog before spending more money for less value. It's a matter of principle.

  • @intraspec
    @intraspec Рік тому +3

    I thought the RTX 3080 mobile with 16GB was going to be overkill for a good while. I've seen a number of games going over 8GB so glad I got this one

    • @LordAshura
      @LordAshura Рік тому +1

      Same, I remember people saying that the 3070/3070 Ti Mobile was where it's at and the 3080/3080 Ti were not worth it. Looks like double the VRAM is going to mean we have to upgrade less than the poor dudes who bought 8 Gb of VRAM.

  • @ntomnia585
    @ntomnia585 Рік тому

    Curious, you show GPU memory usage and GPU memory usage/process on some of the benches, but not others. Correct me if I'm wrong, but I think the higher usage number is what is being allocated, while the process is what is actually being used. If that is correct, why is the process usage so low in TLOU on the 3070 while it's very close to the allocated memory on the A4000 and RX 6800? In following games, the process stats aren't shown on the 3070. What's the deal?

  • @HaRdCrAsS
    @HaRdCrAsS Рік тому

    Nice eye-opener video!

  • @HenryTownsmyth
    @HenryTownsmyth Рік тому +4

    I can't play Hogwarts Legacy and Requiem with RT on even with RTX 3080 ti 12GB at 3440x1440 res.
    Horrible stuttering and fps drops. So yeah, 3080 and ti version should have had 16GB VRAM.
    3070 should have had 12GB.

    • @valenrn8657
      @valenrn8657 Рік тому

      Hogwarts Legacy was my trigger to dump RTX 3080 Ti for RTX 4090. NVIDIA's subscription model works.

    • @HenryTownsmyth
      @HenryTownsmyth Рік тому

      @@valenrn8657 yeah, if I knew this VRAM issue in detail, I would have gone for 3090 as I can't afford a 4090.

  • @EdU-od5ec
    @EdU-od5ec Рік тому +4

    Almost bought a 3070 last year. Decided to go 6750xt as it was cheaper and non-raytracced performance is the same IMO. Glad I did go AMD.

    • @thepathnotfound
      @thepathnotfound Рік тому

      I have this card too, it’s great I paid about $420 in a combo AMD deal with a 5800x3d.

  • @ChrisPkmn
    @ChrisPkmn Рік тому +2

    I'll leave it to Buildzoid to figure out how vram ecc performance costs compares to system ram ecc. I wonder if there's a specific spec they can't boost from

  • @mhoop1
    @mhoop1 Рік тому +2

    When EVGA was having their 23rd anniversary sale right before they stopped making nvidia cards I jumped on the discounted 3060 for a new build simply because I feel as you, that 12gb was the new 'bare minimum' going forward. I have it in a 7600x build with 64gb ddr5 6000 and it's probably the most solid build for gaming at 1080 ultra I've ever cobbled together.

  • @GraceMcClain
    @GraceMcClain Рік тому +3

    The way NVIDIA has gone about things these last few years, especially stuff like this, means there's a very high chance I'll be moving over to AMD for my next GPU.

  • @M00NM0NEY
    @M00NM0NEY Рік тому +18

    This is a fantastic video! Good job, HUB & Buildzoid for recommending the A4000 for the larger 16GB VRAM.
    However… we all know Nvidia will take note and in the future they will probably showcase 9GB at the midrange 😂So don’t hold your breath for 16GB. Nvidia likes to troll the tech world, especially gamers and reviewers.

    • @TheDravic
      @TheDravic Рік тому +2

      How would 9GB even work? What kind of frankenstein memory bus would it be built with for 9GB to even be feasible?
      I don't think you know what you're saying.

    • @watermelon1221
      @watermelon1221 Рік тому +3

      @@TheDravic based on the current gen, it will be a 144 bit memory bus with 9GB ram :DDD

    • @shanent5793
      @shanent5793 Рік тому +2

      ​@@TheDravic DRAM speed is highly variable, even within one die. They could use oversized chips and harvest only the regions that meet the spec. It's how they can make 48 and 96GB DDR5 DIMMs

    • @Rachit0904
      @Rachit0904 Рік тому +2

      @@shanent5793 those DIMMs use actual 24 Gbit chips. There aren't any mass-produced 32Gbit chips to cut down afaik. That's why we have 48GB DIMMs and no 64GB DIMMs.

    • @shanent5793
      @shanent5793 Рік тому

      @@Rachit0904 oops I stand corrected

  • @WolFoxMark
    @WolFoxMark 3 місяці тому +1

    06:44, How come the floor is blurry? Is it graphic card itself or the game settings?

  • @shahrukhwolfmann6824
    @shahrukhwolfmann6824 Рік тому +1

    Really looking forward to your review of the 8GB 4060 series. Keep up the good work, thumbs up!

  • @perfectdiversion
    @perfectdiversion Рік тому +5

    Soon we will need 48gb quadros to play console ports

  • @tahustvedt
    @tahustvedt Рік тому +5

    I picked a single fan PNY 3060 12 GB for my ITX workstation. Underrated card when it comes to efficiency. Undervolted it runs stock speed (1800 MHz) at 100 W.

  • @kardeon4503
    @kardeon4503 Рік тому +6

    I am glad you took the time to talk about the developers. At the end of the day, they are the ones who know more than everybody else about the Vram topic.
    8 GB Gpu have been slowing down the gaming industry for many years. Less quality textures, less new features, less effects and so on....
    16 go is indeed the minimum requirement for gpus priced above 400 $.
    It blows my mind when I see gamers buying in 2023 an rtx4070/ti with only 12gb. These Gpus are already crushed in RE4 at 4k with RT because the game requires 14gb. it seems that some gamers never learn and keep buying Gpu like sheeps lost at Microcenter.

    • @Vis117
      @Vis117 Рік тому

      Well not everyone is playing in 4K with ray tracing. 12GB will be alright at 1440p and below for a while, especially without ray tracing, which is still too costly anyway. I’m all for more vram, but let’s not get too carried away just yet.

    •  Рік тому

      On the video games strugle at 1080p. That is just a scam from Nvidia. 8 more gigabytes cost just 20 dollars more for them.

  • @vextakes
    @vextakes Рік тому +2

    Yeah great perspective. Saw news that ROCm (so AMD’s way of utilizing their GPUs with productivity software) is going to be coming to more apps. I wonder if this will put pressure on Nvidia’s pro cards. Especially since AMD’s cards are already kitted with more vram. Ig it depends how popular and how much support it gets from these communities. However creative devs may be into it because it would allow more ppl to use their apps
    Anyway great vid Steve, I loved seeing how Nvidia segments their market. Hadn’t thought of that before

  • @zellcarter
    @zellcarter Рік тому +1

    I've been thinking about this. Shouldn't this issue fall on these newer games developers? Like the texture packs should scale to the hardware?

  • @p00ner
    @p00ner Рік тому +8

    I’d be interested in knowing the real world difference in cost of adding 8 more GB of vram. It can’t be that much more in the cost to the consumer. Problem is nGreedia is nGreedia. Makes me very happy I decided to not go for a 3080 and instead I’m happy with my 6900xt.

    • @watermelon1221
      @watermelon1221 Рік тому +1

      it will cost them roughly $10-12/GB. Less if they don't go GDDR6X

    • @Yuri-xx2gi
      @Yuri-xx2gi Рік тому +1

      The cost is that it'll take longer for us to upgrade, and ngreedia doesn't want that, the vram itself is cheap

    • @erwinmatic5062
      @erwinmatic5062 Рік тому

      Some commenters on other tech channels said it'll only cost Nvidia 20 dollars at the most.

    • @gozutheDJ
      @gozutheDJ Рік тому

      @@watermelon1221 right cuz im sure u know

    • @gozutheDJ
      @gozutheDJ Рік тому

      @@erwinmatic5062 imagein listening to youtube commenters lmaoo

  • @GewelReal
    @GewelReal Рік тому +3

    Sadly they are still expensive on the used market, unlike the RTX A2000

  • @nadtz
    @nadtz Рік тому

    And don't forget the new A4000 SFF is going to come with 20. I was wondering about this too since I've been looking at a A4000 used and ran into your old video since you can sometimes find the a4000 used for ~500.

  • @titaniummechanism3214
    @titaniummechanism3214 Рік тому +2

    There are one or two RTX 3070 16G out there in the wild, maybe you could get your hands on one of them and benchmark it. I read an article about a portugese modder who put, as far as I understood it, the memory chips if the A4000 on the 3070.

    • @hugovlsilva
      @hugovlsilva Рік тому

      Yes, one is from Russia and the other is from Brazil. Here is the video of the last one: ua-cam.com/video/W6uaUHBNFOU/v-deo.html

  • @bluedragon219123
    @bluedragon219123 Рік тому +3

    You know despite NVidia being stingy with VRAM on their Desktop GPU Cards they are, or were as I haven't checked recently, quite liberal with VRAM on their Laptop GPUs. I have had a 9600M GT with 512MB VRAM, a GTX 670M with 3GB VRAM, and a GTX 980M with 8GB VRAM. Some might say that that's too much VRAM for a Laptop GPU but I prefer having too much than not enough. Still odd how NVidia does things. :)

    • @LordAshura
      @LordAshura Рік тому +1

      Sadly, they're being stingy with their 40-series. Their 4080 only has 12 Gb of VRAM, you must get the big boy 4090 to get 16 Gb of VRAM. While the 3080/3080 Ti came with 16 Gb, granted the 3070/3080 Ti have 8 Gb.

  • @leroyjenkins0736
    @leroyjenkins0736 Рік тому +4

    I replaced the memory on my rtx 3080 to up it to 20 gb it's easy to do with a heat gun and a steady hand (only cost like 50 euro) I might start offering that as a service haha

    • @The23rdGamer
      @The23rdGamer Рік тому +3

      You should

    • @Gelonyrum
      @Gelonyrum Рік тому

      and no any problems with drivers, crashes etc.?

    • @leroyjenkins0736
      @leroyjenkins0736 Рік тому

      @@Gelonyrum make sure it's micron you can even go cheaper if you don't want ddr6x and go with normal ddr6 about 30-40 remove the 1gb ones with a heat gun use a bit of flux if needed move it at the same angle as the default ones and you got yourself a rtx 3080 20 gb can be 24 gb if you have the 12 gb one no problems at all used it for about 5 days now

    • @Gelonyrum
      @Gelonyrum Рік тому

      @@leroyjenkins0736 nice, sounds great, sad thing that it voids warranty, but still great thing)

    • @leroyjenkins0736
      @leroyjenkins0736 Рік тому

      @@Gelonyrum its more then likely already expired for most haha

  • @moasto02
    @moasto02 Рік тому

    Bro. That little wiggle that Steve does with his head when saying 'RX 6800' right at the start of the video is why im subscribed.

  • @BlueberryJamPie
    @BlueberryJamPie Рік тому +1

    Saw a Brazilian youtuber's video (Paulo Gomes) where they modded a 3070 to have 16GB of VRAM and showed it working and much increased performance without hitching in RE4 where it would go up to 10GB easily.

  • @MrMcp76
    @MrMcp76 Рік тому +2

    The RTX 4060 8GB cards are going to be a dumpster fire. Can't wait for the reviews!!
    My 3070 Ti is just trash on a 3440x1440p ultrawide. Which is sad because it was plenty of horsepower with the GPU, but the 8GBs of vram just turns everything into a stuttering mess.

  • @mightko502
    @mightko502 Рік тому +3

    The 4070 looks nice at the moment and I expect 4060 Ti to be even better, but the limitation of VRAM will likely push me away from Nvidia again towards AMD.

    • @erwinmatic5062
      @erwinmatic5062 Рік тому +1

      In my opinion the only impressive part of the 4070 is the power consuption. Will do well in itx builds.

  • @thattrainfan6344
    @thattrainfan6344 Рік тому +1

    I feel like I get the gist of the whole VRAM thing, but could you test the rtx 3070 vs the A770 16 gb?

  • @WilliamOwyong
    @WilliamOwyong Рік тому +1

    Would be interesting to see if/where the performance crossover happens between, say, a Ryzen 5700G with 32GB or 64GB RAM (16GB to VRAM) to a Ryzen 5700 and discreet 8GB GPU. I'd say we're probably not too far off if integrated graphics performance continues to improve.

  • @kobus_n
    @kobus_n Рік тому +5

    Sounds like a 3060 12GB vs 3070 8GB should be the next video...

  • @muthegameonline
    @muthegameonline Рік тому +3

    The question is if the 12 G of the 4070 and 4070 Ti are going to be enough in two years

    • @brams.
      @brams. Рік тому

      maybe on 1080p or 1440p native. But probable not for UHD native textures, at least 16GB I think for native 4K textures.

    • @yosifvidelov
      @yosifvidelov Рік тому +1

      well for 1080p native and 1440p with DLSS it will be enough. For anything above in AAA titles not so sure. Even 16GB buffer could be saturated for 1440p in 2 years time.

    • @brams.
      @brams. Рік тому

      @@yosifvidelov "Even 16GB buffer could be saturated for 1440p in 2 years time" Yes, but only with RT on and Ultra presets.

  • @PQED
    @PQED Рік тому

    I watched Broken Silicon 201 just yesterday, so I can only say this: You absolutely hit the nail squarely on the head with this video.
    Excellent!

  • @paulboyce8537
    @paulboyce8537 Рік тому +1

    FINALLY!!!!!!!!! I APPLOUD YOU TALKING ABOUT FRAME TIME first putting FPS in the background. Because that's what matter and FPS while good to have means nothing if the frame time is bad.👍👍👍👍👍👍👍👍👍👍