Wait, Did They Fix 8GB GPUs?

Поділитися
Вставка
  • Опубліковано 3 жов 2024

КОМЕНТАРІ • 921

  • @AlexCrMaReLiLukyanenko
    @AlexCrMaReLiLukyanenko Рік тому +161

    The point is that games now don't look THAT much better to require THAT much more memory. And when you turn settings down they start to look worse than some older games, that consume 3-5 times less memory and computing power on HIGHER settings

    • @frozby5973
      @frozby5973 Рік тому +33

      THIS, i say this all the time, yes there should be more vram its not that expensive to put 4gigs more especially on new gen cards... we had a 8gig 1080 release in may 2016 THATS 7+ YEARS AGO , BUT, there is no excuse for games to look worse than games from 5+ years ago and run so horribly and use so many resources aswell. its just lazyness and greed from both sides.

    • @BigHeadClan
      @BigHeadClan Рік тому +10

      Aye Halo Infinite looks much worse than most of the remaster series and Halo 4-5 while performing awful.
      Granted that too is due to poor optimizations from the developer.

    • @andremalerba5281
      @andremalerba5281 Рік тому +6

      On the latest Broken Silicon, Tom showed a tweet from someone blaming DX12 and got in touch with some devs and corfirmed the situation.
      DX11 had a built-in feature that managed which textures should stay in memory and which didn't.
      On DX12 and Vulkan the dev has more access do the HW as it's close to the metal, so the devs themselves have to build a system to manage which textures will be flushed and which won't and the devs spent most of their time optimized for the masses which are PS4, PS5 and 12Gb VRAM GPUs, that's one of the devs told him, everything else is a second thought and the devs themselves said "on PC you just lower settings or brute force and the game runs, we need to ship a game that runs! Optimizing for 8Gb of VRAM nowdays would add 2 more years to game development" so yeah, that's the harsh truth, publishers pushes devs to release the games in a working state ASAP and they only have time to optimize for so much and now have to manage which textures stay and which doesn't by themselves instead of by the API so that's the present and future situation.
      On the flipside the devs said that DX12 and Vulkan allowed the games to scale better and run on even lower end setups but they will or will not improve the VRAM usage for 8Gb VRAM GPUs on patches after launch.

    • @tiagomorgado3798
      @tiagomorgado3798 Рік тому +10

      Yup, remember the Witcher 3, it still has some good textures and the game uses 1gb of VRAM

    • @DenverStarkey
      @DenverStarkey Рік тому +7

      @@frozby5973 the 70 class cards got screwed the worse , at least they slowy upped the vram on the 80 calss going into the 2000's adn 3000's while the 2070 and 3070 bboth got stuck with 8 gigs

  • @ocha-time
    @ocha-time Рік тому +188

    Also be careful with average frame rates. I guarantee you will not remember the averages, you will remember those 1/.01% lows.

    • @Micromation
      @Micromation Рік тому +30

      So much this, people are looking at wrong numbers in benchmarks.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +10

      It depends on how frequent those 0.1 lows happen. And sometimes it is not hardware related issue. But simply game engine issue.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Рік тому +21

      @@arenzricodexd4409 they happen .1% of the time...? So a stutter, on average, every 10 seconds on a game running at 100fps average?

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +5

      @@CyberneticArgumentCreator just look at the time frame. Sometimes it only happen once or when the game load a new area. But if we look at the on screen display the lows still being mentioned. Ultimately even if the frametime graph show higher time frame it does not always being felt or affecting game smoothness. That's why the numbers that we got from all this kind of measurement cannot be a definite proof on how good the experience is.

    • @xerxeslv
      @xerxeslv Рік тому

      Daniel Owen just made video about those, and somehow 4060ti is doing fine...

  • @yasminesteinbauer8565
    @yasminesteinbauer8565 Рік тому +180

    If a 400,-€/$ graphics card is not able to match a 2 year old console in terms of texture resolution, that's not hysteria. With less than 12 GB you will simply get more and more problems in the near future. So the problem is by no means solved.

    • @terraincognitagaming
      @terraincognitagaming Рік тому +29

      Or you can be smart and not play on Ultra. Reminder: PC gamers were supposed to be smarter and more educated in terms of settings than their console counterpart gamers. But thesedays, zoomers gonna zoom and must toggle everything to max and then complain WHHYYYY DOESNT IT RUN WELL ON MY MACHINEE!?!?!? while they get ZERO visual gains from it over Ultra.

    • @DarkDorel
      @DarkDorel Рік тому +36

      @@terraincognitagaming or you can be more smart and don't buy gpu 8gb at 400$ in 2023.
      It's clear that this actual generation are just refreshing of the older cards 3000 that was 4 years old... With performance of 5-10% and with 8gb don't expect to play new games with UE 5 in 2024.
      It feel like this 7000 & 4000 series are totally scam move for both company.
      We don't get any real upgrade with this gpu.
      And i think the real upgrade will come in 2024 of the "real next-gen generation gpu"... Better save money when that days comes.

    • @yasminesteinbauer8565
      @yasminesteinbauer8565 Рік тому +8

      @@terraincognitagaming In fact, many last-gen titles High and Ultra often had barely visible differences. However, that is no longer true for current titles. In Diablo 4, for example, you can see very clear differences in texture resolution between High and Ultra. And if you want to achieve the same texture quality as on consoles, you need a 16GB card to play it smoothly. And again: We are not talking about homeopathic but clearly visible differences.

    • @headoverheels88
      @headoverheels88 Рік тому +5

      That's not entirely fair, considering one benefit of consoles is they have ONE model (give or take) which means game developers can optimize for a single platform with predictable hardware. It's a similar dynamic with developing apps for Apple V Android: diversity of models make Androids more difficult to optimize for.

    • @terraincognitagaming
      @terraincognitagaming Рік тому +6

      @@yasminesteinbauer8565 The reason the texture resolution is different is due poor optimization something that Last of Us proved. Look at 1.0 version of Last of Us. The medium textures looked absolutely horrible. Guess what, patch 1.0.5 drops and WHAT! The texture quality literally quadruples AND the VRAM usage goes down by as much as 2GB depending on the scene. CRAZY right? Yeah Blizzard is doing the same thing.

  • @PropaneWP
    @PropaneWP Рік тому +133

    This is much the case of "when the only tool you have is a hammer, every problem looks like a nail". A lot of hardware enthusiasts have developed a bad habit of looking at the problem from the wrong end. The whole philosophy of brute-forcing the performance of PC games is extremely backwards. Just throwing more hardware at software problems is a huge waste of resources and very quickly yields diminishing returns. Most gamers seem to revert to "my hardware isn't fast enough" whenever they encounter problems originating from a badly coded game.

    • @frozby5973
      @frozby5973 Рік тому +15

      its true on both sides really vram isnt that expensive, they should add more, but at the same time its not an excuse for game devs to make very badly made games, and recent releases are just a prime example of that, a very quickly very greedily and lazily done games for the most amount of profit.

    • @veduci22
      @veduci22 Рік тому +14

      When you buy 32GB RAM kit for 30$ more instead of 16GB nobody is concerned about "brute-forcing" but when you want more VRAM on lower-midrange cards that cost 300-600$ while 8GB GDDR6 costs ~27$ then suddenly it's a problem...

    • @andremalerba5281
      @andremalerba5281 Рік тому +1

      @@veduci22 ah another fellow follower of Broken Silicon

    • @Trick-Framed
      @Trick-Framed Рік тому +3

      It's not badly coded. Computer Science 101...35% of programming is done first. The other 65% is bug fixing. This was not a joke. This was an actual hard rule. Optimization is 65% of a project 9/10 times. The issue we have been seeing is a game coming out at the same time on all systems being more important than it actually running well on all systems. Due to the install base and nature of consoles, they will be programmed first. And consoles use mainly the same code as the PCs now so finding talented programmers that aren't already in a studio cranking out console games is slim these days. So we wait for the PC patches. What do they do? Allocate resources correctly for PC instead of defaulting to what this gens consoles do. In the past it was always throwing more horsepower at it. Right now it's about optimization. And the only people that should be worried about VRAM are people playing in 4K or higher. Especially high refresh 4K. I am maxed at 8GB in 4K 120. I definitely needed a 3080 10GB but ended up with a 3070 when I saw the window of opportunity to buy a GPU at retail disappearing as I clicked links. I have had to live with reduced settings since. Which is what this video recommends which makes me wonder. Is the real issue the glut of noobs that have hit the PC scene since COVID trying to figure out something they have no relation to due to being a console fan prior for decades? But I digress, beyond conspiracy theories, what people should be primarily concerned with is the time it is taking to optimize for PC. You need to take this into consideration and then hope the industry will start releasing when ALL ports are polished and ready to go. Thing is? They never did before. And just because they both share the same tech does not mean it works the same nor does it mean that AAA Software Houses will start cranking out PC games FIRST unless they already do. No. All the old rules apply and I do believe they are confusing people.

    • @mickmoon6887
      @mickmoon6887 Рік тому +3

      Its true that modern video gaming software standards have gone lenient and bloaty over the years to use more system resources like VRAM, RAM, CPU etc from lazy developers and company culture
      Its also true that modern games require higher VRAM capacity to push out good quality in higher resolution its been more than a decade almost two decade now that the most majaority gamers are still stuck with 1080p resolution because GPU manufacturers refuse to increase VRAM capacity for their low and mid range most common GPUs now that's simply inexcusable more than two decade with simply slow progression because of GPU duopoly mafia
      Just 2-3 gen ago or a console gen ago you could get 8 GB VRAM GPU the max at that time under $300 now you can't even get the same perf and VRAM capacity at that price or double that price the worst part you have to pay double to get same perf and capacity what you could've got back then looking at recent RX 6600 and rx 580 shame AMD
      VRAM aka DRAM prices have always been cheap only expensive during shortages someone mentioned here before its cheaper to increase your RAM capacity than VRAM capacity back before you were able to change VRAM capacity just like RAM
      Both the lazy devs + industrial standard culture and the GPU company greed are at fault here as both benefit from this mess against the consumers/customers

  • @michaelkennedy320
    @michaelkennedy320 Рік тому +341

    In 2023 Any GPU that cost $200 or more should have at least 10GB of VRAM

    • @MrBlacksight
      @MrBlacksight Рік тому +40

      16

    • @Micromation
      @Micromation Рік тому +21

      Snatched 2nd hand 6700XT in perfect condition for 250 few days ago for my secondary PC - sweet, sweet 4060Ti performance and 12GB of RAM :D

    • @curie1420
      @curie1420 Рік тому +27

      goodluck getting a $200 gpu.... best you can have is a 4050 that is probably just a 3050 but with lower bus width

    • @Mayeloski
      @Mayeloski Рік тому +3

      @@MrBlacksight 20

    • @Taluvian
      @Taluvian Рік тому +12

      Low end 12gb and mid range 16gb.

  • @Jkend199
    @Jkend199 Рік тому +45

    For me this is very simple
    1. On a low-end card (that's under $200 USD IMHO) yes, absolutely I expect to turn down settings even at 1080p 60 FPS
    2. On a midrange card (that's $200-400) expectations are determine by price... when you're talking $399.99 USD I expect 1440p 60 fps no compromises, the closer I get to $200 USD the more compromises I am willing to accept.
    3. When you're talk high end (cards over $400 USD) I expect 1440p 60 fps no compromises, and once you hit $600 USD I want 4k 60 fps yes with settings turned down, but I expect 4k 60 fps from a car that by itself costs more than a current gen console... I don't that's even slightly unreasonable.
    Expectations are set by the price, If Nvidia/AMD want outrageous price premiums, I want Outrageous performance.

    • @divanshu5039
      @divanshu5039 Рік тому +3

      Totally agree with you.

    • @Z3t487
      @Z3t487 Рік тому +1

      "If Nvidia/AMD want outrageous price premiums, I want Outrageous performance" How about more performance for the same price of last gen? You know, i was interested maybe in an RTX 4080 but not for minimum $1200. Yes, maybe the price scales good with the performance of last gen RTX 3080, but i don't care, i'm not going to submit to the new logic that is the following basically: goodbye better performance/price, hi new "the more you pay the better the performance is and this perfectly scales with price or almost does.

    • @Jkend199
      @Jkend199 Рік тому +1

      I was thinking more 4070 coming up to 4080 performance, but staying at 4070 prices, not that Nvidia will ever do that.

  • @badbasic
    @badbasic Рік тому +131

    I think the biggest issue is that texture quality is one of the settings, if not the setting, that impacts visual quality the most. And even though it has such a huge impact, if you have enough vram, it does not effect your performance at all.
    It's simple, if you have enough vram, you can go for the highest texture pack and it won't effect your performance, if you don't, it will either kill your performance or mess up the visuals, like Hogwarts and Halo.
    Almost every other visual setting has a performance price, no matter the hardware, unless you are cpu bottlenecked, you are trading performance for higher settings.
    What you can essentially do, even with an older card that has enough vram, is turn down other settings to get good enough performance, crank up textures for free, and the game will still look very good because of how important texures are.

    • @Zero939
      @Zero939 Рік тому +25

      This. His Ultra Settings reference is misplaced. Textures are unrelated to performance unless that limit is hit in the first place, which is the origin point of being mad in the first place.

    • @diegochavira5815
      @diegochavira5815 Рік тому +28

      Totally agree, this whole debate is just like trying to justify 8GB cards... The ultra settings HUB is referring to are the ones that actually impact performance, and as you said, Textures have no impact on performance whatsoever, but if you don't have a big enough frame buffer, everything goes south.
      Given the price Nvidia and AMD are asking for these "midrange" cards, they should get at least 10 to 12 GB VRAM.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +4

      In majority of games you can lower the texture from ultra and still not seeing the difference. Even with side by side still pictures you really need to look for it. In RE4R for example you can set the texture to high 2GB/3GB and it can still look the same to the one with high 8GB unless you really play around the camera angle and search for the difference.

    • @georgebessinski2122
      @georgebessinski2122 Рік тому +2

      @@arenzricodexd4409 Textures are not the only setting in RE4 remake that eats up vram. If you would want to max it out and use RT you will be left out to use 1gb texture pack

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому

      @@georgebessinski2122 with RE engine i have always think the game asked more but in reality you can still play the game smoothly even if you did not have the recommended amount of VRAM. In RE4R remake i can push everything to max except shadow and volumetric light to mid (since i did not see the difference above that), no motion blur, hair strain off, RT off, texture high 6GB and my 3060Ti can get mid 60s to mid 70 in open area. Indoors usually 80 to 90 plus. Sometimes over 100. And this is with 1440p res.
      Many people see the game being tested with cards with higher VRAM then see how much VRAM the game end up using conclude that it surely running poor or having issues on cards with less VRAM.

  • @titan_fx
    @titan_fx Рік тому +163

    Nvidia thought 8 GB GDDR6 cost $100. What a time to be alive.

    • @Micromation
      @Micromation Рік тому +17

      They prolly want to force power users to pay up and didn't really think about gamers at all. Like, 4060Ti with 16GB would be go-to card for many 3D artists that in the past were continously investing in Titans and learned to optimize their scenes to work around VRAM limitations because Quadros were too damn expensive. Yeah, 4090 is faster but speed is not that big of an issue for most projects (because heavy projects you always push on render farm because it's faster and cheaper than doing it at home), the VRAM is because it limits complexity of your scene and you want that viewport render preview and being able to churn out quick sample renders before you commit to the render farm.

    • @BenedictPenguin
      @BenedictPenguin Рік тому +15

      @@Micromation aren't u forgetting it only has 128 bit bus among other cut down specs? the more u used up those vram buffer the slower the performance will be, the memory bus just simply isnt enough for such capacity.
      i do agree that it will be pretty much a suitable 3060 12gb replacement for budget oriented artists until something better is out there

    • @WayStedYou
      @WayStedYou Рік тому +17

      no, they want to keep their 70%+ margin on what costs 30 bucks

    • @Micromation
      @Micromation Рік тому

      @@BenedictPenguin it matters very little for this particular application.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +5

      ​@@WayStedYouif company sell their hardware near cost how they going to finance their R&D and paid the salary for their engineers?

  • @thestoicwhinger
    @thestoicwhinger Рік тому +5

    In an age where developers release unfinished games, is it really that surprising that efficient memory management isn't exactly a priority?

  • @0nyx11
    @0nyx11 Рік тому +21

    The main problem is my gtx 1070 already has 8gb vram and I feel scammed paying so much money for something that I can't guarantee to be good for more than a few years, as it has the same amount of vram.

    • @BigHeadClan
      @BigHeadClan Рік тому +6

      To be fair your 1070 came out in 2016. You still have another year or two most likely out of the card.
      That’s an excellent run for a gpu.

    • @mannyvicev
      @mannyvicev Рік тому

      @@BigHeadClan yea on low settings

    • @aquatix5273
      @aquatix5273 Рік тому +1

      @@mannyvicev this is entirely because the GTX 1070 is an old card without good throughput in the modern era, not because of its VRAM. GTX 1000 cards on modern games just simply don't perform well anymore unless you have the GTX 1080 Ti, which had amazing throughput for its time.

    • @aquatix5273
      @aquatix5273 Рік тому

      @tomrobinson2914 Yup. Amazing the card was able to keep up for so long though, now it's performance is that if the low range for modern generations.

    • @masterlee1988
      @masterlee1988 Рік тому

      @@BigHeadClan I don''t have another year for my 1070, heck I'm on 3440x1440 and need a new gpu that's stronger badly. Hope to get one by the end of this year.

  • @Sp3cialk304
    @Sp3cialk304 Рік тому +39

    Games developed for the current consoles are going to need a lot of vram on PC. Not only do the consoles have a decent amount of vram themselves. They can also stream massive textures files directly from the SSD. Saving on memory usage and CPU usage. PC needs to start using direct storage more and requiring decent nvme SSDs.

    • @tohro6969
      @tohro6969 Рік тому +1

      Wasn't windows 11 supposed to solve this issue? And then we probably need the game themselves to support it as well

    • @zues287
      @zues287 Рік тому +9

      ​@@tohro6969 Direct Storage has been out for a few months, but game developers need to design their games around it. Diablo IV is the first game to fully support it (Forspoken had a half-baked partial support for it). When all new games start supporting Direct Storage, it should mostly fix the low VRAM issues.

    • @Sp3cialk304
      @Sp3cialk304 Рік тому +1

      Direct storage will help. From interviews I've heard with game devs they are using over 8gb even without the textures. But streaming the textures directly def would help. It will take a few devs saying you have to use a fast SSD for this game. But that's scary for a dev. Because anytime they release something that has any kind of hardware requirements people loose their minds. Start screaming optimization.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Рік тому +2

      You're not making sense.
      The consoles have shared memory. So the games made for them use... all of the memory. On PC, they are unoptimized and instead of using sytem memory and VRAM together in an efficient way, like PC games do, they try to smash everything into both RAM AND VRAM. Think about a game that takes 10GB of system memory and 5GB of VRAM, which is under the PS5's 16GB memory. Then imagine that game is programmed on unified memory for a console and the program shoves the system-RAM-y stuff into system RAM and then the entire thing AGAIN into GPU VRAM, so it thinks it needs 15GB VRAM. This is way oversimplified and the numbers are made up, but this is conceptually what is happening to many of these ports.
      There's some porting process these companies are using that shoves the square console peg into the round PC hole. Whatever it is takes less labor hours to do but has a shitty result. They save money and get what they pay for in the final product: a shitty game that runs poorly.

    • @Sp3cialk304
      @Sp3cialk304 Рік тому +1

      @@CyberneticArgumentCreator you do understand that a lot of data that goes to vram also goes to dram on PC. So the consoles having shared memory means the CPU and GPU can pull from the same memory pool instead of putting the same data in two pools. That a long with direct asset streaming from their SSDs makes the very efficient.

  • @Pidalin
    @Pidalin Рік тому +9

    8GB would be more than enough if games were not broken and developers heard about optimalisation.

    • @ramtinbass
      @ramtinbass 3 місяці тому +1

      I don't know for sure but I feel some developers intentionally unoptimize their games to underperform at ultra, very high quality or even high on some games like Halo Infinite, to urge gamers for GPU's with more VRAM, which are typically the ones from 2020-Now. They urge gamers, especially children, who are obsessed with extreme texture quality, to go tell their dad's to buy high end cards, definitely with more VRAM, and since nvidia is much more popular, and the name Ray Tracing and DLSS has kept the brand on top of the market, they usually go for the cheaper 3080 12GB or 3080 Ti for the least. 3090 & 3090 Ti is overkill for most gamers even at 4K, and with new releases such as 4070, many are lured into mvidia cause of Frame Generation, which is still at its early stsges and lets say it's like a pre alpha test run, so they go buy 4080's or 4070 Ti Super, which are beyond incredible but serious budget. Turning down settings especially texture quality in most newer games doesn't make the game look bad at all except maybe some of them, unless you are going 1440p on a large monitor or 4K, there it matters, or you play 1st person games like Cyberpunk which Texture Quality matters the most, which is evident at 1080p, still tuning down a bit doesn't insanely impact the quality. Yes many Devvs don't optimize them on first releases, then do it in a few months after. Also Nvidia is planning an image compression AI to further sell the remaining 8GB 3000 and 4000 cards. You can go for AMD and get out of such issues, but at the cost of weaker Ray Tracing performamce and the bad reputed FSR. Also Nvidia uses at least 1 GB lower VRAM on newer cards, but that still wont make up for the lack of VRAM. you can still play with 8 GB. But not advised to do so unless on a serious budget. You can also go for a slower bit 12GB 3060, but you need to turn shadow quality and stuff like MSAA, and DLSS is required to play ok on ultra settings. The best option is to wait for 4000 series to drop in price, go for second hand 4070 Ti Super if you want to spend for 4K, otherwise 4060 Ti 16GB 1440p might do that but beware of poor strength and 128 Bit bus, a large L2 cache might not make up for that completely.

    • @Pidalin
      @Pidalin 3 місяці тому

      @@ramtinbass I don't think they necessarily do it on purpose, but today gaming engines support this lazyness because of all those automatic tools they have now, so they don't have to optimize everything manually, but automatic optimalisation tools don't work that well, only way how to do it really well and properly is oldschool way - do it manually, but head doesn't know what heel is doing in today massive studios with hundreds of people. Games like pretty much the same like 10 years ago, but you need 3 times better HW for no reason, I just don't like this development, I remember better times when HW requirments were related to how the game looks, today you just need better HW for no reason and it looks still the same and when I say it, most of people just offense me, they say "buy better computer, you have potato PC" but they absolutely don't understand what I am talking about. I want games like in 2010-2015 and more FPS with today HW, I don't need Unreal Engine 5 where you need RTX 4080 for 2015-like graphics for no reason.

    • @ramtinbass
      @ramtinbass 3 місяці тому

      @@Pidalin let me put it this way, what you say is absolutely correct about how they can optimize their games, I even guess to use even 6-6.5 GB's of VRAM. That's what a true expert knows as I'm not one, but there could be some games which are intentionally not put much effort in terms of optimization, so they hit two birds with one stone, they do this deal with Nvidia on selling higher end cards with actually putting less effort and cutting their costs, and "Then" when their sales rates drops, at that time both Nvidia and their pockets are filled enough to put extra effort to sort things out. I suspect The Last of Us on this, the difference after being patched is hilarious, this could go for Watchdogs Legion as well, same happened... but in games like Halo Infinite where 8GB's struggle at high are are only okay at medium even after being patched, that's absolute laziness which could be the case for 90 percent of game to be released in the future. RE4 Remake crashed after filling over 5 GB's on my old RX580, same happened to people with any 8GB card even 3070's, and after patching it was fixed and a huge jump, strange thing is that filling around 12GB at 1080p in the graphic settings tab sounds suspicious to me, why 12 GB, when the famous 2023 cards, being all 4070s besides Ti super get 12GB???!!!! Watchdogs went beyond 10GB max settings at 1080p as well. Even 3080 ti's and 3080 12GB were harder selling cards, and that would help to sell some of those, in my country (Iran) there are a lot of 12GB 3080's and 3080 Ti's nobody would buy and they sit on the shelves for years, since here in Iran most people struggle for basic needs as of now, still the PC market is going strong. but their pockets wont pull for that much, strange thing is a 10GB 3080 is about at least 70% cheper than used 12GB's and used Ti's, and at least half price from a brand new one, and second hands of those 12GB's are very rare. A 3060 12GB is the same or more expensive than 3060 Ti here since many people do rendering stuff, so the 10GB of 3080 which runs 99% games at 4k max with no issues, is cheap because of that, one of the other reasons it being cheap is you need a powerful psu (>850w) which is going to cost you a third of the price of the cheapest 3080 10GB on the market. Cyberpunk 2077 is complete example confirming what you stated, it's literally an Nvidia benchmarking tool, on my 3060 12GB, I have not seen it use above 8GB's anywhere at 1080p. It has some of the most detailed textures of all games released 2020 and beyond. How does it manage it so well, all 8GB Nvidia 3000 series do not run out of VRAM at Path tracing! Yes at path tracing I tested (max out everything plus path tracing but not in photo mode, DLSS quality, 30-35 FPS at city, rarely 25-30 momentarily or when seated as a passeneger in a car(12100F mine) (6800 XT performs between 3060-3060Ti with Path Tracing on, 10400F), 11.5-12GB allocated but only 8GB used at 1080p!!!! 3060 Ti, 3070, 3070 Ti get better FPS according to their strength. So if they want to optimize, they can optimize, they don't put effort, game devs behave worse than Nvidia, look how badly optimized GTA V's vegetation is with high MSAA and high Resolution Shadows, how poorly loads stuff, holds unnecessary stuff, deletes necessary stuff like cars from RAM, puts unnecessary extra load on RAM for unknown reasons, pours some VRAM there, and the stupid downgraded mechanics from GTA 4, still people pay and play, same as EA, lazier than ever, maximizing profits for the least offering. CDPR didn't do that, it was buggy but now it's incredible. Younger, lazier, easy-going generation just pay huge amounts when excited by trailers, graphics, and high end cards, they just buy and don't think economicaly. That makes Nvidia and most developers take advantage of them. Still many pay for 4060's because they heard Frame Generation, they don't even know what that means, then they complain low VRAM but they didn't think this through before their buy. This will happen with 5000 series as well, they offer something exciting on the outside, buy 8GB's future expected 5060's with 8GB's in hope for Nvidia Texture compression, then they ramp up games to 1440p and they say 8GB's struggle even with texture compression, they simply see it runs okay maxed out at 1080p, but they want more, and go buy a 5080 a few month's later. Just see how many gamers did this with 4060's and 4060 Ti's. They went for 4070's after. Nvidia sold two Graphic Cards to many gamers on 4000 series in the last year. So it's the consumer's fault as well. Many need to learn to control their urges to avoid such things, simply to gather enough money and buy high end cards, when the lower end don't sell, prices drop, VRAM's increase. It's the hard truth and one of the factors of today's corrupt industries. Dev's like to take advantage of buyers these days, just they will gather enough to suck every penny out of consumers pocket's, then offer their new "budget" GPU's with fancy features from the outside and then they go buy! (sorry if it's long, but it's worth the length, for hope to give any uneducated consumer who would read this get a clear shot of what's going on)😑

  • @ImperialDiecast
    @ImperialDiecast Рік тому +6

    its funny how ALL cards are affected, not just 8gb cards people like to dunk on.
    for instance, the 8GB at 1440p issues of today are the 12GB 4k issues of more expensive cards of today as well. you buy them because they are powerful enough to play 4k, but at 4k, the vram usage takes another jump and occupies between 9 and 11 GB as of today, and tomorrow it will exceed 12GB.
    that being said, it wont change the fact that the biggest culprits here are the games themselves, showing how updates can fix stuff. for example, reducing the gap between allocated vram and actually used vram. and fixing vram leaks. halo infinite, hogwarts legacy and forspoken still have lazy devs, but callisto protocol and the last of us have been fixed and forza 5 also works fine despite high vram requirements.
    so at the end of the day, exceeding vram shouldn't automatically gimp your framerate or cause texture pop ins everywhere. this will depend on the game. dont rush to buy a slower card with more vram over a faster card with less vram. there are enough benchmarks out there proving that e.g. a 3060ti still outperforms a 3060 even in games that require more than 8gb vram.

    • @noway8662
      @noway8662 Рік тому

      3060 is also cheaper. I don't really see any real argument there. 3060 just has extra perk of having more memory.

  • @Dulkh4n
    @Dulkh4n Рік тому +77

    the thing is not that you should play on high because it's the rational thing to do, is that mid tier cards in 2023 shouldn't force you to do it jn 1080p for 60 fps... it's a 400 card.. that would work far better with 12gb of ram

    • @TheTerk
      @TheTerk  Рік тому +1

      That's why I classify the 4060ti as a mainstream card. It's way too expensive where it sits in the market

    • @aaren117
      @aaren117 Рік тому +12

      It's not just 2023, these cards are being reviewed now but they will be on sale for about two years. So someone could go "oh, it's tight but it'll do" before pulling up VRAM Punisher 2025 and just feeling cheated. This is especially problematic when memory right now is cheaper than it's really ever been.

    • @zushikatetomotoshift1575
      @zushikatetomotoshift1575 Рік тому +3

      @@TheTerk I am waiting to see how AMD is going to be in the next following year or two for RX 7900 XTX for 500 dollars usd.

    • @zushikatetomotoshift1575
      @zushikatetomotoshift1575 Рік тому +9

      @@TheTerk Also a 4070 is a rip off too.

    • @Magnulus76
      @Magnulus76 Рік тому

      1080p as a resolution is still relatively demanding. It's over 2 million pixels rendered dozens of times per second. If you add in additional effect passes like screen space reflections or raytracing, it will be that much more demanding. That's all about having adequate numbers of shaders, not necessarily VRAM.

  • @27Zangle
    @27Zangle Рік тому +6

    I've always started with high settings and then fine tuned each game. Playing Ultra is so hard on nearly any system and overtime I learned that playing down a setting with fine tunning not only increases the lifespan of the system, but also saved some money on utilities, AND I could never really tell the difference graphically.
    I honestly think many people, even those who know how to build and throw tons of money at components, just set their game to ultra and call it a day. They never really spend the time in setting nor the system settings as a whole.

    • @masterlee1988
      @masterlee1988 Рік тому +1

      Yeah I do agree with you on that, I normally just play on High settings for the most part along with setting textures to Ultra and call it a day.

  • @ForTheOmnissiah
    @ForTheOmnissiah Рік тому +10

    I'll just be dead honest, as an indie game developer, I'll say that the vram usage issues are completely, and utterly, a problem of the developers being lazy and not optimizing. Period. Consoles like the PS5 can run game just fine with a mass shared memory (both system and video) of 16gb, and it does it just fine. While on the PC side of things, we have to have minimum requirements now of 16gb vram. Why? Starfield will be released on Xbox Series X and Series S. Xbox Series S has a grand total of 10gb of total memory, system AND video shared memory. If the Xbox Series S can run this game with 10gb of memory *grand total* , why can't a PC with 8gb of vram and 16gb of ram? The serious issue here is developers not putting much time into PC ports. The minimum requirement for Starfield is 16gb of vram, which is less than the entire ram requirement for the Xbox Series S. That's a developer problem, not a hardware problem.

    • @Thakkii
      @Thakkii Рік тому +1

      Hold on where you saw the 16gb Vram minimum requirements for starfield? Are those requirements out ? Because steam requirements for the game is an placeholder and don't mention Vram reqs at least that i remenber

  • @Viking8888
    @Viking8888 Рік тому +17

    I read something interesting recently (I can't find the article again though). It was saying that with DX12, game devs would have to do a lot more coding when bringing assets into VRAM, which made dev time way longer. What the devs decided to do was just load as much into VRAM as possible to make things easier on themselves. This willy nilly take on things definitely would make things less optimized, especially if the wrong assets were loaded in VRAM when others are needed. Those assets would need to be purged and the correct ones then loaded.

    • @TheTerk
      @TheTerk  Рік тому +8

      sounds like we should hold developers as accountable as the GPU manufacturers, no?

    • @Some-person-dot-dot-dot
      @Some-person-dot-dot-dot Рік тому +3

      @@TheTerk No, not necessarily. Developers have been developing with 8gb for awhile. It's acceptable to think they need a little bit of extra power after 7 - 8 years. I believe atleast 10 gigabytes Vram is the new appropriate minimum for any gaming card. Anything lower should be considered entry level. There will always be market a for 8 Gb cards, but you have to properly price those kinds of cards. Nvidia says "The more you buy the more you save..." Why is that not allowed to be true for game developers?

    • @Viking8888
      @Viking8888 Рік тому +3

      @@TheTerk To a certain degree, I would say yes because for dx12 games, at least, they are taking an easier route that definitely could be a cause of unoptimized games. I'm sure there is a whole lot more to things than just what I mentioned though.

    • @Greenalex89
      @Greenalex89 Рік тому +1

      I think this trend will continue and more L3 cache will be benefitial too. Usually the publisher is to blame not giving the devs enough time to optimize.

    • @stangamer1151
      @stangamer1151 Рік тому +1

      The biggest problem here is that some modern games are way too demanding in terms of VRAM requirement for what they offer in return. The texture quality has not really been improved that much since 2020-2021. Why would these games work so badly on 8GB cards then?
      And judging by the current state of those games, which were patched, developers could release them in a much better state right at launch. If this happened, then there would not be any big complaints about 8GB buffer obsolescence.
      To play at console settings these days you really need 10GB GPU, but 8GB ones should deliver good gaming experience as well, without any major issues.

  • @donatedflea
    @donatedflea Рік тому +8

    The problem is that privelidged people these days expect a midrange card to run ultra settings at higher resolutions.

    • @gozutheDJ
      @gozutheDJ Рік тому +1

      yeah

    • @Keloot
      @Keloot Рік тому

      And they should
      If you make something exclusive less people buy it . Less people buy it less money you make ...

    • @gozutheDJ
      @gozutheDJ Рік тому

      @@Keloot take me back to when pc gaming was niche and wasn’t full of entitled babies who know nothing about PC

    • @darshankumar2717
      @darshankumar2717 Рік тому +1

      Mid range cards should be priced like mid range. Imagine paying 400-500$ for a card and still having to compromise on current games.

    • @gozutheDJ
      @gozutheDJ Рік тому

      @@darshankumar2717 imagine having your ego so fucking wrapped up in an ultra settings boner you refuse to adjust anything

  • @XieRH1988
    @XieRH1988 Рік тому +16

    None of the crysis games ever came close to needing 8GB of RAM, honestly I think game devs these days just don't know how to make a AAA game look good without brute forcing it with insanely high textures or other unoptimised visual assets. conveniently, it works in favour of the GPU cartel who are more than happy to sell you all that VRAM you suddenly now need

    • @HuntaKiller91
      @HuntaKiller91 Рік тому

      True even crysis3 looks better than some games and perform well even with my deck
      Ryse is another good-looking game

    • @TBKJoshua
      @TBKJoshua Рік тому +7

      True but in 2023, texture squeezing and vertex optimization should be much less of a burden. Devs shouldn't still have to do all of that manual optimization because Nvidia doesn't want to give GPUs basic modern vram specs. I see where you're coming from though.

    • @XieRH1988
      @XieRH1988 Рік тому +2

      I definitely don't think nvidia is off the hook their business practices are so shady I'll bet they'd try to sell a 4GB VRAM card in this day and age if they didn't have any competition

    • @gozutheDJ
      @gozutheDJ Рік тому

      crysis 3 is a DECADE OLD

    • @tumultoustortellini
      @tumultoustortellini Рік тому

      gta5 came out a decade ago and still looks better than some games today, and runs far better than others. Metro exodus game out 2021 and same thing.

  • @goldfries
    @goldfries Рік тому +1

    Yup, in many cases one can't even tell the difference unless screenshot the scene and do a picture by picture comparison.

  • @nightlyfrost
    @nightlyfrost Рік тому +10

    If you are spending over £350 for a gaming card you want to make sure it's somewhat future proof. You also want to enjoy current games and future games at their best settings. Graphic cards aren't cheap and people aren't willing to spend on a new one every year. My 1060 6gb is still going strong. Granted the settings are at its lowest but I can still play games on it. At this age and day I would think 10 or even 12gb of can justify its price for £350 but for 8gb is definitely not worth it. It also doesn't help how gaming companies are rushing out to release games that aren't fully optimized which causes gamers to rant.

    • @christinaedwards5084
      @christinaedwards5084 Рік тому

      You missed a trick here.
      What you should of done is upgrade to the 2060, then the 3060, then the 4060 when they released.
      You could of got a decent amount back, I’m seeing a 3060 going 2nd hand on eBay for £240. (Should be £220 for used imo)
      Say you bought all those cards for £300 and sold for £220,
      going from a 1060 - 4060 or 1060 - 2060 - 3060 - 4060 would still of cost you £300 either way but you’d be far less worried about this whole situation you find yourself.
      Trying to sell a 1060 in 2023 will be a tough sell.

  • @wahidpawana424
    @wahidpawana424 Рік тому +2

    The reason why GPU makers are getting away from overcharge gpu with high vram is because we do not call out game developers for poor optimization of video games. Having disproportionately diminishing returns when increasing fidelity of games graphics from medium to high and then to ultra is unaccepteble.

    • @saricubra2867
      @saricubra2867 Рік тому

      Meanwhile the memory usage increasing from almost 1GB to 3.5GB on 2007 Crysis is glorious. I finally managed to experience that game properly at 1080p (on Intel UHD 770 maxed out).

    • @gozutheDJ
      @gozutheDJ Рік тому +2

      wtf are you even talking about
      medium settings these days look a million times better than they did 10 years ago. "disporportionaly diminishing returns" THATS WHY THE SETTINGS ARE THERE FOR YOU TO TWEAK YOURSELF.

    • @wahidpawana424
      @wahidpawana424 Рік тому

      @@gozutheDJ why are you arguing on the same point we agreed upon? Please read my comment again. Going from medium to high is a diminishing return for most game. It does not justify the performance and ram cost.

    • @gozutheDJ
      @gozutheDJ Рік тому

      ​@@wahidpawana424I wouldn't say medium to high is that diminishing, but if you think that feel free to set your games to medium then. no cue what you are trying to say. the configurable settings are there for a REASON.

    • @wahidpawana424
      @wahidpawana424 Рік тому

      @@gozutheDJ the point is game developers needs to optimise their game better.

  • @paintitblack4199
    @paintitblack4199 Рік тому +5

    People in 2023 expect to play ultra high settings on 4k with low and midrange cards is the problem when it has never been that way before.

    • @WayStedYou
      @WayStedYou Рік тому

      Yeah it was, 2 years after the ps4 came out you could run Ps4 games at double the frames at max settings with a gtx 970 or 1060

    • @Magnulus76
      @Magnulus76 Рік тому

      @@WayStedYou That was a long time ago. Modern games are alot more graphically demanding, especially in terms of the number of shaders used.

    • @zdspider6778
      @zdspider6778 Рік тому +1

      In 2017 you could run Titanfall 2 (one of the best looking games at the time) at 4k medium on a GTX 1070.
      Wtf happened?
      It seems like we're going backwards. Now Nvidia wants you to play at 1080p and use their shitty upscaler ("DLSS"). I blame ray tracing. 3 generations later and people still turn that shit off.

  • @richardfarmer6570
    @richardfarmer6570 Рік тому +37

    I said this from the beginning. Just because some recently released, terribly optimized games require over 8GB of VRAM doesn't mean you suddenly need 16GB to ever game again. Just showing the improvement from updates to these games shows that. Obviously proper settings are equaling important.

    • @ashkanpower
      @ashkanpower Рік тому +1

      You are right, but as we progress, game development gets harder and we can expect more unoptimized games.
      TLOU1 looked terrible in high and med for couple of months, the experience was frustrating. Only ultra settings looked good but my 3070 couldn’t handle it.

    • @SweatyFeetGirl
      @SweatyFeetGirl Рік тому +7

      that doesnt mean that you dont need more than 8gb in the near future though.... 8gb vram cards shouldnt cost more than 200$ in 2023

    • @JABelms
      @JABelms Рік тому +2

      @@SweatyFeetGirl I played 4K with my 980Ti from 2014 to 2020 with just 6GB VRAM, played that unoptimized Kingdom Come Deliverance on high just fine, even Skyrim with 500 mods or so. I own a 7900XT now and even I wonder about all the VRAM screaming, games are supposed to be more efficient now, and VRAM doesn't contribute large prices for GPUs, they are cheap...NVIDIA is just pushing their DLSS agenda with all these expensive crap

    • @SweatyFeetGirl
      @SweatyFeetGirl Рік тому +1

      @@JABelms 8gb of GDDR6 memory cost 25$. nvidia sells the 4060ti 8gb for 399$ and 16gb 4060ti for 499$. they quite literally make 3x profit on that lmao

    • @Battleneter
      @Battleneter Рік тому +1

      Even though longer term PC gamers are very used to tweaking settings including myself, you should not "need to" from a VRAM usage perspective is the root of the argument. No 8GB cards are not acceptable in 2023 especially given price hikes of recent years, and how dirt cheap VRAM now is.

  • @JustGaming24
    @JustGaming24 Рік тому +3

    so 12gb vram for 1440p should be fine for at least 3-4 years?

    • @GmodArgentina
      @GmodArgentina Рік тому

      12gb will likely be fine during all of ps4's lifetime

  • @determination9296
    @determination9296 Рік тому +1

    The fact the first gen of Raytracing cards (2080ti) has 11gb of vram. And the counterparts 3080 and 3080ti each have 10gb and 12gb vram... no improvement AT ALL and the 4070 12gb. Ti? 12gb. The 4060TI. 8GB!? The RTX 2060 SUPER had the SAME amount of vram. And you dare call it anything above a 4050/4040/4030? Pathetic. The 1080ti with 11gb vram still proves how nvidia isn't evolving and devolving at times

  • @TheSanien
    @TheSanien Рік тому +3

    90% of the time it is just bad coding and optimization. Most likely because developers have tight deadlines.

  • @navyjonny117tng
    @navyjonny117tng Рік тому +1

    There's no problem with the 8GB Vram. The main problem is that devs are just plain lazy to optimize games to perform properly and to optimize it to fit the entry-level gpus such as 8 GB Vram cards.
    If games are optimized properly, there won't be issues with 8 GB Vram cards.
    Look at The Last of Us Part 1... that game hasn't been optimized, hence why it eats a lot of 12GB or higher GPUs.
    GPU manufacturers like Intel, Nvidia, and AMD still make 8GB GPUs. They are not the problem, the issue is with the devs

  • @Simsationxl
    @Simsationxl Рік тому +5

    1080p was cool when I was in high school.

    • @zdspider6778
      @zdspider6778 Рік тому +1

      It became "mainstream" around 2009-2010, maybe even 2011.

    • @FilthEffect
      @FilthEffect Рік тому

      ​@@zdspider6778try 2008

  • @mythicenchilada8371
    @mythicenchilada8371 Рік тому +1

    shouldnt at this point the mid to low range be in 1440p and no longer 1080? its been at 1080 for way too long

  • @kaptennemo1043
    @kaptennemo1043 Рік тому +5

    please don't show that nvidia ceo, I hate his face. make my wallet screaming.

    • @Kitten_Stomper
      @Kitten_Stomper Рік тому

      😂😂

    • @kaptennemo1043
      @kaptennemo1043 Рік тому +3

      @@saltee_crcker2363 because I'm care about GPU so the worst goes to Jensen.

    • @Micromation
      @Micromation Рік тому

      @@saltee_crcker2363 Always Bobby Kotick...

    • @zues287
      @zues287 Рік тому

      ​@@saltee_crcker2363 Zuckerberg is just creepy. Jensen is cringey as hell. His entire speech at Computex was full of cringe. On the upside, he gave us plenty of ammo for memes.

  • @Bill-lt5qf
    @Bill-lt5qf Рік тому +2

    i'm confused as to why even 8GB is being saturated, depending on the size of the game, it would suggest that a huge amount of the game is just sat in vram. say the game is 64GB, then that is one eighth of the game, or 12.5% of the game. seems like alot more than necessary.

  • @Sprngtime
    @Sprngtime Рік тому +6

    People for some reason don't understand that 7 year old cards like rx 580 still can run "The last of us" at 40+ fps with fairly high quality graphics... 8gb will always be enough you just need to tweak the settings a bit.

    • @AAjax
      @AAjax Рік тому +7

      Not very long ago, some people were saying that 4gb would always be enough for any game.
      TheTerk has it right - if you care about AAA games, you need to compare your GPU to consoles. 8gb GPUs will "always" be enough, until consoles are offering significantly more than 8gb.
      Xbox series S is the only thing keeping AAA developers from abandoning 8gb, and developers are complaining hard about having to support the S.

    • @damazywlodarczyk
      @damazywlodarczyk Рік тому +1

      @@AAjax It's nonsense what you're saying. I played Ghostwire Tokyo(a ps5 exclusive) and The last Of Us remake, also Ps5, with gtx 1060 6gb. 50-60 fps. It's not the series S, these consoles have 16gb shared ram, its not much. PC ports are weak.

    • @Sprngtime
      @Sprngtime Рік тому +1

      @Ajax Will be at least 5 more years before they cut support for 8gb cards, new triple A titles are also more or less CPU heavy now so personally I'm really not worried.

    • @curie1420
      @curie1420 Рік тому +5

      ​@@AAjax the point was 8gb for a $400 card today isnt good since we've seen 8gb gpus under $200 7years ago!!

    • @JackieRipper_AKA_Legs
      @JackieRipper_AKA_Legs Рік тому +2

      I guess the only argument for 8gb being too low is that you have to care about AAA games. Most people don't and for good reason, nothing good has come out in the past 5 years.

  • @DJ_Dopamine
    @DJ_Dopamine Рік тому +2

    I always use my own optimised graphical fidelity settings. Unless they are simply not needed to hit the fps cap.
    The difference versus Ultra/Max is basically unnoticeable to me anyway.
    Texture quality is something I try to keep as high as possible, that has the biggest effect on the look of a game.
    For the rest, I'll dial them down until I notice a significant degradation.

  • @nastytechniquez9685
    @nastytechniquez9685 Рік тому +5

    I’ve got 8GB on a 3070 oc and can get great frames with lower settings at 1440p and especially when I turn off ray tracing options or the minimal amount if the settings allow for it. I don’t mind it right now but eventually I’ll want a high end card with lots of vram for 4K when monitors are more affordable

    • @DenverStarkey
      @DenverStarkey Рік тому

      i've had a 4k TV as my monitor since 2018, i suffered through 4 years of not being able to do any new games at 4k , then in 2022 i finnally got a 3070 in october because all the benchmarks showed it could pull off 4k deccently well at high or ultra settings (on some games) , and less than 3 months later there were 4 games making that card's vram indaequate for 4k high settings. Now there are almost a dozen games doing so. long before i will be able to get my 450 bucks worth outta this card , it will not even be a good card for high settings at 1080p let alone at the 4k native resolution of the TV i've owned now for 5 years. you bet i'm pissed off and wished i had gotten a radeon 6800XT now they were both the same price roughly (used on ebay) when i got my used 3070.
      EDIT: Deccent quality 4k tv's can now be had at prices lower than many of the 1080p-1440p video cards and much lwoer than actual 4k video cards ( some samsung models cost 330-450 bucks). when your 4k TV cost 2x-3x less than the hardware to run it that is batshit and just wrong. foir that matter good enough 4k monitors arn't that much more expensive than the TV's either.

    • @valorantpro-zi4yd
      @valorantpro-zi4yd Рік тому

      @@DenverStarkey amd is BULLSHIT'S SHIT

    • @frozby5973
      @frozby5973 Рік тому

      What do you play if you need lower settings i play everything on max settings with 3070 1440p apart from rtx in some games where it doesnt make any difference like elden ring and get 70+ fps everywhere

    • @DenverStarkey
      @DenverStarkey Рік тому +1

      @@frozby5973 yeah i play every thing maxed out as well and i'm usually runing in 4k on a 3070 just a few games i can't do this on and a few others i run dlss to get 60 fps in 4k upscale. don't know what OP is on about ..,. "Low settings" LOL

    • @frozby5973
      @frozby5973 Рік тому

      @@DenverStarkey guy is probably cpu bottlenecked and blames it on gpu vram lol

  • @andresilvasophisma
    @andresilvasophisma Рік тому +1

    How can you say that it's fixed when you're paying more now for a 1080p graphics card than you were 6 years ago?
    You need more RAM if you want to future proof your graphics card.

  • @emanueldumea8217
    @emanueldumea8217 Рік тому +4

    I have a rx 6600 and a ryzen 5 3600 and i can play many games at high settings with 60fps. For Red Dead Redeption 2 i use a mix of high and medium settings with textures to ultra to get 60fps. There are a lot of great games that everyone can enjoy with 8gb of ram. Developers are just too lazy nowadays.

    • @valenrn8657
      @valenrn8657 Рік тому

      RX 6600 8GB is below PS5's VRAM GPU allocation. RX 6700 XT is a good approximation of PS5 and XSX.

  • @saricubra2867
    @saricubra2867 Рік тому +2

    This VRAM fallacy has to end, by that logic, an APU is the best.

    • @TheTerk
      @TheTerk  Рік тому +1

      I've got some fun videos coming up with desktop APUs :)

  • @eliadbu
    @eliadbu Рік тому +4

    The issue is more of what game developers will want to do in the future, I've heard 2 interviews already, with devs saying that if they want to push graphical fidelity further like let say more detailed hair, facial features (like eyes, mouth etc), they will need to more VRAM. There is point where they can't optimize any further for those low VRAM cards because it will take a lot of time which may not be worth it. Sure ultra settings are stupid If they offer little improvements for huge performance cost, I can't say that for texture quality - because if you have the VRAM the performance penalty is not usually high, but the gains are impactful. The issue is nvidia(and amd to some degree) standardize that mid range cards that cost upto 400$ or 270$ have 8GB of ram. If a 500$ console that came out 2.5 years ago have 16GB of VRAM, I'm sure they could match or at least improve their VRAM offerings on their mid range cards. You can blame the users that use ultra settings , you can blame the devs for not optimizing their games, but the fact is 230$ card that came out 7 years ago have the same memory capacity as 400$ new gpu today, it points to an issue with today's card(s).

    • @Serpic
      @Serpic Рік тому

      "console that came out 2.5 years ago have 16GB of VRAM", not VRAM, just a RAM (technically there is video memory), or Shared RAM between RAM/VRAM. If games on a PC consume about 6-8GB of RAM, then I can assume that in consoles this shared memory (16GB) is divided approximately equally between VRAM and RAM. For example, Hogwarts Legacy (PC port) uses about 10 GB of RAM and about 7 GB of the same RAM is allocated for cache. Total 17GB is only RAM "required" by this game, not including VRAM. And now let's remember how much TOTAL memory is in the console.

    • @eliadbu
      @eliadbu Рік тому

      @@Serpic technically it does not matter. The point is there is 16GB of GDDR6 memory in these machines. It is about cost - what you pay and what you get. These cards should have had more memory, all things considered - both from technical standpoint and economically.

    • @sparkythewildcat97
      @sparkythewildcat97 Рік тому

      I'm fine paying $250 (the current selling price for AMD's only 8 gb gpu from the new gen) for a gpu that has 8gb. Sure, ideally it would be 10gb, but it's fine. However, 8gb on a $400 gpu (which is a completely different class of product) is just offensive, especially when you can by a last gen gpu that is nearly as strong that has 12gb for about $100 less.

    • @eliadbu
      @eliadbu Рік тому

      @@sparkythewildcat97 TBH I don't really get the point for RX7600 at msrp of 270$, it would make sense at 200-230$, you're getting bit better performance vs 6650xt and some other small improvements of RDNA3 and that it. With 7600 you get less ripped off but it not great improvement anyway. Currently you can get it sub 260$ I would not be surprised if it will be available in several months for 230$ maybe less.

  • @ShinesThroughYourEyes
    @ShinesThroughYourEyes Рік тому +2

    Me still having 3GB of Vram and happy in 2023

  • @tapioorankiaalto2457
    @tapioorankiaalto2457 Рік тому +3

    I just bought a used RTX 3070 with a warranty for 300€ because of this panic, so I'm not complaining. They usee to cost 450€ just a couple of months ago and this one was 1050€ in 2021 when it was new...😅

    • @ManchmalGaming
      @ManchmalGaming Рік тому

      A steal! Bought a used one a week ago for about $370 and I don't see 8GB as a big issue since I play 1080p. Also don't need to play on everything max.

  • @fridaycaliforniaa236
    @fridaycaliforniaa236 Рік тому +1

    Meanwhile, if you pay 70 dollars for a game, you better have the possibility to use *all* the graphics quality than staying in a lower preset, just "because 8 gigs". I find this stupid asf that we now have to feel guilty of trying to get the most of the games we buy. Like it is our fault if we are not happy to have a card that would be considered a scam 2 years before, and we have to pay full price for it... 😏🤦‍♂

  • @badass6300
    @badass6300 Рік тому +4

    What you forget is that consoles have unified shared memory, meaning that the memory is used by both the CPU and GPU, the maximum allowed RAM for the GPU is 8-10GB, the CPU gets 6-8GB

    • @belliebeltran4657
      @belliebeltran4657 Рік тому

      That doesn't make any sense.

    • @badass6300
      @badass6300 Рік тому

      @@belliebeltran4657 yes it does... the consoles have an APU, their GPUs are integrated and share the memory with the CPU just like a desktop APU would. Meaning that out of those 16GB of RAM, the CPU uses 6-8GB and the GPU 8-10GB, sony's developer guideline also states that they shouldn't allocate more than 10GB of the memory to the iGPU.

  • @Exotic_Pate
    @Exotic_Pate Рік тому +1

    me watching this knowing i will never upgrare my old 2080

  • @Fortzon
    @Fortzon Рік тому +3

    Digital Foundry also recently dunked on HUB's VRAM claims with their TLOU revisited video. Using bad PC ports as a basis for your thesis is pretty bad.

    • @TheTerk
      @TheTerk  Рік тому +3

      Yep, that single setting (texture streaming rate) fixed it for me

  • @mannyvicev
    @mannyvicev Рік тому +1

    people take this to an extreme though and act like these cards with lower vram will be unusable very soon. the reality is its a very gradual change that can be mitigated largely just by lowering vram hungry settings. but of course that means a compromise in graphics quality

  • @dancomando
    @dancomando Рік тому +3

    Im happy with 3070ti 8gb it handles every new game really well and couples with my 10gen I5. Some games would allocate more than 8gb but for any given moment they dont need that much to run and BUS just replaces memory with new data when needed.

  • @mikeDeSales943
    @mikeDeSales943 Рік тому +2

    The interesting thing is that even if you have enough vram, if your GPU isn't powerful enough to make use of it, there isn't much of a difference, look at the 1650 super vs rx 580, half the ram. Same with the 6700 xt, in most cases it's not playable over the 4060 ti when it does pull ahead in certain cases at 1440p and 4k, of course there is the occasional game where the 6700xt does beat it.
    Of course there is a good point to be made about how future games will perform, but will they only benefit slightly from more vram, what's an extra 10 fps when you are moving from 30 fps to 40 fps, of course it's not worth the company in adding the extra vram if that's the case.

    • @prafullrahangdale666
      @prafullrahangdale666 10 місяців тому

      30 to 40 exactly same experience going from 60fps to 120fps

  • @Hman9876
    @Hman9876 Рік тому +6

    The vram histeria has caused people to reccomend 3060 12gig over the 3060ti. It's a mantra

  • @danlt1497
    @danlt1497 Рік тому +2

    Optimization should be the hysteria

  • @Devilion901
    @Devilion901 Рік тому +3

    That's a pretty bad take considering more and more titles will consume more vram, 8 gb has been standard for half a decade now, time to move up.

  • @seanc6754
    @seanc6754 Рік тому +1

    If you dont realize that Nvidia put 8gb of vram in almost all the 30 series cards and even in some of their 40 series cards on purpose then that I would say you're not seeing the forest through the trees.. pretty convenient to have to buy another video card to run New age games at 1440p and max settings when more and more people are going to 1440p monitors... And games are getting bigger and more complex..

  • @HuntaKiller91
    @HuntaKiller91 Рік тому +6

    I won't pay more than $349 for a 4060ti with 16gb🤣🤣
    My $199 6700xt did a better job for 6months+ now

    • @Micromation
      @Micromation Рік тому +3

      199? That's a steal, I thought I've got a good deal recently at 250 :O

    • @curie1420
      @curie1420 Рік тому +1

      ​@@Micromationyou guys got yours used? i got mine 3months ago for $360 lol

    • @Micromation
      @Micromation Рік тому

      @@curie1420 yeah I've grabbed used. It's for my 2nd PC for when my main is rendering (Blender). Sapphire Pulse in pristine condition :D

    • @HuntaKiller91
      @HuntaKiller91 Рік тому

      @@curie1420 just did a quick check on my region
      Asrock phantom gaming 6800xt is $329
      Crazy deal but I'd wait for navi32
      See how it fares

    • @Steven_Daniel
      @Steven_Daniel Рік тому

      @@HuntaKiller91 that's an insane price, you wouldn't want to miss out on that after it sells out. My 6800xt is really worth every penny at $375.

  • @nukez88
    @nukez88 Рік тому +1

    I'm watching this video with the captions turned on and laughing my ass off because every time you mention A Plague Tale Requiem it shows up in the captions as plate tail, piketail, plugtail,

  • @kaiservulcan
    @kaiservulcan Рік тому +8

    Hi, and thanks for this piece of information. So finally, if the games are well optimized, I can play them with my 500$ 4060ti at 1080p ultra, otherwise, I have to lower the settings presets. High runs better than ultra and medium better than high. BTW, I was already aware of these things. Also, games optimization has improved things over time... Sorry to say that, but I don't understand what the video's purpose is.

    • @Greenalex89
      @Greenalex89 Рік тому +2

      The gist of the vid is: 1.) Peeps were ranting over 8 gigs not being enough. 2.) Games werent optimized at launch but can be handled when optimized with 8 gigs. 3.) Ultra settings are stoopid anyways. Conclusion: Dont panic, tune down settings and wait for a patch (or buy AMD for much cheaper, same performance but much more Vram)

    • @gozutheDJ
      @gozutheDJ Рік тому

      lol well, pc gamers are generally dumb as fuck these days.

    • @TheTerk
      @TheTerk  Рік тому +1

      Neither of yall watched the end of the video. 13:53. You shouldn't spend $500 on a mainstream card, the $400 6700 XT should be the defacto GPU for this Era of gaming.

    • @Greenalex89
      @Greenalex89 Рік тому

      @@TheTerk Friend, that's my GPU.

  • @Magnulus76
    @Magnulus76 Рік тому +1

    Very rarely do I use ultra settings. People seem confused about what "ultra" is about. It's about making screenshots, not necessarily what developers expect you to use to have a good play experience.
    Unless you are playing games at higher than 1080p, I don't think there is an issue with 8 GB of VRAM. It's more than sufficient for a well-written game.

    • @masterlee1988
      @masterlee1988 Рік тому

      Textures on ultra with rest of the settings on high is normally how I would play them.

  • @vygantaspetrauskas911
    @vygantaspetrauskas911 Рік тому +2

    I believe that the 6650 XT's 8GB of RAM is sufficient. Ultra settings seem excessive to me regardless. I opted for the 6650 XT over the 6950 XT because I'm not a heavy gamer and I'm hopeful that something better might emerge in the future, like the 7800 XT.

    • @lalas3590
      @lalas3590 Рік тому +2

      The 6650 XT has pretty insane Clock speed for its price TBH, I expect it to stay relevant for as long as someone keeps the game's settings as to stay below the full 8gb VRAM.

  • @doomtomb3
    @doomtomb3 Рік тому +1

    Good job differentiating the mainstream Vs midrange vs high end cards

  • @jeffphilp7430
    @jeffphilp7430 Рік тому +3

    Great reporting, balanced and well considered, subscribed.👏

  • @jimmyhughes5392
    @jimmyhughes5392 Рік тому +2

    i'm running an 8gb sapphire nitro rx570 and it's still a 1080p beast, most games run high settings with 100 fps avg

  • @supremacy98
    @supremacy98 Рік тому +3

    I wholeheartedly feel that the problem is optimisation. Adding more vram to GPUs is temporarily stalling the problem of running out of vram, but it's not solving the problem. There have been games that looks amazing withoit consuming 10-12gb of vram. Games by right should come properly optimised, and it's not impossible too, so that's the best solution, though an unlikely one

    • @Xenotester
      @Xenotester Рік тому +1

      It can be solved with good asset streaming, but... new 8Gb vram card also have only x8 pcie lines, not 16 - so it not just small amount of vram but slower speed for stream new data to vram

    • @redslate
      @redslate Рік тому +1

      Optimisation is one piece of the puzzle, as it allows games to run more efficiently. The other piece of the puzzle is better hardware (in this instance, more memory), as that allows game development to continue to advance.
      A lot of the features of higher end cards isn't 'trickling-down' as rapidly as it has in the past, leading the mid-tier (core) market to become stagnant.

  • @PackardKotch
    @PackardKotch Рік тому +1

    Nah, if i already paid almost as much for the GPU alone as a PS5 costs, and i cant match the texture settings that the ps5 uses in 4k while im running at 1080p, then i sure af aint paying 1.5x as much to have the same issue on another generation of GPUs

  • @Brian-dd2df
    @Brian-dd2df Рік тому +4

    Glad to see other's talking about this. UFD Tech did a similar video to this showing that yeah we aren't utilizing all of the available ram in newer titles. I feel like a lot this has hyperfocus on VRAM started with the launch of the 4060 cards which do appear to have serious issues related to VRAM. But most, if not all, cards more powerful than the 4060 cards don't seem to have trouble with VRAM bottlenecking.

  • @SeifSoudani
    @SeifSoudani Рік тому +1

    Testing frame rate alone is not enough, you have to also test frame times / frame pacing, that’s another source of lag in vram buffer constrained setups

    • @masterlee1988
      @masterlee1988 Рік тому

      Yep, these are important as well, definitely if they contain stuttering.

  • @MrAckers75
    @MrAckers75 Рік тому +3

    8gb at 1080p was never the problem….the problem is devs pushing out substandard software

  • @TakaChan569
    @TakaChan569 Рік тому +2

    8GB cards are fine, it's the cost of new one that is the issue $400 is insane for that and should always be called out. At least AMD has there new 8GB card at a better 250-280 range, while Nvidia want's $400.

  • @photonboy999
    @photonboy999 Рік тому +3

    8GB VRAM - not enough now for some current and definitely upcoming games
    12GB VRAM - mostly fine likely for next couple years, but will increasingly require some THOUGHT to manage at times
    16GB VRAM - rarely going to be an issue over next couple years likely
    GAME DEVS have spoken about how INCREASINGLY DIFFICULT it is to get games to work with 8GB of VRAM. Put simply, the NEW CONSOLE's have a larger memory cache than old consoles and newer game engines/techniques are using that VRAM in newer and more memory intensive games. And NOT just for textures. There's almost certainly a lot of OPTIMIZATIONS yet to be done with these newer techniques. Streaming TEXTURE ASSETS from an SSD (or copying those textures to SYSTEM MEMORY is one huge, meaningful way to reduce VRAM footprint) so you can stream those "just-in-time" but we're not there yet. So LOTS of reasons to believe VRAM usage will go UP for some games and lots of reasons to believe it will go DOWN for o thers. In the end, the PS5/XBSX will largely drive what direction the PC market goes.... I suspect we'll need 16GB+ VRAM cards in the near future but that they'll get the software sorted out so that streaming from SSD/SYSTEM MEMORY drastically reduces the upward increase in VRAM... and with diminishig visual returns we may simply see hardware requirements stagnate... I suspect we'll end up with something very SIMILAR to the current PS5 on PC's in the future... a SoC/APU with 64GB of shared (VRAM+SYSTEM) memory that doesn't even require a fast SSD because they'll just have games COPY the memory to shared first (so no need to STORE every frikking game on a fast SSD just to run well)... so by then a standard 4GB per second SSD can transfer 24GB of game assets in SIX SECONDS.

    • @damazywlodarczyk
      @damazywlodarczyk Рік тому +1

      It's complete nonsense. You will not need more than 8gb at 1080p until the end of the console life. The ps5 has 16gb ram shared.

    • @JackieRipper_AKA_Legs
      @JackieRipper_AKA_Legs Рік тому +1

      Devs can't even make an original or fun game to begin with now, let's not get into how poorly optimized they actually are.

    • @aqwandrew6330
      @aqwandrew6330 Рік тому

      @@damazywlodarczyk bullshit, 16gb of shared is still 12gb min purely for vram

    • @kealeradecal6091
      @kealeradecal6091 Рік тому +1

      @@aqwandrew6330 how are you sure? 4 GB left for game logic and for the OS of PS5. It is just easier to create and optimimize games for console, as everyone has the hardware

    • @damazywlodarczyk
      @damazywlodarczyk Рік тому +3

      @@aqwandrew6330 You don't know that. Only devs know that. You think everything else works on 4gb? When PC ports have 16gb ram as minimum? Just think a little. It's 16gb ram vs 24gb ram. PC ports are shit because they are badly ported, not because of low vram.

  • @gtx1650max-q
    @gtx1650max-q Рік тому

    4+ years and nvidia still hasn't released gtx 1650 replacement

  • @RandomUserName92840
    @RandomUserName92840 8 місяців тому

    $1000 being a budget gaming PC is ridiculous.

  • @jasonking1284
    @jasonking1284 Рік тому

    There is no "hysteria". Your own results show the games need more than 8GB to perform at maximum. GPU manufacturers NEED to increase VRAM to 16GB minimum. This is no longer 2016 and technology requirements have increased.

  • @jtnachos16
    @jtnachos16 11 місяців тому

    'developers are designing games to use more vram'
    Meanwhile the visuals don't match the additional resource they are demanding. That is the actual crux of the issue. Especially when most that increased resource cost comes from obscenely expensive volumetrics or raytracing that just don't look good enough to justify their resource demand. The amount of times I've blind tested people with screenshots or even video from in a game on high vs ultra settings, RT on vs RT off, and had people decide the lower settings are actually the higher ones, or that no RT looks better than with RT, is quite disturbing.
    We've seen this problem CONSISTENTLY as time has gone on, the more 'resources' game devs are allowed to think they have, the less focus on efficiency they have. Which is why modern games can use 2-3x as much resource as older games that have a similar level of visual quality, as well as a similar overall world/gameplay structure. God forbid we talk about bloat in storage space requirements, often from textures that are obscenely bloated, or text assets/audio for different languages that we aren't using.

  • @WayStedYou
    @WayStedYou Рік тому +2

    Remember when the gpus with half the vram of the consoles used to be able to play at higher settings at double the frames of the ps4 for 2/3rd the cost?
    Now you have to pay as much as the console for less than console settings/frames with the same half vram.

    • @gozutheDJ
      @gozutheDJ Рік тому +2

      may have found the dumbest comment on here

    • @Keloot
      @Keloot Рік тому

      ​@@gozutheDJexplain bootlicker. Because my GTX 1050ti is better than the ps4

  • @gamers-generation
    @gamers-generation Рік тому +1

    Another nice one Terk! Great point bringing up the console vram as well since it'd be easy to assume this is the metric/standard that developers are going by to "make sure things run properly"... although FFXVI's recent performance would suggest otherwise. I'm completely speculating at this point but it does seem as you put it "brute force" is the way devs are going with higher vram on their development systems/kits, and then well.. hey if it doesn't work day 1, they'll just patch it later right? You already bought the game anyway...💀

  • @NockDog.
    @NockDog. Рік тому +1

    For me, The only good thing of having 8 gigabytes of VRAM Is that some developers will be force to make a more playable experience for 8 gigabytes, which means that they will have to optimize their games better.

  • @wiseass2149
    @wiseass2149 10 місяців тому

    It's that a lot these tubers are playing their games on extremely high settings. No one needs to play on ultra settings and most people play on 1080p.

  • @AxleLotl
    @AxleLotl Рік тому +2

    Hard to take HUB seriously when they class the RX7600 as "Mid-range"...
    But tbh, 99.95% of PC games are still 100% playable at 1440p max settings without RT on 8GB of VRAM, it will then come down to whether or not the GPU itself is powerful enough to actually put out decent FPS.
    Edit: Console devs are utilising every piece of the 16GB of VRAM that most new gen consoles have available to them, which saves them a ton of money regarding optimisation, but completely ruins the game if it gets ported to PC and ends up costing them money to optimise later on anyway. We've had plenty of amazing looking games in the past that ran with no issues on 8GB, so I'd say greed is the biggest contributor to this massive push for more VRAM, oh and of course more money in the back pocket of the companies who make the cards because you're also paying more for their larger VRAM capacity products.

    • @valenrn8657
      @valenrn8657 Рік тому

      PS5 has an extra 512 MB DDR4 memory besides the 256-bit 16 GB GDDR6 memory.

    • @AxleLotl
      @AxleLotl Рік тому

      @@valenrn8657 don't think 512MB would make too much of a difference..

  • @alexguild
    @alexguild Рік тому +2

    Thank you. very well argued and substantiated results. Multiple causes that goes into all the noise about this.

  • @djanto98
    @djanto98 Рік тому +2

    If a new card comes out at 400$ and it is already struggling in some games to the point that you have to lower settings, then how will it perform with 2025, 2026 games? It’s just trash.

  • @gaijinkuri684
    @gaijinkuri684 Рік тому +1

    This was a great video, exploring the issue from many directions.

  • @pondracek
    @pondracek Рік тому +2

    6:00 Right on the money.
    Games like GTA 5 and Fallout 4, etc. used to ship extreme texture packs separately.
    Now they just come bundled under the Ultra preset. Even if you have 24 gigs of VRAM, ultra is very rarely worth it for anything but benchmark stress tests.

  • @bfish9700
    @bfish9700 Рік тому +2

    $350 for an Rx 6800 late last year, zero regrets. It'll last me until next gen easily.

  • @tech4u2022
    @tech4u2022 Рік тому +2

    And on top of that they (nvidia) also dumped SLI.
    So the option of running 2x 8Gb VRAM is taken away.
    Seems they planned this all along ...

    • @johnbilodeau6296
      @johnbilodeau6296 Рік тому +1

      For gaming (and in general), sli or nvlink doesn't increase vram. The data is mirrored on the GPUs.

    • @tech4u2022
      @tech4u2022 Рік тому

      @@johnbilodeau6296 thats one way you CAN use it

  • @technologicalelite8076
    @technologicalelite8076 Рік тому +1

    Here's the problem, if these cards CAN'T play Ultra settings NOW, then what about in a few years from now where games tend to get more demanding?

    • @masterlee1988
      @masterlee1988 Рік тому

      Yep, it's going to get worse for them. It's why I want a stronger gpu soon enough.

  • @alexandrostheodorou8387
    @alexandrostheodorou8387 Рік тому

    But the ps5 and xbox use about 14 GB’s. How the hell is 8GB going to compete with a lazy port from a system meant for 14gbs?!

  • @MARTINREN1231
    @MARTINREN1231 Рік тому

    3 years ago, 3gb vram was okay now its starting to get ridiculous

  • @Razor2048
    @Razor2048 Рік тому

    One thing to also look at is the shared memory use. windows task manager will also display the amount of shared memory use for the card. Some games may allocate 500+ MB less than the available VRAM, but may allocate 1+GB of system memory. Modern videocards will issue priorities to different memory pools, thus a game can allocate some shared memory without a large performance hit. The times you get massive frame time issues, stutters, and other major issues, are when you encounter a scenario where bandwidth intensive aspects of the game cannot entirely fit into the dedicated VRAM. You can determine these scenarios based on the PCIe bus usage. (apps like GPUz will will list PCIe bus usage on the sensors tab). Games that do not need to put bandwidth intensive stuff into the shared RAM, will often see the PCIe bus usage jump from the 10-15% range, to the 20-30% range on a X16 4.0 card. Saturating the bus fully in one direction will result in a load of 50%. When a game actively needs to use the system RAM for bandwidth intensive work, then you will notice the bus usage jump to the 40-50% range at which point the game will also begin to perform horribly.
    The solution thus far in many games including hogwarts legacy, has been to just opt to not load in additional textures when you run out of dedicated VRAM.

  • @fridaycaliforniaa236
    @fridaycaliforniaa236 Рік тому +1

    I still don't understand why the hate for this RTX 3050 with DLSS 3.0 combo 🤔

    • @tietieno4890
      @tietieno4890 Рік тому +1

      for the price of a rtx 3050 you can get better way cards

  • @datguyoverdurr2604
    @datguyoverdurr2604 Рік тому +1

    lmao the amount of baby bitching i got for telling people that I dont got vram problems on my 3070 because I dont play unomptimized games day one at 4k max settings with raytracing is hilarious. "nO!!! yOu Must ReGRet YouR PurChasE bEcAusE i Say SO!!!1!!"

  • @valenrn8657
    @valenrn8657 Рік тому

    RX 7600 8 GB *GDDR6-18000* has $269 asking price which is lower than the digital edition PS5 with 16 GB GDDR6-14000's $399.
    RX 6750 XT 12 GB *GDDR6-18000* has reached $355.
    NVIDIA's RTX 4060 Ti with 8 GB *GDDR6-18000* has a $399 asking price.
    RX 7600's GPU chip area is 204 mm^2 on TSMC's 6 nm process node.
    RTX 4060 Ti's GPU chip area is 190 mm^2 on TSMC's 5 nm process node.
    Both RX 7600 and RTX 4060 Ti has similar BOM cost with different retail prices.

  • @luckskill6132
    @luckskill6132 9 місяців тому

    Something I want to mention that in my opinion was a mistake. They gave up too fast on 4gb cards for low end. They could use them for low end cards still(low price ofc) , then 8gb would not look that low. Check for example 6500xt, and even 1650 super or rx 580.These cards can play games like Forza Horizon , GTA , Far Cry on 2k high. I tested it so I know. Big factor is again optimization but for low-medium 1080 they could live couple of years in future. Now when you buy 8gb vram in 2023/4 you know you got the lowest vram card and then you simply can't complain when games don't work well..

  • @bm5994
    @bm5994 Рік тому

    The stream of badly optimized games is so vast and has now come to a point where we as gamers accept this and even throw money at new hardware just to make them playable. If you think about it, it's completely backwards, we should be demanding properly coded and optimized games, instead of forking out hundreds of euro's/dollars for new hardware.

  • @heilaw7002
    @heilaw7002 Рік тому

    It's like telling people that turning on RT is dumb

  • @Gahlfe123
    @Gahlfe123 Рік тому +1

    if a game needs more than 8gb vram its probably a mess of high quality textures, game breaking bugs or no real substance. Just making graphics better is what killed AAA gaming and now add this subcription and DLC model. nvidia and these AAA game companies are out of there mind trying to offer way less product for more money.

  • @satakrionkryptomortis
    @satakrionkryptomortis Рік тому

    reducing quality for a product to claim being top notch for 1080p ultra gaming is cheating on a level beyond understanding.

  • @Xenotester
    @Xenotester Рік тому

    It's not "just a 8Gb vram" - it also only x8 PciE lines instead of 16 for streaming textures - what about budget cpu/mb without pcie 4.0 like 5500 ?
    And 8Gb can be tolerated for $150 -200 or maybe evenm 250 price range, but totally not for over $ 300

  • @Arkaskas1
    @Arkaskas1 Рік тому

    Do people even start questioning if the game companies on purpose increase vram to push people into higher priced gpus. It doesnt make sense 1080p requiring more than 8gb on these games. They dont look that good.

  • @howardwilson3466
    @howardwilson3466 Рік тому

    Looks at my 6gb 2060 and sighes we will figure somthing out.

  • @doc7000
    @doc7000 Рік тому

    I believe you missed the entire point, while the bad stuttering and crashing was abnormal effect of not having enough Vram other effects remained, when you look at RAM utilization what you missed is those games are sending GPU data to the system ram which is resulting in a reduction of performance.

    • @TheTerk
      @TheTerk  Рік тому

      I've mentioned that in a short I posted on the channel. Long story short, you don't see performance truly tank due to PCIe traffic until avout 30% utilization. The failures you see in this video occur far before that

    • @doc7000
      @doc7000 Рік тому

      ​@@TheTerkin this video you pointed out v ram allocation completely missing that point in this video. You focused an entire section of the video to it without mentioning that point. Yes you see performance reduction when that happens. It would be even worse if you are low on system ram. Because of this the 8GB 4060ti not only suffers reduced performance in those situations in some cases it even gets beat by the 12GB 3060.

  • @parker4447
    @parker4447 Рік тому +2

    I got an RTX 3060 TI and I always use only optimized settings even if I can hit 1440p 60fps Ultra(I'm on a 1440p monitor) with or without DLSS Q and I think Ultra or max settings are dumb and rather take the extra 30 plus fps with optimized settings and play with 90 plus fps. In some games ultra shadows, volumetrics, AO or what have u look almost the same as high or even medium but take like 20 fps lmao.

  • @HeartOfAdel
    @HeartOfAdel Рік тому

    No one needed to fix 8gb gpus, because they were Never the Problem.