Your GPU is Useless Now

Поділитися
Вставка
  • Опубліковано 26 чер 2024
  • ==JOIN THE DISCORD!==
    / discord
    Your GPU is Useless Now. It's been a growing trend that more and more games are using your CPU in a PRETTY dramatic way. You would think that since graphics are getting better, that your GRAPHICS CARD would have to work proportionally as hard, but that doesn't seem to be the case. General-purpose computing on the CPU has become the go-to. Even a top-end CPU from just 3 years ago, can't push 60 FPS in some games.
    On top of that, most people don't want to be upgrading CPUs frequently because it is a lot more expensive than buying just a new GPU. You might have to also upgrade you motherboard and RAM at the same time. Is this only going to get worse?
    VV RX 6800XT I tested as well (affiliate links) VV
    PowerColor Red Dragon rx 6800xt 16gb: amzn.to/3YhV3oU
    RX 6800xt (in general): amzn.to/3s05BNm
    HUB: • Insane Gaming Efficien...
    Nvidia: • DirectX 12 Ultimate on...
    BLAndrew575: • 2300+ Hour Satisfactor...
    Daniel Owen: • AAA Unreal Engine 5 ga...
    • Baldur's Gate 3 PC Opt...
    Digital Foundry: • Immortals of Aveum PC ...
    Unreal Sensei: • How to Create a Game i...
    0:00- WTH is going on??
    1:34- The Irony of CPU Utilization
    2:47- New Features are More "CPU Demanding"
    4:37- CPUs are for General-purpose Computing
    5:44- What Inefficient CPU Usage ACTUALLY Means
    8:20- We'll see if things get better
    10:04- CPUs are becoming MORE important than GPUs
    11:04- Silver-lining
  • Наука та технологія

КОМЕНТАРІ • 2,7 тис.

  • @user-nq5hy7vn9k
    @user-nq5hy7vn9k 9 місяців тому +2079

    Based on Steam Hardware survey, 1650 is still the most used GPU. If a developer wants a wider audience to be able to play their games, it's much better they focus on better optimization really soon

    • @redshiftit8303
      @redshiftit8303 9 місяців тому +283

      Unfortunately, their target audience is now console. Where they can get away with 30 fps and the peeps still pay up....

    • @user-nq5hy7vn9k
      @user-nq5hy7vn9k 9 місяців тому +167

      @@redshiftit8303 I doubt with such crappy optimization even the consoles will be able to give 30fps for too long

    • @danavidal8774
      @danavidal8774 9 місяців тому +113

      ​@@user-nq5hy7vn9kif I remember correctly remnant II runs at 720p upscaled to 2k on consoles
      It is wild

    • @InnuendoXP
      @InnuendoXP 9 місяців тому +26

      @@user-nq5hy7vn9k nah this is where the optimisation starts. It takes time, and games are taking longer than ever to develop. When they hit the wall on performance, that's when they start pulling out tricks. Though if a Zen 3 3700 equivalent is limiting to 30FPS, you'll need 2x performance per clock to maintain 60, and we still don't have CPUs doing that.
      At the very least, Series S might keep a lid on VRAM requirements though.

    • @Leaten
      @Leaten 9 місяців тому +9

      We know this. Only AAA developers make demanding games if you didn't notice yet because gamers can justify a hardware upgrade for them

  • @CaptToilet
    @CaptToilet 10 місяців тому +3815

    So it comes down to 2 things. Are CPU's just not good enough? Or are developers just not good enough? The answer should be obvious. Hint it isn't the CPU

    • @bigben9056
      @bigben9056 10 місяців тому +122

      its cpu,people just dont understand how heavy is rt on cpu

    • @upfront2375
      @upfront2375 10 місяців тому +178

      Exactly! I've been pc gaming for around 25 yrs now and I've never seen a time when CPU was nearly this important for gaming, except for old network EA games of course

    • @2528drevas
      @2528drevas 10 місяців тому

      @@upfront2375Same here, I built my first gaming PC in 1998 to play "Half Life", and it did fine with a AMD K6 300, because the 16MB Voodoo Banshee handled the load.

    • @Leaten
      @Leaten 10 місяців тому +105

      This seems to be a way deeper issue to me. Whenever I game the CPU is only used at around 4%.. but in regards of software I recently updated my GPU and it's not even fully utilized (no bottleneck in general) and I can't even reach my monitor's refresh rate 🤦 I have no idea what is causing this anymore. Why would a game not use my pc's hardware fully if it's available and instead let me see lower framerates LOL

    • @bl4d3runn3rX
      @bl4d3runn3rX 10 місяців тому +31

      Would be interesting to see how a game performs on 5900x on release day and 2 years later fully patched on the same system, has it improved or still the same?

  • @radioleta
    @radioleta 9 місяців тому +467

    As a graphics programmer. Vulkan and DirectX 12 shouldn't lead to more CPU utilization by itself. In fact, the whole point of the modern APIs is actually to reduce CPU overhead. The new APIs allow you to use use multithreading to use multiple cores to record rendering commands though. But that actually helps saturate the GPU! Yes, I think it might have something to do with the lack of optimization. I agree that the improvements in realism are not worth for the performance impact most times :)

    • @Mefistofy
      @Mefistofy 9 місяців тому +1

      Just a thought: could it also be memory data rate when everything becomes bigger and open world.

    • @ayliniemi
      @ayliniemi 9 місяців тому +7

      As the cpus, gpus and game engines become more complex with each generation I would think it gets harder and harder to master optimization while at the same time creating a game and bringing it to market in a profitable timeframe.

    • @Mefistofy
      @Mefistofy 9 місяців тому

      @@ayliniemi Did not think about it but complexity is definitely something that exploded ind past decades. I work with ML and getting the GPU into high utilization can be quite hard sometimes, depending on architecture. All the new shiny libraries offer a lot of comfort but are sometimes badly documented. If you want to do something specific finding a way around a framework can be cumbersome and sometimes you waste a little processing time to get the damn thing to work.
      I guess there might be similarities in games. Hardware is developing so fast for decades, software is barely keeping up.

    • @ayliniemi
      @ayliniemi 9 місяців тому +3

      @Mefistofy So your saying because of time constraints the code isn't as perfect/efficient as it could be?
      Like you could go back and recode Mario Bros on the Nintendo, keep seeing how you could make the game run more efficiently on the NES processor am I right? You'll probably run into a dead end at some point.

    • @zuffin1864
      @zuffin1864 9 місяців тому +3

      You know what isn't realistic? Low frames dangit!

  • @HoD999x
    @HoD999x 9 місяців тому +90

    (former game) developer here. you can usually get things calculated a lot faster (x2-x10 in my experience) by thinking long and hard compared to a first prototype. the thing is that this process is usually very expensive and sometimes would mean you run out of funds while rewriting half your engine. on top, you become less flexible because only the special cases you optimized your code for are fast. your game would run better, but you will not have nearly as much content

    • @richr161
      @richr161 8 місяців тому +6

      AS far as spiderman. Its such a cpu hog because its streams a ton of data off the drive. It wasn't an issue on ps5 because Ps5 has hardware to assist with decompressing the data.
      When it was ported to PC , it functions the same, but relies on the cpu to decompress the data leading to high cpu usage. It you monitor the ssd usage you'll see it loading huge amounts of data of the drive. This is a game that can really use gpu decompress like ratchet and clank on pc.

    • @TraktorTarzan
      @TraktorTarzan 8 місяців тому +1

      is it an engine issue? cause with modded skyrim im playing a game thats essentially 500GB, and it runs decently on my 1080. but modern games, with way less content and and similar graphics(i usually play em on medium/high) ends up running with similar fps, even though its less than 100GB.

    • @richr161
      @richr161 8 місяців тому +2

      @@TraktorTarzan I assure you that the graphics detail aren't similar. Skyrim graphics weren't great back then and they're definitely outdated now.
      The size of the game doesn't have much to do with graphical fidelity. cyberpunk is an open world rpg and only comes in at 55gb. Its definitely the best looking game out when you turn on path tracing and all the modern effects.

    • @TraktorTarzan
      @TraktorTarzan 8 місяців тому +2

      @@richr161 i said modded skyrim, not the basegame, that isnt even close. look up "Skyrim in 2023 | Ray Tracing Shader" or "skyrim ultima". Also i said compared to modern games, not THE best looking modern game

    • @QuandariusHuzz-bq1jn
      @QuandariusHuzz-bq1jn 8 місяців тому

      alot of pc games these days have problems with asset streaming and resource management

  • @josephl6727
    @josephl6727 10 місяців тому +558

    It's time to quit high end gaming. Lol They are ripping everyone off.

    • @vextakes
      @vextakes  10 місяців тому +106

      Imagine without the highest end CPUs, it would be even worse

    • @theplotthickens6313
      @theplotthickens6313 10 місяців тому +30

      I play on 1080p ultrawide so I'm even more CPU bound with the same system lol

    • @Ay-xq7mj
      @Ay-xq7mj 10 місяців тому +12

      This. Me be playing allied assault pvp on a 3070ti.

    • @takehirolol5962
      @takehirolol5962 10 місяців тому +26

      Back to the 90s folks...you youngsters missed an awesome decade...
      But super expensive as a PC gamer...

    • @brucerain2106
      @brucerain2106 10 місяців тому

      Always have been

  • @bobsteven2363
    @bobsteven2363 9 місяців тому +406

    As a game dev (not programmer) I will lay out some fun facts. Making a high poly mesh with loads of detail can take a day to make. You can also auto uv unwrap it and start texturing the next day. Super easy. But the poly counts for a single character can easily reach one million with all the parts. Games cant handle that so you need to spend a week manually retopologizing, baking, uv unwrapping and texturing that model so that it has a low poly count and still looks amazing. Unreal engine releases auto lods. Yay, now I can skip the manual retopologizing phase and just make lods with the click of one button. Game sizes are now way bigger. Unreal Engine releases nanite. Oh wow, I can just import models directly from zbrush and fusion 360? Cool, now I dont need to worry about optimizing at all since the game engine can handle it. Every year, making games becomes much easier and less time consuming at the cost of performance. You can still optimize but why would you if you already finished what you were tasked with?

    • @curvingfyre6810
      @curvingfyre6810 9 місяців тому +108

      More importantly, they're under pressure from their bosses to churn them out faster. The crunch is insane, and the quality suffers across the board.

    • @ralphwarom2514
      @ralphwarom2514 9 місяців тому +2

      Yup. Pretty much.

    • @MrThebigcheese123
      @MrThebigcheese123 9 місяців тому +17

      So, it is ok to release a half baked product and spend over 70% of the development time on pre planning while leaving real development until 2 years before release? Mkay then...

    • @curvingfyre6810
      @curvingfyre6810 9 місяців тому +33

      @@MrThebigcheese123 I think the point is to blame the direction and money side of things, not the average programmer. They have to get through the day and survive the frankly insane crunch time. It's up to the directors and producers to approve of the level of work that they have forced out of their engineers in the allotted time, and to choose whether more time is needed.

    • @Born_Stellar
      @Born_Stellar 9 місяців тому +1

      good to know, interesting to hear from inside the industry.

  • @Robwantsacurry
    @Robwantsacurry 9 місяців тому +9

    You've missed something important in the PC space, copy protection.
    Many PC titles are encrypted, sometimes with nested encryption. Denuvo for example, code is decrypted on the fly, because unencrypted code could be ripped from memory. This pushes CPU usage up massively.

    • @saurelius5217
      @saurelius5217 9 місяців тому +3

      More reason not to buy new AA games.

  • @JodyBruchon
    @JodyBruchon 9 місяців тому +4

    I talked about this in "everything is a f***ing prototype." Software is a gas that expands to fill its container. I own a bunch of old XP-era embedded hardware and netbooks. Most of them have one core. Look up the PassMark for the Atom N435 or VIA C7-M 1200MHz, then look up the most basic CPU for a Windows 10 laptop like the Celeron N3050. The difference in computing power is huge. The old wimpy chips could do all the basic tasks anyone needed in a laptop at a reasonable speed, but instead of the software getting faster as hardware has exploded in speed, everything is written in Python or JavaScript and everything uses huge frameworks and pulls in huge dependencies for small things. I recall an old article that disassembled and analyzed bloat in the Facebook app for iOS which was notoriously fat at the time and the most ridiculous piece of bloat was an entire library of functions pulled into the app just to use a single function that does something like getting a date. Any competent programmer could have spent a day at most writing it themselves but they opted to pull in a library to do that single thing. It's disgusting.

  • @nemesisrtx
    @nemesisrtx 10 місяців тому +534

    2023 has been the worst year for PC Gaming, games releasing unfinished and terrible optimized, etc... Nowadays people have very good systems but not even current hardware is capable to keep up with how demanding games are, hopefully game devs understand that it is better to have a playable game on day 1 and not rush their projects because of hype, AND, most importantly that most PC gamers don't have a high end PC lol

    • @brucerain2106
      @brucerain2106 10 місяців тому +1

      Truuue

    • @Not_interestEd-
      @Not_interestEd- 10 місяців тому +24

      Actually, there's a surprising answer as to the issues we've seen recently.
      DX12 has been out for a while, and it's great. Has a lot of drivers to do a lot of the heavy lifting so your games are usually optimized, even with minimal effort put into actual optimization. The problem comes with DX12 "Ultimate", which theoretically is a better branch of DX12, but most of the "useless" drivers had been ripped out, and I guess in the lot removed, something was doing WAY more work than expected, thus, poor optimizations. This in conjunction with bad management and impossible deadlines (think about the Mick Gordon vs Bethesda incident) makes game development a harsh environment for AAA titles.
      I really want people to stop blaming the devs, blame the management that screws around with the millions of dollars they sit on, paying the actually working devs a fraction of what they get.

    • @Ghostlynotme445
      @Ghostlynotme445 10 місяців тому +43

      The rtx 4090 going 40fps at 4k is real what a $1600 experience

    • @Patrick-bn5rp
      @Patrick-bn5rp 10 місяців тому +3

      Pretty good year for emulation on PC, though.

    • @mimimimeow
      @mimimimeow 10 місяців тому +15

      Welcome to the transition period. It happens in every console generations except 8th gen because PS4/XB1 were objectively very weak for its era.

  • @surnimistwalker8388
    @surnimistwalker8388 9 місяців тому +253

    The way I am seeing it is that it's not going to get better. Publishers are pushing game dev studios to pump out games as fast as they can. UE5 enables this as you described being able to pump out a game that looks pretty but relies on hardware to brute force programming "issues." This is a problem with the quick cash grab mentality in the gaming industry and it's not going to go away until the whole bubble that is getting created bursts.

    • @abeidiot
      @abeidiot 9 місяців тому +1

      Also, game dev pays peanuts and has insane working hours. The brightest computer science talent is no longer interested in working in game dev unlike the 90s

    • @onyachamp
      @onyachamp 9 місяців тому +13

      This is true.
      It's like an artist pumping out work for money rather than a love of their art.
      I personally think on a sinister note, that studios are running silent companies to sell cheats for the games they make.
      A person buying hacks for $10-20 a month are spending $120-240 a year on cheats and they would be so much easier to develop than the game itself.
      From a business perspective it is easy money, and that's exactly and almost entirely what the industry has become.

    • @surnimistwalker8388
      @surnimistwalker8388 9 місяців тому +8

      @@onyachamp You know Rockstar has been caught how many times now distributing pirated versions of their own games. I wouldn't put it past them to do that.

    • @BygoneT
      @BygoneT 9 місяців тому +3

      ​@@surnimistwalker8388I've literally never heard of this? When?

    • @Jakob178
      @Jakob178 9 місяців тому +4

      the bubble are called normies that buy every yea r literally shit, doesnt matter the quality

  • @ZTrigger85
    @ZTrigger85 9 місяців тому +24

    I’m so glad I watched this. I have a Ryzen 5 5600X and I’ve been CPU bottlenecked lately. I was considering upgrading to a Ryzen 9, but seeing that it didn’t solve the problem for you saved me a lot of money.
    Are hardware manufacturers paying game developers to make their games as demanding as possible? I think about this often.

    • @MartinMaat
      @MartinMaat 9 місяців тому +11

      I don't think they struck a deal among each other but rather developers will utilize hardware to the max to make their game look better than the competitor's game. So "whatever the hardware guys will come up with, the software guys will piss it away".
      Their interests align nicely though.

    • @a7mdftw
      @a7mdftw 9 місяців тому

      Can i ask what is your gpu

    • @ZTrigger85
      @ZTrigger85 6 місяців тому

      @@a7mdftw Sorry, I just got the notification for this. No clue why. I’m running a 4070.

    • @xTheN8IVx
      @xTheN8IVx 6 місяців тому

      This is mostly a game development issue, the 5900x is still a great CPU for plenty of games. Paired with a 4070, that’s a great setup. There just needs to be more optimization on the game devs side.

    • @dangelocake2635
      @dangelocake2635 5 місяців тому

      I'm not a dev, but it seems more like a industry issue. What I mean is, in order to make a game run as smooth as possible, you need time to optimize. The more time you need, the more people you're keeping into your business, so it costs money. But a game doesn't generate most of the money until is released, so you must balance your timeframe with the amount of fund/time/potential profit . Companies usually rather push a poorly optmized game over a highly cost one an fix it later.
      You could improve engines overtime, but everygame needs to step up in terms of graphics, so there's only so much work you can reuse.
      So I don't think it's a conspirancy, because you have games like RE4 Remake that was running smooth from day one, because it's a Capcom engine they'd developed for years or even Rockstar games that are sometimes demanding, but you can see why they are, in terms of tech.

  • @Oni_XT
    @Oni_XT 9 місяців тому +2

    I'm not a game dev but this popped into my head since I'm constantly seeing texture streaming options. Is it possible more games are integrating this and it's effectively pooping on CPUs?

  • @XieRH1988
    @XieRH1988 10 місяців тому +189

    Things aren't being optimised properly sums it up fairly well.
    The current period is one of transition, where developers are fumbling and stumbling their way to master DX12 and other things. It'll probably get worse before it gets better.

    • @deality
      @deality 9 місяців тому +5

      I believe it dx12 is not optimized but it's the future and it should get better

    • @he8535
      @he8535 9 місяців тому +4

      Poor optimization more complex compression still missing all the features a good game would have

    • @americasace
      @americasace 9 місяців тому

      Sadly, I vehemently agree with your assessment..

    • @borone1998
      @borone1998 9 місяців тому

      Tod Howard would beg to differ

    • @Gramini
      @Gramini 9 місяців тому

      I wonder how long the transition period will be, given that D3D12 is over 8 years and Vulkan over 7 years old now.

  • @imbro2529
    @imbro2529 10 місяців тому +89

    Tbh I think it's an optimization issue of the games themselves not the hardware. Because really we have the 30 and 6000 series GPUs, Intel's 12-13 gen and Ryzen 5000 CPUs probably the best hardware for only to barely run shitty ports like The Last of Us, CyberBug, Hogwarts Legacy (it was quite buggy on release), Darktide (a shitstorm of poor optimization), Forespoken and etc. All of these came out poorly built because the devs probably have difficulties with all this new software and being pushed from a top to release a product. So I don't think it's a hardware issue more like software issue that doesn't correctly utilize the true potential of our components

    • @QQ-xx7mo
      @QQ-xx7mo 9 місяців тому +11

      this is just what happen when the masses get access to a media (games, cinema, tv, internet) it becomes shit.

    • @dugnice
      @dugnice 9 місяців тому +4

      I think it's a concerted effort to destroy the PC gaming market so that only the wealthiest of people can indulge in PC gaming and everyone else gets a console, but I'm probably totally wrong. 🤷🏻‍♂️

    • @knasiotis1
      @knasiotis1 8 місяців тому

      ​@@QQ-xx7mowhat.

  • @ErwinLiao
    @ErwinLiao 9 місяців тому

    hey bro thinking about upgrading do u know if a pre built or building your own pc wich one is cheaper

  • @Soraphis91
    @Soraphis91 9 місяців тому +13

    One issue easily neglected: UE and Unity are kinda general purpose engines. Yeah, UE has a history in FPS games and is rly good in that but you can basically do any game with the engine.
    this means those engines have a lot of abstractions and many game developers - when they chose one of those engines - usually don't have the manpower to go so far inside the engine code to optimize it for the last bit, regarding the concrete game they are working on.
    Just check how many engine developers are hired for projects that are developed on inhouse engines of AAA studios.
    So, with the comfort of taking a ready-to-go-engine you also lose some control

    • @NeovanGoth
      @NeovanGoth 9 місяців тому +3

      Totally agree. UE and Unity are awesome as they allow even smaller teams to use state-of-the-art graphical effects they could never have written themselves (and usually perform really well), but they also seem to encourage using them in improper ways.

  • @ErockOnTech
    @ErockOnTech 10 місяців тому +78

    The take away for me here is that modern games aren’t optimized. I’ve said this in videos. So has HUB. But good job going in depth on CPU usage. You’re right about reviewers using higher end CPUs. I’m guilty of that myself.

    • @vextakes
      @vextakes  10 місяців тому +25

      I don’t think anything’s wrong with using fast CPUs, because the goal is to show the overall GPU performance. However, a lot of ppl might not be able to get that performance just because of the CPU they own. So it’s a mixed bag. It requires a lot of testing, but should prolly be pointed out depending on what games are reviewed. Mb in CPU reviews we could show if they give reasonable performance as well, compared to relative GPU power if we’re talking about gaming

  • @pliat
    @pliat 10 місяців тому +79

    DX12 can be optimised far better that DX11, but that requires time, and most importantly, skilled devs that understand low level coding. The devs just are not good enough.

    • @_GLXC
      @_GLXC 10 місяців тому +2

      maybe those devs are actually around, they're just still working on a game

    • @thechezik
      @thechezik 10 місяців тому

      This is have evrything to do how America degrading those develepores are not pay like just to be meanwhile cost of living skyrocketing and this is all over diffrent industries basicly competion its killed there is no moro loyality consumer base now its focus on AI and most importaninly wealth vs middle class vs working class its gone !!!

    • @AwankO
      @AwankO 10 місяців тому +30

      Perhaps those developers are not given enough time to optimize properly.

    • @henryyang802
      @henryyang802 9 місяців тому +1

      I will not say that the devs aren't good enough, maybe the large game-developer community are still figuring out how to use DX12 the best way possible? Fine-granural control means more fragmentations and more to learn about what's going to be the best. Probably there aren't only 1 Optimal way of using it, there are many?

    • @s1p0
      @s1p0 9 місяців тому

      make game first

  • @SkyAnthro
    @SkyAnthro 9 місяців тому +2

    Have you tried turning on hardware accelerated GPU scheduling? It might help with relieving some of the work load on the CPU ^-^

  • @Nukeaon
    @Nukeaon 6 місяців тому

    thank you for this video! new sub :)

  • @beansnrice321
    @beansnrice321 9 місяців тому +76

    I think the main issue isn most of these games is that their animation is all being handeled by one thread on the cpu. Many 3d content creation programs also have the same issue, with Maya being one of the few multi-threaded animation engines in the industry.

    • @Gramini
      @Gramini 9 місяців тому +5

      I don't think that animations are that heavy. My guess is on the actual game logic (including AI) and maybe physics simulation.

    • @blindfire3167
      @blindfire3167 9 місяців тому +8

      @@Gramini AI and Shadows/Lighting are the heaviest pieces on the CPU (mostly AI), while Physics depends on what type of simulation you're doing (some can be handled by mostly GPU like rain/water or destruction).
      Ray Tracing (although very heavy on GPU) requires a very beefy CPU to handle it since you still need the CPU to calculate light, which also is needed to calculate shadows and reflections (though reflections are usually just handled by the GPU, it can still be kept waiting for CPU to finish the other processes) and I'm not 100% on this, but it always feels like games that use RT always have it running on multiple cores but not multiple threads. I know I'm just a lowly peasant with my 8700k (I just barely graduated and still waiting for that dream job to kill all my enthusiasm for any industry lol) but every game I've tried RT on my 3080 has run horribly from core 0 taking most of the load on thread 0 and then on cores 2 and 4 only the first thread being maxed out.

    • @Gramini
      @Gramini 9 місяців тому

      @@blindfire3167Shadows/Lighting are also done on the GPU. Or do you have some specific case in mind that doesn't?
      Good point with the physics one. Some pieces of it can be delegated to the GPU, but not all. And not all games do that. It also taxes the GPU, which might be more important to do the rendering.
      Also multiple cores = multiple threads. To be more specific: to do things in parallel, a program creates a new (software) thread. It's then up to the OS to schedule the thread to a physical core/thread on the CPU. The program can also give a hint that it should be in on another core.
      The situation you described with only every second physical thread/virtual core being used is quite an interesting one. That _might_ make sense, because those two virtual cores are still only one physical core, which cannot do the same thing twice in parallel. So some programs hint that only every second CPU thread should be used. From what I know/was being told by a consultant was that it's usually best to not do that and just leave it to the OS to schedule/balance.

    • @toddwasson3355
      @toddwasson3355 9 місяців тому

      @@blindfire3167 Multiple threads means running on multiple cores. There's no difference.

    • @blindfire3167
      @blindfire3167 9 місяців тому +1

      @toddwasson3355 Nope, it *can* mean the same thing but you can have something running on multiple cores but not multiple threads, it would just run on the first thread on each core alone.

  • @DankyMankey
    @DankyMankey 10 місяців тому +14

    Also, another issue is that most popular games are also not multi-threaded so a single thread/core of your CPU is bottlenecking your GPU.

    • @mikfhan
      @mikfhan 9 місяців тому +5

      Yep, this is the issue we've had with games since the turn of the millenium. 5GHz boost is needed because games rely so much on a main thread, but CPU can't boost forever. We have plateau'ed around 4½ GHz now and it will take miracles to go beyond that in a stable manner. Parallel programming is difficult. The best way to improve your gaming experience from now on is not hardware, but rejecting games that only use 4 cores effectively. Wait for performance reviews, make publishers care about optimization, instead of micro transactions.

  • @F1Lurker
    @F1Lurker 9 місяців тому

    Very insightful and concise video, thank you for making it

  • @anthonyperez87
    @anthonyperez87 6 місяців тому

    Thank you for talking about the cpu. 😢I’m wondering if changing out my gpu or just cpu on my prebuilt would make a significant improvement?

  • @justarandomgamer6058
    @justarandomgamer6058 10 місяців тому +28

    If I recall correctly data centers had the same issue and they innovated by producing hardware for specifically handling the transfer of large volumes of data instead of the CPU which is more designed for all-purpose tasks.

    • @_GLXC
      @_GLXC 10 місяців тому +9

      you would think that with all this hubbub about "Tensor cores" or "raytracing cores" that the operation would be GPU bound, but no. v_V

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 10 місяців тому +8

      You know it's sad when the latest GPU is bottlenecked by a CPU that is newer than it, every CPU bottlenecks a 4090 with RT at some point in these newer games

    • @BlindBison
      @BlindBison 9 місяців тому

      Yeah, consoles took that route too. PS5 has dedicated hardware for asset streaming and decompression. PC is getting direct storage so maybe that’ll help but basically no games even use it yet.

    • @abeidiot
      @abeidiot 9 місяців тому +1

      Fun fact, this has been available in PCs since a long time. It just wasn't catered to spoon feed game developers, linux even has a system call to handle it. Now microsoft is adding directstorage to directx to make it easier for game devs to implement

  • @minnidot
    @minnidot 10 місяців тому +132

    You really nailed some good points. CPU's used to be one of the last things we had to worry about upgrading. I posted a vid on how to lower CPU usage in Battlefield 2042 and even users with 13900k systems have commented that it helped. That's one of the top pieces of hardware available and its struggling with a game from 2021

    • @ericimi
      @ericimi 10 місяців тому +16

      That video was yours ? I literally just used it the other day for battlefield my 7700x was using about 50 percent then I made the user.cfg file and it uses about 33 percent . Pretty big difference .

    • @ralkros681
      @ralkros681 10 місяців тому +13

      “But older games are not optimizing for new hardware”
      Most bs excuse I have ever heard. Had to say it before someone else did

    • @minnidot
      @minnidot 10 місяців тому +4

      @@ericimi that was mine. I'm so glad it helped you!

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 10 місяців тому +3

      It's BS that every new GPU generation you need a new CPU to be able to utilize it, the 5900X pushed a 3080 and 3090 really well in 2020, but it's virtually a potato on new games in 2023, in some cases my 4090 at 4k, it wasn't even an upgrade over the 3080, but now I can turn DLSS off, which barely makes a difference, because a 5900X is just a trash CPU against new games. I wanna see what Ryzen 8000 and Intel 14th Gen are like, I would like to stay with AMD but I want really want to upgrade a 12 core to 16 core when a 12 core is being nearly fully used in some games, I want at least 20 or 24 cores, but I doubt AMD will increase core counts, whereas Intel probably will.

    • @RickyPayne
      @RickyPayne 9 місяців тому +5

      @@mttrashcan-bg1ro Unless you're a heavy, heavy multitasker like a game streamer, core counts for gaming isn't very important past 6-8 cores because most all games are designed with 8 core consoles in mind. Cores 9+ are only normally only helpful for doing extra non-gaming tasks while gaming. For strictly gaming you're better off with an 8 core AMD x3d model over anything else since, at 6+ cores, processor speed and cache are more important. Once you hit 8 cores/16 threads, unless you're a developer, game streamer, content creator, Gentoo user, etc, you're better off with more cache and single core processing power over more cores and threads.

  • @hermit1255
    @hermit1255 6 місяців тому +8

    I actually feel like the march AAA game devs are on towards more and more demanding games will eventually be forced to a stop as I think most people on mid to lower systems will just stop buying their games. A bright future for underdog indy stuff.

    • @italoviturino6386
      @italoviturino6386 5 місяців тому

      The "look at how real everything inside this game looks but pls ignore what is in it" will be replaced with games like Hi-Fi rush, where the art style will make it so it ages better while demanding less from the hardware

    • @metalface_villain
      @metalface_villain 4 місяці тому

      this has already begun tbh, everyone seems to be playing more indy stuff and ignore big aaa titles unless they are something as great as elden ring for example. this ofc is not only because of how demanding the games are but also because the triple a games are becoming just a money grab full of microtransaction while the more indie stuff focus on a great gaming experience

  • @Thanatos2996
    @Thanatos2996 9 місяців тому +18

    Remnant 2 is actually less CPU bound than the first game, if you can believe it. The first was less visually impressive, but the shaders were so badly optimized on the CPU side that some areas struggled to stay above 70 on my 5800/3080ti rig at 1440 with shadows turned on. They’re both stellar games, but Gunfire could really stand to work on their optimization.

  • @jokerxd9900
    @jokerxd9900 10 місяців тому +28

    I think we are at a weird state we just jumped to next gen and there will be mistakes and bad optimizations but after a while they will master it and things will get better.
    If they are open to learning and not lazy

    • @deality
      @deality 9 місяців тому +3

      I have a theory that unreal engine unity they are just trying to kill the gaming industry because of how unoptimized those applications are also directx12 is poorly optimized which is why you get shitty fps using it

    • @Juanguar
      @Juanguar 9 місяців тому +3

      @@dealitydirectx 12 does not optimize anything because it offloads the optimization to game developers
      Some devs even said it themselves and explained why some games ran on 11 better than 12
      It’s because devs were lazy af and over relied on the auto optimizations that 11 has offered

    • @raskolnikov6443
      @raskolnikov6443 9 місяців тому +2

      Next gen has been out for 3 years….

  • @burnzybhoy9110
    @burnzybhoy9110 10 місяців тому +31

    Personally i feel like pc game optimisation has become an afterthought as of late, i dont feel like DX12 is stable enough in its current form and we should have the option to run DX11 if we choose, also the strange memory leaks we are suffering these days makes me ask a lot of questions, is DX12 the issue or are game devs targeting ram and cpu ? All i know is i want more accessability with DX options

    • @KainniaK
      @KainniaK 9 місяців тому +3

      Because of FSR and DLSS

    • @jjcdrumplay
      @jjcdrumplay 9 місяців тому

      How do you monitor memory leaks?

    • @burnzybhoy9110
      @burnzybhoy9110 9 місяців тому

      @@jjcdrumplay in order to monitor memory leaks not only do we have to do our research but also monitor your ram usage via task manager and then performance, from here you can see ram usage.

    • @mihailmojsoski4202
      @mihailmojsoski4202 5 місяців тому

      @@jjcdrumplay valgrind helps, tho it makes your program/game run like shit while testing because it's technically an x86_64 emulator

  • @D1EGURK3
    @D1EGURK3 9 місяців тому +3

    I can relate a lot to this topic...I have a 4070 Ti and an Ryzen 7 3800X...I'm CPU limited in basically every newer game...but not so much in older games...

  • @brucerain2106
    @brucerain2106 10 місяців тому +10

    Remember ps3 with its 256mb of ram and somehow it ran TLoU, Gta5 and Uncharted?
    And now we have components that cost like a whole console and they can’t even be utilised to their full potential. Wow very cool

    • @slaydog5102
      @slaydog5102 9 місяців тому +1

      I thought we were "advanced" though?...

  • @danos5048
    @danos5048 10 місяців тому +121

    As I understand it, the CPU usage stat is actually based on the number of threads being used. Each thread in use contributes to a percentage based on how many there are. 80% on a 12 core 24 thread chip means that at that moment 19 threads are being used. 0.8 * 24 = 19. 12 cores are actually being used and some of those cores are into hyper-threading/SMT.

    • @tadeuferreira5705
      @tadeuferreira5705 10 місяців тому +20

      Wrong bro, CPU utilization is about CPU time and not about thread's. Any modern os with programs and service running on the background has hundreds if not thousands of active threads at any given time

    • @cagefury3789
      @cagefury3789 10 місяців тому +33

      @@tadeuferreira5705 You're talking about OS threads or process threads. Those are just concurrent instructions running within the same process that share memory. He's talking about hardware threads, which are also sometimes called logical cores. You're right in that it's about time, but utilization takes threads into account as well. You can have 1 thread constantly doing work 100% of the time but you're overall CPU utilization will be very low if you have a lot of threads/cores.

    • @maverikshooter
      @maverikshooter 10 місяців тому +13

      ​@@tadeuferreira5705 50% = 12 cores are used. And 100% = 24 threads.

    • @margaritolimon3683
      @margaritolimon3683 10 місяців тому +6

      @@tadeuferreira5705 No it’s about usage overtime that is why it goes up and down depending on the scene. For a normal cpu (no e-core math hard) 50% being used is all the cores anything higher and it’s now using hyper-threading. Also it’s trying to explain everything with one number. If a 12 core cpu is being used and only 6 cores (no hyper) are at 100% and 6 are at 0% then it will show up at 25% or all cores at 50% (no hyper) will also show as 25%.

    • @VDavid003
      @VDavid003 10 місяців тому +8

      50% on a 24 thread cpu could mean 100% on 12 threads with 0% on the rest or 50% on all 24 threads or anything in between

  • @jjcdrumplay
    @jjcdrumplay 9 місяців тому

    I remember finding second hand Core 2 Duo E7500 for a Linux/winblows build by shopping on ebay for like under 5 bucks. At the time it was only about 7 maybe years old, and the thing worked for 5 years plus! Would it be worth the same buying second hand AMD chips if you'll save hundred, just taking the risk it was overused?

  • @vidmantaskvidmantask7134
    @vidmantaskvidmantask7134 7 місяців тому

    Good voice talking. : ) You are skilled. Its interesting to listen.

  • @MarikoRawralton
    @MarikoRawralton 10 місяців тому +17

    Tim Sweeney once said that the main failing of Unreal Engine was that everything runs on one thread, at least back when he was discussing it. He actually praised EA's Frostbite for being able to multithread logic (but admitted it was a harder engine to use).

    • @kevinerbs2778
      @kevinerbs2778 10 місяців тому

      can the Frostbite engine use mGPU?

    • @progste
      @progste 10 місяців тому +5

      I believe that was UE3 though

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 10 місяців тому +1

      Frostbite engine was good up until Battlefield 1 where is just rooted everyone's CPU so the i7 6700k at the time was bottlenecking 1070s a 1080s, by Battlefield 5 there were some 6 and 8 core options around but the clockspeed and IPC still matters more and that issue was amplified with 2042 where it's able to push a 24 thread CPU to 100% at times despite the visuals not being improved at all over BF1. Frostbite is a gorgeous engine but it's the second most CPU intensive engine I can think of after the one that the newer AC games use.

    • @MarikoRawralton
      @MarikoRawralton 10 місяців тому

      @@kevinerbs2778 If you mean multi GPU, I have no idea. I've seen conflicting reports.

    • @MarikoRawralton
      @MarikoRawralton 10 місяців тому +2

      @@mttrashcan-bg1ro BF1 ran great on my old PC and I was running an 8320E back then. That was a terrible CPU. Targeted 60fps on the old PS4/Xbox One and those both had notoriously weak CPUs. I think it benefited more from core count.

  • @anarchicnerd666
    @anarchicnerd666 10 місяців тому +33

    Nice vid Vex :) I'm not a dev so I can't comment really, but I'm with you - things can only get better. The big thing to remember with DX11 and DX12 is the whole reason that transition happened in the first place - namely devs complaining about the anemic performance of the X1 back in the day and the API having too much overhead. That's a big reason WHY DX12 is so focused on stripping out guardrails and handing control back to developers. It's also worth noting we're in the middle of a weird transition period again, the move to AM5, Intel's paradigm of e-cores and p-cores, Windows 11 rollout, the move to DDR5 etc etc. The sheer breadth and scope of hardware available on the open market and used market for people to build systems for is awesome for consumer choice and value, but a nightmare for devs who need to optimize for an almost limitless combination of hardware.
    ...man I sure picked a hell of a time to build a PC and join the enthusiast community XD my poor little R7 5700X is gonna get completely outpaced by upcoming games
    How ya liking the 5800X3D? Very curious what your thoughts are going to be for editing with it versus the 5900X, but it's clearly a monster at gaming...

    • @mathewbriggs6667
      @mathewbriggs6667 10 місяців тому +2

      I got the ryzen 5950x over the 5800x3d I needed the extra cores and clock speed but the x3d poops on it lol

    • @mv223
      @mv223 10 місяців тому

      @@mathewbriggs6667 It truly is a badass chip.

    • @handlemonium
      @handlemonium 10 місяців тому +1

      Lol my 10700 is gonna be bottlenecking so hard when I upgrade to the 8700XT or 5070 in 18 months.

    • @yonghominale8884
      @yonghominale8884 10 місяців тому +8

      As an ancient dinosaur who played the original DOOM on a 3dfx and the OG Pentium I can attest the things come in cycles. The issue is consoles and when transitioning from one generation to another, it's always rough. I still have nightmares about Crysis.

    • @mathewbriggs6667
      @mathewbriggs6667 10 місяців тому

      @@yonghominale8884 it's gonna be a while longer then 18months

  • @brunoperugini6299
    @brunoperugini6299 9 місяців тому

    which software is that monitoring the computer performance? always wanted to know which software is that

  • @BruceLee-qc2lm
    @BruceLee-qc2lm 5 місяців тому

    Thanks for the video. I have a 3080 with 5900X, and I was trying to figure out the bottleneck even after using Ryzen Master. Might upgrade to 5800X3D and maybe 4080ti when it gets released at the end of the month.

  • @arioch2004
    @arioch2004 9 місяців тому +6

    Regarding the cpu decompressing and handling textures, that is what resizable bar is for. The GPU can access storage to retrieve and decompress textures without involving the cpu. So if you have a recent graphics card and a motherboard that has the resizable bar feature in efi/bios, then enable it.

    • @imcrimson8618
      @imcrimson8618 9 місяців тому

      instructions unclear, my pc turned into a nuclear reactor and now its fallout 4

  • @dragons_advocate
    @dragons_advocate 9 місяців тому +12

    Here's one downside of these higher and higher detailed meshes and textures I have seen nobody talk about yet:
    The high graphical fidelity only really comes to shine when not in motion, or very slow motion. Fast movements quickly turn all those details into noise - particular with video compression (like, say for example a UA-cam video) -- but also 60 Hz can be low enough for our eyes to see a smearing effect. Meaning paradoxically, with this insane level of details games are capable of nowadays, you would need MORE fps (and a high refresh rate monitor, natch) to really enjoy it (Again, mainly talking about fast moving scenes and games here).
    And may god have mercy on your soul if you play with motion blur on. There is a specific circle in hell reserved for those people.

    • @DashMatin
      @DashMatin 9 місяців тому +1

      fr

    • @XxWarmancxX
      @XxWarmancxX 9 місяців тому

      In defense of motion blur: racing games. For them specifically it's hard on my eyes with motion blur off, making my eyes ache at worse on my desktop monitor.

    • @dragons_advocate
      @dragons_advocate 9 місяців тому

      @@XxWarmancxX out of curiosity, at which framerate?

    • @NeovanGoth
      @NeovanGoth 9 місяців тому

      Having good per-object motion blur should be a basic requirement for each game.
      When motion blur becomes problematic it's mostly camera motion blur. Many games for example don't scale the simulated shutter speed with the actual frame rate, causing it to look perfectly ok in 30 fps, but like a blurry mess in everything above.

  • @DiizRupT
    @DiizRupT 6 місяців тому

    yes this is an issue for sure one thing I do is either go 4K or up the resolution scale.

  • @R4D4HN
    @R4D4HN 8 місяців тому

    I have a Razer blade with the same specs as your PC, and I got really scared that the seller ripped me off and there is something wrong with the laptop while playing Miles Morales, but it really just came down to 9 5900hx not being able to push 3080 😢

  • @livingthedream915
    @livingthedream915 10 місяців тому +84

    honestly it's possible to avoid the cpu utilization pitfall by simply not using any kind of raytracing, and in addition it's well known that Nvidia drivers are more CPU heavy that AMD's

    • @skorpers
      @skorpers 10 місяців тому +20

      Yeah people are really stubborn about Raytracing. Ghostwire tokyo is a pretty good example of a game that looks absolutely fine without having raytracing on. And in some cases, you might not even know there's a difference because you have to point the camera at certain angles for there to be any shortcomings of Screen space reflections.

    • @chiari4833
      @chiari4833 10 місяців тому +16

      Yep i dont get why ppl are buying into this expensive marketing trick. It may look good, but it's not ready. It needs massive optimization .

    • @livingthedream915
      @livingthedream915 10 місяців тому +5

      You fellas in the comments get it, this tech is simply too soon out the gate to be seriously considered, anyone who actually bought a 2000 series gpu for raytracing got royally exploited and we're still in a situation where it's almost always a better experience to have it off

    • @skorpers
      @skorpers 10 місяців тому +6

      @@livingthedream915 The main thing that tipped me off about RTX being BS is that the developers of the games promoting it the most stopped even attempting to use impostering techniques to add flavor to the scene in a cheap way. I.E the examples they used like reflections simply not existing on glassy surfaces if RTX was off when the PS2 could literally do it. Also how metro looked extraordinarily washed out with rtx off when colored lighting was done on old consoles as well.

    • @GDRobbye
      @GDRobbye 9 місяців тому +1

      Games without RT can look as good as games with RT, it just requires more work from the designers. So to some degree, RT is a time-saver and going forward, we'll probably see more focus on RT and less focus on traditional illumination/shadowing, which in turn means that, gradually, games without RT will simply look worse. Hopefully, by that time, RT won't be such a big performance hog.

  • @CyberJedi1
    @CyberJedi1 10 місяців тому +42

    One of the biggest problems I think is that CPU advancement is nowhere near GPU advancement... especially on single-threaded performance. From the 3090 to the 4090 we got almost double the performance, from the 5800x3d to the 78003d is not much higher than 10%, and games don't utilize much more than 8 cores, so it doesn't matter how many you have in the cpu. Also, cpu's are kind of stuck at 6ghz, I heard somewhere that it will be really hard pushing past that.

    • @MitremTheMighty
      @MitremTheMighty 10 місяців тому +24

      3090 to 4090 double the performance? 😂It's closer to 60%, which is good, but nowhere near double

    • @korinogaro
      @korinogaro 10 місяців тому +11

      Yes and no. Main problem is that engines suck ass "out of the box" with threads utilization. Devs need actively write ode to utilize more cores and threads but they don't give a fuck and go with default settings in engine in as many places as possible. And 2nd problem is that companies switched run for GHz to masturbation contest over number of cores (because after they figured out how to glue CPUs together it is easier and gives good results in syntetic benchmarks). So it is combination of these two problems. CPU manufacturers give us more cores but improvements in *umpf* of every core are not impressive and devs don't give a fuck about iptimization for more cores.

    • @user-eq2fp6jw4g
      @user-eq2fp6jw4g 10 місяців тому +1

      r7 5800x3d is still insanely good value for money for budged/bit better future proof 1440p gaming taking to and account how expensive am5 platform still is. Specially good motherboards.

    • @nugget6635
      @nugget6635 10 місяців тому +2

      Actually CPU has a much better scalar performance than GPU. GPU is vector processor (highly parallel). So single thread CPU is better actually. Parallel GPU is thousands of times better

    • @andreabriganti5113
      @andreabriganti5113 10 місяців тому +4

      At a certain point, just increasing the clocks will not lead to great improvements. It did happen already. Having tons of fast cache is a better solution for gaming.

  • @Nico-ci9qb
    @Nico-ci9qb 9 місяців тому

    As someone who doesnt know the exact specifics, if the cpu is struggeling with These tasks woul a new specialised komponent fix the issue? Something like a MPU ""mesh processing unit "" ?

  • @theslicefactor4590
    @theslicefactor4590 9 місяців тому

    What's the program you're using to show those stats?

  • @XScythexXx
    @XScythexXx 9 місяців тому +4

    Great video and points, I always played on a low end PCs years behind of current games while growing up, in a country with a poor economy, but with years of work i could finally save up to upgrade my own PC to a decent level several years ago, nowdays whenever a major title releases, I feel like if you don't have the latest hardware you are just everything I've put in my PC just became obsolote on them, it's insanity. I feel like I'm back in 2010 playing some random games in 800x600 just to get 30 fps with the state of all recent titles.

    • @NeovanGoth
      @NeovanGoth 9 місяців тому

      I think that's connected to the current-gen consoles having much beefier CPUs than their predecessors. The PS4 and Xbox One were much more limited on the CPU side, so it soon became easy even for lower end for PCs to reach and surpass this baseline. Al lack in GPU performance can easily be mitigated by just lowering the resolution or reducing the quality of graphical effects, but if the CPU can't keep up, there often isn't much that can be done.

  • @rollingrock5143
    @rollingrock5143 10 місяців тому +3

    I've noticed this a lot on flight sim 2020. It gets so bad sometimes. Other modern games as well. Gpu is an upgrade from my last one and it stagnates behind a 3 year old CPU. Great point to bring up Vex.

  • @dIggl3r
    @dIggl3r 13 днів тому +1

    Is there *any* other apps than Afterburner to see GPU/CPU usage in games?

  • @vairaul
    @vairaul 7 місяців тому +6

    The problem is most games are heavily depended in one or two CPU cores using the others as support, a situation where for example: cpu0 - 100% / cpu1 - 80% / cpu(2-X) - 30% is happening. I believe the today's problem is software bottleneck to utilize the most of modern GPU.

  • @dafyddrogers9385
    @dafyddrogers9385 10 місяців тому +12

    I'm glad you made a video on this, I was feeling a bit sad after learning the new 4080 I bought would be getting held back by my CPU in raytracing in games like Spiderman, Witcher 3 etc. because I bought a high power gpu so I could experience these new features and yet I can't do that now because I need a £500 brand new CPU and platform :(

    • @avanx7699
      @avanx7699 10 місяців тому +3

      I got the same problem. For me a new GPU basically means building a brand new Platform as well and the worst part is, that i bought some new fitting parts only a few month ago for my current GPU which decided to give up on me now. No matter how i spin the wheel, it sucks from every point of view right now.

    • @kevinerbs2778
      @kevinerbs2778 10 місяців тому +2

      The Witch 3 is a ported from DX11 to DX12 which doesn't work well. ground up DX12 engine builds work better.

    • @deality
      @deality 9 місяців тому +1

      You need it anyway

  • @georgeindestructible
    @georgeindestructible 9 місяців тому +6

    2:47 very well said, a lot of people thing that just because they get the best CPU or GPU, especially the CPU, it's not gonna give them any issues, how inexperienced they are.
    The worst part is, we have more than enough horse power in most modern CPUs to deal with almost everything at least at constant 60 FPS but "it's hard to code for that" devs usually say.

  • @eliasalcazar6554
    @eliasalcazar6554 7 місяців тому +3

    I think after Xbox One and PS4 reach EOL we'll see some great strides in optimization. I agree also that the cost/performance of graphics isn't scaling nicely. It seems that the move toward "realism" isn't worth the hardware tax when it comes to final presentation.
    But, I do see GPU manufacturers serious about gaming including more specialized hardware on their chips, like Nvidia and AMD with RT Cores. Unfortunately I see this raising the price of GPUs even further in the foreseeable future.
    It will definitely be interesting to see how a mid-gen "Pro" refresh of the consoles will shake up the PC landscape as well. I'm guessing the consoles will be aiming at 7800xt levels of performance.

  • @marcoferri716
    @marcoferri716 9 місяців тому

    What software is used in this video to check all the of stats?

  • @MrDabadabadu
    @MrDabadabadu 10 місяців тому +3

    Core increase era is done. We are now in frequency, ipc and cache era. 16 core from 3 years ago is crushed by r5 7600x. 8 core can be benefitial vs 6, so you a gamer will buy every single generation of cpu from now until something changes.

  • @Sp3cialk304
    @Sp3cialk304 10 місяців тому +3

    It's almost like the PS4 generation consoles had terrible CPUs so devs didn't have much to work with. Now we have the PS5 which has the equivalent of 16 zen 2 cores. 7 cores that devs can use from the CPU and the IO device to stream and decompress assets is equal to 9 zen 2 cores. The massive increase in geometry alone has bumped CPU requirements. Not to mention volumetrics, lighting, particle effects ECT. Also with ddr4 systems you also have a massive memory bandwidth bottleneck. The consoles use shared high speed gddr6 that can be streamed to and from the SSD in real time.

  • @rusudan9631
    @rusudan9631 8 місяців тому +1

    cyberpunk puts some decent load on my ryzen 3600, normally it's fine and stable 60 fps at 1440p with my 3060ti but in big crowded city areas it drops to 45-50-55. makes me wonder if i should upgrade to a 5700x or maybe even 7700x (latter is bigger pain since i gotta replace mobo-ram too)

  • @bulletwhalegames1575
    @bulletwhalegames1575 9 місяців тому +6

    Just wanted to drop this here, modern rtx solutions are not poorly optimized. Raymarching is extremely demanding on both cpu and gpu, the only way we can do it right now is to cleverly discard a lot of rays and bounces so that we can get somewhat decent framerate. How we do this is by calculating a structure that contains information about where geometry is located (too much info to really get into but there are plenty of good source), building this structure is what is costing your cpu performance, the more geo you need to update in this structure per frame the more your cpu will be doing. For example skinned meshes are extremely expensive and most of the times will be ignored (you can see this in games, a lot of the characters will not receive bounce lighting for example). Modern games do perform good on normal settings once you disable raytracing most of the time, raytracing is not really far enough at this point in time since there still is no hardware to accelerate the construction of structures (and this might likely stay like this for quite some).

    • @FreelancerND
      @FreelancerND 6 місяців тому

      So basically prebaked lightning is still runs better and looks better in most of the cases? =)

    • @colbyboucher6391
      @colbyboucher6391 4 місяці тому

      Yeah... this channel is just a dude who has no idea what he's talking about talking as if he's authoritative, which people love because it validates what people already feel.

  • @Aryzon13
    @Aryzon13 10 місяців тому +22

    It will not get better with time.
    Games were more optimized when DX12 and Vulkan were just rolling in. And since then it went downhill as you mentioned,
    And it will only keep getting worse until people stop buying.
    And since people will never stop buying, it will keep getting worse indefinitely.
    And you will be continuing to buy better hardware to compensate for the devs incompetence or straight up malicious intent.

    • @Leaten
      @Leaten 10 місяців тому +2

      AI generated code is here to save us lol

    • @user-dv5ts3de8e
      @user-dv5ts3de8e 10 місяців тому +7

      Graphics became the most important feature for marketing of new games. Need better graphics - buy new hardware. Its inevitable. DX12 and Vulkan are just tools for using new GPU features, they are not forcing devs to make ultra hi-res textures and a ton of post-processing effects. Its still possible to make low-poly models in a closed room to get 300+fps. But marketing needs open world, filled with people and ready to take nice looking screenshot at any moment.

    • @kada0420
      @kada0420 10 місяців тому +5

      Vulkan was great. Gave more life to my old pc during the time.

    • @Leaten
      @Leaten 10 місяців тому

      @@kada0420 bcuz it was still just as basic as opengl

    • @JN-qj9gf
      @JN-qj9gf 10 місяців тому

      ​@@user-dv5ts3de8egraphics became the most important marketing feature for games 40 something years ago.

  • @user-eu5ol7mx8y
    @user-eu5ol7mx8y 10 місяців тому +4

    Does this mean future graphics cards should have special accelerators for tasks currently done by the CPU?

    • @roklaca3138
      @roklaca3138 9 місяців тому

      Direct storage comes to mind

  • @serg331
    @serg331 5 місяців тому

    I went to where you were in cyberpunk. Ton of mods (none for performance), basically max settings except a few insignificant settings. I walked around and I was getting 55-65% utilization for cpu and 96% gpu util. I have 5800x cpu and 3080 gpu. Also I had my browser open and other stuff. It was 1080p so that’s probably why, but my point is maybe resolution plays a big part in this.

  • @arcadealchemist
    @arcadealchemist 6 місяців тому

    Currently the new Development meta is streaming assets and persistance. the graphic side is as good as it needs to be for now the CPU stuff is all about moving assets to be loaded etc which a lot of these open world games seem to need to be large enough for gameplay but allow the casching and offloading of assets..

  • @pablolocles9382
    @pablolocles9382 9 місяців тому +6

    It's called: bad optimization.

  • @dylanzachary683
    @dylanzachary683 10 місяців тому +3

    I had a 6700k up until this year I wore that thing out and it was great I upgraded to a 12700k and I already feel like I need another upgrade…

  • @agenerichuman
    @agenerichuman 10 місяців тому +50

    I've been really impressed with how you've been covering this. You're one of the few people who is sounding the alarm but without being all doom and gloom.
    You're right that there is hope but you're also right that this is a problem. And it's one not being talked about enough. I feel bad for all the people building PCs now who are going to be hit with a rude awakening.
    Upgrading a CPU sucks for someone who isn't familiar with the process (or even someone with shaky hands). It's not hard but things can very easily go wrong. Part of the push for GPUs was to take some of the load of the CPU. Seems like we're moving backwards.
    Also I think AI upscaling is great but I don't like the trend to use it to optimize games. Upscaling causes visual distortions. In some games it's not bad but in others it's awful. And some people will always notice no matter how good it is. It's clear we're at a cross roads in PC gaming. I hope you're right that good developers will rise to the top and things will get better.

    • @kenhew4641
      @kenhew4641 10 місяців тому +2

      "Part of the push for GPUs was to take some of the load of the CPU. Seems like we're moving backwards."
      The GPU and CPU have different utilization and even the way they process data is different. The CPU is generic while the GPU is task-specific. If the GPU is tasked with rasterization calculating raytracing paths. or data simulation, they do it in many orders of magnitude than the CPU. like if rendering a single frame takes an hour using CPU, GPU can render that scene in just one second. The problem is that now games are getting more complex, more data intensive with much bigger file size hence needing even more bigger memory that can transfer data at much more higher speeds, it's not just about rendering the visual output anymore, the CPU needs to process AI behaviour and logics, simulate physics, simulate soft body collisions like clothing and hair, simulate particles systems like smoke, clouds or fluids, ALL at the same time in a typical gameplay of your typical AAA games. One single fabricated chip the size of a condom pack, is not going to be able to carry all that load, especially when gaming now is going beyond 1080p. It might be enough if you scale down to 720p, but new games coming out now don't even have the option for 720p anymore. We're not moving backwards, we're moving too fast too much ahead that the data have nowhere to go to get processed, and ended up landing on poor old already overworked CPU's doorsteps, waiting in a long queue to be processed.

    • @taxa1569
      @taxa1569 9 місяців тому

      DLAA is the best thing to come out of the release of DLSS. Upscaling is hot garbage otherwise

    • @abeidiot
      @abeidiot 9 місяців тому

      @@kenhew4641 it has nothing to do with resolution

  • @realtonaldrum
    @realtonaldrum 9 місяців тому

    What framerate analyser ect. is he using in the top left?

  • @lukiworker
    @lukiworker 9 місяців тому +1

    I agree with the points made with Video editing in clear footage. I'm not certain if AV1 Encoding and Decoding in Videostreaming can be done on the CPUs too.
    There are reasons for why the graphicscards have become so huge despite GDDR6 and the lack of video memory causing problems even with the latest cards and current video games on high and ultra settings.
    Has memory memory bandwidth (traffic) of games increased enormously which memory bandwith can't keep up with? Or has a memory bottleneck on the GPU been reached that nobody notices with GDDR6? Let me explain:
    I think the load distribution is analysed a bit off, as soon as a bottleneck in the GPU due to little memory bandwidth on the GPU, it can no longer manage the high traffic, it goes to the next problem solver in the CPU. But as CPUs are, they can solve all problems, but not quickly and not so much at once, as a video game is. That could explain the low loads on the GPU the memory bandwith isn't wide enough. That is due to a archtitecture design choice made years ago and a mistake like seen today.
    The consequences are performance drops, because the loads are shifted to the CPU, but magically the game doesn't crash. And due to the architecture it also has a high power consumption set in GDDR6 which certrainly explain the cards getting so huge and big.
    How could have things looked diffirent before RDNA? Back at VEGA:
    It was cheaper to rely on the proven and experienced GDDR memory. Instead of setting up a new infrastructure with HBM and continuously optimising it with each generation.
    AMD hoped that HBM memory would become the defacto new industry standard, have even accompanied two generations R9 Fury and RX Vega. And discontinued it for the next RDNA generation for GDDR6, which has exactly the same scaling issues with adding more VRAM memory, that Nvidia cards have too. GDDR can't widen out more the memory bandwith, that is the architectures fault of GDDR when trying to scaleup. The bottleneck being memory bandwith which HBM doesn't or wouldn't have.
    I'm not shure if it is due to the high traffic on the gpus memory bandwith. I have not found a metrics like fps or something else, that could measure the traffic on the gpus memory bandwith.
    I recommend the following sources to read and see on:
    ua-cam.com/video/se9TSUfZ6i0/v-deo.html
    www.makeuseof.com/tag/high-bandwith-memory-really-need/
    ua-cam.com/video/omjOI1nZToU/v-deo.html
    Linus tech tips:
    ua-cam.com/video/ZIjUMeFCtqg/v-deo.html
    3 Klicksphilips:
    ua-cam.com/video/UphbfBPKycg/v-deo.html
    Niktek:
    ua-cam.com/video/x7R7H4-_TSc/v-deo.html

  • @ProfRoxas
    @ProfRoxas 10 місяців тому +3

    Unfortunately Ray Tracing is a yet another bottleneck, which is why it doesn't show up as gpu usage, even though it's running on it.
    CPU Usage can be said for more complex NPC AI, Logic or physics likes destroyed object falling to pieces and calculate the physics of each of the fallen chunks. DLSS won't help there because it's the same ammount of chunks regardless of your resolution, be it 4k or 240p.
    Like for example if you enable V-Sync (let's say 90-100fps -> 60fps), your GPU usage probably falls, because it has to wait for the specific timing of your monitor (like it would have to wait for ray tracing to finish).
    Using a ganeral purpose game engine like Unreal or Unity doesn't necessarily mean it's more optimised, it's more like developers don't have to implement basic functions.
    Using a custom engine can improve the performance, but it would take like years or decade to develop it, so we just sacrificed a small ammount of perfomenace (let's say 10-15% fps) for a highly simplified and faster development time.

  • @PedroAraujo-vg7ml
    @PedroAraujo-vg7ml 10 місяців тому +5

    Yeah, Starfield is only getting up to 90FPS max on the demanding scenes with THE BEST CPUs, like the 13900k and the 7950X3D. Thats actually crazy. You cant even get to constant FPS with the best hardware you can buy rn.

    • @deadhouse3889
      @deadhouse3889 9 місяців тому

      Are you running it on a SSD? It's a requirement.

  • @kennethnash598
    @kennethnash598 9 місяців тому

    can you try forcing an internal gpu with max ram usage and discrete gpu and see if it has any changes?

  • @latvian_fallen_angel
    @latvian_fallen_angel 5 місяців тому

    Feel you man...

  • @bl4d3runn3rX
    @bl4d3runn3rX 10 місяців тому +11

    I think AM5 is a good investment. Hoping for 3 CPU gens on it. So you buy a motherboard and DDR5, which is very cheap already and you can update twice...not bad if you ask me. Interesting comparison would also be 5900x vs 7800x3d with a 3080... can you do that?

    • @onomatopoeia162003
      @onomatopoeia162003 10 місяців тому +2

      Least for AM5. Would just have to update EUFI, etc.

    • @Nicc93
      @Nicc93 10 місяців тому +4

      going from 5900x to 5800x3d seen almost 50% higher average fps in some games, but it really depends on the game here. Higher clock speeds will benefit some games, some will benefit from the cache.

    • @ozanozkirmizi47
      @ozanozkirmizi47 9 місяців тому +1

      Hard pass AM5! I say that as an extremely very happy AM4 user...
      I'll check Ryzen 8000 and Ryzen 9000 series in the future.
      I may consider to buld an additional system If I see worthy components for my hard earned money.
      Until then, "Thanks! But, No Thanks!"
      I am good...

    • @SrApathy33
      @SrApathy33 9 місяців тому

      I got the 5800x3D on launch day. It was a massive upgrade over my 3600, boosting my GTX1080 in Cyberpunk with 50% more fps. The jump to a 3070Ti, which should be over twice the GPU, didn't improve Cyberpunk's fps by 50%. That worries me for the longevity of my 5800x3D which I planned on keeping several years. The massive depreciation on that CPU doesn't help it. Same depreciation as my 3070Ti btw. I could buy a Ryzen 7600/7700 platform with 32gb ddr5 for the value I lost on my 5800x3D and 3070Ti in one year.

    • @UKKNGaming
      @UKKNGaming 9 місяців тому

      ​@@SrApathy33buy a 6800XT it'll work a lot better with the 5800X3D. 3070TI is a dying GPU. 6800XT is competing against a 3080TI right now for way less money.

  • @imnotusingmyrealname4566
    @imnotusingmyrealname4566 10 місяців тому +14

    This is one of your best videos. Upscaling can't make the CPU process games faster.

    • @saricubra2867
      @saricubra2867 10 місяців тому +3

      CPU's can't brute force bad game optimization.
      Just now with my i7-12700K, i can get over 117fps on Crysis 1 (also higher frames than the 5800X3D), that is DX10.

  • @PersonaArcane
    @PersonaArcane 9 місяців тому

    You can enable hyper-threading, but this is a parallel processing issue. Only 1 core can be used by a game unless it's properly programed to run in parallel with multiple cores (which can be HARD!)

  • @sunfirehell
    @sunfirehell 9 місяців тому

    Hey, i've got a 10gb 3080, with a ryzen 5 5650 G PRO, and trying to find the right 240 Hz monitor. I've tried benchmarks etc to wonder if i would have also to upgrade CPU, RAM, Motherboard, fans.... My CPU has lower performance than yours, but are you aaware of the performance i would have in 1440P?
    Also i've been wandering around, trying to find the best IPS screen with great HDR (1000) but couldn't find a flicker free one, if u have any clues please

  • @jmporkbob
    @jmporkbob 10 місяців тому +13

    One of the biggest issues is consoles, I think.
    PS4/XB1 released with a laughably pathetic cpu, even at its time-much less several years later. So with that being not only the lowest common denominator, but also kind of the central hardware of the industry, the cpu requirements of games became a non-issue for essentially the past decade.
    PS5/XSX released with a solid cpu at the time (basically underclocked 3700X) and it's still respectable a few years later. Given that they are targeting 60 (and sometimes even 30) fps on that cpu, it's going to be pretty difficult to hit very high fps on cpus from around this time period.
    Can there be more cpu optimizations done? I strongly think so. But the generational leap as we move out of the crossgen period is driving a lot of it.

    • @JaseHDX
      @JaseHDX 10 місяців тому +1

      DF made a video on the series x cpu, performs similarly to a 1700x, not even close to an underclocked 3700x

    • @jmporkbob
      @jmporkbob 10 місяців тому +3

      @@JaseHDX it's an 8 core zen 2 design, just underclocked compared to the 3700x. dunno what to tell you, it's literally that architecture lol

    • @ppsarrakis
      @ppsarrakis 10 місяців тому +1

      1700x is still like 5 times faster than xbox one cpu...@@JaseHDX

    • @mimimimeow
      @mimimimeow 10 місяців тому +2

      @@JaseHDX You can't use the DF data really, Windows OS and PC games wouldnt be tailored for that specific hardware config and so the Xbox CPU may be bottlenecked by other areas that Windows didn't utilize. It's like how Android runs like ass on an overclocked hacked Switch compared to a Shield, despite both having the same chip.

    • @ImplyDoods
      @ImplyDoods 10 місяців тому

      ​@@mimimimeowxbox's literally run windows already just with modified gui

  • @hatchetman3662
    @hatchetman3662 10 місяців тому +5

    I know I've been struggling with a 3700x. It often doesn't even max out my 2070 Super, anymore. I could only imagine how bad it would be with a 3080. You pretty much hit on everything I been preaching for the past few years.

    • @alaa341g
      @alaa341g 10 місяців тому +1

      try to max up features and graphics that are way more heavy on the GPU , at laest like that you'll gonna be sure its working 99% XD ; fuck the modern gaming market

    • @CurseRazer
      @CurseRazer 10 місяців тому +1

      Don't even think about it xd. My 4070 ti with a 3800x is only working at maximum 60-70% most of the times. There are instances where it is maxed, but very few sadly. Dont even know what to buy next... a 5800x3d or a 5900x... looks like it doesnt matter

    • @hatchetman3662
      @hatchetman3662 10 місяців тому +1

      @@CurseRazer Well, 3700x and 3800x don't perform much differently in games. If I can afford it, my next upgrade is gonna be a 5600x3d or 5800x3d. It should help in games, regardless. But it is disheartening to see that nothing has gotten better and there's no solutions, currently.

    • @jordanlazarus7345
      @jordanlazarus7345 9 місяців тому

      I've got a 1070 and a 3900x and even I'm not topping out my GPU in some games lol, will probably go for the 5800x3d at some point.

    • @hatchetman3662
      @hatchetman3662 9 місяців тому

      @@aqcys6165 My PC is as "optimized" as it can be without a faster processor and GPU.

  • @AcuraAddicted
    @AcuraAddicted 9 місяців тому

    IDK, my 3800X from four years ago is still chugging away. Honestly, it doesn't make much sense to upgrade to the next gen, because of how small the gain is. Even the latest Ryzen gen is barely a 30% gain over mine, and that upgrade would be astronomical, as I'll have to buy a new MB and memory.

  • @kae0196
    @kae0196 9 місяців тому

    I noticed with the latest updates it's killing it.

  • @jakubgiesler6150
    @jakubgiesler6150 9 місяців тому +12

    Game engine dev here: its hard to utilitize full potential of compuer just because every pc is very different and you need to fullfill somewhat comparable performance on all architectures.

    • @prashantmishra9985
      @prashantmishra9985 6 місяців тому

      How to become like you?

    • @TragicGFuel
      @TragicGFuel 5 місяців тому

      I always wondered, why can't devs try to detect what the cpu gpu ram is, and let the game choose settings that will be best for that amount of processing power

    • @jakubgiesler6150
      @jakubgiesler6150 5 місяців тому

      @@TragicGFuel Absolutely, your curiosity touches on a fundamental challenge in game development. While it's true that developers can detect the hardware specifications of a user's system, ensuring optimal performance across a vast range of configurations is more complex than it may seem.
      Firstly, there's the issue of variability within a single type of hardware. For instance, two computers with the same CPU, GPU, and RAM might still have differences in other components such as the motherboard, storage speed, and cooling systems. These variations can affect performance. Secondly, user preferences also come into play. Some gamers may prioritize graphics quality over frame rate, while others may prefer the opposite. Developing a one-size-fits-all solution that satisfies everyone's preferences is challenging. Moreover, constantly adapting game settings based on detected hardware can introduce a level of complexity that may impact the overall gaming experience. Quick adjustments during gameplay could lead to interruptions or fluctuations in performance. Despite these challenges, many developers are actively working on solutions. Some games do employ automatic detection of hardware specifications to recommend optimal settings. However, striking the right balance between customization and simplicity remains an ongoing challenge in the dynamic world of game development. In essence, while the idea of dynamically adjusting game settings based on hardware is intriguing and has been explored to some extent, achieving a perfect and seamless solution for the diverse landscape of PC configurations is a complex and ongoing endeavor.

  • @macronomicus
    @macronomicus 10 місяців тому +9

    Its good to get a sense of the required hardware up front before making a game purchase, see what others are saying, and avoid badly optimized games vocally, could give devs some crowd support to push back on management & budget for some proper optimizing, otherwise they're throwing away millions of potential sales.

  • @Harry_crypto_investor3634
    @Harry_crypto_investor3634 9 місяців тому

    I felt something like this a year ago when I was trying to run minecraft with shaders, my fps was lockes at 30fps, but my gpu wasnt even at 55% usage, and my cpu was 10%.

  • @Voklesh85
    @Voklesh85 9 місяців тому +1

    Well done and very interesting video.
    I agree with almost everything you said but in my humble opinion one important element is missing.
    Modern gaming doesn't force you to change CPU every year but you have to predict when it's time to change it. For example, until September 2022 I had an Intel 7820X 8 core CPU. I have used this CPU for almost 8 years, without problems and I even managed to use it with an Rtx 3080TI without bottlenecks at 3440x1440. But then the new generation of consoles actually began and that's when we needed to change the CPU.
    Those who own an Intel 13 or a Ryzen 7000 today will not have to change their CPU in 2024 but will certainly have to do so when the next consoles arrive regardless of the market segment of the CPU they own.
    Then another matter is having a balanced PC.
    Obviously there is always the issue of optimization but contemporary developers spend only a small part of the budget to optimize titles on PC also due to the large number of components in circulation.

  • @thseed7
    @thseed7 9 місяців тому +3

    Think it's good practice to develop games accessible to more than just the top 2-5% of Gamers with the highest end systems. Optimization is important as well. If your new features tank high end CPUs and GPUs, they probably aren't ready yet.

  • @Lust4Machine
    @Lust4Machine 9 місяців тому +3

    Well I agree there's been a trend of poorly optimized games I don't think the requirements to run games with good graphics shouldn't change to different components as graphics technology developes. I would also consider that it might be the massive leaps in gpu performance has outpaced CPU development.

    • @knockbitzhd
      @knockbitzhd 6 місяців тому

      yeah im running fortnite at medium/high settings at dx12 with ryzen 7 5700x ( came out a year ago ) with rtx 4060 and the gpu utilization is at 88% no matter what i do it wont hit 99% xd

  • @Marco_My_Words
    @Marco_My_Words 5 місяців тому +1

    I purchased a CPU with 24 cores, the Intel i9-13900KF, for this specific purpose. My objective was to ensure the CPU wouldn't become a performance bottleneck, and it's achieving that goal, but its also overheating all the time. However, investing 600 euros in a CPU seems excessive, particularly when high-end GPUs are already priced above 1500 euros. Modern games have become increasingly resource-intensive. While I appreciate the advancements and realism in games, the escalating costs are becoming really challenging to handle. The amount spent on a top-notch gaming PC could be enough to buy a decent car.

  • @Mboy556
    @Mboy556 9 місяців тому

    It's soo good that games have started to use most if not all cores for gaming!

    • @df3yt
      @df3yt 9 місяців тому

      You are right - provided it's not a typical old school Microsoft excuse to bad coding. "Just throw more hw at it". Multithreading doesn't always mean faster.

  • @astreakaito5625
    @astreakaito5625 10 місяців тому +9

    I'm building my new 7800X3D system tomorrow and this is why. Although you gotta remember other bottlenecks can exist, could be cache issue, memory issue, gpu mem bandwidth issue on the RTX4000s, and sometimes the engine itself simply can't cope with poor code and will simply fail to use HW resources for seemingly no reason. Also if a thread is maxed it is maxed and no amount of moar cores will help, it's impossible for the same tasks to jump to another thread that's free that's why not even CP2077 which is very well multi-threaded for a videogame still won't use your 5800 at full 100%

    • @2528drevas
      @2528drevas 10 місяців тому

      I skipping this generation and riding my 5800X3D and 6900XT for at least another year. I'm curious to see what AMD has up their sleeve by then.

    • @NippyNep
      @NippyNep 10 місяців тому

      bro that can last u years@@2528drevas

  • @andreabriganti5113
    @andreabriganti5113 10 місяців тому +55

    Lowering some options such " crown density " can help a lot in games like Spiderman and Cyberpunk. It isn't the " ultimate " solution but it help. EDIT: It's also worth give an eye on the GPU control panel, in order to see what kind of workloads are assigned to the GPU. I had, in few games, such issues with my 5800X3D, alongside my 4070 TI and few adjustments, solved those minor issues I encountered. Hopefully this will help. Have a good day.

    • @CollynPlayz
      @CollynPlayz 10 місяців тому +2

      What settings do I do in nvidia control panel

    • @andreabriganti5113
      @andreabriganti5113 10 місяців тому +2

      @@CollynPlayz Be sure PhysX is off or, at worst, is controlled via GPU instead of the CP,U, than look into " manage 3D settings " and crank up the video settings. This will NOT benefit the CPU directly but will make the GPU do more work, helping to balance the usage between GPU and CPU. After that, lower the amount of the CPU workloads in Windows and be sure the power management is disabled. This did help me but again, it work fine when the performances difference between CPU and GPU is minor AND the CPU is almost always at 97/98%+. If, as an example you have a Phenom and a 3090, the gap can't be helped a lot.

    • @richardsmith9615
      @richardsmith9615 10 місяців тому +6

      @@andreabriganti5113 Would you recommend the 5800x3d as an upgrade path from a 5600g for 1440p gaming? Or do you think it's better to hold off for a future socket instead? Currently I'm using an Arc a770

    • @tomomei
      @tomomei 10 місяців тому

      Yes use the 5800x3d it will give you a massive improvement in your 1% lows and make games stable. Also the 5600g is a PCIE Gen3 only cpu and with the 5800x3d it will be gen4@@richardsmith9615

    • @mv223
      @mv223 10 місяців тому +6

      @@richardsmith9615 It's worth every bit if you don't want to revamp your whole system, especially with all the issues the new processors are showing. I have the 5800x3d and the 4090, and a lot of the times it will max out the 4090. No need for anymore for a WHILE. Also, I game at 3440x1440 @ 240hz

  • @ImJustStandingHere
    @ImJustStandingHere 7 місяців тому

    Good thing I picked the best CPU on the market for my next build

  • @GmanGavin1
    @GmanGavin1 9 місяців тому

    4:10 higher resolution textures use more VRAM on your GPU. The only thing the CPU would be doing is managing that is stored on VRAM or not.

  • @Bsc8
    @Bsc8 10 місяців тому +6

    It's called bad optimizations by devs due to marketing pressure.
    So too much limited time for them to provide good games experiences!
    UE5 it's a CPU eater because it's not being optimized at all due to the fact that it's not well known yet (just like early UE4 games). It's like running a demo game on the engine editor not a final release.

    • @kizurura
      @kizurura 10 місяців тому +2

      Man, Unreal Tournament 2004 looks gorgeous and it's a technical miracle it looks great when it ran on literally anything at the time. That's optimization.

    • @mimimimeow
      @mimimimeow 10 місяців тому +2

      Most new engines are CPU eaters because lots of the logic were done on abstract layers, so devs don't have to program everything manually - precisely the purpose of a game engine. Games are getting way more complex than it was 10 years ago too. It's a tradeoff because when more things are optimized manually then the cost+time would go through the roof, which would be better used on game content and QC that will bring more revenue. Alternatively you make a simpler game. Take Elden Ring vs Armored Core 6 for example. They both run the same engine, but Armored Core's linear design and non-organic graphics are easier to optimize and QC, so it runs way better than Elden Ring for a given dev cycle. The reality is game companies are corporations and they all have financial targets to meet.

    • @Bsc8
      @Bsc8 10 місяців тому

      @@kizurura yes very good example!

    • @Bsc8
      @Bsc8 10 місяців тому

      @@mimimimeow i know It works like that, but making content for a game that runs like ass it's not a good thing: let's say i'm interested in a new game, i'm thinking about buying because i made a beast of a PC two years ago but that game can be played smoothly on my hardware!? I'll pass and probably never play It.
      What's the point of making good ideas/content for new games that peoples can't enjoy at all? That's the main reason why i can't get hyped for nothing anymore, and when i'm wishlisting something i always have the fear of not being able to run It.
      _(edit) The only games i play now are: the older ones i still have to play on my libraries, something that comes free from the stores or heavly discounted._

    • @mimimimeow
      @mimimimeow 10 місяців тому +2

      @@Bsc8 heh, ask the genius executives that make those decisions. We should have this, we should have that, because the market research says so, here's the suboptimal budget, deliver it before the next fiscal year. It's ok if it's buggy at release, as long as we hit the revenue target first. If market analysis was spot on people will put up with it anyway. Job done.

  • @nimushbimush2103
    @nimushbimush2103 9 місяців тому +5

    I think things will get better because i dont see people going out of their way to but new hardware for games, and Im pretty sure that game devs know that. I feel like the problem that the covid caused hasnt been resolved fully because we can see the devs being pushed hard and not being able to deliver well. I know there is the publisher factor in that too but ifeel like we in the game industry havent recovered fully from covid.

    • @96dragonhunter
      @96dragonhunter 9 місяців тому +1

      They key issue is that after the crypto boom the GPU market has barely recovered when COVID struck and generally speaking a lot of people suddenly and untill now couldn't afford a gpu upgrade.

    • @nimushbimush2103
      @nimushbimush2103 9 місяців тому

      @@96dragonhunter agreed

    • @peterfischer2039
      @peterfischer2039 9 місяців тому

      @@96dragonhunter also with how the gpu market is looking right now, you should not buy a gpu right now.
      Especially if you play on a resolution higher than 1080p, because that will need 12-16gb vram soon and in some cases already does need that.
      But the affordable midrange gpus don't really have that amount of vram, so you are better off waiting for the next gpu generation and hope that they ship with more vram across the board.

    • @96dragonhunter
      @96dragonhunter 9 місяців тому

      @@peterfischer2039 not a problem for me. just gonna buy rtx 4090 and i9-13900kf

  • @YOSHELF
    @YOSHELF 9 місяців тому

    ngl i have some serious ways to get around this but it has todo alot with command lines and properly optimizing your pc
    priority of cpu usage and gpu usage
    also use dx11 in comman line arangements

  • @matsv201
    @matsv201 8 місяців тому

    Some modern game have direct storage access. DX12 game can use that to read the data directly from the drive directly to the GPU with no input from the CPU at all (appart from the northbridge inside the CPU)
    Also decompress the data. The general game data never needs to be compressed (unless you are saving in some games)

    • @portman8909
      @portman8909 7 місяців тому

      "no input from the CPU at all" is not true. It will just use less of the CPU, but the CPU will still have to allocate.

  • @user-ue3qt7gv1n
    @user-ue3qt7gv1n 10 місяців тому +2

    It slightly more complicated. New features requires those computing.
    RT requires geometry behind camera to be processed(otherwise nothing to reflect), while in older games that processing was catted totally.
    Same thing with lumen. It is software RT and usefulness of it is highly questionable overall.
    Nanite is both more complicated geometry(with probably too complicated for games in question) and constant recalculation of LODs.
    Only possible optimization for CPU is to use more simple geometry or to not these features. Manual LODs and baked light is quite optimize for both CPU and GPU.

  • @phased3941
    @phased3941 9 місяців тому +3

    The thing is people say a card is a "1440p card" like my GPU for example, I have a 4070 ti that's considered a 1440p card... But playing at 1440p just leaves me with a CPU bottle neck, with that exact CPU that's in the video... I don't see a reason to not just play at 4k and use your GPU fully, and likely not losing any frames

  • @mikelee9173
    @mikelee9173 6 місяців тому +1

    Nobody is really talking about the capitalistic aspect of this. They want you to upgrade to the latest hardware people. If a 3-5 gen old gpu/cpu combo can give you high fps on ultra settings companies like amd/intel/nvidia are not making money off of you. Once optimizations are at it's peak, they will start charging subscriptions on your cpu/gpu instead of allowing you to buy it outright.

  • @colpanerdi
    @colpanerdi 9 місяців тому

    i dont get the part about upgrading cpu? i still have a 4930k, upgraded gpu numerous times but still to this day never felt any need to upgrade my cpu nor had any cpi bottleneck in games.