I'm SO SICK of this misinformation about CPU Bottlenecks

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 1,8 тис.

  • @mikopium258
    @mikopium258 Місяць тому +685

    It's pretty funny to see that at 1440p the 4090 use 400 watts in pause menu of jedi survivor but only 300 watts in the actual game.

    • @felipedeornelas8054
      @felipedeornelas8054 Місяць тому +49

      Those stupid paused consumptions are what's bothering me the most since I got my 4080 S. Terrible idle in general though.

    • @George-um2vc
      @George-um2vc Місяць тому +204

      Said it once, saying it again, if you are buying trash games like Jedi Survivor, you are part of the problem, standards people, standards.

    • @tysage1473
      @tysage1473 Місяць тому +53

      ​@@George-um2vctrue speak with your money that's your only real power as a consumer

    • @George-um2vc
      @George-um2vc Місяць тому +21

      @@tysage1473 needless to say I refunded Jedi Survivor after 20 mins of owning it, sadly, most accepted the bad performance and praised Respawn.

    • @jasonzhu9742
      @jasonzhu9742 Місяць тому +29

      Leaving the game unpaused is cheaper than pausing the game

  • @dalobstah
    @dalobstah Місяць тому +796

    That performance at those specs is a crime

    • @zylo999
      @zylo999 Місяць тому +83

      Jedi Survivor has abysmal optimization that still isn't fixed. Doesn't matter if you have an i9 and 4090, there will always be stuttering in the Koboh village area. Really put me off the game to be honest.

    • @HallowedError
      @HallowedError Місяць тому +45

      My first thought was that this feels much more like a software issue than a hardware issue. It doesn't matter how expensive your hardware is if the software doesn't utilize it efficiently.
      Why are developers making games that don't run correctly with top of the line current hardware? If we aren't suppose to run at 'Epic' settings make it an opt in toggle to enable 'Shenanigans' level settings or whatever.
      Blaming the hardware just seems silly at this point.
      Edit: I want to make it clear that I'm happy that games are slightly future proof. But either the settings aren't used correctly or gamer culture of using the highest settings is making things seem ridiculous.
      Developers shouldn't be designing their games with these kind of bottlenecks In My Very Unexpert Opinion

    • @b.s.7693
      @b.s.7693 Місяць тому +2

      @@dalobstah I heard Boba Fett is already heading for the guys who are responsible for this FPS mess...

    • @dungeonsiege9439
      @dungeonsiege9439 Місяць тому +6

      Some ps2 games looked better than this

    • @gamingcomputers7485
      @gamingcomputers7485 Місяць тому

      well that's what it is there is nothing we can do about it either play other game or stay away from ue5

  • @Retanaru
    @Retanaru Місяць тому +175

    When the 1% lows happen to average just above 30fps you know exactly what they optimized for.

    • @Tiasung
      @Tiasung Місяць тому +25

      Unfortunately they usually get away with it because most people don't look at 1% lows, just averages.

    • @saricubra2867
      @saricubra2867 Місяць тому +3

      ​​@@TiasungI go beyond 1% lows, i just target frametimes.
      If i run 8 player mode with bots for Smash Ultimate on Switch Emulation, the frametime graph is perfectly flat at 60fps and 16.6ms without stutters while the main CPU thread is bottlenecked at 100%.
      After 12th gen Alder Lake, there's no reason to upgrade since the IPC stagnated. Overclocking RAM or adding more cache is artificially pushing fundamentally flawed chips. (now that Arrow Lake will remove Ring Bus, expect a bigger problem).
      Also a friend has a Ryzen 9 5900X, stutters are non existent and the windows experience is smoother than Ryzen 7 chips.

    • @jasonalston8125
      @jasonalston8125 Місяць тому

      Exactly right.

    • @Kenny-yl9pc
      @Kenny-yl9pc Місяць тому +1

      I dont think that has anything to do with it. Its about the optimization for PC not consoles...

    • @inkredebilchina9699
      @inkredebilchina9699 Місяць тому

      nothing?

  • @PixelShade
    @PixelShade Місяць тому +610

    As a game developer myself I would argue that we currently have insane levels of CPU performance. It boils down to poor game optimization. On the PC side most gamers are literally brute forcing their way to acceptable performance levels, even though the core problem lies within the "optimization" efforts during development. Question is, what happens when you buy "the best of the best" and STILL can't get a good frame delivery? The best strategy is to actually to call out the companies, letting them know that 30fps is not being an acceptable performance target on today's hardware (including current-gen consoles)...

    • @jtnachos16
      @jtnachos16 Місяць тому +50

      Yep. Way too much of the consumer market seems bound and determined to point anywhere but at the developers and publishers with regards to the issues. I've been yelled at by people on forums for making the statement that adjusting simple game mechanics (things like changing what time of day event triggers fire on, for example) should not take literal months to change a few values, should be EXCEEDINGLY simple and easy, and that the only way it takes that long, is if the underlying code for the whole structure is haphazard. Even when I explicitly don't blame the devs, but instead their time crunch, I still get people jumping down my throat claiming it's more complicated than it is. It may be more complicated, but it really shouldn't be.
      I would be shocked if most the current western AAA publishers had a single team amongst them that could build something like a blackjack game without massive performance bloat issues from poor code. Tends to be what happens when you push your team to work with pre-established code structures instead of letting them make new ones better suited to the task, and that's before touching on the push from publishers to replace long-term workers with per project contracted temps.

    • @PixelShade
      @PixelShade Місяць тому +25

      ​@@jtnachos16 Yeah, I mean. I can go anywhere from being super efficient at coding, to basically "taking forever" depending on what the underlying architecture, structure and documentation looks like. In many cases; deadlines, code crunch etc results in developers coding fast but not necessarily adhering to core architectural principles or forward thinking-structure. Further down the line other developers will build features on top of that already-shaky-foundation. That's when things go south in terms of optimizations, because once you want to optimize the foundation at a later stage, everything built on top of it will just come crashing down... I am not a master coder myself, but I often get praised for creating efficient solutions. And it's really not about writing the most efficient algorithm's all the time, but rather to create good foundations while only running code when it is absolutely necessary. Another good practice is to question how often certain features need to be polled, and at what accuracy level, adding those things into your workflow and you end up with code that is fairly optimized. A Perfect example is Cyberpunk on the Steam Deck. CPU performance is great when walking around, but once you start running, you can tell that a lot of "interactive systems" are being triggered; hammering the CPU. things like crowd interaction system, resulting in a lot of ray casting and dynamic animation blending depending on player movement. Typical example of how some of the code isn't actually running all of the time. Personally I wished I could disable those systems on the Steam Deck and instead just slide past npc's without any added interactivity. It would look less immersive, but if we are being honest to ourselves, the NPCs are just window dressing, they are not a part of the core gameplay loop and I much rather take the performance/battery life. :)

    • @AnalogFoundry
      @AnalogFoundry Місяць тому +6

      You're right on the money.

    • @Psychotoxic86
      @Psychotoxic86 Місяць тому +16

      Mate, thanks a lot for your comment, we need more people like you.
      Today we see how disastrous the situation is.
      For example, the UE5 engine is a total mess. Every game (mostly) has traversal stutters, and it has become normal nowadays. Mostly we are playing with half-baked products.
      Look at Silent Hill 2 Remake, an amazing game, but with killing quantity of stutters.

    • @THU31
      @THU31 Місяць тому +15

      The problem is that CPUs have evolved significantly in terms of multi-threaded performance, but not single-threaded performance. If you compare Raptor Lake to Core 2 Duo E8000, the IPC has just about doubled, in 15 years. We have high core counts and much higher clock speeds, but the architectures themselves are not that amazing.
      And games still rely mostly on single-threaded performance. A 6C/12T CPU from the newest generation always does better than any CPU from the previous generation (putting aside X3D chips). Developers always say it's hard to parallelize things in gaming, and Unreal Engine is definitely the biggest culprit in this regard. And considering everyone is moving to UE5, it's not a good situation.

  • @SyntaxDaemon
    @SyntaxDaemon Місяць тому +320

    I have been saying this for years. You can turn down graphics settings and resolution. You can't turn down bad optimization.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Місяць тому +12

      Problem here with these situations is there's nothing you can do to help a CPU bottleneck other than getting faster RAM or replacing the CPU. Turning down CPU intensive settings rarely helps, in a lot of cases if you lower say the draw distance your CPU won't handle the extra fps it would get anyway.

    • @SyntaxDaemon
      @SyntaxDaemon Місяць тому

      @@mttrashcan-bg1ro Exactly

    • @laitinlok1
      @laitinlok1 Місяць тому

      DXVK and VKD3D

    • @saricubra2867
      @saricubra2867 Місяць тому

      ​@@laitinlok1That, and buying Intel (Ring Bus) with usually higher IPC and more cores. OR a Ryzen 9 7900X and beyond.
      I have a 12700K, with DDR5 5200 i easily would get 80fps in Hogsmead for Hogwarts Legacy.
      The 7800X3D is not that fast, it's still 8000 points on 3D Mark, my 12700K reaches 10000 points.
      I always noticed that X3D chips look nice on the averages and max fps but the frametimes are kinda bad (?).

    • @SyntaxDaemon
      @SyntaxDaemon Місяць тому

      @@saricubra2867Vulkan layers only improve performance in very specific and limited ways and are not able to fix a game that is overly broken. 3DMark heavily favors cores, much more than games, so it's not a great data point. 7800x3d does great with frametimes, often even better than something like a 12900k. Smaller cache and the growing ringbus latency are actually sources of frametimes issues in their own rights. Intel does not have IPC lead, they have been trying to make up for it with higher clocks. I have a 7800x3d and also get excellent performance in Hogsmeade.
      Almost everything you said is either misleading or outright wrong, and I would encourage you to be more objective and thorough in your research in the future.

  • @heatnup
    @heatnup Місяць тому +155

    No amount of CPU processing power or GPU processing power will ever be able to overcome bad software. Hardware is software reliant and is essentially complex sand castles without software to tell it what to do.

    • @roklaca3138
      @roklaca3138 Місяць тому +1

      Tell that to snobist PCMR crowd

    • @Owen-np3wf
      @Owen-np3wf Місяць тому +1

      @@roklaca3138 PCMR is a meme for children and basement dwellers. Your average pc gamer doesn't even know what's in his pc let alone knowing or caring about bottlenecks or software issues as long as it makes the game go.

    • @roklaca3138
      @roklaca3138 Місяць тому +3

      @@Owen-np3wf but they feel the performance drop on lesser cpus, cannot deny that. Noone can prove to me you need a 800$ cpu to get high frames...doom seriea proved that

    • @saricubra2867
      @saricubra2867 Місяць тому

      ​@@roklaca3138My 12700K easily can pull above 80fps on Hogwarts Legacy 's Hogsmead.
      With 12 cores and 20 threads monolithic Ring Bus and high IPC, what is stuttering?

    • @Varil92
      @Varil92 Місяць тому +1

      Exactly

  • @konstantinlozev2272
    @konstantinlozev2272 Місяць тому +138

    Solution: don't play broken games.
    Maybe devs would learn how to optimise games.

    • @jagildown
      @jagildown Місяць тому +8

      🤝my man

    • @yumri4
      @yumri4 Місяць тому +1

      The big problem is when you have to optimize for the lowest common denominator the hardware better than it tends to suffer. It shouldn't as better hardware should run it better but depending on how the optimization is done it can. Now how did the people on PC get past it before? They just had hardware so far above the lowest that it didn't matter. I do think always above 60fps is a good target to aim for as you have to draw the line some where. Yes you might have consumer will monitors able to do more than most knowingly or unknowingly will have a 60hz monitor or not have set it in settings to do more than 60hz even if it is able to do more. So making the game look worse so a relatively few people can run their game at 120fps is not a good model. Having it so it can run and go at 60fps in action scenes on the majority of target systems is the model to go with. Going by Steam September hardware survey most only have a 1080p monitor fat 55% and 1440p at 20% of the survey well which do you think will be the target knowing that 1440p can just run 1080p with most not noticing? The majority will make for 1080p at 60fps as it is still the most common by a lot. Do you have that? As you are here most likely no nor do i but we are in the minority not the majority and publishers want to sell as many copies as possiable so optimize for the majority of systems.

    • @bjarnis
      @bjarnis Місяць тому +4

      ​@@yumri4the reality is that the vast majority of PC gamers are on hardware at about RTX 3060/RX 6600 XT and mid range CPU's. That's what developers should optimize games for, or else we will just keep playing older games.

    • @konstantinlozev2272
      @konstantinlozev2272 Місяць тому

      @@bjarnis that has been like, forever. The 60s cards have sold most.
      I guess Devs expect Nvidia to offer increasingly faster 60s cards.
      And instead we got stagnation.

    • @bjarnis
      @bjarnis Місяць тому +3

      ​​@@konstantinlozev2272what devs don't understand is that most of us don't care about "pretty graphics", gameplay and performance is what matters. Native 1080p is still king and it doesn't matter how much "DLSS, FSR and frame gen" they advertise for, we just want crystal clear image with no artifacts when running around.

  • @Pawnband
    @Pawnband Місяць тому +479

    Shame on you guys for expecting devs to optimize their games. Just upgrade your CPU!

    • @arc00ta
      @arc00ta Місяць тому +51

      alright todd settle down

    • @hackintosh3899
      @hackintosh3899 Місяць тому +8

      You can't optimize the RT API as a game dev. You work with what you have and it's garbage. Turn on RT expect stutter or MUCH lower lows where to not notice stutter you need a locked fps near those lows. The better cpu will give you better lows to work from.

    • @SageBladeG
      @SageBladeG Місяць тому +7

      Mfs be running a 40s series card on a potato cpu

    • @lukeludwig1055
      @lukeludwig1055 Місяць тому +30

      @sagebladeg like he said in the vid, literally using the fastest cpu available

    • @jeremysumpter8939
      @jeremysumpter8939 Місяць тому +20

      People like you have brainworms. You guys are always in the comments...screaming everything is unoptimized while knowing nothing about game development. Yall expect Cyberpunk 4k full PT to run on an amd dual core and a gtx 1070...it doesnt work like that.

  • @photonboy999
    @photonboy999 Місяць тому +181

    *"CPU Usage" confuses people...*
    Many people think if a CPU has "50%" usage that you can't be CPU bottlenecked because they simply don't understand enough about computers. You always go by "GPU Usage" because, generally speaking, the code can be extremely parallel and thus close to 100% usage, to oversimplify, means a GPU is fully saturated... and if it shows 50% (at maximum GPU frequency) then you're using about HALF of its potential so are thus bottlenecked by the CPU (assuming no software FPS cap causing this).
    So... if you had a game that had NO branching code and thus could only run on one core at a time, you could only use 1/x(100%) of the CPU. So if it was a 4c/4t CPU then in this scenario you could only use 25% of the CPU yet could still be bottlenecking the graphics card.
    *TLDR*
    GPU USAGE near 100% means a GPU bottleneck.
    GPU Usage below 100% means a CPU bottleneck.
    CPU USAGE near 100% means a CPU bottleneck.
    CPU Usage below 100% can't tell you where the bottleneck is.

    • @Gofr5
      @Gofr5 Місяць тому +7

      Very enlightening, thanks.

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому +9

      It could also be a memory bottle neck but that's not typically too much of an issue. It's not just the CPU and GPU contributing to the performance

    • @C-M-E
      @C-M-E Місяць тому +5

      If only Bernoulli's principle applied to processing power...

    • @LiveType
      @LiveType Місяць тому +2

      Last one with the cpu usage below 100% means it's a latency (usually memory) bottleneck somewhere in the chain. Aka why cranking memory latency lower makes fps number go up despite bandwidth staying the same. That reason is why the x3d vcache amd chips are "the best gaming cpus". They have enough L3 cache to lower avg memory latency by meaningful amounts.

    • @Gamevet
      @Gamevet Місяць тому +1

      @@LiveType Shaders. A lot of the more modern games have to compile shaders. That's why we see sudden judders as he's playing the game. It's not really a good take on CPU usage here.

  • @Zubie2000
    @Zubie2000 Місяць тому +65

    Intel's PresentMon utility and its "GPU Wait" metric are a great way to show how much time the GPU is spending waiting for the CPU.

    • @andersjjensen
      @andersjjensen Місяць тому +7

      If the GPU is only 60% utilized I don't need a tool to tell me that it's waiting 40% of the time....

    • @markl1671
      @markl1671 Місяць тому +32

      @@andersjjensen- That’s not how it works.

    • @ArdaSReal
      @ArdaSReal Місяць тому +11

      ​@@andersjjensenthats just not what that means lol

    • @andersjjensen
      @andersjjensen Місяць тому +6

      @@ArdaSReal I'm perfectly aware that there's more to it than that if you want to get nerdy with frame pacing and dispatch calls. But the net result is about the same. PresentMon is a useful tool for developers as it allows them to see exactly what is happening when during the entire frame rendering. But it doesn't help me much to know that the majority of the the waiting happens between the geometry upload and the shader code dispatch.

    • @FiestaKing36
      @FiestaKing36 Місяць тому

      data, graphs and charts in Special K OSD and widgets is great too

  • @hjyryui
    @hjyryui Місяць тому +313

    a modern game being bottlenecked to 60 fps in some areas by a 7800x3d is just sad

    • @danielowentech
      @danielowentech  Місяць тому +119

      I'm not saying it isn't sad, but it can be somewhat explained by game devs that target 30fps on PS5. That means a CPU with double the gaming performance will only be around 60fps.

    • @Jasontvnd9
      @Jasontvnd9 Місяць тому +67

      ​@JayzBeerz and yet the fastest gaming cpu is an 8 core.....
      It's not the core count that matters.... Game engines only have so much parallelization that can be done , Certainly having enough cores is important but 8 cores is very much enough.
      Typically either the main thread or the render thread is the one holding everything else back.

    • @Gornius
      @Gornius Місяць тому +51

      @@JayzBeerz Do you even know what you're talking about mate? Dividing game loop between more threads is just not possible most of the times. You can even see the CPU usage during parts where there is CPU bottleneck is barely hitting 50%. There are half of the cores doing nothing. 7800x3d is the best gaming CPU there is right now, because it has lots of cache, that actually improving performance by making CPU not needing to fetch data from RAM as often, because it can keep more of it in its own cache, which is 40+ times faster than fetching from RAM.
      You can even see 7900x3d is not as fast as 7800x3d, simply because 7900x3d has only 6 3d v-cache enabled cores, while 7800x3d has all of them, which is 8.

    • @battlephenom8508
      @battlephenom8508 Місяць тому +8

      @@Gornius My 13900K is faster in Hogwarts Legacy. It is OC'd and the DDR5 has tight timings 7200 MT/s CL32. So there's more to performance than just more cache. Some programs prefer core speed and some games can't fit all of it's data into the 7800X3D cache and have to go fetch it from RAM. In these scenarios it's easy to see a 13/14 gen CPU be faster with it's better DDR5 support.

    • @xoso599
      @xoso599 Місяць тому +1

      @@Gornius Aren't half the cores doing nothing because the CPU is parking those cores because they don't have access to the strap on cache? And wasn't the lack of handoff to the right cores a huge bottleneck for a while because CPU cache should be invisible to all software and AMD just dropped the ball on having the OS use the right cores?

  • @SurviveOnlyStrong
    @SurviveOnlyStrong Місяць тому +16

    There's nothing really happening in this scene to justify 7800x3d to drop to 60fps.
    CPU bottlenecks can happen when many characters spawned or many objects calculating physics, many ai calculating paths etc.
    But this game episode literally empty and looks like it's doing some heavy unnecessary calculations all the time on the main/game thread which affects fps and smoothness so much
    It's really a game problem here not the CPU
    You might see drops to 90 fps for example in this game on a cpu from 2034 that wouldn't mean that we need a better CPU to release...
    I presume in this particular scene all the interiors and characters inside buildings are spawned at the same time with zero optimization to them

  • @HallowedError
    @HallowedError Місяць тому +32

    I don't like moving the issue to hardware. The devs designed these games with current and past hardware in mind and shouldn't expect consumers to need future hardware to make their game run correctly. I just find this to be completely ridiculous. Why are we, the consumers, moving the goalpost for a game to run well 2, 3, 4+ years after release. They DESIGNED THIS FOR CONSOLES and it doesn't run well on $3000 rigs.
    I'm just flabbergasted and feel like I'm an old man yelling at clouds with how this seems to be a perpetual an immovable issue. When the game was in development the hardware they had was a generation or two behind when it actually releases so HOW does it not run well on release day new hardware.

    • @TheTeacher80
      @TheTeacher80 Місяць тому

      But is ok for hardwre to stagnate, at same or worse prices? Devs cannot do the same thing? Lets put the blame on everyone.

  • @LiterallyLozyl
    @LiterallyLozyl Місяць тому +27

    We DON'T need faster cpus. We need game developers who aren't lazy and so profit driven

    • @CeceliPS3
      @CeceliPS3 Місяць тому

      We don't?? I've heard that that percentage number of CPU usage is actually not correct because it's overall cores usage. Games are (probably) using 99% of a few cores and not actually utilizing all cores. Maybe engines don't work like that? So if this is correct, we actually do need faster CPUs.

    • @bricaaron3978
      @bricaaron3978 27 днів тому

      It's greed and Leftism.

  • @chiyolate
    @chiyolate Місяць тому +77

    11:58 I don't think you should be disappointed with AMD or Intel, because they can't just magically release a 7GHz cpu then 8, 9, 10GHz and so on. It's the game developers that you should be disappointed with, for designing a game that can't go beyond 60fps on the best gaming CPU on the market.

    • @MrHorst38
      @MrHorst38 Місяць тому +5

      It's not about general disappointment. It's just sad that a new cpu generation brings almost no performance improvements to the table. Zen 5 and arrow lake are mainly about power efficiency and that's not enough.

    • @RobFisherUK
      @RobFisherUK Місяць тому +1

      It might also be game engines and device drivers. Would be interesting to see which graphics settings most affect CPU usage. (Edit: also operating system stuff like the scheduler...)

    • @gerooq
      @gerooq Місяць тому

      Until sony and microsoft release consoles that are able to run these AAA cpu bottlenecked games smoothly at a cap of 60fps or higher, i think we will always see cpus be bottlenecking out systems

    • @nostrum6410
      @nostrum6410 Місяць тому

      This "sweet spot" being really low am5 systems is the biggest problem

    • @bricaaron3978
      @bricaaron3978 27 днів тому +1

      @@gerooq *"Until sony and microsoft release consoles that are able to run these AAA cpu bottlenecked games smoothly at a cap of 60fps or higher, i think we will always see cpus be bottlenecking out systems"*
      That's incorrect. A 4770K from 2013 is at least 25% more powerful than an XBox Series X. A modern CPU is multiples faster than a 4770K. The problem is:
      1. Ever since the Great Consolization of 2008, AAA engines/games are coded for console HW. Console architecture is fundamentally different than PC architecture --- yes, even the PS5 and XBox Series X/S. Console use a shared memory architecture and further have _much_ less RAM than a mid-range gaming PC, to say nothing of the top-end.
      Series X: 13GB (Shared between CPU & GPU --- 3GB is reserved for system processes)
      Mid-range PC: 28GB (16GB RAM + 12GB VRAM)
      Top-end PC: 56GB+ (32GB+ RAM + 24GB VRAM)
      Disk/SSD streaming and DirectStorage are designed to compensate for a lack of RAM. PCs don't need this. A PC needs games that are coded to properly use the large amounts of _separate_ RAM and VRAM. Talking about optimization is a moot point, because the games aren't even properly _coded_ to begin with.
      2. As many others have touched upon, modern studios are very poorly managed. So now, not even the _console_ code is properly written, to say nothing of optimization. Until companies begin hiring based upon merit again instead of "social justice" agendas this ugly situation will continue to fester.

  • @i11usiveman97
    @i11usiveman97 Місяць тому +52

    What this also shows is game developers need to change how they're making games because there's no way that these games should be putting this sort of strain on the fastest current gaming CPU.

  • @notatechguy1209
    @notatechguy1209 Місяць тому +132

    I'm sorry, I'm not disagreeing with your testing and results but this speaks more to poor game optimization than just a CPU bottleneck. Some games are just designed poorly and will stutter no matter how powerful the hardware is. I've never been a fan of just throwing more money or more power at a problem. I remember the Nvidia Fermi days of GPUs.

    • @19CD91
      @19CD91 Місяць тому +10

      I remember playing fallout 4 12% cpu 40% gpu sub 60fps. These games are def not designed well.

    • @b.s.7693
      @b.s.7693 Місяць тому +3

      Exactly

    • @Gyniany
      @Gyniany Місяць тому +5

      I agree, but games are going to continue getting worse when it comes to optimization.

    • @mryellow6918
      @mryellow6918 Місяць тому +1

      the thing is a cpu isnt just built for games, theres large parts of a cpu that just won't be used.

    • @Spaceman69420
      @Spaceman69420 Місяць тому

      its only going to get worse next year and near future. im always buying the best cpu available when I build a new pc now idc.

  • @Ayliar
    @Ayliar Місяць тому +23

    There will always be some sort of bottleneck somewhere. Doesn’t have to be CPU or GPU, but there will be one

    • @OptimalMayhem
      @OptimalMayhem Місяць тому +13

      Yeah but if you’re gaming you usually want that to be your GPU.

    • @BlackJesus8463
      @BlackJesus8463 Місяць тому

      @@OptimalMayhem They should've had you make the game.

    • @bricaaron3978
      @bricaaron3978 27 днів тому

      *"There will always be some sort of bottleneck somewhere."*
      That's not true. If you use VSync, as you should, you will rarely have either CPU or GPU maxed. The goal is to have a stable, non-fluctuating framerate --- the highest stable, non-fluctuating framerate you can achieve given your HW and game settings.
      You don't ever want either your CPU or GPU to be maxed out (i.e. bottlenecking). That means there is no headroom, and thus there can can not be a stable framerate.

  • @Varil92
    @Varil92 Місяць тому +13

    So basically to mitigate the incompetence of nowadays' game developers to optimize performances in a game we should upgrade a beast like the 7800X3D? This is crazy. I'm gonna play older games instead if this is the trend.
    EDIT: typos

    • @bricaaron3978
      @bricaaron3978 27 днів тому +1

      I don't know why you wouldn't want to play older games as a matter of course! I play games all the way back to the 2D Adventure games of the '90s and '80s, and further back to the console/arcade games and Text Adventures of the '80s and '70s.
      But the AAA games from ~1998 to 2011 are some of the best video games ever made. _Especially_ the AAA PC games from ~2003 to 2007.
      The reason I say this is that unlike some older games which are beloved by many, but do not stand the test of time due to gameplay and controls that were in a state of experimentation, these games are older graphically but absolutely hold up to modern scrutiny, and in many ways are _better_ than modern games --- atmosphere, characterization, and most importantly gameplay.

  • @BaieDesBaies
    @BaieDesBaies Місяць тому +8

    There's no valid reason these games don't get more framerate with such a good CPU. No huge number of physics items calculated, no enormous armies that need inidvidual calculation like in Total War, yet framerate can only go up to 60ish or 100ish FPS, that's absurd.

  • @teddyholiday8038
    @teddyholiday8038 Місяць тому +132

    With shoddy optimization becoming more frequent, and many open world games just crushing CPUs (especially UE games), CPU is becoming just as vital as GPU for high end gaming

    • @JoaoBatista-yq4ml
      @JoaoBatista-yq4ml Місяць тому +7

      I'm wondering if from now on I will have to get much beefier CPU "just in case", even though 99% of the games will run just fine on a cheaper CPU

    • @teddyholiday8038
      @teddyholiday8038 Місяць тому +17

      @@JoaoBatista-yq4ml yeah. Im on a 5800x3D still and it’s mostly great, but I running into more scenarios where it gets stomped by terrible optimization or overly ambitious scope. When I upgrade, it will be to the best x3D chip available (or Intel equivalent)

    • @thethiccdonut5257
      @thethiccdonut5257 Місяць тому +2

      @@JoaoBatista-yq4ml Ye, it seems the majority of the culprits are AAA games (no surprise) running on UE5 and sometimes UE4.

    • @KeiGGR
      @KeiGGR Місяць тому +2

      We can say we are still lucky CPUs are not getting insanely high price increases each generation like GPU's (or Nvidia specifically)

    • @mikopium258
      @mikopium258 Місяць тому +17

      You don't have to buy a badly optimized game though. If people aren't buying the game then they'll have to optimize them better. More games are pushing CPUs harder now but you shouldn't feel pressured to shell money to buy an new cpu for an unoptimized game. You'll only be encouraging devs to keep making games that way.

  • @Games_and_Tech
    @Games_and_Tech 12 днів тому +2

    Once I heard Tech from The channel Teach Deal explained things easily for people. It was something like this: your CPU is creating/generating/ rendering all the structures you see in games, all floors, walls, objects and characters, the CPU is giving all of them the structure and your GPU is just painting everything making it looks nice. So, if your GPU IS NOT UNDER 100% usage it means you are bottlenecked by something else in your system CPU/RAM/SSD/software, usually is your CPU that can't generate enough structures and your GPU is just waiting dor them to get "painted"...

  • @thethiccdonut5257
    @thethiccdonut5257 Місяць тому +65

    I agree with your points but I also think this is heavily modern developers' fault for having horrible optimization. Jedi Survivor doesn't look bad by any means, but there is no reason the strongest CPU and GPU combo on the market cannot run the game at max settings at 1440p at the bare minimum of a steady 100+ fps. Time and time again we see these current gen games releasing with piss poor optimization and it's getting annoying to the point that I don't even feel like playing those games until a gen later for better performance. Hell, I barely got into Cyberpunk and GTA V so I could run them at 4K (without RT) with high FPS on my new build.

    • @mannydcbianco
      @mannydcbianco Місяць тому +10

      100% this. This is an optimization issue much more so than it should be a CPU issue. But I fear that this is the new normal, shoddy and horrible optimization is here to stay because they will just shrug their shoulders and say "use upscaling" or "use frame gen" or "get a faster computer" instead of paying a team of devs hundreds of thousands of dollars to optimize the code.

    • @Pwnag3Inc
      @Pwnag3Inc Місяць тому +1

      Yeah man. Nvidia and amd introduced all this new tech that is shifting the work from all gpu to more of a 50/50 load.
      Cpu's never needed this in the past and your perfomance was more dependent on the gpu.
      Now they have lessened the load on the gpu, and that will allow them to hold back on gpu development.
      This is my theory. I have not seen/heard this anywhere else.
      If i am wrong, i am wrong.

    • @Kryptic1046
      @Kryptic1046 Місяць тому +6

      It's because a lot of modern devs are inexperienced because studios don't want to pay experienced devs what they're worth, and they don't want to pay to train the newer devs either, so as a result, we get rushed, unoptimized slop that barely runs on the fastest hardware available. Then we get to wait for 40 patches to roll through before the game runs like it should've upon release, meanwhile some games never truly get fixed.

    • @MrBeetsGaming
      @MrBeetsGaming Місяць тому +5

      "No reason the strongest CPU GPU combo cannot run it max settings at 1440p 100+ fps" based on what exactly? Do you have some technical explanation or are you just pulling numbers out of nowhere based on how "good" you think the game looks? I find it weird that people who have zero experience in game development will make comments like this with such confidence.... PC gaming always has games that push past what even the current best hardware can run maxed out, it isn't a new thing....

    • @MrBeetsGaming
      @MrBeetsGaming Місяць тому +4

      @@Kryptic1046 That is complete nonsense.....

  • @mannydcbianco
    @mannydcbianco Місяць тому +7

    It is very game dependent, too. In Red Dead Online I went from 100 FPS at 4K with my 7900 XT and a 7700, to 101 FPS when I swapped the 7700 out for a 7800X3D. Literally 1% with the exact same settings. That's it.

    • @andersjjensen
      @andersjjensen Місяць тому +5

      But what's your GPU utilization? If that was already at 100% (which sounds rather plausible for a 7900XT at 4k) then a faster CPU helps bugger all.

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому

      Not all games will yield the same benefits. Many older games often don't even use all of your cores. It's not uncommon for many games from the early and mid 8th gen to really only use 4 cores

    • @thebaffman4898
      @thebaffman4898 Місяць тому +1

      RDR 2 in general is very optimized on the CPU front, even more than GTA V, so your results don't surprise me at all especially if you play at 4k. I think it's the least CPU intensive open world game I've ever played (also the fact that it can run on an old ass mechanical hard drive with zero issues is still a miracle for me).

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому +1

      @@thebaffman4898 not to mention GTA 5 also falls apart when the frame rate hits 180fps

    • @Anon1370
      @Anon1370 Місяць тому

      feel bad for you doing that...i've ended up that way in the past...where i upgraded a part only to be at the same fps it sucks so bad cpu and gpu is never gonna be equal.....it looks like the gpu will always have the cpu bottlenecking on high end gear. We always seem to have problems with our pc gear.....

  • @Fantomas24ARM
    @Fantomas24ARM Місяць тому +10

    Dragon's Dogma 2 is also heavily CPU bound in cities.

    • @BlackJesus8463
      @BlackJesus8463 Місяць тому

      People do not accept CPU limitations. 🤣🤣🤣

  • @Stragus_Macleod
    @Stragus_Macleod Місяць тому

    I remember an older video you did about bottlenecking... I'm that you were also talking about resolutions and blew my mind because i totally had not thought about how since we're using lower resolutions, we're actually gaming at those lower resolutions so cpu bottlenecking would nl definitely be a thing. Thank you for being an American hero!

  • @vindeiatrix
    @vindeiatrix Місяць тому +4

    I was one of those who said it doesn’t matter at 4k and I’m really glad someone finally made a clear video about this. I see it now. But what’s confusing me is that I don’t see CPU intensive things happening in these games you demo’d. They aren’t even simulation games. Makes me think developers are used to cutting corners on CPU optimization and that this is fixable with better coding.

    • @stonythewoke9921
      @stonythewoke9921 Місяць тому

      thumbs up for choosing to not be ignorant. I don't know the showcased games very well but especially in RPG games alot of cpu intensive work can be done in the background without really seeing it. for example the logic of NPC can be very elaborated, regularily checking various variables like the players lvl and various skill levels and how they relate to various variables of every single npc and how they are supposed to react, like aggro range and other behaviour. also stuff like what the npc is doing, is he just running down a scripted path or is there a realistic simulation of what the npc is doing, like going to and from work depending on the time and stuff like that. npcs all have some algorithm that makes them choose their path and avoid obstacles, including the player and other NPCs. additionally, alot of graphics related stuff can also be demanding on the cpu. then you have various game assets constantly being loaded in the background, for example scripted events that trigger as soon as you cross a certain point or perform a specific action. there is tons of other things that can happen in the background without you seeing anything on screen. without knowing the source code of a specific game its really impossible to tell what the exact reason for frametime spikes are.

    • @Ottomanmint
      @Ottomanmint Місяць тому

      The game is only using 1 or 2 CPU threads for the highest clocks due to the way the software had to be written! Not many games are good at multy-threading 😊

    • @jose131991
      @jose131991 Місяць тому

      @@Ottomanmintwhy did it have to be written like that? UE4?

    • @Ottomanmint
      @Ottomanmint Місяць тому

      @@jose131991 Not sure where I wrote that, but sounds like Unreal Engine 4 to me.

  • @JukkaX
    @JukkaX Місяць тому +1

    Excellent video again, thank u.

  • @hamzaalhassani4154
    @hamzaalhassani4154 Місяць тому +6

    it's definitly because of how monitoring softwares show the CPU usage, and how games are reliant on single core performance in most cases.

  • @QubaLG-n1o
    @QubaLG-n1o Місяць тому

    Thanks for this video. Even most enthusiasts aren't educated when it comes to the importance of CPU performance in gaming, it's really tiresome to argue against ignorance.

  •  Місяць тому +98

    God forbid these devs optimize their games. Much better to make i9/Ryzen 9 CPU the minimum requirement!

    • @heatnup
      @heatnup Місяць тому +1

      That's the intent.

    • @soniablanche5672
      @soniablanche5672 Місяць тому +2

      game devs and gpu makers are colluding with each other to force gamers to buy unnecessarily more powerful hardware

    • @heatnup
      @heatnup Місяць тому

      @@soniablanche5672 True

    • @terrylaze6247
      @terrylaze6247 Місяць тому +1

      The games are optimized for the ps4/ps5 which is why they run fine on lower PCs as well, demanding that every game should be capable of maxing out any CPU and/or GPU is hilarious.
      "Cinematic" 30FPS is what devs are after and you are trying to run this 30FPS game at 300FPS.

    • @heatnup
      @heatnup Місяць тому +3

      @@terrylaze6247 The Devs can take their "cinematic" performance targets and shove it. Games are for the players not the devs. If you design a game with no intent to maximize your player's experience then you shouldn't be a game dev.

  • @prsoon
    @prsoon Місяць тому

    Good video man. I always wondered if better cpu's would cause me to get these type of stutter in these EXACT types of situations and its actually really hard to find footage of it. definitely in 4k.

  • @valentinosgsxr
    @valentinosgsxr Місяць тому +9

    Easy, there is always something to bottleneck every system, that's undeniable fact. The way to play games in a PC is to lock the frames where the system is comfortable working, relax and enjoy.

    • @BlackJesus8463
      @BlackJesus8463 Місяць тому

      They should fix those lows tho

    • @bdha8333
      @bdha8333 29 днів тому

      Lock fps dont work on cpu bottleneck only affect gpu

    • @valentinosgsxr
      @valentinosgsxr 29 днів тому

      @@bdha8333 I don't know what you re talking about. When the frames are locked, the frame limit becomes the bottleneck.

  • @seeibe
    @seeibe Місяць тому +1

    You should create a poll one day if any of your audience actually plays the game you showcase, because every time you pull up a "modern" game to showcase some point about hardware limitations, it is something I would never dream of playing. These days I play mostly retro and indie games because modern AAA feels like it's moving backwards, but maybe that's just me.

    • @stonythewoke9921
      @stonythewoke9921 Місяць тому

      if you play retro and indie games you usually dont need any good hardware. there is no point in benchmarking a game that runs smoothly on 20 year old hardware.

    • @seeibe
      @seeibe Місяць тому

      @stonythewoke9921 Not really true. Take Pools for example, which is a very small indie game, but it has a focus on realistic rendering so it gives my 4090 a run for its money. Or Minecraft with shaders and other mods. Or any of the nvidia mods like Portal RTX.

    • @stonythewoke9921
      @stonythewoke9921 Місяць тому

      @@seeibe I am not sure what you are trying to say, are you confused why pools does not require a strong cpu compared to hogwarts or jedi?

    • @seeibe
      @seeibe Місяць тому

      @stonythewoke9921 No I'm saying from my perspective it seems artificial to pick out these games to make a point just because they're new. It's a fallacy to assume if someone buys the best hardware it's to play the latest AAA titles. And without a poll there's no telling what his audience is actually playing. Kind of like how compared to the general public, a much higher percentage of people who watch this channel use AMD.

    • @stonythewoke9921
      @stonythewoke9921 Місяць тому +1

      @@seeibe there is nothing artificial, arbitrary or anything about the games he picked. he simply picked games that prove the comments about cpus being fast enough wrong. if you think playing popular games is not a realistic scenario, I can't help you.

  • @dawienel1142
    @dawienel1142 Місяць тому +8

    Try without RT as well, I'm also starting to think that RT is hitting some weird bottlenecks that are not necessarily CPU related.
    Just think about it for a second, if RT is to blame for most of your issues then arguably it's the GPU's fault since its the component responsible for RT.
    Is it offloading too much on the CPU? Yeah maybe but how do we know that the RT cores isn't the part that's bottlenecking the system and what you are seeing is rasterised performance left on the table?
    We kinda had this issue all the way from the 20 series, Nvidia could be processing RT inefficiently vs rasterised or the RT was badly implemented in the game causing this and everyone would blame the cpu and/or game anyways.
    One reason why I stay away from RT like the plague since I get spiky performance almost every time that I enable full RT in any game, though there are some that runs better than others.

    • @Anon1370
      @Anon1370 Місяць тому +4

      rt i think adds to the cpu bottlenecking faster....it puts extra load on it i think and so it stands to reason. However one day games won't allow you to turn that ray tracing off...there'll be no offswitch for alot of setting and then will come more problems!!! the days of smooth pc gaming are either far behind us or far ahead in the future we havent met yet but at present it does suck...

    • @robot_0121
      @robot_0121 Місяць тому

      just think better - why does the video card have RT cores - why is the processor the one that bears the load? Shouldn't the video card do this? another stone in Nvidia's direction.

    • @dawienel1142
      @dawienel1142 Місяць тому

      @@robot_0121 this is basically what I said.

    • @robot_0121
      @robot_0121 Місяць тому

      @@dawienel1142 ok_hand

    • @jose131991
      @jose131991 Місяць тому

      @@robot_0121it should but goes back to the devs because they can actually code it to where RT is primarily done on either the GPU and CPU (wrongfully). It’s supposed to be done mainly on the GPU though.

  • @gerardw.7468
    @gerardw.7468 Місяць тому +1

    I love your point, but i think i also speaks to reviewers needing to add in more "realistic in-world" gaming settings in their reviews, and not only the "in lab settings" to maximize the performance differences. I understand that is better for actually getting the numbers correctly, but it doesn’t speak to real world scenarios, which is really what people should be expecting.

  • @florianb.1382
    @florianb.1382 Місяць тому +53

    Using an horrible performing game... You're not cpu limited, you're "bad code" limited...

  • @jayzn1931
    @jayzn1931 Місяць тому +1

    I built a pc from used parts last year and got a 3060 with 12GB, for photo editing and a kittle gaming. I was really surprised to see how crazy bottlenecked that system is. I can play Battlefield 1 smoother in 4K than 1080p :D. I want to upgrade to a cheap ryzen at some point so I can use my 3060s potential, but as I‘m not gaming a lot it‘s not high priority.

  • @oldtimebenchmarks5294
    @oldtimebenchmarks5294 Місяць тому +11

    Informative and on point as always thanks owen!

    • @Blafard666
      @Blafard666 Місяць тому

      Lot of people totaly missing the point in the comments unfortunatly.
      Daniel was tring to demonstrate that CPU bottleneck with a high end GPU at high resolution can still happen, people think he is defending badly optimized games.

    • @jmangames5562
      @jmangames5562 Місяць тому

      @@Blafard666 We get that but he is doing so using the small % of bad optimized games and the hardware a very small% of gamers have. It is kinda stupid as it makes no sense to use what 1% of PC gamers use to show PC gamers look these bad games can even make the best hardware bottleneck. What is the actual point? Everybody who played these games knows this and it will do the same thing to the 5090 and 9800x3d because it is not a harware problem.

    • @Blafard666
      @Blafard666 Місяць тому

      @@jmangames5562 The demonstration wasn't about the games. Again. Its about the persistent myth that high end GPU + high resolution can't generate CPU bottleneck.
      For this demonstration coupling the best GPU outhere and the best CPU is the smartest combo. What pourcentage of gamers is using that hardware is irrelevant.

  • @michaellee2786
    @michaellee2786 Місяць тому +2

    Speaking as a developer, the entire idea of "cpu vs gpu bottleneck" is not a good one to usefor analysis. It's a perspective that tries to blame one piece of hardware for the overall performance of the software. That's a useful perspective when you're trying to decide what parts to spend more money on when you have a budget and play particular games. But that's it! That's as useful as it is to think that way.

  • @clem9808
    @clem9808 Місяць тому +6

    Daniel , we desperately need your take on pcie 4 x 8 cards on pcie 3 motherboards.
    Most poor gamers only gonna be upgrading to rtx 60/Rx 600 tier GPUs ,and will still be using their old pcie 3 gen mobos . So they'd be stuck with pcie 4x8 bottlenecks from this current gen. to most likely the next gen of gpus.
    We need to know how much the performance reduction, especially when they ran out of vram. Cause most of them are 8gb cards.

    • @virtual7789
      @virtual7789 Місяць тому

      It’s not a problem at all. Its pcie x4 gpus that show a significant decrease in performance. I run my 6600 on a b450 board and my performance is the same as what all the benchmarks say. At most its like 2-5% slower

    • @clem9808
      @clem9808 Місяць тому

      ​​@@virtual7789but for the very little performance it offers, You'd need the 2-5% difference don't ya think?
      I'm running a 4070(4x16) on a b450m(3x16) board so I wouldn't need to worry about it it. But still , being aware of this would clearly help out a lot of budget gamers.
      That's why we need Daniel's take.

    • @virtual7789
      @virtual7789 Місяць тому

      @@clem9808 2-5% performance shouldn’t affect your purchasing decision. Also, why not just buy the used high end cards from previous gen instead of settling for the trash Nvidia and amd offer in the low end? Based on the leaks the generational gains aren’t gonna be there because the cards are so cut down. Just buy a rx 6800 for $250 or a 5700xt for $150. The used market is amazing right now

  • @cars291
    @cars291 Місяць тому +1

    Point taken. A very helpful video.
    Gaming non- competitively and not requiring DLSS at 4k resolution I tend to see no such effects after I learned to turn it off to take pressure off my CPU 😂.
    Turning off raytracing helps too - it is to me an upsell with mostly diminishing returns in most games Inplay. There are trends to get us to buy the latest greatest driven by such tech. Thats ray tracing.

  • @ultraviolet2497
    @ultraviolet2497 Місяць тому +3

    I have noticed that the lower your gpu usage is, the more frames you can squeeze out of frame generation. You can get close to 80% more frames if you are on a cpu bottleneck vs 30% more frames if you're GPU bound. So if you're not that sensitive to latency, just turn on frame gen and everything will run at 120. I agree real 120 would be better, but ¯⁠\⁠_⁠(⁠ツ⁠)⁠_⁠/⁠¯

    • @jose131991
      @jose131991 Місяць тому

      Yes that’s already well know about FG which is another side step to actual CPU optimization. FG is designed to bypass the CPU so the the bigger the bottleneck from that the bigger the get back from FG. Doesn’t help people on 60hz TV’s or monitors that regularly drop below 60 🤷🏾

  • @RobFisherUK
    @RobFisherUK Місяць тому +1

    Trying to do flight sims in VR (so ideally 4K per eye at 90fps) I can never have enough of anything!

  • @zaraizara2794
    @zaraizara2794 Місяць тому +4

    In the past, it was often said that games only utilized 4 CPU cores. Given that your CPU has 8 cores, but the game only uses 4, that's why Afterburner shows around 50% CPU usage. Instead of solely blaming the CPU, why not criticize the game for not taking advantage of all the available cores?

  • @ThreaT650
    @ThreaT650 Місяць тому +2

    I am not Daniel's particular demographic of viewer. I watch him occasionally, and I find his benchmark stuff useful, but I feel like his demographic is of people who are more new to the scene. That said, he's right. Another way to know this is that, these GPUs are far more powerful than their CPU counterparts and have been for a long time. That's why many much older CPUs didn't even bottleneck ADA when it came out and still doesn't. GPUs are progressing far quicker. I feel like that's what made X3D so exciting. Also competitive gamers want every single bit of fps they can get in 1080p, and it doesn't matter that 200 is a lot or 300 is a lot, they want THE MOST they can get. If he throws in the next tier down in CPU his fps will drop, and that's a perfect indicator that he is CPU limited. For most people this will not matter, but for the future of GPUs, when 70 class cards are as powerful as a 4090, we need to understand and be advocating for CPUs that can handle it. That means better flagship CPUs and the trickledown it will inevitably cause. Why buy a 5090 if the performance is the same as a 5080? It encourages Nvidia and AMD to give us worse GPUs to try and artificially keep a larger performance gap between the different GPU tiers.

  • @andred.2335
    @andred.2335 Місяць тому +3

    those who say cpu performance is secondary don't appreciate triple-digit frame rates even in single-player games. these comments will always exist. you can't save everyone.

  • @MrTernation
    @MrTernation Місяць тому +2

    so the evolution would be going from a single thread to multi thread optimization, but that is up to engine/devs cpu manufacturers to implement

  • @PCrealitys
    @PCrealitys Місяць тому +37

    I'm SO SICK of Reviewers misinformation about CPU Bottlenecks ITS POOR TO AWFUL GAME OPTIMIZATION that's to blame The CPUs today Kick Butt

    • @gingerbread6967
      @gingerbread6967 Місяць тому

      Whats fascinating is he doesn't get the obvious, and he has this many subscribers.

    • @I_Jakob_I
      @I_Jakob_I Місяць тому +10

      What? Its still a CPU bottleneck

    • @yumri4
      @yumri4 Місяць тому +1

      The problem is for most games the solution is "just brute force it" to fix the poor game optimization issues. That worked until games began to use the newer features on PC's GPUs as they could just throw better hardware at it to get the desired results now optimization is needed. Now we have games that push the limits of what the GPUs can do the way of thinking of "the CPU doesn't need to be optimized for" isn't working so well.

    • @Varil92
      @Varil92 Місяць тому +1

      ​@@I_Jakob_Ido you understand that it's useless to push the brute force of the hardware if we keep on letting software developers get away with shitty optimization? It's a zero sum game, a capitalistic and braindead game. It's pathetic. CPUs nowadays are super powerful.

    • @I_Jakob_I
      @I_Jakob_I Місяць тому

      @@Varil92 yes I know

  • @Shobu
    @Shobu Місяць тому

    you're absolutely right.
    I am using triple screen 1440p @240hz monitors and the FOV affects the CPU .. A LOT!!!!!!!!!!!!!!!!
    there are so many games I can't play with very good frames and my cpu overall usage is much higher than most users.
    I have been thrown shit at for such a long time, and now you're explaining the shit I've been explaining on reddit so well... great job Daniel!

    • @viamoiam
      @viamoiam Місяць тому +1

      That must be a very immersive setup and certainly demanding. I hope you really have some fun with that. It would be interesting if the FOV was affecting CPU. I play at higher FOV in games even on a single monitor as I want immersion in the scene. I couldn't compare it directly though as you are pushing triple the resolution too, by utilizing more screens.
      That is a lot of pixels you are pushing, which would also work the GPU a lot. You are effectively talking about a higher resolution. Your running something equivalent to 6k (2k x3).
      My usage of FOV is not as demanding, but also not as nice. I'm sure there is confusion as FOV may refer to how wide of an angle the camera is. It can even refer to how far the camera is away from a third person character you control. In those cases the resolution is the same. You are increasing the FOV, but you are changing the amount of rendering needed too with multiple monitors. You may have or have not changed the original camera, sometimes I think the game just adds another 2 cameras. That has got to be nice to have more vision. I think it would only be outdone by a good VR experience. I haven't ever tried VR, well not at home.
      Odd it wasn't understood on reddit, perhaps because of effective resolution. I'd stick with pointing out that..The same basic rule applies. If you are getting less then 95% of GPU utilization the CPU is generally the bottleneck. (IMHO it may actually be a RAM, VRAM, or bandwidth issue as that is quite the demanding scenario and out what I budget for gaming. As you probably know CPU is sensitive to RAM speed, GPU is sensitive to VRAM and bandwidth issues in some scenarios)

  • @coffee7180
    @coffee7180 Місяць тому +3

    8:00 the cpu is asleep as well, under 40% CPU utilization.

    • @yavnrh
      @yavnrh Місяць тому +1

      That's because of the core count with SMT. The game would have to use all 16 threads fully to get to 100%, and games usually can't do that. So they end up bottlenecked by the single thread performance.

    • @coffee7180
      @coffee7180 Місяць тому +1

      @@yavnrh shitty optimization. Battlefield 2042 had similar problems but right now it's one of the best optimize games ever. It uses all 16 threads of my 5800x3d and it can run over 300fps. A Game with 128 players, destruction, wetter effects, big open maps and vehicles has a better parallel CPU utilization than a single player game. Unreal engine 5 is in general a stutter mess.

    • @kevinerbs2778
      @kevinerbs2778 28 днів тому

      ​@@coffee7180you mean weather, lol at the wetter.

  • @thepirate4095
    @thepirate4095 Місяць тому

    best benchmark channel on youtube right now

  • @Son37Lumiere
    @Son37Lumiere Місяць тому +43

    Poorly optimized and coded games are not a CPU problem, they're a game problem. This also has a lot to do with ray tracing, which again was introduced far too early for the hardware.

    • @kiwibom1
      @kiwibom1 Місяць тому +2

      no one said that the hardware is really the problem. poorly optimized games are the problem, you are right on that but having more powerful cpu would make it easier to run poorly optimized games. Otherwise you wouldn't see a difference going from an i3 to an i7 or from a non x3d chip to one.

    • @Blafard666
      @Blafard666 Місяць тому +1

      You right. Also you missing the point.

    • @UnsettlingDwarf
      @UnsettlingDwarf Місяць тому +2

      Facts. This is the majority of modern games in the last 4 years.

    • @Conumdrum
      @Conumdrum Місяць тому +6

      Ray tracing was introduced to early for AMD

    • @Son37Lumiere
      @Son37Lumiere Місяць тому

      @@Conumdrum If it were fine for nvidia they wouldn't have invented dlss and frame gen.

  • @eisenecke2060
    @eisenecke2060 Місяць тому

    Really good video. I hate how all the bechmark videos compare cpus at 1080p mid settings with a 4090. A much more relatable benchmark would be wether there´s an impact on these poorly optimized games at 4k. Well done.

  • @crestofhonor2349
    @crestofhonor2349 Місяць тому +13

    I believe Hardware Unboxed did a similar test on how important the CPU is even at higher resolutions

    • @Blafard666
      @Blafard666 Місяць тому +4

      They also explained they had to do that video because of that persistent myth that high end GPU+high resolution means no need for a good CPU cause "impossib' to CPU bottleneck boyz"

    • @ppsarrakis
      @ppsarrakis 25 днів тому

      Those people havent played any mmos at all

  • @DejayClayton
    @DejayClayton Місяць тому +2

    Man, those animation cycles really suck in Jedi Survivor. It's like the animators never saw a real human run before.

  • @the_monarch2172
    @the_monarch2172 Місяць тому +12

    Do you think this is why AMD is sticking to mid range???

    • @Anon1370
      @Anon1370 Місяць тому +6

      i think you might be onto something here.

    • @IndyMotoRider
      @IndyMotoRider Місяць тому +1

      It's why I paired my 7800x3d with a 7900xtx. 4090 is an extra 8-900 bucks for quality that is limited by shit optimization in most modern games.

    • @CS-pl8fc
      @CS-pl8fc Місяць тому +1

      What? Bad cpu optimization in a few games is why amd is sticking to mid range GPUs? What's the logic here?

    • @soniablanche5672
      @soniablanche5672 Місяць тому +1

      only 0.1% of gamers actually buy the top range GPUs, AMD is actually trying to make money. Nvidia gets away with it because they have the much bigger market share.

    • @mattk6827
      @mattk6827 Місяць тому

      Doesn't make much sense since amd are also making the cpu's. Unless they're limited in how well their cpu's can run. Talking about on a per core level, intel has gone the same route by going wide with multiple cores. Which is great for heavily threaded applications. A few games are, most are still reliant on a single or just a couple main threads. Meaning 12c, 24c, 1024 core, doesn't matter if the 2-3 necessary cores are hamstringed. Single core performance almost always matters (very few light weight multithread loads and those that are parallel processes typically run off gpu acceleration anyway).
      Amd's sticking to mid range because on the gpu front they can't compete. The briefly mentioned they could if they pushed the power (they didn't say by how much). Shoving more power to boost clocks for diminishing returns shows they're up against a wall. Yes nvidia uses more power but it's not power consumption alone stretching the 4090 well beyond anything amd has right now. They know they can't catch up for now so they're sticking with what they can do. It's not speculation, amd's actually come out and said as much. They did try to say they could compete if they pushed more power but 'pics or it didn't happen'.

  • @Hentirion
    @Hentirion Місяць тому +1

    11:54 so the point is : Dont buy 5080 and 5090 because you wont have a CPU .

  • @lflyr6287
    @lflyr6287 Місяць тому +9

    Daniel Owen : this game Star Wars Jedi Survivor is NEVER GOING TO BE WELL OPTIMIZED. Why ? Because it's using an iteration of the Unreal 4 engine released in 2013 and developed through the years from 2008 - 2013 which was the era of Win XP and its DX 9.0c. That engine was then for the year 2013 translated to the DX 11 API which was the current Win 7 API back then. DX 11 is heavily based upon DX 9.0c and has one fundamental flaw : it cannot efficiently and parallerly utilize more than 1-2 cores. So in practice it need an core cut down i9 with 2c/4t setup running at 6.0+ GHz to utilize it's speed.
    That's why measuring anykind of CPU / GPU bottlenecking in this kind of a game is incorrect because it won't show proper and true utilization due to abundance of cores that aren't utilized and lack of very high single-core frequency that the UT 4 2013 needs.
    The only thing here that is correct is the usage of a RTX 4090 because Nvidia uses an older principal of their architecture design here and currently and is very LINEARLY utilized which such old graphic engines like.

    • @residentCJ
      @residentCJ Місяць тому

      Fantastic Explanation, i wish a UE4 Dev would tell Hardware Tubers that, but that will never happen. Because that will hurt cpu sales. 😁

    • @lflyr6287
      @lflyr6287 Місяць тому

      @@residentCJ true....completely true.

    • @jagildown
      @jagildown Місяць тому

      UE4 has been supporting dx12 for years now, many other games running on even older versions with dx11 don't have those issues.... but you are right they wont optimise it because the selling cycle is over now they made millions they are not going to spend ressources that wont generate more sales anyway...

    • @lflyr6287
      @lflyr6287 Місяць тому

      @@jagildown Unreal Engine 4 doesn't support DX 12 API natively....it partially supports DX 11 API....and it translates via DX 11 API through to DX 12 software wise a.k.a. emulated.

    • @jagildown
      @jagildown Місяць тому

      ​@@lflyr6287 so what?... why other games using older ue4 versions don't have those issues then? the real issue is elsewhere its not that they can't do it it's that they don't want to do it

  • @SiberdineCorp
    @SiberdineCorp Місяць тому +2

    Hey Daniel, you'll never read this but:
    Can you do some testing regarding the CPU cost of Ray Tracing and Path Tracing enabled? Almost every reviewer and gamer ignores the CPU cost of enabling RTX. But according to Digital Foundry's Cyberpunk testing the 5800x can drop to mid-40s with Ray Tracing Ultra, and the 12600k performs 43% faster under those conditions.

    • @iang3902
      @iang3902 Місяць тому

      It was commonly said only a few years ago that AMD cpus didn't perform aswell in games with RT on. Not sure if that still holds true but no one talks about it anymore.

    • @meneldil7604
      @meneldil7604 28 днів тому

      @@iang3902 i think you mean gpus

  • @FastGPU
    @FastGPU Місяць тому +14

    If I had a 4090 I wouldn't enable DLSS at 1440p I get the analogy but I run a 4070 Ti super and 165 Hz 1440p monitor I hit 165 FPS at High to Ultra settings without DLSS and ray tracing enabled in most games My CPU is a i7-12700 is considered not optimal for gaming Poorly optimized games are the real problem

    • @sausages932
      @sausages932 Місяць тому

      Yes ray tracing is unrealistic as this video demonstrates how slow it is. Turn it off and most CPUs should be fine?

    • @FastGPU
      @FastGPU Місяць тому +4

      I run ray tracing in my games This video cherry picks poorly optimized games Frankly, its misinformation

    • @parowOOz
      @parowOOz Місяць тому +2

      @@FastGPU Please provide video proof of 1440p native @ 165fps with High-Ultra + RT (wherever possible) in: Wukong, Space Marine 2, SW: Outlaws, Starfield, Remnant II, Hellblade 2, Silent Hill 2 Remake, FF XVI, Ghost of Tsushima, GoW Ragnarok, Horizon Forbidden West, CP2077, Alan Wake II. I'll wait :) Why do you guys feel the need to lie so blatantly ? You know people can fact-check your BS and just look at benchmarks/reviews, right ? Most games my sweet behind :)

    • @TheTeacher80
      @TheTeacher80 Місяць тому

      @@parowOOz Good point. Alot of stupidity here in comments. The discussion was tachnical, not about optimisation, but about the act the Intel, AMD and Nvidia are not giving better products with new gen. But they keeo the same prices or worse. also the concept that you cannot get bottlenecked in 4k is just stupid.

    • @TheTeacher80
      @TheTeacher80 Місяць тому

      @@sausages932 How is this missinformation? You pay 2k for a GPU, and even for that stupid priced RTX 4070Ti to not enable RT? The point of the video was that you need sometimes a better CPU, and Arrow Lake + Zen 5 are not giving for this generation, but i did not hear them "We lower the perices", the contrary, we raise them.

  • @havok9525
    @havok9525 Місяць тому +1

    I hate that games graphics are starting to get out of hand and being thought only for top of the line hardware(and not even that in cases like this one) especially when ultra 1080p gaming is becoming much more affordable than ever(I think?) with cards like the rx 6600.

  • @hangyang6332
    @hangyang6332 Місяць тому +4

    Try turning off Nvidia reflex low latency.

  • @glubrix
    @glubrix Місяць тому +1

    Great video

  • @HunterTracks
    @HunterTracks Місяць тому +10

    If you wanna come at the argument from that point of view, sure, current hardware isn't fast enough to make sure games never get bottlenecked. That's not because the hardware isn't fast enough, that's because there's effectively no upper limit on how poorly optimized code can get.

    • @BlackJesus8463
      @BlackJesus8463 Місяць тому

      If you made the game it wouldve been perfect. 👍👍

    • @HunterTracks
      @HunterTracks Місяць тому

      @@BlackJesus8463 What a braindead take. Should I bake a perfect lemon pie before I set about criticizing someone who shat on a plate, put a candle in it and called it dessert?

    • @jose131991
      @jose131991 Місяць тому

      @@BlackJesus8463lol

  • @-Rizecek-
    @-Rizecek- Місяць тому +1

    And the main blame falls on Nvidia.
    Thanks to DLSS, they buried game optimizations.
    And above all, games are not played on ultra details... Ultra details cause quite a bottleneck.
    It has happened to me many times that on ultra I got 80FPS and gpu 70-80%, and on high it jumped to 150FPS and gpu 99%.
    A beautiful example is the game Crysis Remastered. There you set the water quality to ultra and you immediately have a bottlneck and 50fps worse performance.
    I don't understand why God of War Ragnarok can run absolutely perfectly and the game looks like on Unreal Engin 5 and then games like star wars come out...

  • @FlyTimeRC
    @FlyTimeRC Місяць тому +2

    Hogwarts legacy at 4k I was bottlenecked with a 4080 and a over clocked 5900x so I went to a 7800x3d and now GPU bottlenecked.

  • @AMD_7900
    @AMD_7900 Місяць тому +2

    Frame generation is the only solution for the CPU bottleneck today.

    • @BlackJesus8463
      @BlackJesus8463 Місяць тому

      X3D

    • @AMD_7900
      @AMD_7900 Місяць тому

      @@BlackJesus8463 10700K + Nvidia FG = +-90 fps in MFS 2020 New-York with smooth framegraph.
      7800X3D without FG = +-60 fps in MFS 2020 New-York with spikes.
      Hardware just nothing without software.

    • @bricaaron3978
      @bricaaron3978 27 днів тому

      So-called "frame generation" ** is not a solution. It's a cruddy workaround.
      ** "Frame generation" is a ridiculous term, because that's exactly what every game does --- generate frames.

  • @worlds_greatest_detective6667
    @worlds_greatest_detective6667 Місяць тому +126

    This is such a terrible take. It’s not a CPU problem, it’s a trash Engine problem.

    • @ZippDude
      @ZippDude Місяць тому +26

      Absolutely. The hardware is more than capable...complaining that shoddy code runs poorly and blaming the hardware. Lol

    • @andersjjensen
      @andersjjensen Місяць тому +15

      It's an RT problem. On my 7950X3D Jedi Survivor hits the ceiling at ~120FPS if I disable RT. I can set everything lower and it stays at 120FPS. Cranking on RT and it slices it basically in half. Same with Hogwarts, btw. I don't have Spiderman, so I can't confirm there... but I've seen this enough to know what's up.
      TL;DR: Nvidia pushed a crappy way of doing RT on the world and now we are stuck with it.

    • @gunzorkgaming7847
      @gunzorkgaming7847 Місяць тому +1

      Yeah absolutely.

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому +8

      Not all the engines showcased here were CPU limited. Spiderman uses Insomniac's custom engine which has been shown to be well optimized

    • @cosmicnebula3023
      @cosmicnebula3023 Місяць тому +1

      @@andersjjensen It's not a crappy way of doing RT, it's moreso that optimizing the CPU side of RT is usually not well done in most games.

  • @C-M-E
    @C-M-E Місяць тому +1

    Outside of the trend of games being generally unoptimized because 'frame gen fixes all', it looks genuinely like you've got a serious leach on your system, unless you're capturing footage and pulling game capture at the same time while gaming. Over 20gb of system ram usage With 10-14gb of VRAM usage consistently across multiple titles would make me start looking at what is sucking off the teat while I'm trying to cram frames.

  • @GameplayUnboxed
    @GameplayUnboxed Місяць тому +31

    When he switched to dlss performance, it looked so bad then I realised at same time my video switched to 480p 😂😂😂

  • @thearenajanitor5017
    @thearenajanitor5017 Місяць тому

    Great video, thanks for clarifying this point.

  • @Monsux
    @Monsux Місяць тому +12

    I feel like you either read my mind or saw my posts :D My god, losing my mind when people are saying how they are always GPU “bottlenecked” with some older CPU + top tier GPU. This is so easy to find out if people would monitor their AAA titles even a minute.

    • @PowellCat745
      @PowellCat745 Місяць тому +3

      Some of them are Intel shills. Ignore them.

    • @albert2006xp
      @albert2006xp Місяць тому +6

      Tech literacy seems to be dire among the PC gaming userbase now. I don't even think they know more about PCs than the average console gamer at this point. So many people can't even handle an options menu to tune their own performance, they just open it, leave it as is or adjust the general preset and call it a day. Understanding deeper things is out of the question. The worst part is thanks to social media they repeat wrong information to each other a lot.

    • @Monsux
      @Monsux Місяць тому +3

      @@PowellCat745 These were AMD users. I just think people try to tell themselves that they made a right choice, and what they have is perfect. When someone says anything that breaks their illusion, they go into defense mode. I just trust data.

    • @PowellCat745
      @PowellCat745 Місяць тому +3

      @@Monsux Yeah those too. I saw someone refusing to upgrade from their 3900X, a huge bottleneck for their 4080.

    • @Monsux
      @Monsux Місяць тому

      @@albert2006xp This seems to be the issue. It's weird when other PC users can't understand even simple concepts. We have more information available, but people don't even bother finding out things.

  • @AG-xz7ne
    @AG-xz7ne Місяць тому +1

    Yes, thank you. Especially with modern games being less optimized, this is an issue.

  • @mat4246
    @mat4246 Місяць тому +5

    2:00 my god, these frame time spikes are atrocious

    • @jetsp
      @jetsp Місяць тому

      With the best PC today. And they still say that the game is super playable.

    • @jmangames5562
      @jmangames5562 Місяць тому

      Turn off RT....

  • @thegreengator71
    @thegreengator71 Місяць тому +2

    Clearly the 7800x3d can't keep up with a RTX 4090 in the games you demonstrated. My question is, how far down the GPU stack you have to go to be GPU bound, in those same games? Thanks for your insights.

    • @andersjjensen
      @andersjjensen Місяць тому

      If you enable RT in Jedi Survivor or Hogwarts you'll always struggle to get much above 60FPS. When I disable RT on my 7950X3D I get just above 120FPS in Jedi Survivor.
      But to answer your question: He was at ~70% GPU utilization in the first example. A 3090Ti is about 70% the performance of a 4090, so that would about max out. A 4080 is about 80% the performance of a 4090 so that would still be slightly under utilized.

  • @he1go2lik3it
    @he1go2lik3it Місяць тому +6

    In Hogwarts Legacy turn off DLSS and also turn off Nvidia Reflex and the CPU bottleneck is gone.

  • @alienfunbug
    @alienfunbug Місяць тому

    Damn, thanks for this video: "Some people were spreading misinformation"....and now I know that I am "Some people"

  • @photonboy999
    @photonboy999 Місяць тому +32

    *I finally get it!!*
    The moral of all this is to play at 8K/24FPS for the ultimate "Cinematic Experience!"

    • @christophermullins7163
      @christophermullins7163 Місяць тому +2

      8k 60 with some upscaling is a reasonable to maximize your setup with some of these games. You could also turn off RT and run native.(Yes.. rt is heavy on the CPU)

    • @cezarstefanseghjucan
      @cezarstefanseghjucan Місяць тому +1

      A true cinematic experience will always be superior to 420 FPS @ 720p blaze it... 1337 PC G4M3R 7153...

    • @photonboy999
      @photonboy999 Місяць тому

      @@christophermullins7163
      I'm not sure if you got that I was joking...
      Anyway, the deal with 8K screens is that it's stupid. In fact, I cannot think of a SINGLE benefit for an 8K screen.
      You physically cannot resolve more pixel density than 4K provides unless you're sitting unreasonably close. And if a game benefits from rendering at a higher resolution than 4K you can do that in a 4K monitor (i.e. NVidia DSR to render at higher than native then downscale to fit)... but...
      UPSCALING has a processing cost. So you'll get a lower FPS if you render at 4K then upscale to 8K than you would rendering at 4K for a 4K screen (using DLAA)...
      So...
      There's just no scenario where an 8K screen makes sense. Quite the opposite. That doesn't mean we won't get 8K screens so we need to upgrade. Of course it'll go that way.

    • @christophermullins7163
      @christophermullins7163 Місяць тому

      @@photonboy999 do you play on a 4k screen? I have several and in every normal scenario I play in.. I could most definitely see the difference in 4k and 8k. I see jaggies in every game. "8k has no benefits" is indicative of copying what you hear and having no clue about the reality of the situation. Respectfully of course. 😛

    • @photonboy999
      @photonboy999 Місяць тому

      @@christophermullins7163
      I clearly said 8K screens serve no benefit I can see.
      I did NOT say rendering at 8K served no purpose. It can, and that's why I discussed DSR.
      Do you have an 8K screen? If so, have you compared rendering at 8K on an 8K screen to rendering at 8K on a 4K screen?
      Unless you have an 8K screen to test then you can't dispute what I've said.

  • @TheGameBench
    @TheGameBench Місяць тому

    Faster memory would also help. While I agree that a lot of people do not understand bottlenecks, I feel like a lot of this is less of a CPU issue and more of a poor optimization issue. However, that isn't likely to change and it definitely highlights that we're not at peak CPU performance with how lazy many of these of these developers are.

  • @jetsp
    @jetsp Місяць тому +21

    People don't want to think outside the box. GPUs are flying and processors are falling behind.

    • @andersjjensen
      @andersjjensen Місяць тому +2

      Except this was all with RT. RT absolutely rapes the CPU. The FPS ceiling in Jedi Survivor is ~130FPS on the 7800X3D and ~120FPS on the 14900k as soon as you disable RT. Remember that Nvidia solely designed and forced the how RT is done on everyone. They don't give a fuck if that standard tanks CPU performance. They just wanted a new gimmick that took up as little space as a possible on the GPU die so they could tell gamers they needed an AI feature (DLSS).

    • @felipedeornelas8054
      @felipedeornelas8054 Місяць тому +2

      DLSS is amazing though. ​@@andersjjensen

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому +4

      @@andersjjensen UE4 is pretty inefficient with how RT is done. The engine is known for being notoriously heavy on the CPU with ray tracing. That's why Hogwarts Legacy and Jedi Survivor get such a big impact on the CPU when ray tracing is enabled. Plenty of other games do not have such heavy CPU usage with RT on such as Cyberpunk, Guardians of the Galaxy, Minecraft RTX, Metro Exodus(Both versions), and others. Quit blaming Nvidia and ray tracing as the reason for the inefficient CPU usage with RT on when it's not even present outside of UE4 games, UE5 is by default pretty inefficient with the CPU even without the ray tracing done via Lumen or other traditional RT solutions. Also the standard, DXR, isn't just supported by Nvidia, it's supported by AMD and Intel as well who also tend to follow these examples.

    • @terrylaze6247
      @terrylaze6247 Місяць тому

      That's what he explained right at the beginning, GPUs always have a benefit from getting wider due to the nature of what they are processing while CPUs don't get a benefit from getting wider because most things the CPU has to process don't have that much data to process all of that has been moved to the GPU years and years ago.

    • @Anon1370
      @Anon1370 Місяць тому

      yet when the 5090 comes out people will be rushing to get it ignoring that cpu bottleneck and then wondering why the fps are no better with rt on lmaoooooooooo im not upgrading from my 4090 fk this im just gonna go afew years only upgrading my cpu to try and even it out

  • @stonythewoke9921
    @stonythewoke9921 Місяць тому

    I got a good analogy why parallel processing doesn't always make sense for gaming workloads: if you have to calculate in your head the results of "1+1+1+1+1+1+1+1", would it be faster if you find 4 other people, tell each of those people to calculate "1+1" and then ask each of those people what the results of their calculations was and then tell two of those other people to each calculate "2+2", then ask them for their results and then calculate "4+4" in your head to get the result, or would it be faster if you did the "1+1+1+1+1+1+1+1" calculation yourself in your head? the answer is, communicating the tasks to the other people and then fetching their results takes alot longer than doing everything by yourself. thats why not everything in game workloads is parallelized and you usually have one main thread that makes the most important state dependent calculations by itself. stuff that can easily be parallelized, like rendering the frame for your monitor to display, is actually highly parallelized and done by the GPU, which is utilizing thousands of cores. but calculating the game logic, including f.e. what effect your inputs have on the state of the game and so on, cannot be effectively parallelized in the same way. so you end up with one main thread that fully utilizes the core it is assigned to at the moment, while a few support threads are being worked on other cores but who usually don't max out the computing potential of those cores.
    Parallelization in game logic can not only slow everything down substantially, it can also cause mistakes in the game logic that manifest in stuff like duping of items, game crashes and all kinds of stuff. so claiming that a game is badly optimized or the programmers are lazy only because it mainly utilizes one core fully while 4 other cores are just used slightly isn't really a valid argument. at most it shows that you don't really know how those things work.

  • @Krisztian5HUN
    @Krisztian5HUN Місяць тому +53

    This is not cpu bottleneck. This is bad code or game engine/software bottleneck.

    • @virtual-adam
      @virtual-adam Місяць тому +2

      The CPU never goes above around 70% max in those games, so it seems you may be right. I'm no expert so someone could prove me wrong.

    • @thomasantipleb8512
      @thomasantipleb8512 Місяць тому +6

      You are completely missing the point of the video.
      There are going to be badly optimised games that are CPU bottlenecked, the only way to deal with it, is better CPUs.
      If you compare this CPU with lower performance CPUs, at these badly optimised games and these high resolutions, the better CPU will do better, therefore the point of the video is, better CPUs will run better even in high resolutions. So the people who say "yOu dOnT nEEd a hIGh eNd CpU fOr hIgH rEsolutIOn" are talking nonsense. Can't explain more simple than that.
      Kindergarten basic logic, over.

    • @GreyDeathVaccine
      @GreyDeathVaccine Місяць тому +2

      @@thomasantipleb8512 Your point is valid only when you are ready to waste any money to buy expensive hardware.

    • @peterfischer2039
      @peterfischer2039 Місяць тому

      @@thomasantipleb8512 You are accepting game developers releasing badly optimized games as a matter of life.
      The sane thing to do is, to not buy a game until it is proven to have at least decent optimization.
      Honestly, there are so many good games nowadays that you can skip all the terrible optimization and still have more games to play then there is time to play them.
      Especially if you have other things to do in live other than playing games.

    • @thomasantipleb8512
      @thomasantipleb8512 Місяць тому +4

      @@peterfischer2039 I don't disagree with you, but that's not the point.

  • @robertlawrence9000
    @robertlawrence9000 Місяць тому +1

    It would have been good if you showed all of the core utilization to show how many cores it's actually using.

  • @wireless1235
    @wireless1235 Місяць тому +5

    This video is so misleading and does a disservice to gamers. What people are saying about CPU bottlenecks is generally true. In most games at 1440p or higher you will be GPU bottlenecked and and CPU improvements will have marginal impacts. Of course there's going to be outliers but most games aren't poorly optimized like the examples you showed.

    • @ConnorH2111
      @ConnorH2111 Місяць тому

      exactly

    • @johnlong6197
      @johnlong6197 Місяць тому +1

      I wish what was true, but many studios are switching over to the mess that is unreal eg. The new silent hill. For they way it looks it runs very poorly.

    • @skinscalp222
      @skinscalp222 9 днів тому

      Misleading how? What he's trying to say is that some people are under the impression that you can offload from CPU to GPU by increasing resolution and thus get better frames. He's just trying to say that is not how it works.

  • @oppressorable
    @oppressorable Місяць тому +1

    On some types of games, there could be more optimization to use more cores. For example, City Skyline 2 can use up to 64 cores, and that might very well be a soft Window limit. I believe that the bulk of the CPU usage is for pathfinding for the agents. If so, many strategy games could use multithreading much more aggressively. That's a small fraction of the game, but that would be a good start.

  • @andersjjensen
    @andersjjensen Місяць тому +19

    Try this without RT. Nvidia pushed an absolutely stupid standard on everyone. It costs an enormous amount of CPU power to calculate the BVHs... and that should have been done on the GPU (which, you know, is pretty good at geometry). The problem is that it would require dedicated silicon like ROPs and TMUs and Nvidia didn't want to take space from their precious compute... which is what the professionals pay big dollars for. So now CPUs have to handle that on top of doing the character animations, physics and building the scene geometry before dispatching it to the GPU.

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому +1

      Ray tracing isn't the issue

    • @atiradeon6320
      @atiradeon6320 Місяць тому +4

      Blaming RT & Nvidia is a cope. Cyberpunk 2077 already proved you can have very advanced RT & the game scale CPU performance beautifully. The only people at fault for the crummy game performance are the devs who made the game.

    • @andersjjensen
      @andersjjensen Місяць тому +8

      @@atiradeon6320 Cyberpunk also tanks massively on the CPU when you enable RT. It's not nearly as egregious as the UE5 games (which are particularly bad due to Nanite constantly recalculating the BVHs because polygon levels change all the time). Hybrid rendering just categorically sucks. It's not until you get to full scene path tracing that you get out of that pickle.
      And no, being on Nvidia's ass is not "cope". I've watched the motherfuckers for over 30 years now. I could write an entire book about how they've consistently tried to lock competition out.

    • @andersjjensen
      @andersjjensen Місяць тому

      @@crestofhonor2349 I have a 7959X3D. Both Jedi Survivor and Hogwarts Legacy drops to 60-75FPS when I enable highest RT at 720p Native while my GPU is under 50%. If I disable it I get 120-130FPS and my GPU is still under 50% (Obviously, RT is GPU heavy).

    • @PhillipJermakian
      @PhillipJermakian Місяць тому

      ​@atiradeon6320 I only ran the cyberpunk benchmark a few times, never played it but we were turning RT on and off and I could not really tell much of a difference. Nothing as good as the difference between 60fps and 100fps with it off.

  • @c0nd3mnd22
    @c0nd3mnd22 Місяць тому +1

    While technically true that some games are "CPU limited" in terms of single-threaded performance, it's also true that even modern games that use a couple of threads are still limited by single-threaded performance. This is likely not something that we'll see huge leaps from one generation to the next on. As you stated, GPU scaling is simpler because making GPUs wider makes games run faster on them. NVIDIA could release an RTX 5099 that's double everything the RTX 5090 is and it'd be more or less twice as fast (and expensive, and consume 800-1000 watts or whatever, but it'd be possible).
    So I'd put the blame mostly on game developers and current game engines in use.
    Also, if you really wanted the fastest possible gaming CPU, you'd see a very slight increase in performance with a 7950X3D with the non-3D CCD disabled, as the 3D CCD can turbo slightly higher than the 7800X3D's CCD.
    We're also holding a what, $400 CPU up to a $1500+ GPU, right? I just don't think CPU manufacturers super selectively binning their CPUs to get a 10% faster version and then selling that for $1000+ would sit very well with consumers.
    So yeah, even if we got a +15% uplift in gaming performance for CPUs this year, it would just shift the point of where the "bottleneck" is occurring slightly further back, but nowhere near enough to keep up with the upcoming RTX 5090 in these scenarios you've shown, even though that +15% improvement in CPU gaming performance would be very impressive generation-on-generation.

  • @racheeeed
    @racheeeed Місяць тому +5

    I don't know Daniel, you're always gonna be bottle-necked by something ... Your point is indeed well taken with regards to the online discourse about "cpu performance don't matter duh", but the more devs rely on upscaling and FG to provide the fps figures instead of optimizing the underlying engine the bigger the need for cpu perf for everything else that makes a game will be glaringly obvious. Love your videos, keep up the good work. Cheers

  • @zero2006xl
    @zero2006xl Місяць тому

    Im fed up of people on the internet saying it works OK for me. Or it runs smooth as butter. Or no stutter here. Well, here we have it, some games in which you throw more than 2000 dollar in components and you get a sub par experience. Its like higher end parts are meant to bruteforce through badly optimized games instead of getting super next gen graphics. What a waste of money, energy and peoples time. Thank you, Daniel, for exposing such data always.

  • @purandhar-hm6vx
    @purandhar-hm6vx Місяць тому +7

    actually i want to make point that this doesn't mean always cpu bottleneck, unless you see one or more threads hittting 85-90% utilization. The point i am trying to make is that game engine or game itself may be very bad which make neither cpu or gpu culprit. For example , if i write a code to wait for 2ms, the cpu will just wait for 2ms doing nothing or something else and no matter 100 years down the line we get the best cpu, it still has to wait 2ms as the command is given to it. So if i don't see any cores being utilized much and gpu is also being under utilized then i start suspecting optimisation of the game or game engine not fitting well for the game scenes.

    • @ToxicOsOk
      @ToxicOsOk Місяць тому

      Bad optimization literally makes the CPU or GPU a bottleneck when it doesn’t need to be. The reason why it’s bottlenecking doesn’t really matter when there’s nothing you can do to make a game perform better.

  • @JosepsGSX
    @JosepsGSX Місяць тому

    Wow. It would be very interesting to test how memory optimization and some CPU tweaks in your system, with that top CPU & GPU combo.
    I have been dipping my toes in memory tweaking/overclocking and CPU undervolting/overclocking since, well, forever at this point, but in my current system my GPU is innadequate to escale significantly to changes on the CPU side of things. In your system any improvement will be immediatly obvious.

  • @FilthyPeasantGaming
    @FilthyPeasantGaming Місяць тому +16

    People who say you don't need a better CPU nowadays obviously havn't played Tarkov.

    • @2shae475
      @2shae475 Місяць тому

      It's a dogshit coded game - it's not the hardware's fault.

    • @ConnorH2111
      @ConnorH2111 Місяць тому +1

      the 7800x3d is a beast for tarkov though

    • @FilthyPeasantGaming
      @FilthyPeasantGaming Місяць тому

      @@ConnorH2111 It is! its what I have, but funny enough I'm sure a better CPU would still help for that game.

  • @krayzieridah
    @krayzieridah 27 днів тому

    Just a note - The main thing that improved the performance in Jedi Survivor, was the removal of the DRM Denuvo, which just shows us that all games running Denuvo, will have the potential for greater performance, if only the developers remove it from their games.

  • @hackintosh3899
    @hackintosh3899 Місяць тому +6

    RT cripples CPU's as you showed with Jedi Survivor. Other games are also CPU dependent like MMO's but it's a dying genre for kids, so they probably have just never experienced it. Heck I think SWTOR would bottleneck a PS4 level GPU due to being DX 9 and single core dependent. Add to all this Intel is a desperate company atm. Never underestimate bots and astroturfing. It's rampant on all social media, it's cheap and the FTC rarely fines the companies for it.

    • @ArdaSReal
      @ArdaSReal Місяць тому

      I thought rt is completely on the GPU?

    • @ladrok97
      @ladrok97 Місяць тому

      @@ArdaSReal Nah, it have huge impact on CPU. Which is why Nvidia made key feature of 4000 series being "frame generation", because this thing doesn't need much of cpu, so Nvidia could sell it's 4000 series as "better graphic and more fps"

  • @nelloderisi5299
    @nelloderisi5299 Місяць тому

    Thank you for the video. Unfortunately there are many reasons why a CPU makes bottleneck, many times the not good optimization and full utilization of available resources including the use of individual cores. It would be very useful for example to monitor the % usage of each logical core instead of the whole CPU to show which core saturates and creates bottleneck for the GPU.

  • @cks2020693
    @cks2020693 Місяць тому +15

    to echo Daniel's point about CPU bottleneck. I have a 7800X3D+4070ti @3440x1440, playing Spiderman Rematered (2022 game) on max graphic settings with regular ray tracing, I'm getting 100% GPU usage and 70% CPU usage.
    People forget ray tracing costs a lot of CPU, and higher the resolution, the more workload on the CPU to handle ray tracing

    • @christophermullins7163
      @christophermullins7163 Місяць тому +1

      When I got a new 40 series I was shocked at how CPU heavy ray tracing is. Stark difference. What I find odd is there aren't many videos doing CPU comparison WITH RT ON specifically.

    • @cks2020693
      @cks2020693 Місяць тому +1

      @@christophermullins7163 I think that's because an average PC user doesn't have a GPU powerful enough to run RT on every game, so to cater to an average user, they test games without RT

    • @NadeemAhmed-nv2br
      @NadeemAhmed-nv2br Місяць тому

      FG also adds to cpu usage

  • @thomasantipleb8512
    @thomasantipleb8512 Місяць тому +2

    For those saying "its not the CPU, its bad optimisation".
    You are completely missing the point of the video.
    There are going to be badly optimised games that are CPU bottlenecked, the only way to deal with it, is better CPUs.
    If you compare this CPU (7800x3D) with lower performance CPUs, at these badly optimised games (that unfortunately exist and will continue to exist) and these high resolutions, the better CPU will do better, therefore the point of the video is, better CPUs will run better even in high resolutions. So the people who say "yOu dOnT nEEd a hIGh eNd CpU fOr hIgH rEsolutIOn" are talking nonsense.
    Can't explain more simple than that.
    Kindergarten basic logic, over.

  • @misanthropus0
    @misanthropus0 Місяць тому +20

    This video only proves what I've always thought
    I will never enable RT on any games since the difference is mostly minimal and it's really costly on your PC.
    Raster performance is what you mostly want and it pains me that devs are baking RT right into games' engines and settings

    • @paulc5389
      @paulc5389 Місяць тому +6

      RT as a technology is better by a mile. The technology to use it well just isn't there yet but people are anyway.

    • @crestofhonor2349
      @crestofhonor2349 Місяць тому +3

      RT is far better as a solution for lighting. We can't stay on rasterization forever, especially since it can also have a lot of time and authoring since it's just an approximation. There are times when rasterization is slower than ray tracing

    • @Chakotay2222
      @Chakotay2222 Місяць тому +1

      is not minimal difference. I don't know why but I played calisto protocol today since it was free on epic some time ago now but I never found time to play it. Anyway playing it without raytracing is different experience. Looks like older game not to modern standard until I put on raytracing. I don't mind raytracing in single player campaign even if it come with some stutter hiccups. When I payed for gpu I intended to use it to its limit. Also maybe is not such bad thing that the best gaming cpu is falling behind gpu performance. It may force software developers to improve game engines and we get free fps boost instead of keep upgrading hardware for minimal gains

    • @ArdaSReal
      @ArdaSReal Місяць тому +1

      Some games use it well but yes in most titles i dont see much difference in looks but the difference in fps is very obvious. Im sure in the future i will be standard tho in AAA games at least

    • @mr.t3782
      @mr.t3782 Місяць тому

      This is only true if you have a low quality monitor. I agree with you if you’re not using a oled. I have both a Ray tracing does look way better on a properly tuned oled.

  • @tapioorankiaalto2457
    @tapioorankiaalto2457 Місяць тому +1

    I play at 1440p/PSVR2 using a 4070 paired with a 7600 & 32GB DDR5 and I'm 99% of the time GPU limited. Don't really need a faster CPU until a couple years later.
    You're not really making the case for CPU bottlenecking when you're using a 4090, which about 0,01% of gamers has.