CPUs Matter for 4K Gaming, More Than You Might Think!

Поділитися
Вставка
  • Опубліковано 30 тра 2024
  • See MSI QD-OLED Gaming Monitors: msi.gm/GetQDOLED
    Go QD OLED Global: msi.gm/GoOLEDGlobal
    Go QD OLED Australians: msi.gm/GoOLED
    Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs: • Nvidia Has a Driver Ov...
    Nvidia Driver Investigation [Part 2] Owners of Old CPUs Beware: • Nvidia Driver Investig...
    Buy relevant products from Amazon, Newegg and others below:
    GeForce RTX 4070 Super - geni.us/wSqSO07
    GeForce RTX 4070 Ti Super - geni.us/GxWGmYQ
    GeForce RTX 4080 Super - geni.us/80D6BBA
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 4070 - geni.us/8dn6Bt
    GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
    GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
    GeForce RTX 4060 - geni.us/7QKyyLM
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    Radeon RX 7800 XT - geni.us/Jagv
    Radeon RX 7700 XT - geni.us/vzzndOB
    Radeon RX 7600 XT - geni.us/eW2iWo
    Radeon RX 7600 - geni.us/j2BgwXv
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6750 XT - geni.us/53sUN7
    Radeon RX 6650 XT - geni.us/8Awx3
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6600 - geni.us/cCrY
    Video Index
    00:00 - Welcome to Hardware Unboxed
    00:44 - Ad Spot
    01:29 - CPU Reviews
    04:08 - Test System Specs
    04:50 - Hogwarts Legacy
    11:02 - Starfield
    14:55 - Counter-Strike 2
    17:28 - Final Thoughts
    CPUs Matter for 4K Gaming, More Than You Might Think!
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo
  • Наука та технологія

КОМЕНТАРІ • 1,2 тис.

  • @1Grainer1
    @1Grainer1 27 днів тому +1566

    Obligatory "where is 5800x3d?" comment

    • @stephenwerner1662
      @stephenwerner1662 27 днів тому +164

      Did you check behind the couch?

    • @InfernoTrees
      @InfernoTrees 27 днів тому +50

      Check your pockets

    • @brucethen
      @brucethen 27 днів тому +9

      Lol you beat me to it

    • @1Grainer1
      @1Grainer1 27 днів тому

      @@stephenwerner1662 found it! in the cookie jar

    • @GewelReal
      @GewelReal 27 днів тому +8

      Still in a store

  • @fracturedlife1393
    @fracturedlife1393 27 днів тому +1343

    Usually wallet limited before CPU limited these days!

    • @djpep94
      @djpep94 27 днів тому +18

      So true 💔

    • @thesilentassassin1167
      @thesilentassassin1167 27 днів тому +12

      Not just these days

    • @theanimerapper6351
      @theanimerapper6351 27 днів тому +57

      CPUs are actually pretty cheap nowadays. It's the GPUs that cause a wallet limit 😂

    • @RiasatSalminSami
      @RiasatSalminSami 27 днів тому +31

      @@thesilentassassin1167 More these days compared to past. Especially since a potato gpu costs 500+ $

    • @empathy_ggs
      @empathy_ggs 27 днів тому +3

      If there is a booklet of truths for hardware related statements, this is the truest lmao

  • @StefandeJong1
    @StefandeJong1 27 днів тому +354

    I have an RTX 3090 and like playing simulation games, using my 42" LG C2. Upgrading from the 5600X to the 5700X3D (I reused the 5600X for my HTPC) gave me 60% higher 1% lows and 55% faster simulation speed in Cities:Skylines 2. It was night and day

    • @jay-5061
      @jay-5061 27 днів тому +38

      That game is also garbage in performance terms

    • @yahyasajid5113
      @yahyasajid5113 27 днів тому +23

      I did a comparison with a 7500F and 7900X3D in modded BeamNG
      Upto 40% performance difference with a 3090 at 1440p was nuts

    • @eswecto6074
      @eswecto6074 27 днів тому +11

      I wish there is more such data about games performance at 4K60, 4K120 and for both upscale/FG; with/without RT but different games. This can save a lot of time without need to look at different benchmark videos on youtube.

    • @hilerga1
      @hilerga1 27 днів тому +16

      I’m on a 5800x (non 3d), RTX 3080, and that 42” LG C2. Built the computer for retail prices at launch during the lock down - I was stuck home with very little better to do than obsess over beating scalpers to good parts 😂.
      Picked up the 42” LG C2 later on sale for about $600 and I have to say it is killer for sim gaming and single player RPGs! Best “monitor” I’ve ever owned.

    • @aftermarkgaming
      @aftermarkgaming 27 днів тому

      @@jay-5061 that makes it a great real world use-case example

  • @gurshair
    @gurshair 27 днів тому +54

    "try not to fixate on the cpu used"
    As someone with a R5 3600 I will most definitely pay attention

    • @PhotoshopArt
      @PhotoshopArt 21 день тому +3

      3600 is just fine. Don't throw away money. I was on 2 core FX4100 a while ago. Wait for 2 - 3 more generations. Then you gonna notice the difference. Thank me later.

  • @moevor
    @moevor 27 днів тому +164

    Steve: "Try not to focus on the actual CPUs used here"
    Me: Yeah, good luck with that Steve!

    • @martineyles
      @martineyles 27 днів тому +4

      Well it does matter. We don't see the in between of the 5600 or 7600, which will be more likely upgrades for people like me currently running Skylake CPUs.

    • @deviantbuilds
      @deviantbuilds 27 днів тому +9

      I mean, it matters a lot. That's like having a race between a current Olympic level runner and some old dude that was the fastest in high-school 😆

    • @MaxIronsThird
      @MaxIronsThird 27 днів тому +3

      people missed the joke

    • @MaxIronsThird
      @MaxIronsThird 27 днів тому +11

      @@martineyles just look up the review for the CPU you're interested in, this video is not a CPU review comparing the 3600 and the 7800X3D, he's just trying to make a point that it seems it's still going over people's head.

    • @moevor
      @moevor 27 днів тому +4

      But it doesn't, a lot that matters is your target frame rate, to be more precise frame time (CPU time as captured by PresentMon). The CPU scaling is extra info you need if your current CPU is not providing what you need. Seems that message did not come across

  • @KimBoKastekniv47
    @KimBoKastekniv47 27 днів тому +44

    Of course the 7800x3D is faster than the 3600 even at 4K, it has more cores!
    (Steve don't kill me I'm joking)

    • @Hardwareunboxed
      @Hardwareunboxed  27 днів тому +50

      Agent has been dispatched...

    • @nathangamble125
      @nathangamble125 8 днів тому +2

      Obviously an FX 9590 is even faster! It's a 4.7GHz 8-core CPU, which is more than the 7800X3D's 4.5GHz!

  • @KillFrenzy96
    @KillFrenzy96 27 днів тому +337

    When a game throws you into a crazy situation, it's usually the CPU that suffers the most. Most benchmarks are not done during the unpredictable chaos that you get into for many games, especially modded games.

    • @roqeyt3566
      @roqeyt3566 27 днів тому +44

      Exactly, and those situations you do want a better CPU in to reduce frame skips, jitter, input lag etc.
      For some folks, that's worth 300$ on a 1500$ PC, for others, they'd rather pocket the money and deal with it

    • @Omar-Asim
      @Omar-Asim 27 днів тому +12

      Or certain old games or MMOs

    • @francystyle
      @francystyle 27 днів тому +30

      Yea I’ve got a 4070TiS with a 14600k and at the beginning of a mission in Helldivers 2 I’m gpu limited, but when I’m at the end with a ton of enemies my cpu starts to limit my gpu and my frames drop

    • @InnuendoXP
      @InnuendoXP 27 днів тому +3

      Yeah though it'd be interesting to see what extent multithreading would be beneficial, particularly as a modded game would probably be loading heavily on tasks designed for a single thread so the single-thread performance is the main limiting factor once you're at 6 cores or more. The 5600X3D showed this pretty comprehensively I think.

    • @snozbaries7652
      @snozbaries7652 27 днів тому +3

      @@francystyle really?? I have a 4070ti 5800x3d and I don't get cpu limited. I play @ 4k with dlss quality.

  • @UncannySense
    @UncannySense 27 днів тому +71

    So basically if I run my game whatever that game might be at 4k and changed quality setting and notice next to no frame rate change or uplift then I need to upgrade CPU....

    • @exscape
      @exscape 27 днів тому +16

      Yep! That's basically always the case. Some settings will affect the CPU more than others, though.
      If you get similar frame rates at your standard resolution and HALF your standard resolution (or at say 720p), you're almost certainly CPU limited.
      You could also check GPU usage while gaming though. When GPU limited it will almost always be above 95%, typically 98%-99%.

    • @kleinerprinz99
      @kleinerprinz99 27 днів тому +14

      @@exscape problem are several many games that came out last couple years that are very unoptimised and will idle your GPU and you'll run into CPU limit. famous examples are Jedi Survivor 2nd game and Cities Skylines 2 because the devs didnt even do their basic homework. Elden Ring is also a culprit. Didnt optmise anything even 1 year, or 2 years now after release. No CPU on the world exists that can compensate bad & lazy programming.

    • @IIHydraII
      @IIHydraII 27 днів тому +2

      Could also be memory, but generally speaking, yes.

    • @White-nl8td
      @White-nl8td 27 днів тому +6

      Check your GPU utilization. If it's less than 95%, you are certainly CPU bottlenecked.

    • @moevor
      @moevor 27 днів тому +5

      Use PresentMon from Intel, look at CPU time vs GPU time, this is the true metric of whether you are CPU bound or GPU bound. CPU and GPU time in ms cane be converted to equivalent FPS = 1000 / time. The difference in CPU/GPU time will tell how how imbalanced each component is for that workload. Bear in mind that just because you are CPU bound does not mean you need to upgrade. For two reasons: 1) is the fps you are getting enough? 2) will a faster CPU allow the GPU to render more. Bonus point of 3) is there a faster CPU in this workload. Your upper limit is not infinity.

  • @andrewcross5918
    @andrewcross5918 27 днів тому +201

    Of course they matter. Not everyone plays AAA titles that are often GPU bound. Plenty of people are out here conquering the Europe in HOI4 and exterminating the Xenos in Stellaris or creating some traffic nightmare in Cities Skylines or trying to build the longest, most efficient production line ever in Satisfactory / Factorio or building a rocket to Mars in Civ 6 or guiding a team to victory in Football Manager and I could go on. CPU performance is the primary driver in these kinds of titles and yet they so rarely get tested which is a real shame.

    • @nootonian4149
      @nootonian4149 27 днів тому +1

      what is a good gpu for cities skylines 1 at 4k?

    • @RobBCactive
      @RobBCactive 27 днів тому +1

      Well a bar showing fps isn't significant (for those games), after the x3D CPUs came out I have seen various tests but the results don't seem all that surprising.
      CPU reviews test various features, not just in action games.
      For purchasing decisions weighing CPU / platform LT / cache relatively more in budget seems reasonable.

    • @Flaimbot
      @Flaimbot 27 днів тому +10

      add wow to the list. there's never "enough" cpu headroom.

    • @luzhang2982
      @luzhang2982 27 днів тому +6

      Rimworld, cs ( both counter and cities ), VR, etc Lots of top 10 games and game categories that are not what reviewers use.

    • @genocidehero9687
      @genocidehero9687 27 днів тому

      Wow niche nerd games lmao

  • @lanelesic
    @lanelesic 27 днів тому +47

    As an owner of the R5 3600 I must say its one of the best cpus I bought. Perfect match for gpus like the RX 6600, 6600xt and the 6650xt.

    • @andersjjensen
      @andersjjensen 26 днів тому

      In most games it can still pin those GPUs at 1080p. If you're gaming at 4k with upscaling it will still probably pin a 6800XT.

    • @lanelesic
      @lanelesic 26 днів тому +1

      @@andersjjensen Pin?

    • @andersjjensen
      @andersjjensen 26 днів тому

      @@lanelesic "to max out"

    • @lanelesic
      @lanelesic 26 днів тому +1

      @@andersjjensen It technically doesnt max them out, but those are the most reasonable combos.

    • @RohanSanjith
      @RohanSanjith 25 днів тому +1

      i have 3700x, primarily used for video editing. very powerful but it does run hot.

  • @Exitar15
    @Exitar15 27 днів тому +7

    Raise your hand if you have a 4k monitor....

  • @macmuchmore1
    @macmuchmore1 27 днів тому +21

    I upgraded my Ryzen 5 3600 to 7800x3d, while keeping the same 3070. My FPS almost doubled! Had no idea my cpu was holding back my gpu that much. CPU is DEFINITELY important in a gaming pc.

    • @Argoon1981
      @Argoon1981 27 днів тому +3

      Both are important, paring a strong CPU with a weak GPU, you will have the inverse.
      So the best is have a CPU and GPU combo that complement each other. But is certainly not easy to know what to buy.

    • @JonBslime
      @JonBslime 27 днів тому +4

      Yea obviously it’s important when you cpu bound basically 1080p and sometimes 1440p but at 4K which the video is referring to your 99% gpu bound where the cpu is not the main issue here…

    • @felixmdx
      @felixmdx 24 дні тому +1

      whats your resolution!?

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 24 дні тому +1

      Yes but did it double your 4k performance?

    • @vigilant_1934
      @vigilant_1934 23 дні тому

      @@JonBslime Not 99% at 4K. It depends on the game Steve showed about a quarter of these games being CPU bound at 4K. Did you watch the whole video?

  • @nannnanaa
    @nannnanaa 27 днів тому +9

    One thing is that I would rather be limited by GPU than by CPU, when limited by GPU, the frames are lower, but usually more stable than if I was limited by CPU- where stutters and hitches occur more- which I find more annoying. So I usually want my builds balanced towards more CPU power, especially with a high refresh rate monitor and playing more first person games.

  • @GewelReal
    @GewelReal 27 днів тому +45

    I can see that Broadwell-based LGA 2011-3 CPU on the thumbnail 👀

  • @ericpisch2732
    @ericpisch2732 27 днів тому +73

    I suspect I might have been one of the people that triggered this video, this video was extremely helpful and very insightful. Thank you very much for it. I'm gonna stop asking him for 4K benchmarking :-)

    • @martineyles
      @martineyles 27 днів тому +4

      I don't think he's proven the point. Ignoring 1% lows didn't help. Neither did ignoring the 5600 and 7600 CPUs.

    • @geebsterswats
      @geebsterswats 27 днів тому +11

      I’ve been asking for this type of video for over a year. I game at 1440p and had bought an RTX 4090, but still had my 5 year old 3700X CPU. Now i know I’ve been leaving performance on the table

    • @rooster1012
      @rooster1012 27 днів тому +17

      @@martineyles You missed the point of the video completely.....

    • @martineyles
      @martineyles 27 днів тому +2

      @@rooster1012 The video showed a lot of useless 1080p charts. From those charts I was unable to say whether a CPU could deliver the goods (60 fps consistently) at 4k. I needed to see the 1% lows at 4k. However, that's useless if it only shows an old CPU and the best available. We need those charts for other CPUs.

    • @lewzealand4717
      @lewzealand4717 27 днів тому +16

      @@martineyles This was an example video to show *why* CPU tests are done this way, not a comprehensive CPU test video. Look for other vids from HUB and others to find more specific data you need.

  • @andersjjensen
    @andersjjensen 26 днів тому +18

    I admire that Steve patiently, over and over again, explain this simple concept. That people can't get into their head that any system is going to cap out, in any given scenario, at the performance of the part that is the slowest in that particular scenario, is completely bunkers to me. It's only two bloody factors to pair for gaming. How fast can a given CPU go, and how fast can a given GPU go. The slowest is your resulting performance.
    If you want nightmare fuel, get into database performance tuning. Suddenly you're juggling memory speed, memory latency, core count vs core speed, NUMA domain configuration, and umpteen different storage performance metrics... spread over wildly different tasks

  • @paulmaydaynight9925
    @paulmaydaynight9925 27 днів тому +4

    so... 8K 120fps is still unobtainable -4 years after 8K 120fps TV's were available in japan-

  • @tommihommi1
    @tommihommi1 27 днів тому +46

    even with a RX480, going from a 3600 to a 5800X3D is a crazy difference in real world in game experience.
    The main difference is that you get rid of those spikes where the CPU can't keep up and the game noticeably chugs, even if they don't show up in the 1% lows at all

    • @dagnisnierlins188
      @dagnisnierlins188 27 днів тому

      Get intelligent standby list cleaner and there is a video on techyescity where you can unlock power settings in windows and change how often your cpu gets pinged for it's data, default is way too high. Just by this you system will be snappier and more responsive.

    • @martineyles
      @martineyles 27 днів тому

      It would be good to see whether a new CPU or GPU would help most, as my i5 6600 rx480 8GB combo sometimes chugs in Cities Skylines and SWTOR at 4k, but it's hard to tell whether it's CPU or GPU.

    • @dagnisnierlins188
      @dagnisnierlins188 27 днів тому

      @@martineyles try intelligent standby list cleaner and watch techyescity video how to unlock options in power plan settings, where you can change how often your cpu gets pinged for data. It will improve stuttering and responsiveness.

    • @SpringHaIo
      @SpringHaIo 27 днів тому +6

      @@martineyles download MSI afterburner and look at the GPU and CPU usage graphs. If GPU isn't at 90-100% usage, you have a CPU bottleneck. I would guess at 4k with a rx480 you're GPU bottlenecked.

    • @martineyles
      @martineyles 27 днів тому +2

      @@SpringHaIo I'm not always playing the latest games, and I get the impression that things like SWTOR and Cities Skylines may have CPU issues, but It's hard to get a definitive picture one way or the other. The GPU usage in AMD's overlay can keep going up and down.

  • @catwrangler420
    @catwrangler420 27 днів тому +1

    Been waiting for a CPU 4k video from you guys, thank you for the hard work!

  • @stylist_bitter6194
    @stylist_bitter6194 27 днів тому +1

    Cheers Steve, this is a great piece of content. I conceptually understood this, but this video did a great job of demonstrating the point and driving it home.

  • @klasssavage6581
    @klasssavage6581 27 днів тому +58

    The Ryzen 5 3600 lived a long and meaningful life. It is finally time to upgrade.

    • @Pikkoroo
      @Pikkoroo 27 днів тому +3

      I did in the beginning of 2023, 3600 to 7800X3D, and I’m still in 1080p @240hz. 200+ 1%lows is a game changer I can definitely see why people go over 240 but I’m good.

    • @ismaelsoto9507
      @ismaelsoto9507 27 днів тому +7

      Put a R7 5700X3D or 5800X3D and you're good until AM6 :)

    • @deviousnate7238
      @deviousnate7238 27 днів тому +2

      A year ago my R5 3600 started its 4k 60hz HTPC duty. Once my RX 6800 is free to join it there I expect that combo to last quite a while longer. Long live the 3600!

    • @klasssavage6581
      @klasssavage6581 27 днів тому +3

      My monitor is 1440p at 144hz. I'm still gaming on the Ryzen 9 5900X.

    • @nipa5961
      @nipa5961 27 днів тому +4

      The 3600 was an amazing budget gaming CPU for the last (almost) 5 years.

  • @Jojo_Tolentino
    @Jojo_Tolentino 27 днів тому +160

    Can't wait for the comments about bottleneck

  • @gerardfraser
    @gerardfraser 27 днів тому +2

    Thanks for sharing makes sense in the testing

  • @MrHamof
    @MrHamof 27 днів тому +5

    Cryengine games increase the LODs when you're at a higher resolution, under the theory that at higher resolutions it's easier to spot lower LOD models. This is why in some circumstances increasing the resolution can reduce your framerate in those games even if the GPU still isn't at 100%.

  • @MichaelChan0308
    @MichaelChan0308 27 днів тому +89

    Glad I upgraded my 3600X to 5800X3D to pair with 4090 for 4K gaming.

    • @Cole_Zulkowski
      @Cole_Zulkowski 27 днів тому +19

      so you gave nvidia an arm and or a leg with a lung for an overpriced gpu?

    • @45eno
      @45eno 27 днів тому +9

      The 4090 can get bottlenecked by a 5800x3d at 4K. If you do then make sure to run settings that will load the gpu down like Ultra.

    • @Deathscythe91
      @Deathscythe91 27 днів тому +7

      you saved up some money to pay the repair shop? for when the power plug melts?

    • @samson7294
      @samson7294 27 днів тому +36

      Enjoy mate! I got my 4090 about a year ago and I love it.
      Upgraded from a 8700k/2080 to a 7800X3D/4090
      Ignore the pocket watchers in the comments!
      All that matters is if you enjoy it.

    • @philliprokkas195
      @philliprokkas195 27 днів тому

      @@samson7294 2.5x the price for 30% more performance over something like a 7900xt looooool

  • @stevenguyen22
    @stevenguyen22 26 днів тому +3

    This was a great learning experience for me, thank you!

  • @jaybee5771
    @jaybee5771 20 днів тому

    Exceptional video explaining the situation. Thank, mate!

  • @IIIII47IIIII
    @IIIII47IIIII 27 днів тому +32

    you should rename your channel to HardwareMythbusters

    • @christophermullins7163
      @christophermullins7163 27 днів тому +1

      Actually a great name for a second channel for... Busting hardware myths

    • @martineyles
      @martineyles 27 днів тому +3

      Except he hasn't really busted any myths.

    • @awebuser5914
      @awebuser5914 27 днів тому

      They are obsessively focusing on pointless edge-cases, just like their incessant VRAM bleating...

    • @kainlamond
      @kainlamond 26 днів тому

      Is it really a myth though? It's really just common sense

  • @baxsb1578
    @baxsb1578 27 днів тому +7

    As someone running a R5 3600x with a 4070 super, and waiting for next gen cpu's, this as good as it gets clarification on putting more eggs in the cpu basket. Thank you!

  • @alifahran8033
    @alifahran8033 27 днів тому +7

    Yes, I am guilty of saying "CPU doesn't matter at 4K", but context matters here. I am saying it when its 7600 vs 7800X3D on 4K gaming with a non-4090 GPU. In my case 7600 + 7900 XTX. In this case the difference truly is almost unnoticeable without a FPS counter. There could be some very CPU limited titles, that might show the difference between the CPUs, but for the most part it is fine.
    At least for my case - single player, 4K native, Ultra/Very High, mostly highly GPU demanding games, the CPU doesn't matter as much. I could've probably used 5600 just fine as well. But who am I to pass on a 150 euro deal for 7600 one year ago, when at the time 7800X3D was newly released with a price tag of 450 euro.
    I am personally not willing to pay 3 times more, for less than a 5% increase of performance.
    Now that 7600 costs 200-220 euro and 7800X3D is 340 euro, it's obvious, that anyone who could afford it should buy it. It's really THAT good. But at the time, for the money, for my 4K use case it simply didn't make sense to buy 7800X3D.
    Hopefully Zen 6 is on AM5, so I can upgrade to Zen 6 X3D!

  • @EhNothing
    @EhNothing 25 днів тому +1

    Good data. Thanks for the info.

  • @unitybeing777
    @unitybeing777 26 днів тому

    Loved this benchmark review, very well thought about.

  • @The_Noticer.
    @The_Noticer. 27 днів тому +3

    nvidia overhead issue is never aknowledged on forums, its really weird.

  • @tyre1337
    @tyre1337 27 днів тому +10

    i've upgraded from a 9700k to a 13600k and my framerate has doubled in certain game scenarios
    i'm using a 7800xt at 1440p, and was told by people it wouldn't make a difference
    for reference i'm also using the exact same ram

    • @mmaayyssoonn8858
      @mmaayyssoonn8858 27 днів тому +1

      I had a 5600X clocked to the moon with ram at super tight timings and in a good amount of the games I played the 13600k I switched to either matched or did worse in performance. I found disabling 3 E-cores actually gave me a substantial lift in performance. I suspect it's to do with clearing up more cache for the other cores.

    • @H94R
      @H94R 27 днів тому +2

      2x the perf and a few quid back selling your old hardware. hard to argue with that! people seem to think anything above 1080p just bypasses the cpu and doesn’t get held back by outdated hardware as long as you have a shiny gpu 🤣 i went from an AMD Athlon x4 to a 4690k back in the day and the difference was crazy, after being told my HD7770 (i think) gpu wouldn’t really benefit from it.

    • @Personalinfo404
      @Personalinfo404 26 днів тому +1

      willing to bet the people that told you that were children

  • @ivanrozic9809
    @ivanrozic9809 27 днів тому

    Very informative, really. Cheers from Croatia!

  • @dannywinfield324
    @dannywinfield324 25 днів тому

    Great video. Very well explained

  • @vulcan4d
    @vulcan4d 27 днів тому +4

    A 5700x3d vs the 7800x3d would have been an interesting comparison with a 7700xt. I bet you they are identical until you hit a 4070ti super or better.

  • @MaxIronsThird
    @MaxIronsThird 27 днів тому +3

    Why can't people still understand this?
    If a CPU reviewer does a 240p run and 0.1% is 200fps, you know that in 16K it will have the same 0.1% lows(IF YOUR GPU CAN MATCH IT)
    The PC is always as fast as the slowest component(it varies depending on workload)

  • @VicharB
    @VicharB 27 днів тому

    Thanks for this, it clarifies certain questions/queries I had. Good one, kudos. In the meantime I am happy with my upgrade to R7 7700 with 32GB 6400 CL30, retaining older 3080Ti and able to play 4K games, when required using frame generation mods, like on Alan Wake 2.

  • @sam-rv8tp
    @sam-rv8tp 25 днів тому

    thanks! fascinating video, especially for people like myself who are considering one of the new 4k OLED displays in the near future as an upgrade from 1080/1440p

  • @ArmchairMagpie
    @ArmchairMagpie 27 днів тому +13

    I think the reason for why the CPU is often disregarded is that it's often forgotten that there are operations that can only be done on the CPU side. The CPU also has to wait for GPU operations to finish, but when it is finally its turn again, then it can do stuff, and the faster the CPU, the faster it can get its stuff done. This may be a round of logical or organizational tasks like sorting or iterating through collections etc., and there is a really technological limit. The CPU isn't just there to present you the Save & Quit button, but actually in charge of the whole game.

    • @GoldenSun3DS
      @GoldenSun3DS 25 днів тому

      I like the analogy that the CPU/RAM is the foundation of the computer and determines WHETHER you can play the game (or what FPS you can achieve) and the GPU determines how good the game will look.
      The argument of "CPUs don't matter as much at 4K resolution" isn't wrong, but it's using the wrong variable in the argument. What people are actually meaning is "CPUs don't matter as much if your target is 30FPS or 60FPS".
      I've seen benchmark videos before where the point of the video is to show how much performance you're missing out on with an old/weak CPU (saying that it's a waste to upgrade GPU if you're still on an old/weak CPU), and in their own graphs on screen, the old/weak CPU was still mostly hitting the 60FPS goal at maximum settings. Their words were saying one thing, but their own evidence was contradicting their point.
      A lot of people still are fine with or prefer 60FPS gaming, and oftentimes if someone is buying a 4K display, it's a 60Hz screen. So in a roundabout way, "CPUs don't matter as much at 4K resolution" is still TRUE unless that person specifically bought a high refresh rate 4K display AND they prefer high refresh rates.
      But with upscaling technology, your hardware still won't actually be running at 4K resolution, more like 1080P or 1440P.
      I love emulation, though, so I actually prefer a high end CPU + a medium end GPU. With emulation, even medium range GPUs can do 4K 60FPS, but the CPU/RAM is much more important.

  • @garbuckle3000
    @garbuckle3000 27 днів тому +4

    This makes a lot of sense and just confirms what CPU reviews say; CPU makes a difference for high-refresh gaming; that meaning above 60fps. How you get above that threshold (4k low settings or low resolution) is irrelevant. I used to game on a 6600XT with a R5 3600, and playing at 1080p, I noticed a difference when I upgraded to a 5700X. Now that I have a 6950XT and play mainly at 4k60, there is less need to upgrade my CPU, however I am still debating it for the times I want to increase my FPS (my display does 4k120) and for multitasking while gaming. For this reason, I'm looking at a 7900X3D as an upgrade. I do wonder how much lower prices will go when Zen 5 is announced, though, or if I should just get it now. Thank you, Steve, for explaining this in detail, especially since 4k gaming is becoming more prevalent.

  • @AK-Brian
    @AK-Brian 26 днів тому +2

    Really loved the methodical tone and approach of this video. Having it set up as a bit of an educational showcase should help a lot of people better understand how to plan for their upgrades, using information that might not always be intuitive at first glance. Those faster CPUs may often find themselves limited at higher resolutions, but the flexibility of being able to lower detail or apply upscaling lets them stretch their legs. It's a great option to have at your fingertips.

  • @leyterispap6775
    @leyterispap6775 27 днів тому +2

    Even is situations where personally i happened to upgrade the cpu prior to upgrading the gpu i found it interesting that in many games although the AVG framerates didnt change, the 1% lows felt way better and buttery smooth. when paired with a low tier gpu.

    • @benjaminoechsli1941
      @benjaminoechsli1941 27 днів тому +3

      That's fairly typical of a CPU upgrade. Definitely welcome!

  • @jeffreybouman2110
    @jeffreybouman2110 27 днів тому +6

    one thing to add to this: i went from a 3600 to a 5800X3D with a 3060 Ti.
    my max FPS stayed the same but my 1% and 0.1% lows got way better
    upgrading my CPU gave me much smoother gameplay ;)

    • @awebuser5914
      @awebuser5914 27 днів тому

      _"upgrading my CPU gave me much smoother gameplay"_ Yep, good 'ol confirmation bias in full-force...

    • @Amaan_OW
      @Amaan_OW 26 днів тому +3

      @@awebuser5914most tests show that the 3D cache chips do have vastly better lows than prior cpus. And even regardless 3600 to 5800x3d is a big jump

  • @4kORCHILL
    @4kORCHILL 27 днів тому +3

    Dang what camera are using to film this , video is crisp

  • @dirtyfunk1165
    @dirtyfunk1165 27 днів тому

    Great video! I happen to be using the 3600 at the moment and also happen to be eyeing off the 7800x3d as my next cpu. Thankyou!

  • @enkyenky9410
    @enkyenky9410 24 дні тому

    Thx for all te work thxxxx

  • @MrMeanh
    @MrMeanh 27 днів тому +5

    I have a 4090 paired with a 5800X3D gaming at 4k. I target 90-120 fps when I can (using DLSS and lowering settings) and in many newer games I've noticed that my 5800X3D is a bottleneck in some areas/scenarios. Luckily it's still not at the point where 60+ fps is hard to get most of the time, but in a year or two I suspect that a CPU upgrade will be more needed than a GPU upgrade for me to hit my fps target.

    • @randysalsman6992
      @randysalsman6992 27 днів тому

      It's most likely the games you are playing benifits less from memory and more from the CPU grunt, so if you had the R7 5800x non 3d I bet you'd have better performance. Also for the games that would benifit from memory the x3D would have gave you you could get most of it back from getting Samsung b-die memory RAM and OC it ( or under clock it lol) to as close as it can get with the cpu's fclock (double the f'clock), I find RAM at 3800MT's is the best you can get (it's what my 32GB are clocked too) and CPU's f'clock to 1900mhz. After that you tighten up the RAM's timings and subtimings as low (and stable) as you can get them and don't worry too much about how far you'll need to push the RAM's voltage because Samsung b-die just seems to love it, mine are set to 1.55v, any higher though you'd probably would want to start activily cooling those suckers. Oh and it never hurts to get yourself aftermarket RAM heatsinks and thermal pads cause that will enable you to use more voltage to push them further without the need of active cooling, I did that and I don't have any active cooling besides what my case fans are providing. I got my RAM to 3800MT/s CL14-14-14-14-28@1.55v, plus allthe subtimings lower too but I can't remember them off the top of my head. lol

  • @dkphantomdk
    @dkphantomdk 27 днів тому +3

    you should really put the GPU usage up next to the results, would be nice to see how much the GPU is limited at those settings.

  • @AvroBellow
    @AvroBellow 27 днів тому +1

    To Steve:
    Steve, this was another great video from you and I swear that you must have the work ethic of a Japanese robot because what you achieve through all of your hard work is just astonishing. This is one of the most misunderstood aspects of PC gaming and you did a fantastic job showing that while CPU performance at 4K isn't as relevant as GPU performance, it still matters because it sets the minumum gaming performance that a PC is capable of regardless of what video card is under the bonnet (I think that's how Aussies say it). I tip my hat to you and I'm really considering joining you on Patreon or Floatplane. I think that would be pretty cool. Good on ya mate!
    To everyone else:
    Steve's 100% right. It's not that the CPU is AS important as the GPU, it's that you need a CPU that's fast enough to deliver a framerate that you want. This is because CPUs aren't really affected by graphics settings as the number of draw calls it makes to the GPU will be roughly the same regardless of the resolution.
    A CPU bottleneck is far worse than a GPU bottleneck because there's really no way around it. I mean, sure, if you have Firefox open with 30 active tabs, closing it would probably go a long way to alleviate the CPU bottleneck but when Steve's testing CPUs in gaming, there's NOTHING open to slow them down. A GPU bottlenck is more common but far less critical because you can turn graphics settings down, employ upscaling or lower the resolution to get those framerates up. However, a CPU's max framerate is its max framerate.
    Ten years ago, you could OC the CPU to get better framerates (assuming that your cooling solution was good enough). This is why my attitude towards overclocking was to only do it when the CPU was getting old and no longer giving me the framerate I wanted. I would OC the CPU to give me time to save up for an upgrade and the only CPU that I never had to do that with was my FX-8350 because it gave me playable frame rates for the whole five years I used it.
    However, once I upgraded to an R7-1700, overclocking became more or less irrelevant because modern CPUs auto-clock themselves as high as possible on their own. While it's true that you technically can overclock CPUs like the R7-7700(X) and i7-13700K, the gains that are possible by taking that route are nowhere near the gains that were possible with CPUs from the FX/Sandy Bridge era, an era that ended about seven years ago when things switched to the Ryzen/Skylake era we live in now.
    Overclocking becomes even less relevant when one considers that the best gaming CPUs are AMD X3D CPUs and they can't be manually overclocked AT ALL. I actually like that because X3D CPUs, despite their incredible gaming performance, are forced to have relatively low TDPs are are thus much easier to cool since there is no PBO Max setting for them. This means that if I were to upgrade my R7-5800X3D to an R7-7800X3D, I would still be more than fine using my AMD Wraith Prism cooler that I got for free.
    This is also why I tell new builders to get an AM5 CPU that doesn't have a suffix like X or X3D because they get a usable cooler with it for less than an X model costs without one. I usually recommend the R7-7700 because it comes with a Wraith Prism that not only is far better than the Wraith Stealth, but is easily one of the most beautiful air coolers ever made and it scratches that itch that some young'uns have for RGB aesthetics (and I totally get that because I like it too) without them having to resort to some $100 240mm AIO. An AIO on an R7-7800X3D is a complete waste of money and that $100 would be far better spent on getting faster RAM which, while not relevant for an X3D CPU (because the 3D V-Cache is faster than the fastest RAM anyway), is completely relevant for unlocking the performance of CPUs with suffixes like X, K and F as well as CPUs with no suffix at all.

  • @nathangamble125
    @nathangamble125 8 днів тому +1

    I really love this video. It shows a ton of the nuance involved in bottlenecking which is overlooked in typical benchmarking videos.

  • @AronHallan
    @AronHallan 27 днів тому +6

    IT does, turn ray tracing and go to a very crowded area in Cyberpunk.

    • @thetruth5232
      @thetruth5232 27 днів тому +3

      Yup. My 5800X3D runs out of steam well before my 7900XTX with RT+PT at 1440p60. The frames drop into the 50's while the GPU runs below 80%.

    • @andersjjensen
      @andersjjensen 26 днів тому +1

      @@thetruth5232 Yikes. My 7950X3D does 90-95FPS very solidly, also on a n XTX at 1440p. My settings are, however, a custom mix of RT and raster.

  • @englematic
    @englematic 27 днів тому +3

    lmao it's pretty funny that you're using an R5 3600 for the low end. That's the exact CPU I have in the computer that's hooked up to my 4K TV, along with an RX 6800. I realize it's a pretty unoptimized build, but it was more of a spare parts PC than anything for streaming movies and the occasional older platformer or cozy indie game.

    • @christophermullins7163
      @christophermullins7163 27 днів тому +2

      Honestly that is a great balance for 4k. Similar balance to my setup 5600@4.7ghz and 6950xt at 4k.

  • @coolvinay
    @coolvinay 25 днів тому +2

    Seems like Steve has too much spare time to kill. Its not even a 5600.

  • @agent4seven856
    @agent4seven856 27 днів тому +2

    8700K team here. Still happy with what I'm getting out of it with my 3080Ti in terms of performance @1440p/4K and I OC'd it to 4.8GHz on all cores. I also don't mind perfectly stable 30FPS since I also play games on the PS5, but in no way 60 FPS is terrible, moreover, most AAA single player games can't even be played with over 120-144 FPS and in some cases locked to 120 or 144 FPS cuz they're locked to 60. Some games also have bugs and other stuff when running at higher FPS. My point here is that if you're only playing single player games, you'll be fine with 144Hz monitor, be it 1440p or 4K (16:9, 21:9 etc.) and in some cases 7800X3D won't help you get more than 120-144 FPS, even if it can deliver more performance cuz some games are locked to 60/120/144Hz.
    I don't care about competetive gaming, multiplayer games and such, so it doesn't matter to me if 7800X3D is better. I also won't be playing game at medium or low settings cuz I'm perfectly fine with locked 30/40/50/60FPS. I mean, 8700K is 7 yo already and the fact that it can still proved the level of performance I'm confortable with is one hell of an achievement after almost a decade. I do plan to upgrade the whole platform at some point, but it's hard to get latest high-end mobos where I live and... at this point I better wait for Ryzen 8000 series and new mobos. We'll see.

  • @band0lero
    @band0lero 27 днів тому +4

    7800X3D is a beast! I regret not giving enough attention to CPU performance in the past. Upgrading from 5800X to 5800X3D to 7800X3D has been great. Can’t stress enough how much better it is in 1% lows and overall performance stability.

    • @Pikkoroo
      @Pikkoroo 27 днів тому +2

      200+ 1% lows is a game changer.

    • @TheGuWie
      @TheGuWie 25 днів тому

      This is interesting information for me (running a 570x mb, a 5800X cpu, a 4090 gpu and a 240hz tft). Thx. But upgrading isn't just money but also time and stress.

  • @christophermullins7163
    @christophermullins7163 27 днів тому +6

    I made this same comment on daniel owens video on the subject. The only reason this video needs to be made or holds value for the community is because some people do not understand the way a pc works with regards to a gpu or cpu limitation. It is indisputable that CPU limited scenarios are less likely at higher resolution. It is ALWAYS beneficial to have a faster CPU in all scenarios.. that is obvious. However, if your favorite game is a very gpu demanding game and you are ok playing 4k 60fps then a cpu upgrade will most likely have ZERO impact on that situation where as it would at 1080p. Not every game/scenario can be treated like this but a LOT can be. This video is helpful only for the ignorant gamers that do not understand pcs on that level. This is no different than seeing Steve get frustrated with people that continue to demand "real world testing" by showing cpu benchmarks at 1440p or 4k.. That is simply not how it works and this video is for the same people asking for these "real world benchmarks". No offense to anyone.

    • @martineyles
      @martineyles 27 днів тому +2

      Except the video shows that the 3600 struggles with 1% lows and can't make 4k60 in several of the games. We only know that because 4k charts are there. The 1080p charts tell us literally nothing about the 1% lows the CPU can achieve at 4k.

    • @tiimhotep
      @tiimhotep 27 днів тому +3

      ⁠@@martineylesgo touch some grass man, you’ve replied to everyone’s comments saying the same thing. If you think there is further testing needed or they haven’t proven why 4K testing isn’t necessary, you simply didn’t understand the video.

    • @martineyles
      @martineyles 27 днів тому

      @@tiimhotep I think that there is nuance missing and that most other people don't understand that.

    • @awebuser5914
      @awebuser5914 27 днів тому +2

      Maybe it's that Steve is obsessed with being "right", even though it's invariably pointless edge-cases? That sounds more like it to me...

  • @larsjrgensen5975
    @larsjrgensen5975 27 днів тому +1

    Thanks for the bonus info of Low+RT vs Ultra settings

  • @Patrick-tw7nr
    @Patrick-tw7nr 27 днів тому

    Thanks Steve for this video as the timing couldn’t be better. I have been going back and forth on either upgrading my cpu and other parts of my system and upgrading GPU later or upgrading my GPU this year and upgrading CPU and other components later down the road.
    I currently have a Ryzen 3700X and 3070 build with 16gb of DDR 4 3200 RAM with a 800 w power supply. I currently game at 1440 p and play mostly single player games targeting 30-60 and 90 fps with mostly high/ultra settings with RT and some DLSS upscaling to hit those frames.
    I’m looking at possibly getting a 4080 S or 5080 so can play current games like Cyberpunk and upcoming and current PS5 games maxed out at 4K hitting mostly 60FPS with all bells and whistles on my 77 OLED with a controller. I will also play at 1440p occasionally as well.
    Would it make sense to buy a GPU or upgrade my cpu and mobo, ram first?

    • @VeloRakic
      @VeloRakic 26 днів тому +1

      From personal experience, I would recommend that you upgrade your CPU, RAM and mobo first, stretch the budget as far as you can to suit the best price to performance ratio (no need to go overboard for 10-20 FPS more for about 40-60% in price difference), that 3070 would unlock its full potential in any game.
      A good choice of CPU can save you about 7 years of further investment (2-3 GPU generations), whereas GPUs typically last about 3 years at their marketed resolutions 60+ FPS provided you are playing on ultra without that upscaling bullshit. So yeah, it would be wise to get a killer-deal CPU straight away and just wait for a 5080 or just snatch a 4080 when the price is right as it will provide much needed overall system balance. Hell you could even wait for a 6080 if you grabbed a 7700, 13700k, 14700k or 7800X3D right now.

    • @Patrick-tw7nr
      @Patrick-tw7nr 26 днів тому

      @@VeloRakic I appreciate the feedback. There are some banger deals on 7800X3D Mobo and Ram Combos through Microcenter so maybe just spring for that and get some more mileage out of the 3070.

  • @garrettkajmowicz
    @garrettkajmowicz 27 днів тому +4

    I'm still running a Ryzen 3600. I still don't see a large amount of value in upgrading so far. But then again I don't game a lot, am not very FPS sensitive, and wear dirty glasses. I mostly want my code to compile faster.

  • @Flaimbot
    @Flaimbot 27 днів тому +34

    and despite this video, there will still be people not understanding why cpu benchmarks are done at lower res

    • @martineyles
      @martineyles 27 днів тому +8

      Mostly because they are wrong. He ignored the 1% lows not hitting 60fps at 4k60. I don't care about getting a massive framework, but a nice reliable 60fps without stuttering is important. I saw wether a 3600 can hit it, but that's useless as I won't be buying one. We need to see the 5600 and 7600 included too, as these are the CPUs we would buy if we have a lower budget.

    • @rooster1012
      @rooster1012 27 днів тому +10

      @@martineyles You are missing the point, people make ignorant comments saying that the CPU doesn't matter at 4k gaming and he is showing it does.

  • @kanive1566
    @kanive1566 27 днів тому

    This was a great video!

  • @erictayet
    @erictayet 24 дні тому

    Great video Steve. Not everyone understands the test methodology. I'm surprised you didn't use a Zen 1 CPU. :D

  • @lazyboygamech
    @lazyboygamech 27 днів тому +5

    If you compare the normal 5600. The gap wouldnt be that significant. Because ryzen 3000 never known for good single core ipc. It just give more core to provide better offer than intel. It is ryzen 5000 new architecture that change the deal!

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 24 дні тому

      Thank you, we all know he deliberately picked an older cpu with X3D, if he wouldve atleast tested a 5600x3d the numbers would be much closer

  • @arradog3212
    @arradog3212 27 днів тому +5

    Recently upgraded from a 5950X with 64GB 3733CL14 to a 7800X3D with 32GB 6000CL30.
    I game at 4K with a 4080 Super, feels much smoother now.

    • @turboimport95
      @turboimport95 27 днів тому +3

      i went from a 5900x to a 7800x3d both using a 4080 and, i could tell a big difference, in lag spikes are now gone.

    • @Gielderst
      @Gielderst 26 днів тому

      I upgraded from a Ryzen 1700 + RTX 3070 to a Ryzen 7950X3D + RX 7900 XTX Red Devil Limited + 128GB RAM 6400MHz.
      And so far it's been pretty good.
      But i'll try to save up for Zen 5 / Ryzen 9000.

    • @turboimport95
      @turboimport95 26 днів тому +1

      @@Gielderst you wont need 9000 series for a long time, keep the same system for atleast 5+ years then upgrade the gpu maybe. I made a r7 1700 with r9 390x and 16gb ram last me like 7 years+

    • @Gielderst
      @Gielderst 26 днів тому

      @turboimport95 I know what you mean.
      But now, when i'm in a position where i could potentially upgrade one or two parts at a time
      if i have some saved up cash. Then i wouldn't mind that at all. I'm an enthusiast and really like hardware parts. And it's a hobby of mine to try and obtain the best stuff, if i can, of course. But that's not always the case, cause it's not easy to save up for these things. So i'll see. I hope to have enough saved up by the time Zen 5 and for example a Ryzen 9950X3D is out. Time will tell.

    • @turboimport95
      @turboimport95 26 днів тому +1

      @@Gielderst i would wait and watch the benchmarks first, If the 9950x3d does not improve fps or performance by at least 50% I would hold out until the generation that does. A 10-30% increase etc is not worth it at all.

  • @rgstroud
    @rgstroud 27 днів тому

    Steve, Thankyou for this review and the statistics I've been looking for that can help me decide when I need to upgrade. I went all out and got the ZEN 3 best of the best system about 3-4 years ago but started with the 3600XT on a Godlike X570 and a 6900XT LC ASUS, which later I upgraded to the 5950XT OC with 3800 CL14 DDR4 at about 31K on Cinebench and then later upgraded the GPU to a 4090. These slides show that the difference between my original 3600xt and the 7800X3D with the 4090 with no upscaling on ultra ray tracing 4k (how I PLAY) Shows a little below 20 FPS difference. Sinch I have my own stats on the FPS difference between the 3600 and my 5950XT OC, I can now extrapolate the difference using my 5950XT in your slides and I approximate about a 12 to 15 FPS difference. This is not enough to want to spend that kind of money this soon on ZEN 4. However, it may well show that the amount gained in Zen 5 or 6 will justify an upgrade then. Thanks again for this HWUB.

  • @AngryChineseWoman
    @AngryChineseWoman 27 днів тому +11

    Tech Deals likes this

  • @nipa5961
    @nipa5961 27 днів тому +17

    It's so sad to see how many in the comments didn't watch or didn't understand the video.

    • @olo398
      @olo398 27 днів тому +1

      welcome to the internet?

  • @PlayerOne101.
    @PlayerOne101. 27 днів тому

    Wanted to ask then does cpu make a big impact with frame gen? Either using official frame gen or mods? I get terrible ghosting and judder when using frame gen. I game mainly 4k on a 3080fe with a i7 7700k I thought before this video 4k was 90% gpu dependant.

  • @AetherProwl
    @AetherProwl 27 днів тому +2

    Why does MSI even bother paying for advertising they can’t keep these monitors in stock as is

  • @justinpatterson5291
    @justinpatterson5291 27 днів тому +20

    Hopefully my 5800X3D still sits well on these charts.

    • @christophermullins7163
      @christophermullins7163 27 днів тому +11

      We will never know as.. he didn't nest it here 😭

    • @calisto2735
      @calisto2735 27 днів тому

      @@christophermullins7163 bruh... There are plenty older videos with the 5800x3d on the channel, you can infere.

    • @DavideDavini
      @DavideDavini 27 днів тому +4

      @@christophermullins7163 I didn’t know CPU’s nested. 🤣💀

    • @christophermullins7163
      @christophermullins7163 27 днів тому +4

      @@DavideDavini :P

    • @DavideDavini
      @DavideDavini 27 днів тому +3

      @@christophermullins7163 it’s a very nice typo mate. Cheers. 🙂

  • @StingyGeek
    @StingyGeek 27 днів тому +6

    Awesome content. Thank you!

    • @gtech325
      @gtech325 27 днів тому +1

      What? He compares two cpus. This content is irrelevant to most of the world. Why he released this, I’ll never know.

    • @Hardwareunboxed
      @Hardwareunboxed  27 днів тому +6

      Not everyone is as blunt as you gtech.

    • @martineyles
      @martineyles 27 днів тому +1

      ​​@@gtech325The video is released to get views. There's a gap in new hardware to review for a while. Zero views for a long period would probably harm them in the algorithm when the new releases finally appear.

    • @StingyGeek
      @StingyGeek 27 днів тому +2

      @@gtech325 what he's done is used two CPUs, of very different performance levels, to demonstrate what a GPU upgrade means in that circumstance. He demonstrated that you can waste a shit ton of money on an expensive GPU by not keeping your CPU balanced, and that depends on your use case. To that end, he showed how you can test for CPU bottlenecking. At a time of bullshit GPU prices, this is one of the best videos from the hardware unboxed team.

  • @marsMayflower
    @marsMayflower 26 днів тому

    thank you for this... i've been wanting something that looks at 4k gaming and CPU

  • @michael8590
    @michael8590 25 днів тому

    thanks for making this comparison its so much work but really interesting! i’ve got the 7800x3d from microcenter $500 for the board 32gb of ram and the chip so i got an amazing deal but hundreds of dollars between cpus does matter for some people! i love how there is no overhead from nvidia staaling power away from the cpu i have a 7900xtx with the chip

  • @gametime4316
    @gametime4316 27 днів тому +3

    i actually would love a deep dive to see what happen with X3D CPU's when they become CPU limited and not latency limited.
    i believe that X3D CPU are only good for extreme FPS because they improve the latency a lot, but sooner or later a game that peg them to 90-100% *cpu use* will come ( let's say what happen to 4c/4t CPU today) and then they will be just as good as the same CPU without the 3D V-cash.
    best way to test something like that is to take 7700X and 7800X3D and drop the CPU frequency to like 1 or 2GHZ and test with heavy CPU game/s that have good CPU scaling :)

    • @thetruth5232
      @thetruth5232 27 днів тому +1

      The 5800X3D struggles to keep 60fps in cyberpunk with Raytracing. It Jitters just like any other CPU when it drops into the 50's at 90% usage.

    • @Deathscythe91
      @Deathscythe91 27 днів тому +1

      "what happen with X3D CPU's when they become CPU limited "
      your whole story here already answered your own question

    • @gametime4316
      @gametime4316 27 днів тому

      @@thetruth5232 are u sure that its not u'r GPU ? do u see low GPU usage ?
      my 12700 hold over 100FPS with PT if i test it with ultra performance DLSS
      (4070ti 1440P UP DLSS)

    • @thetruth5232
      @thetruth5232 27 днів тому +1

      @@gametime4316 Go to market areas with many NPCs and High crowd densities. CPU usage spikes there, from 50-65% up to 90%. I play 1440p60 with RT+PT and my XTX runs at 80-90%.

    • @gametime4316
      @gametime4316 27 днів тому

      @@thetruth5232 give it a try at 1080P or even 720P with the same FSR, if u'r FPS doesnt go up u are really CPU limited and its quite shocking.

  • @3dge433
    @3dge433 27 днів тому +4

    honestly this comparison would be better for me if the 5600 and 7600 were also included

    • @thegreathadoken6808
      @thegreathadoken6808 27 днів тому +3

      The point of the video isn't to give 3dge433 the benchmarks for the precise hardware they need to know about, but to make a point.

    • @martineyles
      @martineyles 27 днів тому +1

      ​@@thegreathadoken6808It doesn't make any point particularly well, and using only these 2 CPUs doesn't help.

    • @wertyuiopasd6281
      @wertyuiopasd6281 27 днів тому +4

      ​@@martineylesBecause you didn't understand the point of the video.
      The limit through put of a CPU at 1080p is the same at 4k, it just depends whether the GPU can display the same amount of frames.
      If the CPU can only display 80 frames per second at 1080p low and medium for example, it means no matter it won't display anymore frames than this.

    • @martineyles
      @martineyles 27 днів тому +1

      @@wertyuiopasd6281 I'd rather see the proof in a 4k chart than have to hope it's the case in every review that doesn't show a 4k chart.

    • @enderfox2667
      @enderfox2667 25 днів тому +1

      ​@@martineylesBut this video has 4k and even 1440p charts, alot of them

  • @Ultrajamz
    @Ultrajamz 27 днів тому +1

    Curious where the line is, would 7700x vs that 3d have any difference?

  • @RyanProsser0
    @RyanProsser0 24 дні тому

    Steve I’ve listened to a lot of hoursXdays of your conclusions and Q&A answers.
    And I gotta say, you really did up the IQ on the writing and delivery of the explanations in this video
    Top job and thanks for all you do
    My next CPU upgrade will be from R5 3600 to likely the 5700x3d
    Cheers

  • @tomtomkowski7653
    @tomtomkowski7653 25 днів тому +3

    C'mon.
    Test something more realistic like 7800x3d vs 5700x or 12500k because the statement that CPU doesn't matter in 4K is true but you have to be realistic.
    What next? 7800x3d vs Core2Duo to prove that even in 8K CPUs matter?

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 24 дні тому

      Exactly this video proved nothing. Just include 4k Benchmarks in your CPU videos, like its not that hard

    • @michaelangst6078
      @michaelangst6078 17 днів тому

      I think it's a decent test.. The Ryzen 3600 came out in 2019..... People lioke you are acting liek it came out in 2016....

  • @seamon9732
    @seamon9732 27 днів тому +6

    Love my 7800X3D.
    CPU market is very good atm... GPUs, not so much.
    For the same'ish price all I could get was a RTX 3070.
    And I don't upgrade them at the same time, if anyone says "should've put more of the budget on the GPU".

    • @kerotomas1
      @kerotomas1 27 днів тому +4

      you would have been way better off with a 7700XT or 7800XT, 8 gigs of vram is kinda pathetic these days especially for the price of a 3070. last gen Rx 6800 nowadays are dirt cheap too

    • @redlt194
      @redlt194 26 днів тому

      @@kerotomas1 Unfortunatly AMD video cards may as well be invisible to most. Their loss.

    • @andersjjensen
      @andersjjensen 26 днів тому +2

      An RTX 3070 is also larger chip (on an older node), 8GB of VRAM, half a motherboard's worth of components and a non-trivial cooler. Why people compare CPU prices and GPU prices is beyond me.

  • @sojasan1
    @sojasan1 26 днів тому

    I have an intel.2600k@4ghz6 and 980ti😅 ( still surprinsgly working for my standards) keep up the good work !

  • @pretentiousarrogance3614
    @pretentiousarrogance3614 27 днів тому +2

    I thought it was about if there is a CPU load difference with the res increase but I guess not.
    Also crazy to see how slow Zen 2 is these days.

  • @mttkl
    @mttkl 27 днів тому +5

    Just a couple hours in and, as usual, a third of the comment section is still not getting the point lol.
    Steve: "Don't focus on the components, focus on the FPS"
    Comment section: "But WHY a 3600!!1, you should have tested a 5800X3D! All reviewers are dumb and doing everything wrong!"

    • @martineyles
      @martineyles 27 днів тому

      Focus on the average FPS, but ignore the 1% low, frame stuttering etc. That seemed to be his message.

    • @enderfox2667
      @enderfox2667 25 днів тому

      ​@@martineyles He summarized his point in the last segment. The message is: "It's the framerate that's important, not the resolution. Looking at heavily CPU limited results that are often showing sub 60 FPS aren't useful for gauging CPU performance.". Hope this helps ❤

    • @martineyles
      @martineyles 25 днів тому

      @@enderfox2667 Looking at the 4k charts is what tells me whether a CPU can handle 4k60. That's the information I want to know, because my TV has a native resolution of 4k and a maximum framerate of 60fps. I just want 4k charts for the CPUs I'm interested in buying in the games I'm interested in playing.

    • @enderfox2667
      @enderfox2667 25 днів тому

      @@martineyles A CPU can handle 4k at the same fps as it would at 1080p. The bottlenecks there all come from the gpu. You can check the graphs in this video, and if you listen to Steve, he's trying to explain it to you! (if you dont want to watch the full video, you can skip to the summary at the end)

  • @terr281
    @terr281 27 днів тому +4

    Another reason to test at "low" resolutions (video comments around 21m mark), is that 1080p is still the preferred resolution for many gamers. Yes, 1440p (especially via upscaling) is gaining traction, but it has a LONG way to go. (The fact of the matter is... a much lower cost is had to game at 60, or even, 120 fps at 1080p when you factor in overall system costs.
    My household has yet to upgrade past 27" IPS 1080p 60hz good response time monitors. (Our monitors from 2018 haven't died yet.) Based on pricing today, our next monitors would be 27" IPS 2K 100hz good response time ones. (There is no need for 4k... in our opinion. You have to get there through software upscaling.)

    • @sjneow
      @sjneow 27 днів тому

      yup, Steam Hardware Survey data also suggest that most screens out there are still 1080p

    • @ThunderingRoar
      @ThunderingRoar 27 днів тому +2

      ​@sjneow And how many of those are shitty 8+ year old lap tops? Tims amazon referral data even from few years ago showed that the majority of new monitors purchased are 1440p.

    • @terr281
      @terr281 27 днів тому

      @@ThunderingRoar I'd argue it doesn't matter for testing purposes, especially since "decent enough" integrated graphics are now becoming the norm. Yes, these aren't in those 8+ year old laptops, but those new ones... .
      The enthusiast (desktop) gaming market first began to be killed by consoles, then "good enough" laptops, and now... the new generations (in general) care more about gaming on their phones and tablets. In reality, I'll admit it... I lied about 1080p as the most played gaming resolution. It is actually 360x800. (I had to look it up.)

  • @spoots1234
    @spoots1234 27 днів тому

    At the same FPS, 4k seems rougher to me since I upgraded to a 4k screen. It's only when I upgraded the CPU it started to feel as smooth as before. For whatever reason, there was the odd framedrop on a Ryzen 5500 that ironed out when I upgraded to a 7600. And running games at 1440p was the solution to the smoothness issue before I upgraded the CPU. Strange.

  • @pcrundown
    @pcrundown 27 днів тому

    Excellent video.

  • @danzydan2479
    @danzydan2479 27 днів тому +11

    Great content as always. Buy yourself a slab and enjoy the comments. Cheers mate.

  • @lharsay
    @lharsay 27 днів тому +12

    "CPU doesn't matter at 4K' was more or less true when the RTX2080Ti an later the RTX3090 were the top GPUs but with the 4090 it's very apparent that the increase of GPU performance has outpaced CPUs and I would even go as far to say that none of today's CPUs are good enough to completely utilize a 4090 in every game at 4K.

    • @erikbritz8095
      @erikbritz8095 27 днів тому +4

      Next gen might show this.

    • @tommihommi1
      @tommihommi1 27 днів тому +3

      it doesn't matter if your max framerate is limited by the GPU. A faster CPU will make the game just run better overall, with fewer spikes.

    • @gametime4316
      @gametime4316 27 днів тому +2

      well u are right and wrong at the same time.
      it's all about the game, cyberpunk 2077 with PT run at like 20FPS on 4090 at 4K with no upscaling... that won't be limited by intel 6700K
      so if u count on that scenario u can say that 6700K doesn't limit 4090.
      and if u look at CS 2 at 1080P u can say that 7800X3D limit 4060...

    • @Tech2C
      @Tech2C 27 днів тому +1

      Which begs the question, will we see much of an uplift with the RTX5090 if the CPUs can't keep it fed?

    • @totalermist
      @totalermist 27 днів тому

      @@Tech2C depends on the game I guess. Titles like Alan Wake 2 or Cyberpunk 77 with PT will surely see a big uplift in performance, whereas e-sports titles like CS 2 or more CPU constrained one like Baldurs Gate 2 probably won't see much of difference.

  • @glown2533
    @glown2533 24 дні тому

    thank a ryzen 9 3900xt with 4000mhz 1:1 sync ram still hold up with a newr high end gpu? like 4080 or 4070 super ti at 1440?

  • @eswecto6074
    @eswecto6074 27 днів тому

    Finally good 4K review, a little bit more 4K upscale/frame gen performance data and would be perfect.
    For 4K60 max settings you can easily go with R5 5600/R7 5700 + 4070TiS/7900XT and will get best for bucks combo.
    I am still doubt about 16/32Mb L3 cache for that settings. As B450 user I am looking more to the R7 5700, for B550 maybe pick R5 5600.

  • @user-jq1yb5mv8j
    @user-jq1yb5mv8j 27 днів тому +5

    hello, what about 5900x or 5800x3d. That would be interesting

    • @winterkoalefant
      @winterkoalefant 27 днів тому +1

      Watch one of their recent CPU reviews for that data. Just testing two CPUs in this much depth already took him like 100 hours of work.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 24 дні тому

      ​​@@winterkoalefantIf two CPUs being tested took 100 hours of work then he is terribly ineffecient and should look for a different job. Like lets say he works 8 hours a day. It would take him 12 and 1/2 days to do test on only 2 CPUs?!?!? Thats pretty sad

    • @vigilant_1934
      @vigilant_1934 23 дні тому

      @@CallMeRabbitzUSVI Steve likely does 12 hours or more some days. I'm sure he has the method down so it doesn't take him as long but it may depend on the test. Some tests you have to wait for it to complete no matter how efficient you are.

  • @zodiacfml
    @zodiacfml 26 днів тому +4

    ironic that as you increase resolution you actually less need of an expensive CPU system. consider that Ryzen AM4 3600 is now dirt cheap and the 7800x3d likely costs 5 times. If I simplify these results, you just need the x3d CPUs for competitive or fps gaming.

    • @pcmark-nl
      @pcmark-nl 25 днів тому +1

      What "you" need, isn't what anyone else needs. The point of the video is that the CPU does matter in a lot of scenarios: when fiddling with resolution and/or quality settings.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 24 дні тому +1

      Exactly! At 4k high/ultra the gauns were marginal, not warranted paying 5 times the price for the high end. That money can be spent buying a better GPU

    • @zodiacfml
      @zodiacfml 24 дні тому

      ​@@CallMeRabbitzUSVI I've been dreaming of getting x3d CPU someday because it looks so good in benchmarks/charts, only to realize that I don't play FPS games. Consider also DLSS or frame generation that increases fps without the help from the CPU

    • @vigilant_1934
      @vigilant_1934 23 дні тому

      @@CallMeRabbitzUSVI Some of that money saved should also go to a better CPU so you can build a balanced system. A cheap low end CPU and a super fast high end GPU don't match. Also the 7800X3D is not expensive or high end anymore and has been under $400 for a while and you could probably find it used for $300. That CPU will last you years longer than a 3600. It's an investment not a one time cost so you need to think long term not just how much you can save now. The lowest end CPU that won't be the bottleneck most of the time when paired with a fast high end GPU is a 5700X3D but I wouldn't go above a 4080 Super as far as GPU. A balanced system with an inexpensive but fast CPU and fast high end GPU would be a Ryzen 7 7700(X) and 4090. With a Ryzen 5 3600 I wouldn't go above a 3080/6800XT while a 3070 or 6700XT would be a better balance.

  • @derekp3961
    @derekp3961 27 днів тому +1

    This is a Superb video to teach people!

  • @ThorDyrden
    @ThorDyrden 27 днів тому

    Thanks - in deed that corrected my CPU evaluation somewhat.
    I knew it is not irrelevant, as my old i7 8700k (also a 6-core) was in deed limiting my gaming-performance already with a 3070@1440p - and was one reason for last years upgrade.
    But your numbers imply a rule of thump, that you define the maximum fps by choosing the CPU, measured at low-res/details in a certain game - and then shop for the GPU supporting the fps you target in your intended settings. So set the CPU-bottleneck high enough, that the GPU-selection is defining your FPS.

  • @richardhunter9779
    @richardhunter9779 27 днів тому +21

    Why are you considering average framerate to be more important than 1% lows? If anything, it should be the other way around.

    • @totalermist
      @totalermist 27 днів тому +13

      Because 1% lows tell you nothing really. Depending on the game engine (UE5 _cough_) frame drops can be unrelated to the hardware used, e.g. due to shader compilation/loading/cut scenes etc.
      It's simply not a very useful statistic in isolation. A single shader compilation stutter during the benchmark run can drop %1 lows significantly depending on the duration of the run. Ideally, you'd want to factor in the duration and frames rendered overall, but in that case frametime graphs as used by Other Steve are much more informative in terms of illustrating what's really going on. A reasonable compromise is looking at the discrepancy between 1% lows and averages, not just either on their own.
      Long story short: it's complicated.

    • @concinnus
      @concinnus 27 днів тому +3

      Lows are more disruptive to experience, but very inconsistent and hard to replicate. Ideally you'd want a scripted sequence for consistent results, but that means the built-in benchmarks, which never include the stutters from e.g. loading at level boundaries.

    • @kleinerprinz99
      @kleinerprinz99 27 днів тому +4

      Id say the average is important and then observe if you get 1% lows stutter/ jitter. Nothing out of context is useful.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 27 днів тому

      No average is more important. I'd rather choose a 100 fps experience with 60 1% lows than an 80fps one with 70 1 % lows. I know one is more consistent but idc.

    • @fracturedlife1393
      @fracturedlife1393 27 днів тому +1

      Limit frame time spikes, large frametime deviations and don't bounce between high and low FPS plz!!! Average never told the whole story

  • @LyroLife
    @LyroLife 27 днів тому +6

    but a 5600 or 5800 is still okay right?

  • @Blafard666
    @Blafard666 27 днів тому +2

    Thank you for this work, many of us still struggle with this concept...

  • @ToGrimmToWin
    @ToGrimmToWin 27 днів тому

    I know you didn't do it a lot, but using FSR dlss starts to make a bigger difference at 4K as well based on CPU. Because you're technically not playing at 4K anymore, I noticed that that isn't referenced or talked about amongst a lot of forums and such. They just see 4K and then say oh CPU doesn't matter but most people are playing at 4K run those settings so that the frame rates are better

  • @lexsanderz
    @lexsanderz 27 днів тому +3

    You should include GPU usage or GPU Busy standard deviation next to your FPS. It's a useless number if the engine can't use more than 80% gpu.

    • @andersjjensen
      @andersjjensen 26 днів тому +2

      It's also useless information that the engine can't use more than 80% GPU if you want that game to run at a certain performance level. With and without that information your options are exactly the same: Lower settings if your CPU has the headroom, buy a bigger GPU if your CPU has the headroom, or upgrade both if you're really screwed. Yelling "it's a bad engine" doesn't fix it.