Ryzen 7 9800X3D, Really Faster For Real-World 4K Gaming?

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 4,5 тис.

  • @Hardwareunboxed
    @Hardwareunboxed  11 днів тому +812

    I'd also like to thank Gamers Nexus Steve who wrote this for me, but we didn't end up adding it to the video. So please watch their review to support them if you haven't already: ua-cam.com/video/s-lFgbzU3LY/v-deo.html
    From Gamers Nexus:
    We currently run tests at 1080p and 1440p for CPU gaming benchmarks, though we mostly rely on 1080p results for comparison. Although we didn't bother for the 9800X3D review, we typically publish 1-3 1440p charts in games that are still mostly CPU-bound for perspective.
    There are a lot of ways to approach reviews. We view bottleneck testing as a separate content piece or follow-up, as it also starts getting into territory of functionally producing a GPU benchmark.
    What matters is a consistent philosophy: Our primary philosophy is to isolate components as much as possible, then as standalone or separate feature pieces, we run 'combined' tests that mix variables in ways we wouldn't for a standardized reviews. For us, reviews are standardized, meaning all parts (more or less) follow the same test practices. Introducing more judgment calls introduces more room for inconsistency in human decision making, so we try to avoid these wherever possible to keep comparisons fair. Choosing those practices is based upon ensuring we can show the biggest differences in components with reasonably likely workloads.
    A few things to remember with benchmarks that are borderline GPU-bound:
    - You can no longer fully isolate how much of the performance behavior is due to the CPU, which can obscure or completely hide issues. These issues include: poor frametime pacing, inconsistent frametime delivery, in-game simulation time error due to a low-end CPU dropping animation consistency despite good frame pacing, and overall quality of the experience. This is not only because it becomes more difficult to isolate if issues such as micro stutters are caused by the CPU or GPU, but also because the limitation may completely sidestep major issues with a CPU. One example would be Total War: Warhammer 3, which has a known and specific issue with scheduling on high thread count Intel CPUs in particular. This issue can be hidden or minimized by a heavy GPU bind, and so 4K / Ultra testing would potentially mean we miss a major problem that would directly impact user experience.
    - Drawing upon this: We don't test for the experience in only that game, but we use it as a representative of potentially dozens of games that could have that behavior. In the same example, we want that indicator of performance for these reasons: (1) If a user actually does just play in a CPU bind for that game, they need to know that even a high-end parts can perform poorly if CPU-bound; (2) if, in the future, a new GPU launches that shifts the bind back to the CPU, which is likely, we need to be aware of that in the original review so that consumers can plan for their build 2-3 years in the future and not feel burned by a purchase; (3) if the game may represent behavior in other games, it is important to surface a behavior to begin the conversation and search for more or deeper problems. It's not possible to test every single game -- although HUB certainly tries -- and so using fully CPU-bound results as an analog to a wider gaming subset means we know what to investigate, whereas a GPU bind may totally hide that (or may surface GPU issues, which are erroneously attributed to the CPU).
    One thing to also remember with modern 1080p testing is that it also represents some situations for DLSS, FSR, or XeSS usage at "4K" (upscaled).
    A great example of all of this is to look at common parts from 4-5 years ago, then see how they have diverged with time. If we had been GPU-bound, we'd have never known what that divergence might be.
    Finally: One of the major challenges with GPU-bound benchmarks in a CPU review is that the more variable ceiling caused by intermittent GPU 'overload' means CPU results will rarely stack-up in the hierarchy most people expect. This requires additional explanation to ensure responsible use of the data, as it wouldn't be odd to have a "better" CPU (by hierarchy) below a "worse" CPU if both are externally bound.
    We still think that high resolution testing is useful for separate deep dives or in GPU bottleneck or GPU review content.

    • @krazyfrog
      @krazyfrog 11 днів тому +30

      I don't know why this even needs to be explained in 2024

    • @timcooper1841
      @timcooper1841 11 днів тому +15

      Think most people "should" understand that the 1080p benchmarking allows to compare CPU. But people also require a way to understand whats the best way to spend thier money , which is why a quick average PC aka rtx 4060 1440p ish benchmark at the end would be useful. Because the reality is for most people , especially if you need new mobo/ram etc as well then your almost always likely to spend your cash on gpu upgrade vs cpu.

    • @jouniosmala9921
      @jouniosmala9921 11 днів тому +8

      But I complained about not using 720p. :D

    • @Y0Uanonymous
      @Y0Uanonymous 11 днів тому +3

      To the contrary, you shall do 1080p testing with lower graphical settings and some upscaling. That's where most people end up before buying a new PC.

    • @_Mystic1978_
      @_Mystic1978_ 11 днів тому +6

      I’ve literally outperformed all the results of your 9800X3D in 8 games so far at the same settings and 1080p, using my stock 14900KF and finely tuned DDR4. All the benchmarks are available on my channel

  • @FatetalityXI
    @FatetalityXI 11 днів тому +1457

    Surely 5090 is going to show how much more headroom 9800X3D has over other cpu's even in 4k.

    • @Hardwareunboxed
      @Hardwareunboxed  11 днів тому +546

      Yep it will and I'd imagine you'd keep the 9800X3D for another generation as well.

    • @Shini1171
      @Shini1171 10 днів тому +39

      @@Hardwareunboxed Isn't it the same case with the 7800x3D is it worth upgrading from it?

    • @darreno1450
      @darreno1450 10 днів тому +52

      I was lucky to pick up a 9800X3D, but it's just a steppingstone to the 9950X3D.

    • @aapee565
      @aapee565 10 днів тому +134

      @@Shini1171 I would say there are better uses for money than a 10% increase in CPU performance. Such as wiping your ass with dollar bills etc :D

    • @thewhiteknight9923
      @thewhiteknight9923 10 днів тому +21

      ​@@Shini1171no.
      9000 series might be the last new generation on am5. 10000 might happen but chances seems small imo. 7800x3d is plenty good for the next few years

  • @SAFFY7411
    @SAFFY7411 11 днів тому +486

    Thanks Steve.

  • @HoodHussler
    @HoodHussler 11 днів тому +1619

    Where r my 360p and 720p benchmarks at?

    • @YothaIndi
      @YothaIndi 11 днів тому +144

      140p or bust

    • @HoodHussler
      @HoodHussler 11 днів тому +63

      @ absolutely! Some of us still use CRT so where’s our invitations to the party?

    • @themarketgardener
      @themarketgardener 10 днів тому +77

      I’d honestly take a 720p benchmark for CPU testing ngl 😂

    • @kingplunger1
      @kingplunger1 10 днів тому +22

      ​@@YothaIndinot 144p ?

    • @YothaIndi
      @YothaIndi 10 днів тому +30

      @@kingplunger1
      Nah, that extra 4p will reduce the score dramatically 🤣

  • @kr00tman
    @kr00tman 9 днів тому +13

    To be fair, I totally understand where you are coming from, but as an avid 4k gamer (at least when im gaming on my personal rig) understanding what kind of performance uplift id get from upgrading from my 7800x3d to the 9800x3d in 4k even if its only a few fps is helpful

    • @cl4ster17
      @cl4ster17 9 днів тому +3

      Doesn't seem like you understood.
      All you need to know is how much faster one CPU is over another. Then extrapolate that knowledge to your system by simply checking if you're GPU-limited or not.
      Resolution is irrelevant to the CPU. Either it's fast enough for your use case or it isn't.

    • @kr00tman
      @kr00tman 9 днів тому +4

      @cl4ster17 that's a good way of looking at it but I don't think a lot of people fully understand that. My testing basically confirms that but that doesn't mean people like to see if there's a worthwhile improvement or not.

    • @mehedianik
      @mehedianik 9 днів тому +2

      @@cl4ster17 Your example works when you are comparing a CPU with your own CPU. That way, you have your own data to compare with to decide if it's a worthy upgrade or not. But when you are deciding between multiple CPUs, the resolution also matters. For example, if I want to decide between the 9700x and 9800x3d, a lower resolution will give me an idea regarding the actual difference between the performance of both CPUs. At higher resolutions, say 1440p, one might become CPU bottlenecked while the other doesn't. The performance gap will be closer than the lower resolution result. But how much closer? That’s what people want to know.
      Also, when comparing high-end CPUs, high-end GPUs make sense. But when you are comparing middle class CPUs, people mostly pair them with mid-end GPUs like the 4070 or 7800xt. Their GPUs become bottlenecks much earlier. If they lack understanding and only rely on the review data, they might think upgrading CPUs will give them similar performance uplift. They upgrade their CPUs and get the exact same performance they had earlier.
      That's why real-world data is also necessary, in my opinion, to assess different scenarios. This should be a part of the reviews. I understand the amount of work it takes, and I greatly appreciate reviewers' efforts to present us the data. It won't be possible for them to put in the day-one reviews, but eventually, they should include this data as well.
      Not because it's the right method of testing performance, but rather to give non-techy people a better understanding so they don't waste their hard-earned money for no to barely any performance gain.

    • @cl4ster17
      @cl4ster17 9 днів тому

      ​@@mehedianik People don't always run the highest settings and at this point the supposed real-world data goes out the window because GPU-limited "real-world" tests don't show how much headroom if at all the CPU has which could be turned into more FPS by reducing settings.

    • @joaovictorbotelho
      @joaovictorbotelho День тому

      Exactly.
      The biggest part of the audience wants to have an idea how the hardware would perform in their own situation.
      Beyond that, most users change their systems in the 3 years period (not me, i got a 12 years one. Any change would be a benefit). So it doesn't really make that much sense to prove some performance in a given scenario which isn't even a practical one.
      Certainly the audience appreciates the effort for making this kind os videos, but i guess there's no need to bash against the voices that claim for 1440p, since it's the actual standard for non pro or fast pace shooters players.

  • @trackgg586
    @trackgg586 10 днів тому +418

    1. This shows what kind of beast 5800x3d was and still it.
    2. It proves your point, obviously.
    3. It may be anecdotal, but I moved from 3600x to 5800x3d on 3080, while in CP77 my FPS was not significantly affected at 1440pUW, the 1%low spiked by roughly 50%, greatly increasing stability and getting rid of any stutters. That's also a thing to consider, outside of raw FPS performance.

    • @christophermullins7163
      @christophermullins7163 10 днів тому +34

      This is especially relevant for 40 series and raytracing. It increases CPU load and creates MUCH worse percentile lows. Faster CPUs always help. Basically almost always. Some games will see no difference but that is rare.

    • @renereiche
      @renereiche 10 днів тому +16

      I don't think your third point is anecdotal, spikes are just far more pronounced when CPU-limited, it's widely known. And above that, think Steve disproved his own point in this video with many of the included 4k DLSS gaming tests, where there was a significant difference. Imagine upgrading from a 3080 to a 4090 and get 50% higher framerate, which costs you $1600, not upgrading the CPU for $4800 to get an additional 30% performance in quite a few titles at 4k DLSS (~1440p native) doesn't make sense financially and with only 1080p tests and being told that there is no uplift at 4k (native) people wouldn't know that they are leaving so much on the table by not upgrading their 4 year old non-X3D CPUs.

    • @scottbutler5
      @scottbutler5 10 днів тому +11

      Jeff at Craft Computing tested higher resolutions for his 9800x3d review, and found the same thing - average FPS was pretty close to other CPUs at higher resolutions but the 1% lows were significantly higher.

    • @RobBCactive
      @RobBCactive 10 днів тому +5

      @@trackgg586 Once again this video showed the value in the 5800x3D bought near its floor price in extending the useful life of the extremely solid AM4 Zen3 platform.

    • @lczp
      @lczp 10 днів тому

      Hello sir, I'm thinking on a similar upgrade, from the r5 3600 to the 5700x3d. Outside of CP77, would you say the upgrade was worth it? Did you gain more avg fps in total, disregarding the more stable 1% lows?

  • @SgtRock4445
    @SgtRock4445 11 днів тому +796

    I watched this video in 480p so that I get the most Steve's per Second.

    • @YothaIndi
      @YothaIndi 11 днів тому +40

      If you watch HU & GN at 144p simultaneously you obtain the highest SPS possible

    • @nielsenrainier7710
      @nielsenrainier7710 10 днів тому +20

      watch in 72p and userbenchmark will just pop out of your screen XD

    • @kosmosyche
      @kosmosyche 10 днів тому +14

      I find Steve to be very CPU-bottlenecked.

    • @nipa5961
      @nipa5961 10 днів тому +2

      @@YothaIndi That's cheating!

    • @BladeCrew
      @BladeCrew 10 днів тому +3

      I am watching Steve in 1440p 60fps on a 1080p 144Hz display.

  • @orangejuche
    @orangejuche 11 днів тому +757

    I'm glad that Steve is getting some rest for his poor feet.

    • @michahojwa8132
      @michahojwa8132 11 днів тому

      You know where this is going - do tests with oc/uv, med-high details, fsrp xessp and dlssp and Path tracing - because this is realistic 4k and 4090 is 1080p card.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 11 днів тому +5

      Thank 9800x3D

    • @lightofknowledge713
      @lightofknowledge713 10 днів тому +8

      Its really sad that he got stabbed by the arrowlake😢 to the knee

    • @jacobgames3412
      @jacobgames3412 10 днів тому +2

      ​@@lightofknowledge713very sad😢

    • @Beardyabc88
      @Beardyabc88 10 днів тому

      Surprised he wasn't standing up haha

  • @KimBoKastekniv47
    @KimBoKastekniv47 9 днів тому +46

    I fully agree that CPU testing should be done at 1080p, but I can't help but wonder why 4K balanced was chosen to prove the point. The input resolution is closer to 1080p than it is to 4K, why not quality mode?

    • @Skeames1214
      @Skeames1214 9 днів тому +4

      Quality mode would still be closer to 1080p than 4k, and it was the middle option between the two more popular options (Performance and Quality) in the poll. People should be able to work out that scaling will increase if you choose Performance, and decrease if you choose Quality.

    • @esportsfan2118
      @esportsfan2118 8 днів тому +25

      because they dont actualy want to show real 4k or 1440p results, and only want to stick to 1080p. they pretty much made this video out of spite for the comments and are trying to justify not showing 1440p/4k results in their cpu reviews.

    • @toddsimone7182
      @toddsimone7182 8 днів тому +9

      Should have done 4k native

    • @enmanuel1950
      @enmanuel1950 8 днів тому +3

      @@esportsfan2118There's a very clear example on this video showing that if you test on 4k (yes even on 4k balanced) testing 3 cpus (5800X3D, 5800X and 3950X) yields the same result for the 3 of them. Even though we know the 5800X3D is significantly faster than the other 2 and will achieve considerably more performance not only on 1080p but as well as 4k if you pair it with a gpu capable of keeping up with it.

    • @esportsfan2118
      @esportsfan2118 8 днів тому +5

      @@enmanuel1950 yes, and it would be just like that with 5800X3D, 7800X3D and 9800X3D if they showed NATIVE 4k results. but they dont want to show people that it would be no use upgrading if u play in 4k native...

  • @RadialSeeker113
    @RadialSeeker113 11 днів тому +221

    @Hardware Unboxed Sim games and city builders seem to benefit the most. Missing games like anno 1800 and MSFS are vital to determine just how far a CPU can push. On avg a 9800x3d is about 50-60% faster than a 5800x3d without GPU restrictions which is completely insane.

    • @rasteek
      @rasteek 10 днів тому +6

      THIS!

    • @themarketgardener
      @themarketgardener 10 днів тому +15

      Tarkov benefits from this too because of poor optimization🫠

    • @Hardwareunboxed
      @Hardwareunboxed  10 днів тому +85

      They are just other games that can be used to measure this. We saw plenty of examples in our 9800X3D review where the 9800X3D was 50-60% faster than the 5800X3D.

    • @ValenceFlux
      @ValenceFlux 10 днів тому +5

      I get about 60-90 fps in Starfield 4k on 5900x 4070TI.
      I get a little less in city skylines 2 and a lot less in the first game.

    • @kognak6640
      @kognak6640 10 днів тому +20

      I would be very interested to see Cities Skylines 2 tested properly. The CPU scaling is massive with it, 64-core is technical ceiling, the game can't use more. This is absolutely unique in gaming world, nothing comes even close. However framerate is not directly tied to CPU performance, it's simulation side slowing down if CPU can't handle it(when the city grows larger). First you lose faster speed settings, then everything just moves slower and slower. I downloaded 500k size city and tested it on my 5800X3D. Framerates stayed same as always but simulation ran only 1/8th speed. It's like watching a video 1/8th speed, simply not playable. Because 100k city runs very well, I'd say 150k is max city size for 5800X3D. Basically you could find out how big city a particular CPU can handle at least on slowest speed setting(1x). No one has done it.
      Btw, if any Cities Skylines 2 player is reading this and thinking what CPU you should get, just buy one with most cores you can afford. But because no one has made tests, it really difficult to say if AMD or Intel approach is better. 9950X is probably safest bet for best CPU in consumer space.

  • @RafitoOoO
    @RafitoOoO 11 днів тому +439

    He's not standing so this might be good.

    • @Hardwareunboxed
      @Hardwareunboxed  11 днів тому +159

      It's neither good nor bad :D

    • @xRaptorScreamx
      @xRaptorScreamx 11 днів тому +95

      @@Hardwareunboxed then you should've lean on the desk :D

    • @1Grainer1
      @1Grainer1 10 днів тому +69

      @@xRaptorScreamx True Neutral Steve, floating in the middle of the screen

    • @GalloPhilips
      @GalloPhilips 10 днів тому +5

      @@Hardwareunboxed 😂

    • @ReSpAwNkILeR
      @ReSpAwNkILeR 10 днів тому +6

      @@Hardwareunboxed its the inbetween version

  • @dloc2907
    @dloc2907 10 днів тому +5

    Great info and completely agree. However I think we are all wondering if it worth an upgrade. So maybe show test in 1440p or 4k against a bunch of older and newer cpu's.

  • @mihaighita8553
    @mihaighita8553 11 днів тому +49

    I think one of the best use cases for the 9800 x3d should be MS Flight Sim with all the streaming/decompressing going on. Maybe you could add a comparison for that one too.

    • @andreiavasi7600
      @andreiavasi7600 10 днів тому +1

      My 9800x3d arrived today for that exact game (msfs) :). It’s keeping my 4070 super in 40% bottleneck on airliners with ground and buildings turned way down since those kill the cpu. Plus i can’t use frame gen or dlss because then i get stutters.
      So i can’t wait to plug this baby in. Plus msfs2024 will optimize for multicore even more, which is great for the next gou upgrade, so i can avoid splurging on cpu again.

    • @Hussar-fm8iy
      @Hussar-fm8iy 10 днів тому +5

      9800x3d destroys 285k in MS flight sim even in 4k.

  • @keithduthie
    @keithduthie 11 днів тому +427

    Thanks Steve. I have two hobbies - bitching _and_ moaning.

    • @worthywizard
      @worthywizard 11 днів тому +89

      You should try whining, it’s often overlooked

    • @__-fi6xg
      @__-fi6xg 11 днів тому +32

      i feel like crying like a little girl is the hardest one to master

    • @thepadonthepondbythescum
      @thepadonthepondbythescum 10 днів тому +19

      Don't forget being a victim of Corporate Marketing!

    • @marktackman2886
      @marktackman2886 10 днів тому +3

      Steve is waiting for these people like a bald eagle perched on a branch, waiting to strike, I would know...I'v happily been a victim

    • @n8spL8
      @n8spL8 10 днів тому +2

      i go for ranting and raving personally

  • @aiziril
    @aiziril 10 днів тому +25

    Saying that it's not about raw performance, but more about having headroom when and where you need it (which depends on the game played) is a really smart way to explain this.

    • @imo098765
      @imo098765 10 днів тому +3

      When you have enough enough CPU its amazing
      The moment you hit that CPU wall its lower fps and horrible 1% lows. Its a stuttery mess

    • @FurBurger151
      @FurBurger151 10 днів тому

      @@aiziril Car salesman talk if you ask me.

  • @klevzor
    @klevzor 10 днів тому +1

    Been waiting for a 4k test of this processor! Thanks. Considering moving on from my 5900x now

    • @CookieManCookies
      @CookieManCookies 10 днів тому

      These 4k graphs are a work of art, good job steve! I'll be back for your 5090 reviews, hopefully in 4K with 4K benchmarks!

  • @giantnanomachine
    @giantnanomachine 10 днів тому +5

    Thank you, this is a video I have been really hoping for from one of my trusted review channels. Seeing how CPUs hold up (or don't) over HW generations is extremely helpful for me, since when I build my systems I usually upgrade the GPU once or twice while sticking with the same mainboard and CPU. Seeing what one given CPU or another can achieve with a 4090 (which I wouldn't buy due to price) at 4k with high quality settings today is a valuable indicator for me since in a couple of years that will be what it can achieve with a 7060Ti or 7070.

  • @atariplayer3686
    @atariplayer3686 10 днів тому +7

    Thank you Steve for the awesome benchmark & all the hard work you have put into this video 😊👌

  • @moritzs3207
    @moritzs3207 10 днів тому +6

    Great video, thanks a lot for the refresher. Maybe in future reviews stress the fact that 1080p performance to an extent shows the CPU limit in terms of FPS, which can be extrapolated just fine to higher resolutions IF you have the graphics card allowing this. I think a lot of people are missing that.

    • @leemarks1153
      @leemarks1153 9 днів тому

      We should upvote this to the top because so many people still don't seem to get it

  • @SchmakerSchmoo
    @SchmakerSchmoo 10 днів тому +27

    One minor anecdote from the "vocal minority" that I think may have been missed is how low resolution benchmarks are used to justify "over buying" the CPU on a mid range build. Someone will be doing a build for ~$800 and you'll see tons of comments about "7800X3D is 50% faster - you must get it!" but these comments ignore the budget of the build and the fact that opting for such a CPU means significantly downgrading the GPU.
    A 7800X3D + 4060 is going to be a lot worse then a 7600X + 4070S in most titles.
    It is misinterpreting the graphs on both sides but only one side seemed to have gotten called out here.

    • @memitim171
      @memitim171 10 днів тому +3

      If someone is using 1080P benchmarks to justify over buying a CPU they are doing it wrong and that isn't Steve's fault. All they can do is provide the data, all data can be misused or misrepresented and I think most of us would rather they just continued giving us the data than attempting to tackle blatant stupidity, which is always a race to the bottom.

    • @Coolmouse777
      @Coolmouse777 9 днів тому +2

      Just look at GPU benchmarks and compare 2 numbers, it's not hard to do.

    • @Skeames1214
      @Skeames1214 9 днів тому +2

      @@Coolmouse777 The number of people who are behaving like children is a little disturbing. “Why aren’t you a repository for every benchmark I could ever want???? Where is my favorite graph????”

    • @Stars-Mine
      @Stars-Mine 9 днів тому

      gpus you just swap out in 3 years.

  • @donnys9259
    @donnys9259 10 днів тому +24

    Hats off to you Steve for pushing out this video pretty fast after user comments from your previous video. It was good to get a refresher. Very useful. Thanks. 🙏

  • @zodwraith5745
    @zodwraith5745 10 днів тому +46

    This discussion will never end because both sides have a point.
    2:40. This is a good point that I think gets missed a lot. When you focus on competitive gaming that's tuned to be lighter on GPUs then CPU performance at 1080p is going to be FAR more important. But if you _don't_ play just competitive games what people are asking for is knowing _where_ those diminishing CPU returns are BEFORE we buy new hardware. Of course I can test my own system to see where a bottleneck occurs but how am I supposed to magically know that if I _DON'T_ own the hardware yet?
    Anyone that's asking for 1440 and 4k and thinks 1080 is useless is a f'ing idiot so don't lump all of us in with them. What the reasonable ones of us that want more than just 1080p are asking for is _exactly_ what GN does in your pinned message, just throw in ONE "don't overspend on your CPU" bench to let people see _where_ a lot of games slam into a really low GPU bottleneck. Even better if you throw in a 4070 or 6750xt because if you hit that bottleneck with a _4090_ at 1440p? That's a minefield for anyone with a midrange GPU, and you _still_ only used 4090 testing which completely ruins your claim this is "real world" testing when the vast majority of people DON'T own a 4090. The ones that do already have a 9800X3D on order so they aren't watching this anyways.
    We aren't stupid. We KNOW 1080p is the best option for CPU testing and expect it to be prevalent in CPU reviews. The issue is it's ONLY good for CPU vs CPU testing and some of us WANT to know where those GPU bottlenecks kick in. I think Daniel Owen touched on this best. Reviewers are laser focused on the testing of CPUs only, but some of the viewers are looking for real world buying advice when they're watching reviews.
    We're not challenging the scientific method, we're asking for a convenience instead of having to collect the data of GPUs ranked with only $500 CPUs at max settings, CPUs ranked with only $2000 GPUs at minimal settings, then trying to guess where a mid range GPU and mid range CPU will perform with mid range settings. There's almost NO reviewers out there that give you this content, except of course Daniel Owen that will sit for an hour and bounce through all kinds of settings and different CPUs and GPUs. But that's only helpful if he features the game you're interested in.

    • @mikenelson263
      @mikenelson263 10 днів тому +9

      I really hope these channels understand this. The testing is important at base, but it does not do a very good job translating to buying decisions. With the amount of product review cycles and the number of channels involved, it would have been helpful to have built a predictive model by now. Rather than dump all these data points on the review screen, give the review presentation ample time to query the model for best advice for real world decision-making scenarios.

    • @timcox-rivers9351
      @timcox-rivers9351 10 днів тому +6

      I was nodding along with what you were trying to say until you got to saying that Daniel Owen who does try with multiple settings, is still not enough information because it's not the specific game you want him to test. It sounds like there is going to be no review that is going to give everyone what they want, b/c what people really want the reviewer to do is pick their favorite game, their current CPU and their current GPU and then a direct comparison showing the exact same game with the exact hardware they have on pcpicker. That's ridiculous.

    • @Mathster_live
      @Mathster_live 10 днів тому +5

      Both sides have a point? The side where they're asking for irrelevant and redundant testing on higher resolutions when there's a GPU bottleneck?
      Even a 5600x is enough for most users, anything above that on higher resolutions, you're GPU bottlenecked. I don't get why it's so difficult to understand, better cpu = will be relevant and scales well on multiple GPU generations. The 1080p results tell you that if there was no GPU bottleneck then you would get that FPS in ALL resolutions, so it clearly tells you how long your CPU will stay relevant throughout the years if you remember what the FPS cap was if there was no GPU bottleneck.
      Like it's surprising how many people like you will find a middle ground between reviewers and people who are clearly misinformed, they don't have a point, they're just wrong and are asking for more tests to waste time.

    • @zodwraith5745
      @zodwraith5745 10 днів тому +1

      @@timcox-rivers9351 Well obviously that's not what I'm saying, but even if it's _not_ your specific game what other reviewer do you know that goes through as many settings, resolutions, and hardware configurations as him? Even half?
      I literally can't name anyone else that will show you a benchmark with a midrange GPU, midrange CPU, 1440p at 5-6 different settings and upscaling. And that's still a LOT more info of where bottlenecks can pop up than anywhere else.

    • @zodwraith5745
      @zodwraith5745 10 днів тому +3

      @@Mathster_live Because it shows you _where_ the GPU bottlenecks occur. If you can only see CPUs tested with a $2000 GPU at low settings, and GPUs with a $500 CPU at max settings, WTF does that tell you about a mid range CPU and midrange GPU at midrange settings? You're left guessing between 2 completely different numbers.
      If we get just *_ONE_* "don't overspend" benchmark at 1440p with a GPU that ISN'T the price of a used car if you see a bottleneck you know, "ok, you need a stronger GPU for this CPU." If you don't see a bottleneck you know "OK, this CPU I'm looking at right now is totally worth it!" SO much shit is hidden in the middle.
      Besides this video has heavily skewed results to begin with because he only showed heavily upscaled 4k "balanced" meaning really only 1254p, and he only used a 4090. He did everything to _STILL_ make the GPU not the bottleneck. How is that "real world?" Does everyone you know own a 4090? Cause I only know a couple.
      Yeah this video _was_ a waste of time but only because he purposely didn't show what people were asking for. Not to mention if you looked at his pinned comment Steve from GN *_DOES_* mix in a few 1440p slides to let you know where bottlenecks can kick in fast, and he _doesn't_ upscale the shit out of it either. So if GN does it, why doesn't benchmark Steve do it?

  • @Zemla
    @Zemla 11 днів тому +49

    I haven't seen the video yet, I have a rtx 4080 playing at 4K.
    I switched to 9800x3d from 5700x and it EXTREMELY helped me.
    Almost no stutter where before it was too unplayable, higher FPS, minimized fps drops etc.
    worth it, money well invested.!!!!!
    I should have done the upgrade much much sooner. to 7800x3d

    • @JohnxYiu
      @JohnxYiu 10 днів тому +4

      good to hear that! I'm using 11700k and having some stuttering issues playing RDR2 and Witcher 3, now I'm planning to upgrade to 9800x3D with the hope to have this issue fixed.

    • @anthcesana
      @anthcesana 10 днів тому +2

      Exactly why the 1080p only testing is almost useless for actual resolutions people play at. I found the same with my 5800X3D upgrade

    • @AndrewB23
      @AndrewB23 10 днів тому +7

      Uhh doubtful 😂

    • @lycanthoss
      @lycanthoss 10 днів тому

      Maybe you changed something else?
      I'm running a 12600K + 4080 myself. I recently reinstalled Windows 11 when moving to a new NVMe so I had to take out the GPU. Somehow, and I genuinely don't know how, XMP got disabled so I was getting massive stuttering because I was running at the baseline 2133 MT/s of my DDR4 kit. I didn't understand why I was stuttering so hard until I noticed in task manager that the RAM was running at 2133 MT/s.
      I was going to blame Windows 11 24H2 or the new Nvidia drivers with the security fixes, because I don't think removing the GPU or replacing the first M.2 slot should disable XMP?

    • @Nissehult23
      @Nissehult23 10 днів тому +2

      Same. My main game World of warcraft is doing insanely well, more than doubled my avg FPS.
      Not only that, but on Space marines 2 fps (with the high res DLC) went from 60~fps to 100~120.

  • @troik
    @troik 8 днів тому +2

    thank you for this video, here is why I'd like to see more 4k testing and what I would actually need from that:
    I totally understand why low resolution testing is done, to isolate the CPU performance and keep the GPU Performance mostly out of that, to make CPUs comparable. My counter-argument to that would be that if it's not actually about the fps, but the difference between the CPUs, then a Cinebench score or looking up the gflops of the CPU, would get me 90% there (I admit that X3D makes this more complicated).
    My thinking is this, I'm gaming at 4k/120fps (at least as a target) so I'm GPU bound most of the time. With my aging 9900k in some games I sometimes reach 50-60% CPU utilization. That suggests that I might still have headroom, but 100% would mean perfect utilization on all 16 threads and no game scales that perfectly across all threads, so it might be that I'm CPU bound on a frame here and there without realizing it.
    Switching to a new CPU won't double my performance, I might get make 10-20% depending on the hardware choices I make (board, ram etc).
    So most likely I'd just see my CPU usage go down to 20-30% ingame.
    Now I come to my question, is it possible, due to how modern CPU works, that at 30% utilization, a CPU behaves differently than at 100%?
    Would it be possible, if not everything scales perfectly, that at 30% one CPU might be more effizient than another CPU and that it might be different compared to at 100%?

  • @kentaronagame7529
    @kentaronagame7529 10 днів тому +25

    Ever since the 9800X3D dropped Steve has been smiling a lot in his thumbnails 😂

    • @codymonster7481
      @codymonster7481 10 днів тому

      These channels are selling out one by one. It's like they are all falling in line, echoing those pre-launch benchmarks from nVidia/AMD to make things look better than they are. If you don't show 1440p/4K tests, then the data is totally useless and any claims of "best cpu evah" is totaly fraudulent.

    • @nielsenrainier7710
      @nielsenrainier7710 10 днів тому +4

      @@codymonster7481 so you're telling me that a 7600 is as fast as 14900k because they're pushing the same FPS in 4k?

    • @PC_Ringo
      @PC_Ringo 10 днів тому +6

      ​@@codymonster7481 tell me you didn't watch the video without telling me that you didn't watch the video. Kek.

    • @jompkins
      @jompkins 9 днів тому

      @@PC_Ringo hahahaha yes

  • @Dayanto
    @Dayanto 10 днів тому +14

    13:45 You accidentally showed the 1440p poll instead of the 4k one.

  • @evanractivand
    @evanractivand 10 днів тому +73

    Got the 9800X3D and no regrets so far. I play on 4K usually with DLSS quality or balanced to boost frames on my 4080, and it has significantly improved things. MMO's and games like Baldur's Gate 3 now don't have any CPU bottlenecks, it's made a pretty large difference to my average framerates and 1% lows in a number of CPU heavy games I play. I came from a 12700K that I thought I wouldn't get much benefit upgrading from, but boy was I wrong.
    At the end of the day you need to look at the games you play and figure out how much money you want to throw at your PC to get a level of performance you're happy with.

    • @worldkat1393
      @worldkat1393 10 днів тому +1

      Been wondering if I go from a 7700x to it for MMOs in particular.

    • @evanractivand
      @evanractivand 10 днів тому +7

      @@worldkat1393 Yeah I tested it in New World the other day, previously on my 12700K it would drop to 50-60 FPS in towns with high population with a fair bit of stutter. On the 9800X3D the stutter is all but gone and it averages 90-100 FPS. So for CPU bound MMO's, it's a big diff.

    • @5ean5ean22
      @5ean5ean22 10 днів тому +9

      ​@worldkat1393 once you go 3d, you don't go back.

    • @1984Captive
      @1984Captive 10 днів тому +2

      I had 7900x to 9800x3d. I used process lasso to prioritize games on one chip and other programs to the other. I would give the gaming chip priority, high response, etc to try to reduce fps dips or lag spikes. It helped for sure, but still happens. With this new chip it's barely there and helps with immersion of any gaming experience.

    • @laxnatullou
      @laxnatullou 10 днів тому

      Hi! What board that u use?

  • @TT-ix5yr
    @TT-ix5yr 9 днів тому +1

    thank you its nice to see benches at 4k just to see the same settings I want to use which feels much more real

  • @wolfgangvogel5407
    @wolfgangvogel5407 10 днів тому +34

    This is a weird test tbh. Why is upscaling on balanced mode, it should be with quality or off. The claim was, in a gpu limit cpu differences are far less important. You test that, which is cool but then you turn upscaling on balanced which reduces gpu limit again, am I missing the point of this test?

    • @Hardwareunboxed
      @Hardwareunboxed  10 днів тому +7

      Yeah missing the point for sure. Maybe just skip to and watch this entire section of the video.
      29:05 - CPU Longevity Test [3950X, 5800X, 5800X3D]
      29:14 - Watch Dogs Legion [2022]
      30:14 - Shadow of the Tomb Raider [2022]
      30:44 - Horizon Zero Dawn [2022]
      31:04 - 3 Game Average [2022 Titles]
      32:04 - Starfield [2023]
      32:43 - Warhammer 40,000 Space Marine 2 [2024]
      33:04 - Star Wars Jedi Survivor [2024]
      33:34 - 3 Game Average [2024 Titles]

    • @wolfgangvogel5407
      @wolfgangvogel5407 10 днів тому +14

      @@Hardwareunboxed Still dont see it. I agree with the initial testing in lower res to run in a cpu limit, really best way to do it. Some media outlets even still test 720p, that would drive some people mad for sure. But I really dont see the point of upscaling balanced at 4k with a rtx 4090. Its like testing somewhere between 1080p and 1440p, it would give you the same results.

    • @Hardwareunboxed
      @Hardwareunboxed  10 днів тому +8

      That is without question the most obvious way I can illustrate why GPU limited CPU testing is stupid and misleading. So if that didn't work I will have to tap out. You're welcome to get your GPU limited CPU testing elsewhere though.

    • @wolfgangvogel5407
      @wolfgangvogel5407 10 днів тому +19

      @@Hardwareunboxed I am aware about GPU and CPU limit, thanks. I am not questioning any statements either. Simply saying 4k upscaling performance is weird, in the first part of the video with the 4090 the whole test is pointless and misleading. You are not testing in 4k, its not even 1440p. The second part of the video proves your point much better. I think there is simply better ways to show why you test in low res than what you did in half of the video

    • @NGHutchin
      @NGHutchin 9 днів тому +2

      @@wolfgangvogel5407 I think he has already been hit by the more aggressive comments by the time he got to yours. It is definitely true that a 4k test without upscaling represents a greater GPU bind than with. I think the argument of the better CPU ages better is not the question the audience, which this video is attempting to address, is asking. I do still appreciate the point, though.

  • @BigPeter93
    @BigPeter93 10 днів тому +5

    1080p IS and SHOULD be the standard for CPU testing. HOWEVER, I really enjoy the 4K benchmarks as well. You said it yourself, none of the mainline tech reviewers are showing 4K benchmarks. This forces the part of your audience who want 4K CPU benchmarks to go to less than reliable sources. Wouldn't it benefit everyone involved to offer it on the slide during your reviews?

    • @thorbear
      @thorbear 10 днів тому +2

      You have been given misleading information. It is not true that none of the mainline tech reviewers are showing 4K benchmarks, and that's not what he said, although it is clearly what he wanted to say.
      LTT tested at 4K, and 1440p, and 1080p Ultra, probably the most comprehensive review available.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 10 днів тому +1

      ​@@thorbearYep, LTT is the only review of the 9800X3d that made sense

    • @7xPlayer
      @7xPlayer 9 днів тому

      "4K CPU benchmark review" This whole video is about how such a thing doesn't exist, cause of the GPU bottleneck in GPU bottlenecked scenario, or useless, the game is not GPU bottlenecked, so now you're on a wild goose chase, is it the cpu? if so, the 1080p test would show that anyway, or is it the memory? or the game itself? or or etc. The variables are mixed together and now you have mixed the CPU variable in, which is a big variable that hides the other variables, so no meaningful conclusion can be drawn cause the details are unclear.

  • @frankguy6843
    @frankguy6843 10 днів тому +35

    As a 5800X owner I appreciate the comparison at the end a lot, I got the CPU right before the 5800X3D arrived and couldn't justify the purchase, been waiting for my time to switch off AM4 and the 9800X3D is the answer. Should pair fine with my 3080 until I upgrade and then see even more performance. AMD really doubling down on giving gamers what we want and I appreciate it.

    • @rustyshackleford4117
      @rustyshackleford4117 10 днів тому +2

      Should be a big upgrade, I went from the 5800x to 7800x3d last year, and got decent FPS bumps even in many 4k games on ultra settings. Several CPU-heavy titles had massive uplifts, as did emulation of things like Nintendo Switch, up to 30-40% performance increase in a few cases like Tears of the Kingdom in cities with lots of NPCS.

    • @Spectre1157
      @Spectre1157 10 днів тому +2

      Hey same here! If I was humming and hawing about it before, after reading this comment I am now decided. Thanks!

    • @cschmall94
      @cschmall94 10 днів тому +1

      I just built a new system, coming from a 5800x, same boat as you, bought the 5800x before the x3d variant launched, and while the 4080 super does a lot of the heavy lifting over my 3070 before, the performance boost is incredible. Haven't done many tests, but the one that I did, Arma 3 koth, at 3440x1440, my 5800x/3070 would struggle to have a steady even 50fps, the 9800x3d/4080, rarely dipped below 130fps. Granted, it wasn't a full server, which kills fps most times, but still, insane boost.

    • @keirbourne5323
      @keirbourne5323 10 днів тому

      Same here, gonna get a 9800x3d to replace my 5600x.

    • @justinthematrix
      @justinthematrix 20 годин тому

      Same man been waiting to upgrade from
      My 5800x and 3080 as well

  • @TheMasterEast
    @TheMasterEast 10 днів тому +1

    Hey Steve thank you for doing this. I know why CPU are tested like they are. But I Highly value your work here in 4K testing. Keep up the Good work.

  • @Phil-q7h
    @Phil-q7h 10 днів тому +15

    Good video, thank you. I am one of those that FULLY understands why testing of CPU’s is done at low resolution, however I still want the 1440p/4k data. It’s useful to me as a Single player gamer. It lets me know where we are at, where the market is it. You contradict yourself here Steve, you pointed out in more than one chart that the cpu CAN make a difference and IS relevant at the higher resolutions. Especially when new gpu are released

    • @Hardwareunboxed
      @Hardwareunboxed  10 днів тому +12

      I did not contradict myself at all. You're somehow missing the point. The 1080p data tells you everything you need to know. You then take that data and apply it to whatever a given GPU can achieve at a given resolution/quality settings. The 1440p/4K data is misleading and I clearly showed by here:
      9:05 - CPU Longevity Test [3950X, 5800X, 5800X3D]
      29:14 - Watch Dogs Legion [2022]
      30:14 - Shadow of the Tomb Raider [2022]
      30:44 - Horizon Zero Dawn [2022]
      31:04 - 3 Game Average [2022 Titles]
      32:04 - Starfield [2023]
      32:43 - Warhammer 40,000 Space Marine 2 [2024]
      33:04 - Star Wars Jedi Survivor [2024]
      33:34 - 3 Game Average [2024 Titles]

    • @Phil-q7h
      @Phil-q7h 10 днів тому +2

      I know what you mean. I do honestly but as someone who never upgrades, and swaps out the pc is full every 2 years. I want to know what difference it’s going to make to me today, in today’s games. I appreciate the caveats that can bring, as you say, you don’t know what I am playing or at what resolution. I rarely buy the high end cpu and have always gone for the mid range, lumping the money into the gpu. BUT, if I see a clear difference in a cpu use at the higher resolutions, I want in. And I want you to tell me that, even if it’s on a small sample size. I know I’m not gonna win you over and I know why, but still……

    • @VON_WA
      @VON_WA 10 днів тому +6

      ​@@Phil-q7hyou know, you're essentially saying "I FULLY understand the reason you're doing it this way and it is the correct reason, but I still want you to so what I want to see even though it's unreasonable."
      Just do what Steve said. Check if you're CPU bottlenecked, then upgrade. If you're still happy with the performance, then you don't need to upgrade, simple as that. You said you're a single player gamer anyway, so most of the time it will be your GPU that's bottlenecked if you're playing at 2K/4K.

  • @bool2max
    @bool2max 8 днів тому +10

    Why test with upscaling @ 4K when we know that the internal resolution is closer to 1080p, for which we already know the results, i.e. that the newer CPU is faster?

    • @iyadkamhiyeh527
      @iyadkamhiyeh527 7 днів тому +1

      Yeah I don't understand the point of choosing 4K Balanced Upscale

    • @kyre4189
      @kyre4189 6 днів тому +2

      @@iyadkamhiyeh527 because Hardware Unboxed once again failed as a review channel and this video is just a copout to shift the blame to the viewers. If your review needs a separate video because it confuses the viewers, and if the separate video once again confuses the viewers and creates more questions, it's bad.

    • @iyadkamhiyeh527
      @iyadkamhiyeh527 6 днів тому +1

      @@kyre4189 yeah very disappointing from Steve at the beginning to speak to viewers as if they're some small kids who don't get it.

  • @vasheroo
    @vasheroo 10 днів тому +3

    The Stellaris uplift from the 7800x3d to 9800x3d has got me thinking about the upgrade. You can never have enough performance to fight the true late game crisis 😂

  • @jacobb-c7946
    @jacobb-c7946 10 днів тому +2

    thank you for making this, got into an argument with a guy over this on your benchmarks for the 9800x3d. he said its 100% worth upgrading to even if you have a 7800x3d because its 20% faster and you're stupid not to but to me that simply isn't the case, it really heavily depends on what you are playing, the resolution you are at and how many generations you can keep using the CPU because it will be less likely to bottle neck later ones because of its power. its weird to me that people think anything other than this CPU is obsolete, when someone with a 7800x3d will probably only need to upgrade their CPU a GPU generation or so earlier than people using the 9800x3d, which for people who bought it when it came it is completely logical.
    and lastly, who really needs an extra 15 - 30 fps more when you are already on 144fps.

    • @memitim171
      @memitim171 10 днів тому +2

      My 7800X3D replaced a 4690K😆, I upgrade the CPU when I can't play a game I want to play, and not before.

  • @TeodorMechev
    @TeodorMechev 9 днів тому +21

    I am pretty much aware of why reviews of CPUs are done on 1080p and I appreciate your hard work and vigorous testing, but in this video I would have liked to see some comparison between 7800x3d and 9800x3d in 1440p gaming and how viable would it be to upgrade or not. Maybe in upcoming one?!

    • @Your_Paramour
      @Your_Paramour 9 днів тому +10

      If you watched the video and are asking this question, you haven't understood the video. Testing at lower resolutions tells you what the maximum throughput is of your cpu, as in the vast majority of cases, cpu performance is independent of rendering resolution. Any higher resolution that does not have the same performance means it is now the gpu side that is the performance limiter. You cannot expect a reviewer to give you a complete performance profile of every game as that is totally unreasonable, and not their job.

    • @discrep
      @discrep 9 днів тому +1

      He literally explained why higher resolutions provide LESS accurate information -- because even the best GPU on the market will hit its limit of how many fps it can render before the CPU does. This is a CPU review, which compares the difference between CPUs, which cannot be determined if all of the benchmarks are the same because the GPU is the bottleneck. At lower resolution, when the GPU is not close to its limit, the CPUs will determine the max fps and you can more clearly see the true difference between the CPUs.
      Moreover, the actual fps numbers are irrelevant because everyone has different setups. You won't replicate their numbers unless the rest of your hardware and all of your software settings are identical to theirs. What you want to know is the >>relative performance gain

  • @FastDistance1
    @FastDistance1 10 днів тому +3

    Great info, but just to give you an example of a user who wants 4k data to be included in the CPU benchmarks, even after watching the whole video and understanding all your points. I have a 7600 and a 4090 (for example). I am thinking about upgrading my CPU after the new part is released, and I primarily game at 4k (playing casual games). After this video, I’ll go to buy a new CPU because the 9800x3d is 17% faster on average than 7700x. If that margin were under 10%, I wouldn’t update, and the 1080p data is not helping with my decision.

    • @mgk878
      @mgk878 5 днів тому

      If you're still asking this then maybe Steve has not explained it well enough.
      CPU performance is nearly independent of resolution. There's little or no extra work for 4K, because that work is done mainly by the GPU. So a "% uplift at 4K" is just the same number as 1080p, except that no GPU exists right now that can demonstrate that. Including those numbers in a CPU review would only help people with a 4090 and most people don't have one. Meanwhile the "CPU bound" numbers are for everyone. Most recent GPUs can achieve similar numbers just by turning down resolution and quality settings, until you get a CPU bind.
      The question of "will I get 17% more FPS" depends on your GPU and games, so the answer is really in the GPU benchmarks. I'd guess that most people asking this play at 4K/high settings and so are usually GPU bound, so the uplift would be about 0%. If I were you I'd save my money.

  • @galinhadopai
    @galinhadopai 10 днів тому +5

    DLSS QUALITY 🤨 NOT BALANCED

  • @abhijitroutray2752
    @abhijitroutray2752 10 днів тому

    Great video. The 'Future Proof' bit was eye opening. Thanks and keep up the good work.

  • @knuttella
    @knuttella 9 днів тому +9

    isn't 4k balanced just 2227 x 1253?

  • @rgstroud
    @rgstroud 10 днів тому +11

    We understand the best case processor performance comparison at 1080P but we also want the graph you gave for 4k but with Quality not Balanced differences. This will definitely tell us who have the 4090 and use epic settings if it is worth upgrading from the 7800X3D as we never use balanced settings with this configuration, we bought the 4090 to use Quality settings worst case and can add frame generation if the latency is at 60 fps native 4k resolution. This will get even more useful with the 5090 when no gpu limits will exist for most games. We will also be greatly interested in 4k quality game comparison between the 9800X3D/7950X3D and the new 9950X3D if it has the 3D cache on both CCDs. The end of the video was very useful as I did not upgrade my 5950X OC to the 5800X3D for my 4K gaming rig as the boost shows no help for many games and even though the 5800X3D was better in some games, it was not worth the upgrade as I didn't have the 4090 yet. Now with the 4090 and soon the 5090, these 4k Quality setting comparisons with the CPUs IS VERY Important to represent real world use cases.

    • @anitaremenarova6662
      @anitaremenarova6662 10 днів тому

      I can tell you right now: it's not. AM4 havers should be upgrading if they buy flagship but the rest will be fine with ryzen 7000 until the next platform drops easily.

    • @rgstroud
      @rgstroud 10 днів тому

      @@anitaremenarova6662 sorry but real world use cases are what people want to see. You can denigh all you want it doesn't make it untrue.

  • @santeenl
    @santeenl 10 днів тому +10

    It's a lot about future proofing. I always bought a system for a longer time and would upgrade the GPU. You could buy a 9800X3D and just keep it for like 6-8 years and upgrade your GPU over the years.

    • @robertopadrinopelado
      @robertopadrinopelado 10 днів тому

      Probablemente sea posible llegar a esa cantidad de tiempo adquiriendo una placa base con la mejor conectividad que hoy esté disponible. Tanto para GPU , para los M.2 y los puertos USB 4.

    • @TheGreatBenjie
      @TheGreatBenjie 10 днів тому +1

      There is not a CPU review in existence that can accurately say how "future-proof" a cpu will be, and that metric should be largely disregarded.

    • @Flaimbot
      @Flaimbot 10 днів тому

      not just future proofing, but also if you lock your fps, you enjoy a much more stable framerate, due to MUCH higher 1% lows compared to weaker cpus, that would fit your bill just from ther avg fps

    • @santeenl
      @santeenl 10 днів тому +1

      @@robertopadrinopelado Maybe respond in English, thanks.

    • @santeenl
      @santeenl 10 днів тому

      @@TheGreatBenjie It can accurately say it's WAY faster than a 9700X for example. Even if you don't notice a lot of difference today in 1440p, you might in lets say 3 years.

  • @5etz3r
    @5etz3r 10 днів тому

    Good video, I appreciate the one-off unique testing just to show examples and prove points about variables etc

  • @jacks3735
    @jacks3735 9 днів тому +7

    Dlss balanced is 2227 x 1253. Barely more than 1080p. What's the point of this video?

    • @anarchonda7273
      @anarchonda7273 9 днів тому

      What's the point of 4K120Hz+ monitors?

    • @samgragas8467
      @samgragas8467 9 днів тому

      @@anarchonda7273 The point is upscaled 4k and medium/high settings to get the perfect balance between frames and quality.

  • @joker927
    @joker927 8 днів тому +3

    1% lows. This is what i am interested in as a 4k gamer with a 4090. Will a new CPU help make games smoother? Will this CPU make frame times more consistent? This is valuable info for a halo product buyer like me.

  • @Aleaf_Inwind
    @Aleaf_Inwind 10 днів тому +3

    I think your message from Gamer's Nexus says it all, "we typically publish 1-3 1440p charts in games that are still mostly CPU-bound for perspective." It just gives another perspective. Sure we can cross reference and do the work ourselves, but we can also CPU test ourselves, or cross reference other reviews ourselves. It's nice to see the work done by a channel we trust to do it right to give an alternative perspective on CPU performance in realistic situations. It doesn't mean we need a 45 game test with every possible config combination, just find a few games, maybe one that sits at the average for GPU demanding game, one that sits at the average for overall CPU performance, and one that sits at the average for CPU demanding games and do a few tests like the ones you did here. You kept saying that the data was pointless, but even though I already understood your testing method and agree with it as the correct method for testing CPUs, I still found this data very fascinating to look at, and am glad you said that you might do another one in two years with the next CPU generation release.
    On a side note, I'm still sitting with a 5600x and want to upgrade, but I'm struggling to decide between a 5700x3D and a total platform upgrade to a 7700x. The 7700x would cost way more while not giving much benefit, but then I could upgrade again further down the line to a 9800x3D or potentially a 10800x3D if it stays on AM5, but there's always a chance that it could end up on AM6, and then it would probably be a better idea to skip AM5 altogether... Decisions...

    • @Coolmouse777
      @Coolmouse777 9 днів тому

      Hope they not listen to you. You agreed that this data is pointless but still want it ? Why ?

    • @Aleaf_Inwind
      @Aleaf_Inwind 9 днів тому

      @@Coolmouse777 No, I didn't agree that it was pointless, I completely disagreeed. I agreed that their testing methodology is the best practice for showing CPU raw power. But it's also nice to see how that power scales in different real world scenarios. I said that it was fascinating and gives another perspective, which Gamer's Nexus also seems to agree with, and they are one of the most technically minded channels around.

    • @Coolmouse777
      @Coolmouse777 9 днів тому

      @@Aleaf_Inwind There is no another perspective because the is no additional data if you want to get real world results, just look at GPU review to, it's simple to compare 2 numbers )

    • @Coolmouse777
      @Coolmouse777 9 днів тому

      @@Aleaf_Inwind But it is not scales, it is the point of video. If CPU get 120 at 1080p, 1440p and 4k will be the same 120 fps.

    • @Aleaf_Inwind
      @Aleaf_Inwind 9 днів тому

      @@Coolmouse777 There were plenty of cases like Cyberpunk 2077, where the 9800x3D got 219fps in 1080p and only 166fps in 4k Balanced, a reduction of almost 25%, while the 7700x only gets 158, but only had a reduction of about 5% from the 166 it gets in 1080p. So no, you don't get the same fps, you get less and it's nice to know how much less, because it's not just a solid percentile. Sure you can go cross reference GPU reviews to get an idea, but as you can see, it's not just a straight number, the 4090 still gets more frames with the 9800x3D than it does with the 7700x in 4k Balanced.

  • @SidorovichJr
    @SidorovichJr 5 днів тому

    thanks for 4K benchmark, those are really practical tests for us to make a decision

  • @dotxyn
    @dotxyn 10 днів тому +56

    Fantastic 1080p vs 1253p comparison Steve😁

    • @johnhughes9766
      @johnhughes9766 10 днів тому +12

      And look how much the results reduced at less than 1440p lol 😂

    • @coffeehouse119
      @coffeehouse119 10 днів тому +13

      Yea I don't see the point of the video , why change settings not just resolution, why test vs 7700x instead of 7800x3d... strange video from Steve

    • @STATICandCO
      @STATICandCO 10 днів тому +10

      I know it's missing the point of the video but DLSS quality definitely would have been more 'realistic'. DLSS balanced doesn't really represent 4k or a realistic use case for 4090 users

    • @Donbino
      @Donbino 10 днів тому

      @@STATICandCOi actually disagree. digital foundry along with many many 4090 users including myself use dlss performance on my 32 inch qd oled. i actually can’t believe how much performance im leaving on the table lol

    • @kerkertrandov459
      @kerkertrandov459 10 днів тому +5

      ​@@Donbino dlss performance on 32 inch. U must be blind if u cant see how bad that looks. Considering on 32 inch its much more noticable than 27 inch.

  • @adamchambers1393
    @adamchambers1393 10 днів тому +3

    My view is that I am gaming at 4k native now and am waiting to purchase the LG 45" 5120x2160 and just want to see if it worth upgrading from the 5950x when I purchase a 5090RTX next year or if I can save the £1k on CPU, mobo, RAM etc because the % increase remains at 5% only average increase and so can wait for the Zen6 10800x3D or whatever it will be. I can't see that data without some basic 4k numbers. If I looked at the 1080p without the 4k shown, am I going to have to assume I would get the same uplift as there would be no data to tell me know there isn't that uplift (yes aware that educated guess means it wont be those numbers but an uplift % still helps decide because at some point a GPU will allow that overhead.

  • @PopTart-j8u
    @PopTart-j8u 10 днів тому +5

    This is exactly the video I was looking for, thank you!

  • @martheunen
    @martheunen 10 днів тому

    Excellent video! Thank you! Possibly the best one of this topic yet! In the last video I was wondering if you'd do one of these explanatory videos about high res vs cpu etc etc etc... My wording in the comments of said previous video was somewhat lackluster. But yet here now is exactly the type of video I was wondering if you'd make.
    I guess these kind of videos are not your favorite to make, but personally I enjoy them and hope to learn something new about the current gen hardware and I also think they are very good for newcomers.
    Unfortunately for you I guess, this won't be the last one of it's kind, but again, every now and then (once every 2 years????) a refresher of this topic is useful for the community I think.

  • @jerghal
    @jerghal 10 днів тому +24

    At 13:40 about the upscaling poll: Steve says 80% use upscaling. But then it must be the wrong poll (this is the 1440p poll instead of the 4K poll). Coz what I see on screen is 43% use upscaling (8% balanced and 35% quality mode). 39% (the biggest group) DOES NOT use upscaling for 1440p. So why test with balanced if that is the smallest (8%) group, unless it's the wrong poll 😅.

    • @CookieManCookies
      @CookieManCookies 10 днів тому +2

      Such a tiny audience, how many youtube viewers are scoping for these polls? I didnt even know there was one.

    • @jerghal
      @jerghal 10 днів тому +1

      @CookieManCookies I happened to answer that poll 😁. Well both of them.

  • @msolace580
    @msolace580 10 днів тому +23

    didn't put 7800x3d on the chart is a miss.. but we can assume there is no reason to upgrade to 9800x3d lol

    • @codymonster7481
      @codymonster7481 10 днів тому

      or an intel

    • @andersjjensen
      @andersjjensen 10 днів тому

      Day one review said 11% faster on average. The 40 game benchmark marathon said 8% (like AMD's marketing), so unless you happen to play one of the games where the gains were 20% all the time, AND that game is not performing adequately for you, then yeah.... going from the 7800X3D to the 9800X3D is pretty much "e wanking". Which is fine if your kids don't need new shoes or something.

    • @emiel255
      @emiel255 10 днів тому +2

      He doesn’t he doesn’t really need to as he made a dedicated video comparing the 9800X3D with the 7800X3D plus his day 1 review he compared the 9800X3D with many other CPU’s

  • @MrHC1983
    @MrHC1983 9 днів тому +15

    Upscaling is not NATIVE 4K brother....... sorry but this whole video is a wash.

    • @iliasguenou4930
      @iliasguenou4930 9 днів тому +3

      It increases cpu load which is why he used it. Testing native 4k is a gpu test

    • @Blaze72sH
      @Blaze72sH 4 дні тому

      Why would you care about a gpu bound test in a cpu test video?

  • @mr.waffles2555
    @mr.waffles2555 9 днів тому

    This test was amazing. I have a few friends who haven’t quite grasped this concept and sharing this video with them finally bridged the gap. Thank you for all that you do.

  • @mption293
    @mption293 9 днів тому +3

    The most important thing is the big question, "Do I need to upgrade?" Telling me how many FPS it gets at 1080 doesn't tell me that , its very useful in overall CPU performance when it is time to upgrade, but if I'm running a 12th gen intel, and the 9800x3d is getting broadly the same results at 4k, or 1440p if i am playing at that, I don't need to upgrade.

    • @Skeames1214
      @Skeames1214 9 днів тому +1

      "The most important thing is the big question, do I need to upgrade?"
      There is no way that question can be answered by a CPU review. It's completely subjective, only the individual can decide. They don't know what exactly you want out of your system, what games you play, etc. If you: 1. already have a high end CPU, 2. play demanding at high resolutions and quality settings, and 3. are happy with your performance, then don't upgrade. If you aren't happy with the performance, and you're CPU limited, 1080p benchmarks give you the best idea of what CPU *can* reach the number you're targeting, and you can adjust quality settings or upgrade your GPU from there.

    • @mption293
      @mption293 9 днів тому +1

      @Skeames1214 yeah data sets at high resolutions are totally useless at helping someone make that decision/s
      This answer tells me there is no point to consuming this content until I make that decision. A product review should show if something will benifit the consumer,not just it's the fastest!

    • @Skeames1214
      @Skeames1214 8 днів тому

      @@mption293 But whether or not it will benefit the consumer is subjective. You can’t offer a review that answers that question for everyone. You don’t know what they want, what their other hardware is, how often they upgrade, etc. Tiering the CPUs by their raw performance is a useful data point for all consumers, and more importantly the point of a *CPU* review.

    • @mption293
      @mption293 8 днів тому

      @@Skeames1214 whether the 1080p data is beneficial is subjective too. Reviews are supposed to guide you to what's the best for your needs not to say here's all the possibilities this is the fastest in specific use case. More data might not help everyone but this data doesn't help everyone either.

    • @Skeames1214
      @Skeames1214 8 днів тому

      @@mption293 "Whether the 1080p data is beneficial is subjective too." No, it's not. It is objectively more useful in the context of a CPU review. It shows differences in CPUs instead of masking them behind the limitations of other parts. Would you want them to test GPUs at 1080p with a 10600k? Because that's the same logic. "Show me what this can do when it's artificially handicapped by another component"
      "Reviews are supposed to guide you to what's best for your needs." No, they aren't. You're talking in circles. Nobody knows what *your* needs are other than you. You can make that determination. This isn't guesswork, knowing what CPU and GPU are capable of independently will give you everything you need to know about what parts are right for you. Reviews are to evaluate the quality of the product in question. Not to evaluate artificially limited versions of it. That's how literally every single review from every reputable channel works. They tell you what the product is capable of now, which in turn tells you more about how it will age. If you actually only care about how specific games will run at specific resolutions with specific graphics settings and a specific GPU on a specific CPU, look up benchmarks for it.

  • @BenBording
    @BenBording 10 днів тому +8

    I completely get why you test CPU's specifically at 1080p. Makes a ton of sense when you test CPU vs CPU.
    BUT as one of the lucky persons that game at 5160x1440 ultra settings with a 4090 on my old 5900X, this review still hit the bullseye. Almost. These benchmarks at higher resolutions are very helpful, for me personally, to judge wether or not an upgrade is even worth it in my specific use case. So thanks for the video, this was exactly what I was missing as a bonus!

    • @pepijngeorge9959
      @pepijngeorge9959 10 днів тому +3

      The point of the video is that high ress testing doesn't tell anything about the CPU that 1080p testing doesn't already tell you.

    • @Dempig
      @Dempig 10 днів тому +1

      I wouldnt call 5160x1440p lucky, ultra wide looks really bad lol just get a big 65"+ oled or something it would look so much better. Whats the point of a extra wide display with a teeny tiny vertical view? Most games look very bad on ultrawide especially first person.

    • @BenBording
      @BenBording 10 днів тому +3

      @@Dempig lol you obviously never tried it then, but perhaps that's not your fault. 65" is just neckbreaking and looks trash up close. Most games look amazing on super ultrawide. ;)

    • @BenBording
      @BenBording 10 днів тому

      @@pepijngeorge9959 No but it does tell you something more detailed in very specific use cases, as Steve rightly points out multiple times in the video

    • @Dempig
      @Dempig 10 днів тому +1

      @@BenBording Nah man ultra wide is terrible for literally every game lol like I said it completely ruins first person games , and having such a tiny vertical viewing range with a large horizontal viewing range makes no sense.

  • @Tostie1987
    @Tostie1987 10 днів тому +10

    Steve, do you still not understand that people also just want to see benchmarks that mostly mymic their own system and that (at least for me), is why I would like to see 1440p benchmarks? This way I have some sort of image of what I can expect if I buy this cpu.

    • @samgragas8467
      @samgragas8467 10 днів тому +4

      You can expect to get more FPS if you are CPU limited at the framerate you would like to play. That is all you need to know.

  • @Flaimbot
    @Flaimbot 10 днів тому +2

    i think the only thing missing to drive the point home would be adding 720p (in fact, my preferred benchmark res due to even 1080p sometimes hitting some bottlenecks in the gpu)
    regarding 13:23
    you've partially included what i was about to request due to intermingling the 1080p-VS-4k(dlss) data, but i think it would still be helpful as a standalone video.
    all the dlss/fsr/xess performance requests for gpus are just as pointless as high res cpu benchmarks. all they show is just the base resolution data (minus a bit from the overhead of those thechniques).
    showing this correlation between upscaler-base-resolution and the respective native resolution of that base, could help demystify that technology to the laypeople.
    and while at that, frame generation, according to my understanding, only leverages the free resources of the gpu in case of a cpu bottleneck, up to twice the native frame rate, due to interweaving native and generated frames. this could also make for another nice video, showing how the FG framerate remains consistent across multiple cpus in the same game. (given my understanding is correct)
    38:29 you mean "again" ;)
    big updoot for that type of content 👍

  • @Superior85
    @Superior85 10 днів тому +8

    If the most popular option on the poll at 13:41 was native 1440p, why not at least include that result in reviews? Could be useful for both 1440p gamers and those who use 4K Quality, since 1440p is the 4K Quality DLSS rendering resolution....

    • @Dark-qx8rk
      @Dark-qx8rk 10 днів тому +1

      1440P with upscaling would in effect give the same fps as 1080P which would make the 9800X3D a great choice.

  • @BrutalSavageYT
    @BrutalSavageYT 8 днів тому +3

    i actually hate the fact dlss has become such a crutch, just find a benchmark without it at high resolutions is a nightmare.

  • @MaggotCZ
    @MaggotCZ 9 днів тому +4

    there is a difference between using the best quality mode for upscaling and using balanced or below because at that point even 1440p native is more taxing on the GPU and looks better. You practically tested 1253p with 4090 lmao no sh it thats still CPU tied.

    • @samgragas8467
      @samgragas8467 9 днів тому

      Balanced looks better and is a bit more taxing. In UE 5 it is less taxing btw.

  • @Cuthalu
    @Cuthalu 10 днів тому +1

    Fantastic video, and it's something the community really does seem to need.

  • @DaKrawnik
    @DaKrawnik 10 днів тому +8

    Just do 1080p and 1440p and be done with it.

    • @Coolmouse777
      @Coolmouse777 9 днів тому +4

      no reason to do 1440p at all

    • @LucasHolt
      @LucasHolt 9 днів тому +2

      @@Coolmouse777 it's becoming the default res for new monitors so it will matter

    • @TAG_Underground
      @TAG_Underground 9 днів тому +2

      @@Coolmouse777 More people game on 1440p than 4K.

    • @Coolmouse777
      @Coolmouse777 9 днів тому

      @@LucasHolt So what ? Performance in 1440 and 1080 same when CPU bottleneck.

    • @LucasHolt
      @LucasHolt 9 днів тому +1

      @@Coolmouse777 In the real world, people want to know how the game will work on their computer. As 1440p becomes the default, that is what people will want to see. This happened with previous revolution bumps too. Otherwise, they'd still be benchmarking at 640x480.

  • @marcinsobczak2485
    @marcinsobczak2485 10 днів тому +3

    Steve is going to kill me, but what is the point of testing 4k resolution with upscaling? 😅 when we know it is not real 4k? if I wanted hi frame rate and blurry graphics I would just simply switch to 1080

  • @Vincent-v9q
    @Vincent-v9q 10 днів тому +7

    Steve, you are a GENIUS! (no)
    4K + DLSS balance = 2160p / 1.7 = 1270p = 2258x1270 ... It's not even 1440p!
    You are comparing 1920x1080 vs 2258x1270. You really need to compare 4K in native!
    We are waiting for a normal test. For example, will there be a difference in 4K 7500F/12400F vs 9800x3d. Without upscalers! You can add "DLSS quality" as a separate line, but it is not necessary.
    Without upscalers, it will be clear where to invest money. The difference between real prices between 9800x3d and 7500f is about 400 dollars. You can invest those $400 in a good 4K monitor and not see the difference between the 7500F and 9800x3d even on the 4090.

    • @Hardwareunboxed
      @Hardwareunboxed  10 днів тому

      I'm not sure what you're talking about, balanced DLSS at 4K still has 31% more pixels than native 1440p.

    • @Vincent-v9q
      @Vincent-v9q 10 днів тому

      @@Hardwareunboxed I can't give you a link, just Google "dlss quality resolution". The preset quality is 66.6% of the native, the balanced preset is 58%, the performance preset is 50%, and the ultra performance preset is 33% of the native.
      Setting the balanced preset to 4K will give you a real render of 3840*0.58=2227 and a resolution of 2160x1252.
      Scaling is not based on the total number of pixels! But each side of the image is reduced by this percentage!
      In the same Hogwarts. If you go to the settings, set 4K and set DLSS Quality, for example, you will see a render resolution of 67% (2562x1441), which is even lower than the native 2560x1440, if you count by dots. 4K + DLSS balance is MUCH LOWER than native 2560x1440.
      I'm ashamed that I have to explain this to a channel with over a million subscribers. Who think that DLSS in percentages reduces the total number of pixels across the entire image area, and not on the sides.
      Steve, remember! 4K + DLSS quality ~= QHD (2560x1440) native. 4K + DLSS balanced = something in between QHD and FHD. 4K + DLSS performance = FHD.
      Waiting for a normal test in native 4K!

    • @Vincent-v9q
      @Vincent-v9q 10 днів тому

      ​@@Hardwareunboxed I can't give you a link, just Google "dlss quality resolution". The preset quality is 66.6% of the native, the balanced preset is 58%, the performance preset is 50%, and the ultra performance preset is 33% of the native.
      Setting the balanced preset to 4K will give you a real render of 3840*0.58=2227 and a resolution of 2160x1252.
      Scaling is not based on the total number of pixels! But each side of the image is reduced by this percentage!
      In the same Hogwarts. If you go to the settings, set 4K and set DLSS Quality, for example, you will see a render resolution of 67% (2562x1441), which is even lower than the native 2560x1440, if you count by dots. 4K + DLSS balance is MUCH LOWER than native 2560x1440.
      I'm ashamed that I have to explain this to a channel with over a million subscribers. Who think that DLSS in percentages reduces the total number of pixels across the entire image area, and not on the sides.
      Steve, remember! 4K + DLSS quality ~= QHD (2560x1440) native. 4K + DLSS balanced = something in between QHD and FHD. 4K + DLSS performance = FHD.
      Waiting for a normal test in native 4K!

    • @Vincent-v9q
      @Vincent-v9q 10 днів тому +5

      @@Hardwareunboxed I understand how you calculated.
      3840*2160 = 8,292,400 x 0.58 = 4,809,592
      2560*1440= 3,686,400
      4,809,592 / 3,686,400= 1.304 = ~30.4% more
      DLSS doesn't work THAT way!
      You're wrong... I described how DLSS works above.

    • @Vincent-v9q
      @Vincent-v9q 10 днів тому +5

      @@Hardwareunboxed It works like this 4K DLSS balanced =
      (3840*0.58)*(2160*0.58) = 2227 x 1253 = 2790431
      2560*1440= 3 686 400
      3686400 / 2790431 = 1.32 = QHD resolution (2560x1440) is 32% more than 4K DLSS balanced(2227 x 1253 real resolution), as you tested in this video!

  • @zuluagaco
    @zuluagaco 10 днів тому +1

    Don't worry about rude comments. Just add real case scenarios at the end of the benchmarks to see if people with popular cpus should upgrade or not. Thanks for making amazing videos. Love monitor unboxed by the way..

  • @UsernameInvalid48
    @UsernameInvalid48 10 днів тому +9

    There are instances when 4K still has a CPU bottleneck. It's very uncommon but if you look at Dragons Dogma 2 or Monster Hunter Wilds, those games get CPU limited at 4K with a high end graphics card. Thats all because of capcoms shitty engine that doesn't work well for open worlds at all optimization wise. I got a 9800x3d just to be able to keep those games as close to 60fps as possible.

  • @Thor847200
    @Thor847200 9 днів тому +6

    Ok, I find nothing wrong with testing @1080p, but personally I want to know what I am getting if I buy that cpu for 1440, or 4k gaming. I don't want to guess or spend mutiple hours trying to figure out how to do the math properly to see 4k benchmark data. If the CPU gives no uplift @4k, I want to know that. If it does give uplift, I want to know that too. And you say that it is "pointless" to show anything above 1080p because I would have to have the exact same setup in the same games as you in order to see if I wanted to buy the CPU or not, but here is the thing. I don't play at 1080p. So you ONLY giving 1080p data means nothing to me because, as you said about 4k, I don't have that exact same setup you are using in this video. In fact I would bet the majority of people don't have a 4090 to compare to. I understand why you use it in the benchmarks, to limit gpu bottlenecking, but if you are going to say that I would need the exact same setup you use @4k then you could use the same logic for 1080p. With your own logic, I don't use 1080p, any of the other cpus you showed, or a 4090, so all of that 1080p data is pointless except to show that the new cpu is faster then the other older cpus. And that is why I like to see 4K data, to see how much faster, if any at all, the new cpu is against other cpus @ 4k. I would expect the new cpu to be faster at 1080p, that should be fairly obvious unless it is a dud of a product in general. I also would have been interested to know if the 9800x3D also gave any uplift to raytracing as raytracing is at least partially cpu dependent. All that being said, to me it isn't pointless to see the 4K data at all. I do appreciate you making this video even if you don't see a good reason to in your own opinion. But there are a good number of us out there that understand that 1080p is mostly irrelevent to us other than to use to see how much faster a new cpu is. That doesn't mean though that we shouldn't get to see the new cpu data for resolutions higher than 1080p. Most reviewers don''t go back and retest most GPUs for a new CPU, especially at the end of a GPU generation, so seeing 4K numbers for a new CPU is typically the best and latest benchmark updates we will get until a new GPU lineup comes out. So, thank you for making the video. And sorry for the wall of text.

  • @mananabanana
    @mananabanana 10 днів тому +3

    You could have made this admittedly great video without being salty throughout and it would have been a great watching experience for us and you'd be more relaxed as well. Don't get caught up with the bottom of the barrel commenters Steve! For our sake and yours.

    • @Hardwareunboxed
      @Hardwareunboxed  10 днів тому +1

      Salty? WTF I made a few jokes while trying to explain the testing.

  • @fmatax
    @fmatax 10 днів тому

    Thank you! That "future proofing" seccion was awesome. I knew that in theory, but haven't actually see an example. It was quite the eye opener.

  • @tadeuszkubera3060
    @tadeuszkubera3060 10 днів тому +3

    About CS2 data:
    Surely even when over 500 FPS is plenty for almost virtually everyone, the difference between ~250 1% lows and ~340 1% lows must be noticeable on for example 360 FPS monitor.
    Aren't 1% lows more important for competitive shooter than average FPS?

    • @Weissrolf
      @Weissrolf 10 днів тому

      Can you detect 5 out of 500 frames per second coming less than 1.1 ms later (4 vs. 2.94 ms)? Very likely not.

    • @tadeuszkubera3060
      @tadeuszkubera3060 10 днів тому +1

      @Weissrolf
      I don't think that's how 1% lows work.
      The lows are not evenly spaced across time. 1% low of 250 vs 340 means rather that someone will have 3 seconds with such, or lower, frame rate per 5 minutes of playing time, likely bunched together.
      0.1% lows would tell how low this drops can go.

  • @tmright18
    @tmright18 10 днів тому +7

    Honestly it's about time this is done. In fact i'd like to see this with actual 4k, or DLSS set to 'quality'. I also play competetiive shooters but 300 fps or 350 fps is not going to matter to me. I'm interested in frametime consistency and lowest fps at very high graphic settings. Next time quality DLSS please :)

    • @RainboomDash
      @RainboomDash 10 днів тому

      You can subtract a little over 10% difference from balanced results and get a very rough guess

  • @jongamez5163
    @jongamez5163 10 днів тому +13

    4K Performance/Balanced/Quality are far from Native 4K since they use much lower resolutions but yeah. Imo the real selling point of a 9800X3D at 4K would be for 1% and 0.1% Lows that probably can give you a smoother experience due to less FPS drops. I have a 5900X and most games still run great on my 4090 at 4K. But when the 5090 releases in a few months a 9800X3D might be needed...

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 10 днів тому +5

      Yep! Was happy to see him actually doing 4k testing but then very upset that he is using upscaled "4K" Steve is a joke and I will continue to not take him seriously 😂

    • @RADkate
      @RADkate 9 днів тому

      there is zero reason not to use dlss at 4k

  • @Crankshaft_NL
    @Crankshaft_NL 10 днів тому +2

    For me the underlying question to look at 1440p testing is to look at what is the best step for upgrading(most fps for money). Cpu or gpu first. Forsure the 1080p testing is the best, no comment on that. But content like you did with min max and the 7600 and 4070ti vs 7800x3d with 4070 is welcome content from time to time, to help consumers extrapolate there own questions concerning upgrade path and where to put ther x amount of money.

  • @OfficialEthern1ty
    @OfficialEthern1ty 10 днів тому +22

    As a long time 4k user, I would have liked to see 4k native. Was it not done due to the testing methodology? Length of testing? Would testing with upscaling be closer or equivalent to 1440p testing?

    • @zacharyspencer2285
      @zacharyspencer2285 10 днів тому

      My guess would be that most people use either quality mode DLSS or balanced mode for the extra performance. If you have a smaller screen with DLSS such as 32 inches I've noticed even performance mode works well with DLSS and is hard to tell the difference in a lot of games. I notice that the bigger the screen the easier it is to notice DLSS messing with the image and isn't as good as native but I think that most people use DLSS quality or balanced with their graphics cards or FSR quality if they have AMD.

    • @DB-cu3xu
      @DB-cu3xu 10 днів тому +4

      I'm sure this was to emphasize CPU performance at 4k rather than focus on a more GPU limited scenario.

    • @vampe777
      @vampe777 10 днів тому

      They have done it with DLSS because most people voted for it.

    • @floriang2801
      @floriang2801 10 днів тому +2

      @@DB-cu3xuwhich kind of defeats the whole point of the video…

    • @anitaremenarova6662
      @anitaremenarova6662 10 днів тому +1

      You can't play modern games at 4K native, not unless you want a console experience (30FPS)

  • @JDurnall
    @JDurnall 10 днів тому +3

    C'mon Steve, no 1440p benchmarks? What is this world coming to

  • @edmon-pf9fp
    @edmon-pf9fp 10 днів тому +16

    this is crazy. why are you not showing 4k native. your showing 1080 native and in 4k balance upscaling. come on mate. i dont see what this video is showing.

  • @garbuckle3000
    @garbuckle3000 10 днів тому +1

    Your previous videos on the subject definitely helped in making my decision on where to upgrade last year. I do watch cpu reviews to mainly get an idea of what's good value, and what makes sense for my situation. I play at 4k with a 6950XT, so I know a high-end gaming CPU does little. In some games, it helps the 1% lows. And it's helpful for future games. This is why I went with the 7900X3D when I found it for $280. Definitely an upgrade for multitasking work, and the benefit of some cache to help a bit on some games, for an amazing price. I think the big issue is that people are looking at the ridiculous FPS numbers on the chart, and not the % increase. These charts should always be seen as "up to" X performance.
    In the end... It depends.

  • @CookieManCookies
    @CookieManCookies 10 днів тому +4

    If you are spending $800 on a new motherboard, and CPU. Your crazy to think that these people can't afford a 4K gaming display, or only care about dollars per frame. You don't need to choose the exact same settings people choose in real life. "High" is pretty representative. For me, 4K benchmarking is a valid concern because I'm coming from a 3950 and my big choice is between the 285K or 9800X3D. I haven't gamed in 1080p in ~20 years, so yes, completely pointless benchmarking unless your one of the 100 people on counterstrike fighting for 300+ fps.

    • @tdrm
      @tdrm 10 днів тому +1

      You seem clueless about Counter-Strike popularity. That game has millions of active players, not 100.

  • @Strix-gp9xg
    @Strix-gp9xg 8 днів тому +3

    Why 4K with balanced upscaling? Why not test at native 4K?

  • @ChoppyChof
    @ChoppyChof 10 днів тому +5

    I’m always interested in seeing if I can improve my set up, and that if I can what is the cost vs return to do so. So when I see a new CPU, for me at least I want to see if it makes any difference at all in 4K ultra in native over my current set up (5800x3d and 4090). I highly expect no to marginal as the answer but I’m still interested to see the data.

    • @xpodx
      @xpodx 10 днів тому

      Right. I have a i9-9900k and 4090. 4k 144hz and 5k I'm gpu bound but I'm wondering/hoping my i9 will handle a 5090 fully.

    • @matejnemec3022
      @matejnemec3022 10 днів тому

      @@xpodx It will definitely hold you back in some games, unless you enjoy playing sub 60 fps. If you play competitive games with competitive settings it will be a massive bottleneck. You should definitely upgrade your cpu for a 5090.

    • @xpodx
      @xpodx 10 днів тому

      @matejnemec3022 i game at 4k 144hz on ultra and high settings. More demanding titles i get 70. But mostly the games i play such as mw19, Vanguard cod, I get 165-225 105% Render and fully use my i9-9900k. I think it can handle the 5090, I can always bump up the render to 150% in other games making it look a tiny bit sharper.

    • @anitaremenarova6662
      @anitaremenarova6662 10 днів тому

      No it won't, CPU upgrade would only improve your FPS when upscaling because 4090 is limiting your 5800X3D in 4K native.

    • @xpodx
      @xpodx 10 днів тому

      @anitaremenarova6662 yea i wish he could of tested 4k ultra no upscaling in this video ha

  • @joeschmoe-w8z
    @joeschmoe-w8z 10 днів тому +2

    the test you did though show there is a benefit to 4k testing. in games like assetto corza competizione and remnant 2. as you said earlier testing at 1080p although it is a resolution that was played a long time ago because nvidia cards excelled at ultra settings. it is mainly used to show the differences between the cards and not as practical advice for how the game should be played. 4k benchmarks on the other hand could conceivable be how a user who owns a 4090 would actually use their card. that being said there are a wide variety of hardware configurations. the point is by putting a load on the processor you come closer to an actual scenario gamers may find themselves in. the problem is remnant 2 and asseto corza competizione and remnant 2actually excel on the intel core ultra 285k. I think the reason they excel is because demanding settings use the memory bandwidth of the cpu cache and potentially the ram this exerts pressure on the modern front speed bus of the processor. pushing out frames to decrease latency between the gpu and the cpu is a function of a cpus ability to quickly calculate. although the 285k is getting slower. its multiple cores make its memory bandwidth better than older processors. however because its p cores are less performant. there's actually a regression in speeds on the p cores between 14th gen and core ultra 2 gen or in reality due to intels terrible naming scheme. its actually core ultra 1.5 gen. essentially this leads to scenarios in benchmarks where its single core performance tells a different story then its actual gaming performance under load. although its true the 9800x3d is always the best cpu. the 7700x isn't always better. I suspect if games were tested that put more of a load across all 3 processors. the 7700x would easily be the worst out of the three although admittedly that could simply because newer games become designed around the performance characteristics of the core ultra 5 285k. I wouldn't say any of these processors is a deal breaker for the performance it offers. although it may be a deal breaker for its cost per frame. even a 3317u if you could load up windows 10 with it overclocked would do an adequate job with a rtx 4090.

  • @Hjominbonrun
    @Hjominbonrun 11 днів тому +38

    wait, He is not lounging on his couch.
    I don't know what to make of this.

  • @xshadowinxbc
    @xshadowinxbc 10 днів тому +12

    *Tries to prove a point*
    *Uses DLSS balanced upscaling rather than native at 4k, even when it's absolutely not necessary*
    *Doesn't even show the 5800X3D or 7800X3D vs the 9800X3D at those resolutions*
    I don't think anyone is trying to pry the "1080p testing is useful to show CPU performance" argument out of your sweaty palms. The point is that most people are trying to decide whether an upgrade is worth for what they're actually going to be playing, at the resolutions that they will actually be using. This is becoming more and more relevant when the CPU starts costing as much, or more, than many GPUs. That's why we want to see how well it fares in both GPU and CPU limited scenarios (and then in the middle, which is basically what you did here), rather than just one of the two. It's not like it makes the 9800X3D look much worse; even at GPU limited scenarios it still provides benefits. This video is at best disingenuous, and I'm going to kind of be sitting there laughing when people plunge down nearly $500 on a 9800X3D and then later realize that it isn't doing jack shit with their budget GPU.... rather than saving a lot of money by, say, grabbing the 7600X3D (while still being on a modern platform) and then using those savings towards a better GPU. Furthermore, "in the future" is a moot point considering "the future" is going to have its own new CPUs which might provide much better price/$, or hell you could probably get the 9800X3D on sale (perhaps secondhand) for probably a fraction of its current price anyway. What is the future and when is the future that actually matters?

    • @Nick-05
      @Nick-05 9 днів тому +1

      Nailed it

  • @Rafyelzz
    @Rafyelzz 10 днів тому +4

    I don't think most people are criticising the methodology, but rather the fact that it doesn't help them make a decision because most of them considering this purchase, don't use 1080p anymore. That's all. For me, understanding the gains in 1080p is useful to understand the technical innovation, but i go for the 4k tests to see if it's worth for me. Dramatically different point of view.
    This video is great because at the end it shows how GPUs bottlenecks impact fps with different CPUs

  • @tyraelhermosa
    @tyraelhermosa 9 днів тому

    Great job. You nailed it. That example at the end with the CPUs running on the 3090 vs the 4090 makes it so clear.

  • @benfoster93
    @benfoster93 11 днів тому +7

    Thanks Steve I moaned at you for real world results and you've delivered. I'll devour this video when I've finished work

  • @asranssrg
    @asranssrg 10 днів тому +20

    The 4K reviews are worthless as they are using Balanced Upscaling, who decided this was needed? Sorry you wasted time on this. Could you have provided 4K results without changing the variables? Apples to apples comparison. 4k results already show less of a performance uplift over the bs 1080p results. Rather have raw data and make my own interpretations. And no 7800X3D or 5800X3D to compare against, not good comparison.

    • @nicolastremblay7364
      @nicolastremblay7364 10 днів тому +9

      Balanced DLSS means 1253p (Internal resolution), so not a good comparison I think also... and by the way, I fully understand that CPU tests must be done at 1080p to avoid GPU bottleneck....but for me, i'm more interested in seeing CPU tested at 4K (no upscaling), because I'm never using other resolution than this... !

    • @FurBurger151
      @FurBurger151 10 днів тому +5

      Exactly this is pointless. They just don't want to show the real gains at native 4k.

    • @yoked391
      @yoked391 10 днів тому +1

      @@FurBurger151 cpu gains are useless at native 4k thats why xD

    • @FurBurger151
      @FurBurger151 10 днів тому +1

      @@yoked391 I know. I game at 4k so this CPU is pointless.

    • @thischannel1071
      @thischannel1071 10 днів тому +2

      @@yoked391 That's what makes benchmarks at 1440p or 4k useful to potential purchasers of this, or other high-end CPUs. Seeing that there's no performance gain with the top-end gaming CPU lets people know to not spend money buying the top-end gaming CPU when it won't do anything for them. That's part of the context of what the 9800X3D means to gamers thinking about buying an upgrade, and that should be conveyed in a review of it that intends to inform consumers.

  • @CrazyKat0827
    @CrazyKat0827 10 днів тому +8

    That is why I upgraded the 4080 Super pair with a 5800X from a 6800XT at 4K, and there was almost no performance upgrade. Then the 9700X still did not show much difference, but with the 9800X3D, the difference was significant. This is specific to Space Marine 2.
    Thank you!

    • @Top_Weeb
      @Top_Weeb 10 днів тому +1

      There was no performance improvement because the 4080 Super isn't hugely better than the 6800 XT. What were you even thinking?

    • @wojtek-33
      @wojtek-33 10 днів тому

      Your situation is why some people were complaining. People may waste a lot of money for no return and want to get it right the first time.

    • @wojtek-33
      @wojtek-33 10 днів тому

      @Top_Weeb It's 50% faster when not CPU limited.

  • @evan8683
    @evan8683 10 днів тому +1

    Before watching the video, I'd just say this is really interesting to me.
    I only game (and work) at 4K. If, for example, I'm GPU bound and getting the same frames with a Core Ultra 7 CPU as a 9800X3D CPU, but get substantially better production performance from the Core Ultra 7, then that would certainly factor into which product I'd buy.
    *After watching the video, I like the information, especially as someone who works on their computer as much as he games. I'm disappointed that only the balanced preset was used. I personally don't go below quality with DLSS at 4K. Even your poll, which to be fair was for 1440p users, shows that the majority use the quality preset.
    This is good stuff though! My old 10900K is noticeably starting to show its age in my production applications, and is very much so holding my 4090 back in gaming, even at 4K.

  • @definitelyfunatparties
    @definitelyfunatparties 7 днів тому +3

    6:54, dude.. high end GPU 4k gamers, such as myself, are happy with 60FPS, AS LONG as we are GPU limited, that's why we spend over 1k on our GPUs, all we want to know is whether there's actually a point to getting a 7800x3d over a 7700x, or if we're wasting money that could go towards our next GPU
    Edit: 18:52 the argument DOESN'T fall apart if you don't use upscaling. I play hogwarts on native resolution, honestly, people asking for these results are gamers that want the best graphical fidelity, I never understood FPS chasing, and I never will. Some people will buy a 4090 and not turn on RT/PT because it lowers FPS.. well, then just get a damn 7900XT, you'll be looking at the exact same thing
    Edit: 20:13 ANOTHER failed test, you're testing ACC with AI, which probably 1% of players are going to be racing against.. I get much higher FPS than even the 9800X3D while playing online, which is how that game is meant to be played

  • @Premier024
    @Premier024 10 днів тому +7

    Yes they should be tested at 1080 just to see the max performance it can do but it is not a real world use case that's why I want to see the 4k numbers as well in the reviews.

  • @Quezacotlish
    @Quezacotlish 10 днів тому +3

    I am interested in how different CPU's perform at higher resolutions tho as well, so I do like to see that content because say I am unhappy with my framerate a (i play at 1440p) and want to move up to 4k, I like to know, Ok can i get by just upgrading my GPU or will i see a substantial increase from also upgrading my CPU, Cuz like if i buy a 5090 Am i going to get 120fps at 4k on my 5800x3D and 120fps at 4k on a 9800x3D so i can forgo spending another 1500$ on a platform upgrade, for example.

  • @naserali5211
    @naserali5211 10 днів тому +1

    I appreciate your efforts and transparency in the tests. I sincerely hope that you will be one of the largest and first channels. I apologize for commenting earlier, but you know that the absence of competition leads to raising prices, and this is not in the interest of the gaming community. Thank you very much.❤

  • @tical2399
    @tical2399 10 днів тому +7

    All the snark is fine and good but there are people like me who only play at 4k. When I'm looking at a cpu I want to see what it does at 4k, as in exact numbers. Not this
    take the 1080p results and divide it by pi or some nonsense reviewers do. You can do your whole speech about 1080p results and then go "oh btw there are the 1440p and 4k results in another graph." Why is that so hard to do. Show the results and let people make whatever (right or wrong) assumptions they want about future performance.

    • @randomexcalmain4512
      @randomexcalmain4512 10 днів тому

      Why not check the already existing GPU benchmarks?

    • @Lemurion287
      @Lemurion287 10 днів тому +1

      Because Steve only has so much time and so many resources. Benching everything at 1080p gives him the most data for the amount of time and effort invested. Adding additional high resolution benchmarks won't change his conclusions but will increase his workload--and potentially take up the time needed to investigate real outliers that might be important.

    • @thischannel1071
      @thischannel1071 10 днів тому

      @@Lemurion287 Didn't Steve say in this or HU's previous discussion video that they always test at 1440p and 4k as well, and just didn't include the results in their 9800X3D review video? So, HU already spends the time. They just choose to omit the information in their reviews.

    • @avatarion
      @avatarion 10 днів тому +1

      At native 4K you only have to see one semi-modern CPU running a game and you pretty much know the rest.

  • @RobloxianX
    @RobloxianX 10 днів тому +16

    Can't wait to see this review with the R9 9950X3D and RTX 5090! Febuary is going to be AWESOME!!!

    • @Snxgur
      @Snxgur 10 днів тому +6

      9800x3d will still remain the king of gaming

    • @erhanozaydin853
      @erhanozaydin853 10 днів тому +6

      @@SnxgurOn 9950x3D probably both dies shall have 3D cache. May surprise everyone.

    • @Keiseru
      @Keiseru 10 днів тому

      ​@@erhanozaydin853 As I bought 9800x3d The fomo is real for me. Or might be if reviews show it to be good.

    • @zehoo2
      @zehoo2 10 днів тому +1

      Maybe that’ll make me move on from my 8700k.

    • @ChrisM541
      @ChrisM541 10 днів тому

      Hi Jensen ;)

  • @Monsux
    @Monsux 10 днів тому +4

    Can you please do Path Tracing tests on these high-end CPUs? It's by far the most CPU intensive feature that is often disregarded because it's the most GPU demanded option. In my testing, 4k DLSS ultra performance, Cyberpunk optimized high settings + PT. The CPU limitation with 5800x3D was insane on the city area even with medium crowd density. CPU was massively holding back when using high crowd density. People who play with 4k DLSS performance + PT, might think they are always fully GPU limited, but the CPU holds things back. I bet that this is going to be a massive issue when the new 50xx cards are launched, because users can run games with higher framerate while using PT. Even just basic Ray Tracings on open world titles are super demanding for the CPU.
    I would like to know at least the generational jump from 5800x3D, 7800x3D, and 9800x3D. I would run these CPU tests, but lack the hardware for run all of these. The limitation might even be harder for players who are running games with mods.

    • @Joelione
      @Joelione 10 днів тому +2

      Path tracing is one of those features that is always left out from CPU reviews. Not even my 7800x3d is able to run cyberpunk smoothly with PT when using 4090. CPU bottleneck!

    • @l8knight845
      @l8knight845 10 днів тому +1

      Completely agree. RT testing is where the real 4K separation will show up.

    • @thephantomchannel5368
      @thephantomchannel5368 10 днів тому +1

      Once hardware can cope, path tracing will be the gold standard for lighting. The more information the system has access to (pixels) the better the results will be as well as the higher the refresh rate the more realistic it will look. The problem with path tracing using up scaled resolution is that the math for the lighting will not be as accurate as it would if it had been calculated at a native 4k resolution. Also the amount of bounces/ passes needs to be increased to make the tech almost resemble photo realism. Currently I think the limit is set to three passes max which requires software augmentation/ interpolation to fill in the gaps as a function of de-noising.
      Some day I hope that tech will be fast enough to at least have VR with native 4k with 120hz per eye and still be able to output full 4k path tracing with at least 12 passes for much higher fidelity and accuracy for color and shadows without relying on interpolation.

    • @kerkertrandov459
      @kerkertrandov459 10 днів тому +1

      ​@@Joelione yes getting 20 fps at 4k native with a 4090 when u turn on path tracing (rt overdrive) is def a cpu bottleneck lmao

    • @Monsux
      @Monsux 10 днів тому +1

      @@l8knight845 The thing is, Path Tracing does bottleneck my 5800x3D even at lowest possible rendering resolution in Cyberpunk. Framerate goes to unplayable levels + low % fps super low. I just wish the PT testing would become norm in CPU reviews/benchmarks. New RTX 5090 & 5080 users who play, for example, Cyberpunk with Path Tracing 4k, DLSS performance. They are massively limited by their CPUs. How much, this is what I would like to know.
      Some people who say that the CPU doesn't matter at 4k, well, they haven't used PT or played open world games with RT on (spider-man, Star Wars Outlaws, etc.)

  • @hassanaslam6008
    @hassanaslam6008 10 днів тому +1

    This is the video I was waiting for.