If you want to learn more about cores/cache performance of Ryzen processors then check this video out (if you missed it) ua-cam.com/video/0mO4op3bL90/v-deo.html
When I play cyberpunk ryzen 95950X I get a avg fps of 120/150 while I have wall paper engine running and discord on while in a call sharing my screen so people can see my gameplay and I have a UA-cam video on in the background my friend with the same system GPU mb so on has a ryzen 5600x and when he plays cyberpunk even at 720p low while he has wallpaper engine on and shared screen he is getting a avg of 60/70 and when he closes everything but the game and runs discord on his phone he instantly goes up to like 90ish FPS
So I know this has been covered before but it's been a while for me, If I shut off multicore processing in a 32thread CPU like the 5950X does it help any? there would still be 16 cores but I have just never switched it off and tested it in a game. I love my 5950X it is perfect power to power(w) ratio lol for most of what I use it for but hey I am always happy to game a little faster.
I do wish AMD will drop 6-core and start with 8-core with the next generation. It was amazing when we went from 4-core Intel stuff for ten years to 6-core with AMD Ryzen but that was 7+ years ago now.
The 5800x3d kinda matches it which is sorta nice, but the people who made the smart choice and went with the 7x00x are going to be the ones laughing when they pop in a cheap zen 5 or zen 6 upgrade.
@@mruczyslaw50 or a 7700x3d just like the 5700x3d with a little lower clock than 7800x3d for like 50 to 70$ less would be nice. in germany the 7700 costs 230€ and the 7800x3d costs 340€, so something in the middle there would be lit
Hehe yes and included my 8600K that I have delided, replaced the IHS and is running with an aggressive OC in a water-cooled system with relatively slow 64 GB memory. That chart is going to be very long, maybe the video should be a vertical video 🤣
@@HardwareunboxedIm not clear on which video this video is in regards to, Could you link it please? Also sorry to say but Im not clear on the big point being made with this video’s testing. I get the point made at the beginning (that when it comes to cpu’s, the overall throughput capability of the cpu matters in so far as it needs to be capable of your desired fps).
Multitasking is gonna hammer your cache and memory pipeline all the same regardless of your core count. I don't understand why people think the number of cores determines everything. Not everything is Cinebench, barely anything is.
I even feel my SATA channels being "hammered" when I run a not so demanding game from one SSD and downloading torrents to another HD and my virtual memory is on separate SSD btw. Maybe I should try to replug my SATA drives in different channels. Maybe qbittrrent is keeping RAM too busy, memory allocation is not big though.
I still hear people all the time saying cores don't matter constantly. Apparently programmers are too stupid to take advantage of multiple cores and we should go back to single core. The truth is both matter and it depends on your use case.
The best multitasking example is capturing gameplay using CPU encoding (which still offers the best quality). If you have a single-PC setup, whether you're recording or streaming, you'll need a beefy CPU if you want to use it to encode video. Even the newest 6-core CPUs are not enough to encode 1080p60 (x264 medium preset) while gaming. You need a modern 8-core chip for that. Higher resolutions would require a lot more cores. It's a niche use case, but it exists.
Have you heard about windows task priority? Put compiling to idle and as long as the compiling doesn't hammer your disk and/or ram the games will run fine.
The 7700k (4c) was faster than the 1600 (6c). The 2600 (6c) was faster than the 1700 (8c). The 3600 (6c) was faster than the 2700x (8c). The 5600 (6c) was faster than the 3700x (8c) The 7600 (6c) is faster than the 5800x (8c). How many times do we have to go through this?
@@R3endevous not only ipc, cache size, access memory latency, how cache is access by cores. ipc is just instructions per clock, if anything else is designed wrong ipc doesn't matter
The be fair to Bethesda, their asset quality is kind of...insane. Even a damn sandwich is high poly count. So if you walk into a residential area with all the assets, its gonna be quite taxing.
You can save a couple bucks if you run it with WD-40 instead, though you gotta be careful when applying it that you don't get runoff into the PCIe slots.
@@moldyshishkabob Indeed. I myself sometimes like to throw the cpu into a tub of good old lard. And make sure all the pins get nice and insulated for the electrons.
Just upgraded from AMD Ryzen 5 3600 to 5600X. Some might consider that barely an upgrade, but I got my 5600X for cheap, and in the end, there will be only a 40 euro difference after I sell my 3600. Was considering 5700X3D or 5800X3D, but there is basically none in the used market (since they kick ass) and I will not pay a premium for new a CPU.
VINDICATION!!! THANK YOU!! I chose the Ryzen 5 7600 as my CPU in my build last year to pair with a 7900XTX. I was given so much grief both in person and online about what a bad CPU/GPU combo that was and that I HAD to get a 7800X3D or else I was otherwise severely crippling my performance. Thing is, I was making my build with 4k gaming in mind, and I already knew the difference at 4k wouldn't be huge enough to warrant paying double the price for a CPU that won't benefit me. Wow not only is the 4k performance close, it's pretty much the exact same between both CPUs!!! It feels so good to know I was right in my decision (not that I ever had any doubts about it) and also to pocket a decent bit of change in the process (which helped with the purchase of that 7900XTX lol). Thank you for revisiting this topic and showing folks the facts!!
I had to explain this while trying to budget out a gaming system to a friend, that a ryzen 7500f was faster than his 9900k, and by a significant margin, but he could not get around that his had 8 cores.
@@foxskyful ok.... But there's hardly any immediate advantage. It makes no sense. You'd really rather have 200-300 less budget for your GPU. To save 10 a year in electricity + 6% CPU performance
Something I'd love to see is a more extensive investigation as to what games benefit greatly by that 3D cache CPU's, things like simulation category games for example and to see how much of a correlation there is by category of games and larger than normal gains. This would likely be hard to do with this in-depth benchmarking because you can't just use your standard test suite and a lot of games simply do not have easy ways to test this with consistent repeats, realistically it would be more rough tests that can show it as a trend rather than exact benchmarking data, so for example the 7700X and 7800X3D compared and seeing if there is a very clear unmistakable difference like in for example Assetto Corsa Competizione. It would be neat to be able to conclude and recommend something like "if you play games of this category you are more likely to benefit from the 3D cache CPU models" or the opposite where certain categories of games are unlikely to benefit much and be reasonably confident that it's helpful advice. Sure in the end every game is different but there could potentially be trends within categories of games just from the nature of the experience.
Basically 3600 single core perf is so low compared to 7600 single core perf. The architecture difference, the selectable channel in DDR5, the increase in L2 cache and how L3 cache is connected to the Core plays a part in the gaming workload. Gaming workload cannot take advantage of all the threads but DX12 and modern middleware like Unreal 4/5 is able to take advantage of more cores. Rendering and other multi-core workload like multithreaded workload works differently and depends on which instructions are being used. SIMD or FP instructions for eg.
"multithreaded workload works differently" yeah and we have been begging HU to put multithreading workloads in their tests, but they say it's too hard. they would rather just spout off guesses.
TBH, even with games, the concept of a single-threaded major application are almost gone. The vast majority of modern games use multiple threads, maybe not all available, but they are certainty *not* confined to a single thread. Really, it's just a legacy concept that stubbornly refuses to die, like so many others...
But Steve has, for years now, done the testing to back up his claims. About two years ago he did a CPU-to-GPU Scaling video where the number of CPUs, GPUs, resolutions and games he tested against each other added up to 1700 benchmark runs for a single video.
@@JustADudeGamer Steve has been doing this for over 20 years, and this isn't his first collection of games benchmarks to back this up. He caught flak for proving that quad-cores were categorically out if you want a good gaming experience, and now people are giving him flak for showing that in the vast majority of modern games 6 cores work perfectly fine with some benefitting a bit from 8 cores. Above that it's just down to clock speed.
and i agree with Teach on what hes saing because ive ott two different pcs that i use in daily life and i notice a massive difference between my 7820X/64gb to my 10940X/256 gigs (yes its a workstation) why you beleave all this when subjectiveness matters more at the end of the day then some nice numbers in a nice looking chart? i would not wanna play MW3 on my 7820X hell no, the 10940X is a LOT better, even considering the rumors that the X299 chip generations are ppretty muc hthe same, therse then isnt anything left then the cores to be talking about right? yes but THAT then made the differrence needed ;) i would also not go back to my 5700X which i gave a friend who had a 2600X becasue the 2600X was terrable to him and to me as well, both get destroyed in everything i do with a PC by the 10940X so yeahh theres that so i dont know why you dudes still watch a youtuber that tests stuff as they did in idk 2014? that cant be right is it???
Yepp, this is the video we needed to clarify the topic. 👍 There's only one that might have been added: CPU utilization chart. What I've seen recently, Ryzen5 3600 is at the end of its breath, especially in online competitive games. Nonetheless, nice job HWU! 😊
Good point on the online competitive games. I just built my daughter an AM4 build using the R5 3600 (it was on my shelf brand new) with a Powercolor 6800XT and seems to do quite well in 1440p with games such as Baldur's Gate 3 and other RPG games. But as you rightly point out intensive FPS games will suffer FPS.
13:02 Something's off here. If we assume about a 15% IPC bump from Zen 2 to 3 and the same from 3 to 4 and factor in the different turbo frequency, we should have a theoretical maximum difference of (1.15^2) * (52/42) = ~64%. No way should it be double unless AVX is doing all the heavy lifting.
This a conversation that has been going on for at least the last 5 to 10 years. Ultimately, it will always boil down to the specific game engine and its capability to utilize additional threads. This reminds of a conversation that happened roughly 4 - 5 years ago when the 9000 series Intel Core processors (Coffee Lake Refresh) were released. At the time, the data was showing that, when clocked to the same frequency, the 6C/6T 9600K, the 8C/8T 9700K, and 8C/16T 9900K all performed within 10 fps of each other. The reasoning at the time is that most game engines were still limited to 1 or 2 threads at most, so adding more cores didn't really help you in gaming performance and clock frequency is what made the difference. By buying the higher SKUs, what you were really purchasing was the ability to clock 100 to 200 MHz higher than the lowest SKU and only if you took the time to overclock them.
yea and most games are still coded single core heavy. getting 4 threads to 6 alone was a leap and was it can still be agued the sweet spot for gaming and even then that's only certain games as most games don't use all physical 8 cores let alone all 16 threads. BF2042 uses 8c and 16 threads since it actually utilizes hyperthreading and WZ as well. i cant thing of any others off top of my head that do
I won the silicon lottery with my 7600X. It often gets up to 6 Ghz for short spurts without drawing a ton or power or creating a lot of heat. Sits around 5.5 all cores most of the time in gaming if needed. Staying cool and drawing a modest amount of power.
@@Safetytrousers Stock. I did try an under volt and an overvolt. With the first few BIOS this was pretty unstable. With the newer BIOS that's been fixed. Gigabyte X670 GAMING X AX. Not sure what the issue was for the first 6-8 months I had it. Had BSOD's and boot failures quite a few times. BIOS F8 fixed it about 10 months ago and it's been very stable since. That said I stopped playing with the voltage. My cooling is very good and my office is usually pretty cool anyway. In fact in the UK so far this year - it's been hard to get warm no matter how much heat my PC and home radiators pump out.
@@Safetytrousers It does seem very BIOS/MOBO dependent. I've built my own PC's since the original Pentium, and it's been a long time since BIOS and MOBO's made such a big difference. I think the last time I had so much variation between BIOS versions was the old AMD Athlon FX57. On an old MSI motherboard. I had to update the BIOS about 5x till I got the performance. At the time that meant making a floppy disk and a lot more messing about than it does these days. I suspect many people stayed with very poor performance and never knew why.
Thanks for the content. I think if the comparison was a 7600 vs 7700 would be better for 6c vs 8c contect as the 7800X3D, while 8c/16t numbers are in many cases and in varying degree influenced by the 3DVC but yes it shouldn't be that far off from say the 7700X 8c/16t counterpart.
I game at 4K, have an overclocked 5800x3D, and in a moment of weakness ordered a crazy overkill 7950x3D, mobo, and memory. I sadly sent it back to NewEgg unopened yesterday and now I’m feeling awfully good about it.
Hi, here in MÉXICO City the temperature is rising, 30° to 32°C , so can you please make a test with higher room temp, to see if it's enought the stock cooling?
I feel like to really answer the question about "how much do cores matter" you should compare chips of the same architecure with different core counts or, maybe even better, take a high core count chip and start disabling various amounts of cores... Oh wait, you already did - 2 years ago for Intel 10th gen and only 2 months ago for Zen3. Almost as if this debate keeps coming back every year. I would post the links for the lazy but those tend to be removed for obvious reasons, so tl;dr frequency/cache/architectural improvements tend to matter a lot more than cores beyond the 6th (and definitely beyond the 8th)
I was really happy with the 3600/RX580 8GB for 1080p medium settings but then went up to the 5600 and RX6600 and can very high/Ultra my games at 1080p and I'm happy with it.
@@RickGrimez9490 Which CPU do you have and what is your total budget? If your CPU sucks and the motherboard does not support modern enough CPUs, you need to upgrade the CPU or at worst, both. Otherwise, you need to limit your spending on your GPU so there won't be any bottlenecks unless you upgrade other parts later, or plan to play with high resolutions and more demanding graphics settings to eliminate CPU bottlenecks. Otherwise, you might lose some performance. It's not the end of the world, but it's something to consider while upgrading your gear.
@@RickGrimez9490 I mod skyrim se for 1080p. With optimize setting + ENB, i can say i only need 3070/4070 with my 3600. Thats goin to have like consistent 60fps and probably some low 52-55 for super crowded place.
@@Purjo92 im currently saving money. What ik is that it will be am5 plattform, r5 7600 i need a 1080p gpu, was thinking rtx 4060 or rx 7600xt, or something for max 400 Euro (im from germany). My goal is to build this pc in september
This was super useful for me! I'm building a gaming pc for the first time in over a decade and couldn't decide between the 7600 (non x) or the 7800x3d. Reason being AM5 is a new platform with hopefully two more generations of CPU to look forward to. Save money and get the 7600? Then have a super easy upgrade path later on with something like a 9800x3d or whatever is the final version for AM5. I can apply the $190 savings with the 7600 towards a MUCH more useful GPU upgrade. Most reviewers don't really bother comparing chips like the 7600 and 7800x3d because as you said they aren't really comparable for use. But for me it is and I really appreciate your time to do this review!
Absolutely, I bought the 7600 on sale for dirt cheap and put the savings into a better GPU. Also knowing I'll be able to upgrade the CPU in a few years with no regrets.
The idea that 6 core 12 thread CPUs are obsolete isn't the full story. For 60hz gaming, they are fine, at any resolution. Resolution never has affected CPU, the idea that is does is something I believed for years, and I know most people do, and it doesn't help that game devs list higher-end CPUs at the 4k spec than the recommended 1080p or 1440p specs. Most gamers are saying 16 threads is what you want because they're not targeting 60fps, or they are multitasking, or using the consoles as a baseline. It is a better idea to go with a 16 thread than a 12 thread if you have the budget for it and have a 120hz monitor or higher and actually get above 60fps often. On a lower end GPU you likely won't benefit.
as a 7600x owner, i appreciate this very much. Most testbenchmarks only use the 7800x3d for gpu testing but now we have clean data that they arent that far apart.
@@MaxIronsThird depends on what you're running. I prefer a 7800X3D for my 4090 for the 1% lows it offers. And I still use DLSS to get the fps to 120 at least. But, I have to share this, apart from depth of field, even DLSS ultra performance mode looks great, it's astounding 🤯 😂
@@MaxIronsThird7800x3d is completely useless if u play triple A titles, especially on ultra settings. Even on 1080p, ull be gpu limited if ure not on the highest of highest end of gpu. It is fast, but its still useless for 99% of people who doesnt own a 7900xt or anything beyond a 4080
only thing i can say about multitasking slowing games is that one time i actually did that... i was 14? had bulldozer cpu, was playing old at that time Prince of Persia Warrior Within, at some point game started being 3x normal speed because fps got unlocked somehow, how did i fix that? i overloaded pc with task and throttled fps to achieve normal gameplay fun times
Vcache tends to make a lot more difference for gaming than cores--just look at how the 5600X3D smokes the 5950X in the 12-game average at 18:11--so I'd prefer "generation over X3D over cores."
tbh i can agree on a lot hes saying cuz i saw it myself ;) even before i ever saw him i went more cores and mroe ram and if i did go back i was like OMG what is this pile of nonsense? ^_^
Awesome video, thanks guys! I really like this because it’s very useful to see exactly very the limits of a cpu is, so you don’t have over spend. 7600 in this case seem to give very similar performance at higher settings at higher resolutions so if this is where you intend to go a GPU upgrade is possible with out compromising on the performance and at the same time if you lower to medium an CPU upgrade might be something to think about. I really like to see more of this with more GPUs, like I have said before this really beats out those crappy “bottleneck calculators”. Great work guys! ❤
Of course it depends, I have an "old" 6-core 5600X and an "outdated" RX6600 and there isn't a single game I want to play that I can't at 60fps. Of course if I wanted to play modern shooters in 4K at 144HZ, different story. Hell, for most games I play my PC is complete overkill.
My uncle had that setup and he barely games like he used to but it's a very power efficient system (slight undervolting helps even more as well) for when he spends time watching TV shows, UA-cam and torrenting.
@@DoktorLorenz I undervolted my gpu greatly, quite a bit more than my gpu is supposed to go and it is totally stable. At full tilt it is supposed to use 100w max but mine is frequently using around 75w when playing Cyberpunk at ultra settings. Cpu uses next to nothing, so yes, extremely efficient. I am very happy with my system.
while i get that you used the 7800x3d because it was the fastest gaming cpu out right now, I think that using a non-Vcache 8 core cpu would have had a more profound statement. showing the tiny increase in frames going from the 7600x to the 7700x would have really driven home the fact that having 8 cores over 6 is less important to having newer architecture or more cache. Someone new to tech seeing these slides may get the idea that the 7800x3d is getting its boost from the 2 extra cores and not truely understand that its the Vcache doing all the heavy lifting on that part.
The funny thing is that the 7900X3D is significantly slower than the 7800X3D in gaming because of CCD cross talk. So you need 8 cores with 3D V-Cache or you need 16 cores with 8 of them having 3D V-Cache.
8:59 The 4k 4090 Hogwarts legacy chart shows that the 7600 will stutter with RT enabled (much lower than 60fps 1% lows) but the 7800x3d won't. If you looked at only the 1080p or 1440p chart, you would not know that the 7600 would struggle with RT enabled because the 1% lows are above 60fps.
@@Hardwareunboxed I didn't say anything about the reason why the game would stutter when playing with raytracing at 4k. I fully accept that this is due to the cache, not the number of cores. I merely noted that the 4k chart tells us information about the gameplay smoothness (48fps 1% lows) with that setting which you cannot see on the 1080p or 1440p chart. The 4k chart is not misleading or redundant. It contains useful information.
The latency penalty on the 3600 'infinity fabric' is horrible. I had a 3600X in my gaming PC back when it released and found huge performance gains by pinning games to the last 3 cores, and as many other programs and background tasks to the first 3. 3 cores for gaming wasn't ideal at all, but faster for the games i was playing than 6 poorly coordinated ones. While this is not an issue with the 5000 & 7000 series with 8 cores or less, there are still some big gains (mostly stutters in the 1-5% lows) to be made by having games pinned to the last 6 cores and everything else on the first 2 (assuming it's just discord, browsers and other basic stuff).
1800x to 3800x was amazing but then going to a 5800x3d removed the ccx latency of the 5000 series and the extra cache (Factorio was mind blowing smoother and could increase factory sizes) I can imagine 7800x3d would be interesting but full system is out of the question for now probably be 9000 series before I replace this system I find that 2000 and 3000 CPUs been missing a lot and comparing 5600 to a 5800x3d when it’s not a typical upgrade when a 2000/3000 cpu is (sometimes even 1000 cpu as well both the 1000 and 2000 cpu was horrible to use the 3000 3800x was good for me for years )
@@leexgx i upgraded from 2700x to 5800x3d some time ago and play stuff like factorio too. this 3d cache and single ccx latency was such a good investment for strategy and simulation games! stellaris endgame is finally playable (barely ^^). if i would play more multicore friendly modern titles, i probably would have gone with the 5600 for best value. it hat the better latency, nearly as much clock speed and multicore performance hasn't been a problem in games even with the 2700x.
fast cpu matter when playing competitive shooter game ppl will do anything to win and get advantage over enemies there are cpu heavy because most ppl will play at low or medium at 1080p or 1440p asus already release high end mornitor currently top2 are rog 2k oled 360hz and rog fhd tn 540hz ppl will need stronger system to push game performance higher to match their monitor but for average and casual gamer, mid range cpu is worth more because it's last longer and cheaper than high end, they can spend more on gpu and upgrade later once it's not good enough, this is cheaper than buying high end cpu
I installed a 5600G into my mediaserver after I damaged the old 3800X during a cooler upgrade (it stuck to the cooler and then dropped into the socket, bending a few pins and whilst it could be repaired, I lack the tools to do so (magnifying and tiny tweezers of some kind) and buying the tools vs a 5600G wasn't a big difference) What I found was the 5600G easily as good as the 3800X for media operations, transcoding, encoding, remuxing and so forth... Whilst running cooler... It also allowed me to remove the GPU entirely, which frees up that PCI-E 16x slot that I plan on filling with an NVME card with a bunch of NVME drives... so I can start to retire some of the HDDs with around 75k hrs on them.
3:20 I believe statements like this is the problem. most games need one strong thread and like 2-16 background ones. (amount depends on a game, plus if anything else runs in background its nice that its not taking cpu time from main important thread as well) you will see better performance as long as: A) main thread have more cpu time for itself, up to what single core can deliver B) all background threads, like ssd access or data transfer to gpu are able to be served, antivirus, windows downloading update RIGHT NOW, somehow without interfering with A) this vary per game, obviously. both matter TO A POINT.
Wasn't there an article from a PC mag a decade ago that asked, "is three cores all you need to game?" Edit: Found the article, "How Many CPU Cores Do You Need?" by Tom's Hardware back in 2009. The conclusion: "As far as games go, we see a huge 60% performance jump from going single-core to dual-core, and a further 25% leap from dual- to triple-core. Quad cores offer no benefits in the sampling of games we tested. While more games might change the landscape a little, we think the triple-core Phenom II X3s are looking good as a low-cost gaming option."
@@user-wq9mw2xz3j Yep, it was back in the days of AMD's Phenom II chips. A number of their quad-core chip production had a bad core, so they disabled it and sold them as Triple-core CPUs labled Phenom II X3. I had one of them myself, until they made the hexacore Phenom II. Ah, I found the article, it was, "How Many CPU Cores Do You Need?" by Tom's Hardware. The conclusion: "As far as games go, we see a huge 60% performance jump from going single-core to dual-core, and a further 25% leap from dual- to triple-core. Quad cores offer no benefits in the sampling of games we tested. While more games might change the landscape a little, we think the triple-core Phenom II X3s are looking good as a low-cost gaming option."
Always been like this due to higher clocks and generational improvements.... IF ALL CPUs where to run at the SAME clocks, it wouldn't be that big of a difference!
@@Nepturion88 IPC improvements typically account for half-ish (sometime more, sometimes less) of a generational improvement. So it's still substantial what you get without accounting for clock speed.
Given the chart at 18:15, just popping in a 5700X3D for 230 bucks if you still have Ryzen 3000 series seems like a a great option for anyone with a 6700xt or 3070 or better
I think this video cleared up a lot of convoluted analysis about perhaps an overly limited data set and explained it all much better. Thanks for following up.
I'm very happy with my Ryzen 7600, but I'll say it *needs* a cooler upgrade. The stock cooler just doesn't cut it. But a 23$ air tower cooler is perfectly enough. I'm glad I paired that CPU with a 6750XT -that I already owned- for 1440p gaming, I feel it's pretty well balanced. And I'm confident my motherboard will be enough for the next low TDP Ryzen 5, zen5 or more probably zen6 in a couple of years. Thanks HUB, your videos are pure gold, the best content covering CPUs, motherboards and GPUs for consumers interested in gaming PCs 👍🏻
_"...my Ryzen 7600, but I'll say it needs a cooler upgrade"_ Uh, NO! It's a 65W CPU! There is absolutely no reason to *ever* "upgrade" the cooler on a baseline 7600; overclocking gains are minuscule, completely pointless and simply generate more heat (like the irrelevant 7600X),
@@awebuser5914 with the stock cooler, the 7600 hits 95º after about 5 seconds of 100% load, without PBO and, of course, no overclocking. I don't care if AMD says that is ok and it's by design, I'm not letting my CPU work at those temperatures all the time, whatever they say... They could have included the higher tier stock cooler and it would be a different matter.
I undervolted my 7600 to -20 and 85.now it's run way cooler.i m also using rx6750 xt in eco profile it's cap fps to 60 but cpu and GPU runs much cooler.
You guys must understand that undervolting the lowest CPU in the current lineup, which is just a 65W TDP, because the stock cooler underperforms, should not be the norm, and is something that regular users wouldn't do.
So say for example you wanted a low end new gpu like the rx 7600 or the rtx 4060. At what point would it make sense to get more cpu than the 7600? Should you even bother with 7800x3d if you are wanting a low end new gpu? I imagine the gpu would be bottlenecked until you went up to a 7700xt gpu.
I remember running into similar FPS changes when I upgraded from my Ryzen 5 2600 to my Ryzen 7 5700G. It has 2 more cores, but that doesn't explain the new issues I was running into. Specifically, I was getting shoddy 1% lows. With the 2600, the 3000 MHz RAM I was using was fine because the CPU was just that limited that the RAM could transfer data between it and my 6700 XT just fine. But with the 5000 series being just that much more powerful, the slow RAM was causing hitching issues. I've since upgraded to some G.Skill 4000 MHz, which runs stable at full speed and a 2000 MHz FCLK. The 1% lows still hurt because my CPU is affected by the USB disconnection issue, forcing me to downgrade the PCIe speed since an SoC overvolt of even 10mV proves to be unstable. But even with the downgraded PCIe speeds, it's still miles better than with the 3000 MHz RAM. I'm still able to get within around 10 FPS of most Horizon: Zero Dawn benchmarks for my 6700 XT, and I'm even using Linux as my main OS. Now that I think about it, that leads me to a potential video idea: Benchmarks using native DirectX vs DXVK/VKD3D. There's actually a lot of games out there that get a speed boost even within Windows when using DXVK. It'd be interesting to have some quantifiable data for it.
Not to be in the "but you missed x CPU!1!" camp, but to really lock it home you could have included the 2700X -- that would have been an 8 core CPU getting beaten by both 6 core CPUs.
Straight up just threw away my 3600 once I built my 7800x3d system, they say the pc doesn’t make the gamer a better gamer well when your 1%lows are 200+ it kind of does + good 5G wifi
The change I've often noticed the most has been adding more GPU lately. 6 cores and the best GPU one can afford might still be the meta with 7800X3D or an 8 core/higher being the more premium options.
Right, your correct but this is a pure CPU test which means needs to be CPU bound and why they use a powerful GPU, it forces the CPU to max out to give you best performance
I know this channel (or almost any big youtube channels) doesn't cover the emulation performance but I think that could have had interesting results in this specific test video. As far as I know, core counts actually matter on emulating more demanding stuff like ps3 games etc but it's quite difficult to find a reliable source like you guys who actualy made a comprehensive video about it. Great video regardless. Hope to see people commenting "do I need 8 cores for gaming" in a couple of days again.
I remember, when i paid attention to these things, that platform latency played a big part on gaming performance. That applied even when game was GPU limited, if i remember correctly. I think it's possible that when GPU is waiting, that waiting time is increased when player does something and frames and environment needs to be recalculated. Every change takes time and response time of system affects, something like "input latency" in displays. At the time AMD introduced CPU's with integrated memory controller (used to be in a chip set) which decreased latency greatly.
19:46 This is wrong. What matters is how many threads the background task is doing work on. If you have 24 cores and a background task that uses, say, eight work threads, that will not significantly impact gaming performance. If you have a 7950X3D and make sure the game runs on the V-cache CCD while background tasks run on the other CCD, you won't be able to tell the task is running, unless either it or the game is very sensitive to memory throughput (constantly reading large quantities of data to and from memory), rather than latency. If you were to limit a background task to four threads and do the same with a 7800X3D, the impact would be greater, because less of the cache will be available for the game. With a TR system, it's like with the 7950X3D, just without the extra cache boost. You can limit tasks to specific CCD's, meaning the L3 cache where the game is running won't get filled with data from those tasks. Given the four memory channels, you'll also have more memory throughput to work with, if either the game or the tasks are reliant on it.
i have a question if a cpu (7600 for example) hits 100fps with a high end graphics card while cpu bottlenecked and a gpu (4070 for example) hits 100fps with a high end cpu (14900k/7800x3d) while gpu bottlenecked does that mean that one will hit about 100fps with a 7600 and 4070 or will the performance suffer more
I wonder if higher core count cpu receives less of an impact on utilization from a higher polling rate mouse? If so than gaming cpu needs to be a higher core count now.
What is a good 5000 series for both streaming and playing games at the same time? My 5600 doesn't like me playing games at max 1080p while streaming with my 4070super..
Most applications out there usually boil down to simple load -> basic calc/ evaluation -> store instructions on the CPU. You'd want modern hardware with modern ISA so that latency sensitive apps like games can run the best. Games usually arent complex and require a lot of number crunching. Ideally you want clock frequency and IPC, atleast with a traditional thread model that most games use. When you program around Fibers this changes somewhat but comes with its own downsides.
In broad terms, more cores will be advantageous either because the game can schedule certain tasks to better spread the load across (depending on the game that is) or mostly because modern versions of Windows (10 or 11) are resource hogs and often will chew up as many resources as they can from machines by multiplying useless processes, having processes suspended so they don't eat away CPU resources but they still reserve a portion of your system's memory, and so forth. In short having more cores helps you game because Windows is so bad that you need to make sure it has enough to "eat" while still allowing you to have resources to play your games :P
I did a "budget" high end build late last year with a 7600X/7900XT. I'm pretty happy with it's performance for the $$$. I couldn't justify spending another 280$(bundle savings) on the 7800X3D when I game at 1440p. Shoot I even try to see if I can run closer to 100fps in games and keep the thermals/power usage down. My wife plays Hogwarts on max setting no RT capped at 75fps. I don't even see temps above 65C like this. Gotta love it
Almost 3 years with my 3600 when i finally managed to upgrade at something mid-range, and i still don't plan to upgrade until this one starts to show it's age. It's paired with RTX 3060 so i'm more than happy at the moment. It was a huge upgrade from my previous i5-2400 and still rocks with whatever i throw at it!
How much is the difference between low/ultra settings due to the CPU? Noticably when heavily GPU limited there's a difference between presets, but not between CPUs. I then assume the difference is mostly GPU bound... But then in 1080p low/medium where the performance is definitely CPU limited, we still see a very similar frame rate difference between Low and Medium presets. Suggesting the CPU is mostly responsible. Is it certain groups of settings that primarily hit the GPU/CPU specifically? In that case I'd love to find out which settings hit my CPU (to lower them) and which hit the GPU (it has headroom) to keep as much eyecandy as possible without losing that 30-50% performance
Different programs still implement multicore differently. Extra cores are best fore heavy creation and management softwares, like editors and various servers. As long as you meet your specific workloads overhead, more won't make much difference, outside costs, but less will hurt.
If you want to learn more about cores/cache performance of Ryzen processors then check this video out (if you missed it) ua-cam.com/video/0mO4op3bL90/v-deo.html
Starfield is such a mess - should not be included its that bad.
When I play cyberpunk ryzen 95950X I get a avg fps of 120/150 while I have wall paper engine running and discord on while in a call sharing my screen so people can see my gameplay and I have a UA-cam video on in the background my friend with the same system GPU mb so on has a ryzen 5600x and when he plays cyberpunk even at 720p low while he has wallpaper engine on and shared screen he is getting a avg of 60/70 and when he closes everything but the game and runs discord on his phone he instantly goes up to like 90ish FPS
yeah its no secret 4 cores is all you need.
Tech Deals wouldn't approve of this though. 2:30
So I know this has been covered before but it's been a while for me, If I shut off multicore processing in a 32thread CPU like the 5950X does it help any? there would still be 16 cores but I have just never switched it off and tested it in a game. I love my 5950X it is perfect power to power(w) ratio lol for most of what I use it for but hey I am always happy to game a little faster.
time to complain about a CPU thats not included
2500k owners in shambles
Where is my 8086k lol
I do wish AMD will drop 6-core and start with 8-core with the next generation. It was amazing when we went from 4-core Intel stuff for ten years to 6-core with AMD Ryzen but that was 7+ years ago now.
So true 😂
I feel those types have unlimited free time. No concept of the labor and love that goes into this kind of content 😂
Only amd CPUs exist.
You should have a video called "It Depends" which covers every technical question possible
I might make that video, it depends.
Oh heck!!!
That video would be a full length feature film!!!
@@GB-zi6qrwell, I think it depends on
@@berengerchristy6256 🤣🤣🤣🤣
@@Hardwareunboxed well played
But Steve, how can you not include the Phenom x6 1055t to represent the popular 6 core cpu?
Dont forget The legendary fx 6300
1055t I am still using....
to old dude. the intel Q6600 was also a killer. Or duo core 6600 at that time when 2 core were still enough.
But if we are gonna do that I want to see Octo Opteron 850
Yes, and also i7 970 :P
The 7600X being as fast as a 5800X3D in gaming should say a lot tbh
5.45 ghz all cores under 85C in all games.. why would i need more
@@laurentiudll ghz doesnt tell anything.. 10 years ago i overclocked my garbage buldozer fx-8350 5.1ghz and i bet 7600x is faster even 2ghz.
@@laurentiudll because 85c is too hot REEEEEEEEEEEEEEEEEEEEE
Says nothing since it clocks 0.5 -0.9ghz higher
The 5800x3d kinda matches it which is sorta nice, but the people who made the smart choice and went with the 7x00x are going to be the ones laughing when they pop in a cheap zen 5 or zen 6 upgrade.
If they made 7600X3D, it would probably be much closer to 7800X3D making 6 cores truly all you need for gaming
They are collecting bad 7800x3d and when there are enough… they will release 7600x3d… they don`t have enough of those bad cpus yet!
@@haukionkannelTruuu
The 3D v-cache die of 7900X3D could be used for a potential 7600X3D.
@@haukionkannel As long as it won't be a stupid microcenter exclusive as 5600X3D was then it will be great
@@mruczyslaw50 or a 7700x3d just like the 5700x3d with a little lower clock than 7800x3d for like 50 to 70$ less would be nice. in germany the 7700 costs 230€ and the 7800x3d costs 340€, so something in the middle there would be lit
Still, why didnt Steve include my mighty 5600x in the test. He better re-do whole test again? right?
On it!
Hehe yes and included my 8600K that I have delided, replaced the IHS and is running with an aggressive OC in a water-cooled system with relatively slow 64 GB memory.
That chart is going to be very long, maybe the video should be a vertical video 🤣
@@HardwareunboxedIm not clear on which video this video is in regards to, Could you link it please? Also sorry to say but Im not clear on the big point being made with this video’s testing. I get the point made at the beginning (that when it comes to cpu’s, the overall throughput capability of the cpu matters in so far as it needs to be capable of your desired fps).
@@Hardwareunboxed Thanks Steve
😂😂😂
Cores and effect.
No cores for concern.
Uhiehuehue
Rebel without 8 cores.
Cores Light.
I coronate you King! of the coreny puns, my brother from another motherboard.
okay, that's enough dad.
20:00 Best B roll of 2024 😂
electrostatic shock: Let me introduce myself!
bro really watched a 20 min video at 5x speed
I cleaned all the electrostatic off first.
@@Hi_Im_o2Probably. Lmao
@@GewelRealSynthetic dusters get static very easily, but organic (ostrich feather) ones really don't.
Multitasking is gonna hammer your cache and memory pipeline all the same regardless of your core count. I don't understand why people think the number of cores determines everything. Not everything is Cinebench, barely anything is.
I even feel my SATA channels being "hammered" when I run a not so demanding game from one SSD and downloading torrents to another HD and my virtual memory is on separate SSD btw. Maybe I should try to replug my SATA drives in different channels. Maybe qbittrrent is keeping RAM too busy, memory allocation is not big though.
I still hear people all the time saying cores don't matter constantly. Apparently programmers are too stupid to take advantage of multiple cores and we should go back to single core. The truth is both matter and it depends on your use case.
still, it would be nice if somebody actually tested multitasking
@@zangetsu_the_best_zanpakuto HUB did. Look it up.
The best multitasking example is capturing gameplay using CPU encoding (which still offers the best quality). If you have a single-PC setup, whether you're recording or streaming, you'll need a beefy CPU if you want to use it to encode video.
Even the newest 6-core CPUs are not enough to encode 1080p60 (x264 medium preset) while gaming. You need a modern 8-core chip for that. Higher resolutions would require a lot more cores. It's a niche use case, but it exists.
So that's how you keep your Windows clean 19:59
19:47 I can attest to that. Gaming performance absolutely tanks when I'm compiling code in the background on a 32 core threadripper.
Have you heard about windows task priority? Put compiling to idle and as long as the compiling doesn't hammer your disk and/or ram the games will run fine.
The 7700k (4c) was faster than the 1600 (6c).
The 2600 (6c) was faster than the 1700 (8c).
The 3600 (6c) was faster than the 2700x (8c).
The 5600 (6c) was faster than the 3700x (8c)
The 7600 (6c) is faster than the 5800x (8c).
How many times do we have to go through this?
And it will be a while before we are seeing a situation like Dragon's Dogma 2 on the i3 12100 where having only 4 cores simply isn't enough.
Tech hardware channels are allergic to understanding technical terms. IPC is all you’d need to mention and explain.
There's a bot that stole your comment
@@R3endevous not only ipc, cache size, access memory latency, how cache is access by cores. ipc is just instructions per clock, if anything else is designed wrong ipc doesn't matter
@@koczaioandaniel4014 Yes so we're in agreement. Explain what IPC is and naturally you'll explain what is contributive to that.
Crazy how Starfield is so CPU bound even at 4K 🤯
The be fair to Bethesda, their asset quality is kind of...insane. Even a damn sandwich is high poly count.
So if you walk into a residential area with all the assets, its gonna be quite taxing.
Ancient gaming engine that they refuse to replace.
@@Ryotsu2112 People keep buying their crap so why would they.
Solution: don't play Starfield. 🤔
High asset quality + full of physics simulated objects.
You can get away with 6 cores if you add more core fluid. I usually run about 3 quarts of 0w-20GB core fluid.
Hold my almighty tweezers there, check your power supply first!!!
What?
So true
You can save a couple bucks if you run it with WD-40 instead, though you gotta be careful when applying it that you don't get runoff into the PCIe slots.
@@moldyshishkabob Indeed. I myself sometimes like to throw the cpu into a tub of good old lard. And make sure all the pins get nice and insulated for the electrons.
Just upgraded from AMD Ryzen 5 3600 to 5600X. Some might consider that barely an upgrade, but I got my 5600X for cheap, and in the end, there will be only a 40 euro difference after I sell my 3600. Was considering 5700X3D or 5800X3D, but there is basically none in the used market (since they kick ass) and I will not pay a premium for new a CPU.
I went from 3700X to 5700X X3D chips were still on the way. Noticed at least 10% jump in FPS with less stutters...
5600x 2024 is kind of buns
Still a decent upgrade if u got it cheap..The 5600 runs much more efficient and better in general. With that u dont need to go for a x3D anymore now.
I want from ryzen 1500x to a ryzen 5600x
Went from 3600 to 5600 for cheap last year too. Does all I need!
VINDICATION!!! THANK YOU!! I chose the Ryzen 5 7600 as my CPU in my build last year to pair with a 7900XTX. I was given so much grief both in person and online about what a bad CPU/GPU combo that was and that I HAD to get a 7800X3D or else I was otherwise severely crippling my performance. Thing is, I was making my build with 4k gaming in mind, and I already knew the difference at 4k wouldn't be huge enough to warrant paying double the price for a CPU that won't benefit me. Wow not only is the 4k performance close, it's pretty much the exact same between both CPUs!!!
It feels so good to know I was right in my decision (not that I ever had any doubts about it) and also to pocket a decent bit of change in the process (which helped with the purchase of that 7900XTX lol). Thank you for revisiting this topic and showing folks the facts!!
after watching these videos, i feel better with my 14400, 6 core cpu. The jump in performance does not justify the price. Watch out for FOMO!
I had to explain this while trying to budget out a gaming system to a friend, that a ryzen 7500f was faster than his 9900k, and by a significant margin, but he could not get around that his had 8 cores.
6% faster. You should have kept him on the 9900k instead of side grading him. And used the money for a better gpu
@@aightm8 The argument was probably that he could upgrade later to a "cheap" 7800x3d in 5 years.
@@zeenyuhrass then upgrade later. There's no advantage to sidegrading to am5 today because you might upgrade later. Really makes no sense.
@@aightm8 Na, tdp is big difference and also platform 7500f is better choice overall
@@foxskyful ok.... But there's hardly any immediate advantage. It makes no sense. You'd really rather have 200-300 less budget for your GPU. To save 10 a year in electricity + 6% CPU performance
Something I'd love to see is a more extensive investigation as to what games benefit greatly by that 3D cache CPU's, things like simulation category games for example and to see how much of a correlation there is by category of games and larger than normal gains.
This would likely be hard to do with this in-depth benchmarking because you can't just use your standard test suite and a lot of games simply do not have easy ways to test this with consistent repeats, realistically it would be more rough tests that can show it as a trend rather than exact benchmarking data, so for example the 7700X and 7800X3D compared and seeing if there is a very clear unmistakable difference like in for example Assetto Corsa Competizione.
It would be neat to be able to conclude and recommend something like "if you play games of this category you are more likely to benefit from the 3D cache CPU models" or the opposite where certain categories of games are unlikely to benefit much and be reasonably confident that it's helpful advice. Sure in the end every game is different but there could potentially be trends within categories of games just from the nature of the experience.
They already did it years ago
Basically 3600 single core perf is so low compared to 7600 single core perf. The architecture difference, the selectable channel in DDR5, the increase in L2 cache and how L3 cache is connected to the Core plays a part in the gaming workload. Gaming workload cannot take advantage of all the threads but DX12 and modern middleware like Unreal 4/5 is able to take advantage of more cores.
Rendering and other multi-core workload like multithreaded workload works differently and depends on which instructions are being used. SIMD or FP instructions for eg.
Skipping the 5600(x) was sheer stupidity in this case. Inexcusable...
but when you see r5 3600 pushing 200fps you stil may not give a fcuk about upgrade ;)
Don't forget shader compilation steps in modern DX12 games also scales with CPU cores.
"multithreaded workload works differently"
yeah and we have been begging HU to put multithreading workloads in their tests, but they say it's too hard.
they would rather just spout off guesses.
TBH, even with games, the concept of a single-threaded major application are almost gone. The vast majority of modern games use multiple threads, maybe not all available, but they are certainty *not* confined to a single thread. Really, it's just a legacy concept that stubbornly refuses to die, like so many others...
"but where is the 5800x3d!?!?!? 🤓"
I should have never said anything :)
@@Hardwareunboxed ha ha lol
In my system!! Bad HUB trolling us 😉😉
5700x3d kicked it to the side.
😂😂😂
This goes straight against what TechDeals has been preaching for some time now.
But Steve has, for years now, done the testing to back up his claims. About two years ago he did a CPU-to-GPU Scaling video where the number of CPUs, GPUs, resolutions and games he tested against each other added up to 1700 benchmark runs for a single video.
@@JustADudeGamer Steve has been doing this for over 20 years, and this isn't his first collection of games benchmarks to back this up. He caught flak for proving that quad-cores were categorically out if you want a good gaming experience, and now people are giving him flak for showing that in the vast majority of modern games 6 cores work perfectly fine with some benefitting a bit from 8 cores. Above that it's just down to clock speed.
and i agree with Teach on what hes saing because ive ott two different pcs that i use in daily life and i notice a massive difference between my 7820X/64gb to my 10940X/256 gigs (yes its a workstation) why you beleave all this when subjectiveness matters more at the end of the day then some nice numbers in a nice looking chart? i would not wanna play MW3 on my 7820X hell no, the 10940X is a LOT better, even considering the rumors that the X299 chip generations are ppretty muc hthe same, therse then isnt anything left then the cores to be talking about right? yes but THAT then made the differrence needed ;)
i would also not go back to my 5700X which i gave a friend who had a 2600X becasue the 2600X was terrable to him and to me as well, both get destroyed in everything i do with a PC by the 10940X so yeahh theres that
so i dont know why you dudes still watch a youtuber that tests stuff as they did in idk 2014? that cant be right is it???
Hats off to you for this. Amazing work. Waiting for my 7500f to arrive this week 👍
Yepp, this is the video we needed to clarify the topic. 👍
There's only one that might have been added: CPU utilization chart. What I've seen recently, Ryzen5 3600 is at the end of its breath, especially in online competitive games.
Nonetheless, nice job HWU! 😊
Good point on the online competitive games. I just built my daughter an AM4 build using the R5 3600 (it was on my shelf brand new) with a Powercolor 6800XT and seems to do quite well in 1440p with games such as Baldur's Gate 3 and other RPG games. But as you rightly point out intensive FPS games will suffer FPS.
13:02 Something's off here. If we assume about a 15% IPC bump from Zen 2 to 3 and the same from 3 to 4 and factor in the different turbo frequency, we should have a theoretical maximum difference of (1.15^2) * (52/42) = ~64%. No way should it be double unless AVX is doing all the heavy lifting.
This a conversation that has been going on for at least the last 5 to 10 years. Ultimately, it will always boil down to the specific game engine and its capability to utilize additional threads.
This reminds of a conversation that happened roughly 4 - 5 years ago when the 9000 series Intel Core processors (Coffee Lake Refresh) were released. At the time, the data was showing that, when clocked to the same frequency, the 6C/6T 9600K, the 8C/8T 9700K, and 8C/16T 9900K all performed within 10 fps of each other. The reasoning at the time is that most game engines were still limited to 1 or 2 threads at most, so adding more cores didn't really help you in gaming performance and clock frequency is what made the difference. By buying the higher SKUs, what you were really purchasing was the ability to clock 100 to 200 MHz higher than the lowest SKU and only if you took the time to overclock them.
yea and most games are still coded single core heavy. getting 4 threads to 6 alone was a leap and was it can still be agued the sweet spot for gaming and even then that's only certain games as most games don't use all physical 8 cores let alone all 16 threads. BF2042 uses 8c and 16 threads since it actually utilizes hyperthreading and WZ as well. i cant thing of any others off top of my head that do
Can I just say real quick that the video quality looks absolutely CRISP. This is probably some of the best 4k footage I have seen of ANY tech UA-camr.
I won the silicon lottery with my 7600X. It often gets up to 6 Ghz for short spurts without drawing a ton or power or creating a lot of heat. Sits around 5.5 all cores most of the time in gaming if needed. Staying cool and drawing a modest amount of power.
At what voltage?
@@Safetytrousers Stock. I did try an under volt and an overvolt. With the first few BIOS this was pretty unstable. With the newer BIOS that's been fixed. Gigabyte X670 GAMING X AX. Not sure what the issue was for the first 6-8 months I had it. Had BSOD's and boot failures quite a few times. BIOS F8 fixed it about 10 months ago and it's been very stable since. That said I stopped playing with the voltage.
My cooling is very good and my office is usually pretty cool anyway. In fact in the UK so far this year - it's been hard to get warm no matter how much heat my PC and home radiators pump out.
@@PaulRoneClarke Stock on my ASUS board is crazy high. I have to offset.
@@Safetytrousers It does seem very BIOS/MOBO dependent. I've built my own PC's since the original Pentium, and it's been a long time since BIOS and MOBO's made such a big difference. I think the last time I had so much variation between BIOS versions was the old AMD Athlon FX57. On an old MSI motherboard. I had to update the BIOS about 5x till I got the performance. At the time that meant making a floppy disk and a lot more messing about than it does these days. I suspect many people stayed with very poor performance and never knew why.
I am confused why use the 7800x3d? I am not sure now if the gain is due to the cache or to the 2 extra cores
He explained it wasn't due to the two extra cores. It's the 3d cache.
@@Safetytrousers but the question is if 6 cores is all you need, why use a 3d cache cpu instead of a normal 8 core one?
@@Hoytehablode Because the lowest count AM5 CPU with a 3d cache has 8 cores, there is not one with 6.
@@Safetytrousers that is not what I am saying. Is why not use the 7700x which is 8 core without 3d cache
The 7500f seems like a really good deal since its just a 7600 without a gpu, give it an oc and its close to the best gaming cpus out there?
Agreed, if you could buy it anywhere
Come to European side of life. We got cookies. Um, I mean processors. But not the 5600X3D
Just bought it last month, what a bargain beast this thing is.
It essentially doesn't exist and is typically more expensive than the retail 7600!
🔥
Thanks for the content. I think if the comparison was a 7600 vs 7700 would be better for 6c vs 8c contect as the 7800X3D, while 8c/16t numbers are in many cases and in varying degree influenced by the 3DVC but yes it shouldn't be that far off from say the 7700X 8c/16t counterpart.
@@MaxIronsThird I'd go with same gen comparison.
I game at 4K, have an overclocked 5800x3D, and in a moment of weakness ordered a crazy overkill 7950x3D, mobo, and memory. I sadly sent it back to NewEgg unopened yesterday and now I’m feeling awfully good about it.
Good choice mate. Unless you have RTX 4090, then you fucked up. :D
You do productivity task on your build?
price reduction incoming with the release of 8000 cpus.
@@Purjo92 indeed.
@@Purjo92what do you mean?
Hi, here in MÉXICO City the temperature is rising, 30° to 32°C , so can you please make a test with higher room temp, to see if it's enought the stock cooling?
Yeah, all the planet Is gettin hotter
I feel like to really answer the question about "how much do cores matter" you should compare chips of the same architecure with different core counts or, maybe even better, take a high core count chip and start disabling various amounts of cores... Oh wait, you already did - 2 years ago for Intel 10th gen and only 2 months ago for Zen3. Almost as if this debate keeps coming back every year.
I would post the links for the lazy but those tend to be removed for obvious reasons, so tl;dr frequency/cache/architectural improvements tend to matter a lot more than cores beyond the 6th (and definitely beyond the 8th)
I was really happy with the 3600/RX580 8GB for 1080p medium settings but then went up to the 5600 and RX6600 and can very high/Ultra my games at 1080p and I'm happy with it.
what games are u playing? i have a 1080p monitor and dont want to overspend on gpu, need something that is just enough to play in 1080p high
@@RickGrimez9490 Which CPU do you have and what is your total budget? If your CPU sucks and the motherboard does not support modern enough CPUs, you need to upgrade the CPU or at worst, both. Otherwise, you need to limit your spending on your GPU so there won't be any bottlenecks unless you upgrade other parts later, or plan to play with high resolutions and more demanding graphics settings to eliminate CPU bottlenecks. Otherwise, you might lose some performance. It's not the end of the world, but it's something to consider while upgrading your gear.
If you wanna max out 1080p then buy rx7600 or rx7600xt.... More than enough for 1080p...
@@RickGrimez9490 I mod skyrim se for 1080p.
With optimize setting + ENB, i can say i only need 3070/4070 with my 3600.
Thats goin to have like consistent 60fps and probably some low 52-55 for super crowded place.
@@Purjo92 im currently saving money. What ik is that it will be am5 plattform, r5 7600 i need a 1080p gpu, was thinking rtx 4060 or rx 7600xt, or something for max 400 Euro (im from germany). My goal is to build this pc in september
This was super useful for me! I'm building a gaming pc for the first time in over a decade and couldn't decide between the 7600 (non x) or the 7800x3d. Reason being AM5 is a new platform with hopefully two more generations of CPU to look forward to. Save money and get the 7600? Then have a super easy upgrade path later on with something like a 9800x3d or whatever is the final version for AM5. I can apply the $190 savings with the 7600 towards a MUCH more useful GPU upgrade.
Most reviewers don't really bother comparing chips like the 7600 and 7800x3d because as you said they aren't really comparable for use. But for me it is and I really appreciate your time to do this review!
Absolutely, I bought the 7600 on sale for dirt cheap and put the savings into a better GPU. Also knowing I'll be able to upgrade the CPU in a few years with no regrets.
I did too. I got the 7600 and opted for 7900 xt on sale that then could fit in my budget.
@@Olofberglund Hell yeah that's exactly what I'm going to do too
The idea that 6 core 12 thread CPUs are obsolete isn't the full story. For 60hz gaming, they are fine, at any resolution. Resolution never has affected CPU, the idea that is does is something I believed for years, and I know most people do, and it doesn't help that game devs list higher-end CPUs at the 4k spec than the recommended 1080p or 1440p specs.
Most gamers are saying 16 threads is what you want because they're not targeting 60fps, or they are multitasking, or using the consoles as a baseline. It is a better idea to go with a 16 thread than a 12 thread if you have the budget for it and have a 120hz monitor or higher and actually get above 60fps often. On a lower end GPU you likely won't benefit.
as a 7600x owner, i appreciate this very much. Most testbenchmarks only use the 7800x3d for gpu testing but now we have clean data that they arent that far apart.
they are
@@MaxIronsThird not really, 15% is nothing. invest 200 into next tier gpu, probably is a better idea with better results, especially at 2k-4k gaming.
@@MaxIronsThird depends on what you're running. I prefer a 7800X3D for my 4090 for the 1% lows it offers. And I still use DLSS to get the fps to 120 at least. But, I have to share this, apart from depth of field, even DLSS ultra performance mode looks great, it's astounding 🤯 😂
@@MaxIronsThird7800x3d is completely useless if u play triple A titles, especially on ultra settings. Even on 1080p, ull be gpu limited if ure not on the highest of highest end of gpu.
It is fast, but its still useless for 99% of people who doesnt own a 7900xt or anything beyond a 4080
@@contactluke80 I play at 120fps though.
only thing i can say about multitasking slowing games is that one time i actually did that... i was 14? had bulldozer cpu, was playing old at that time Prince of Persia Warrior Within, at some point game started being 3x normal speed because fps got unlocked somehow, how did i fix that? i overloaded pc with task and throttled fps to achieve normal gameplay
fun times
TLDR:
Generation over Cores over X3D
BUT the small notes are 2'000 pages thick
Vcache tends to make a lot more difference for gaming than cores--just look at how the 5600X3D smokes the 5950X in the 12-game average at 18:11--so I'd prefer "generation over X3D over cores."
TL:DW 🤓
Why would you compare 3d 8 core chip with non 3d 6 core
“I don’t know where that comes from”
Tech Deals:
*whistling while walking away*
tbh i can agree on a lot hes saying cuz i saw it myself ;) even before i ever saw him i went more cores and mroe ram and if i did go back i was like OMG what is this pile of nonsense? ^_^
Awesome video, thanks guys! I really like this because it’s very useful to see exactly very the limits of a cpu is, so you don’t have over spend. 7600 in this case seem to give very similar performance at higher settings at higher resolutions so if this is where you intend to go a GPU upgrade is possible with out compromising on the performance and at the same time if you lower to medium an CPU upgrade might be something to think about. I really like to see more of this with more GPUs, like I have said before this really beats out those crappy “bottleneck calculators”. Great work guys! ❤
Of course it depends, I have an "old" 6-core 5600X and an "outdated" RX6600 and there isn't a single game I want to play that I can't at 60fps. Of course if I wanted to play modern shooters in 4K at 144HZ, different story. Hell, for most games I play my PC is complete overkill.
My uncle had that setup and he barely games like he used to but it's a very power efficient system (slight undervolting helps even more as well) for when he spends time watching TV shows, UA-cam and torrenting.
@@DoktorLorenz I undervolted my gpu greatly, quite a bit more than my gpu is supposed to go and it is totally stable. At full tilt it is supposed to use 100w max but mine is frequently using around 75w when playing Cyberpunk at ultra settings. Cpu uses next to nothing, so yes, extremely efficient. I am very happy with my system.
Incredible work and content, it answers a topic which is always discussed but often without data.
Great 👍
Can you please make a cpu scaling test for intel Arc?
4:15 By "faster" you mean "has a higher average frame rate", so is average frame rate still more important than 1% lows in your opinion?
i think videos like these give much better understanding of given cpu performance
should have included 7900x, because most people are just going to see "8 cores is faster than 6 cores" because 3D Vcache
while i get that you used the 7800x3d because it was the fastest gaming cpu out right now, I think that using a non-Vcache 8 core cpu would have had a more profound statement. showing the tiny increase in frames going from the 7600x to the 7700x would have really driven home the fact that having 8 cores over 6 is less important to having newer architecture or more cache. Someone new to tech seeing these slides may get the idea that the 7800x3d is getting its boost from the 2 extra cores and not truely understand that its the Vcache doing all the heavy lifting on that part.
13:50 why was the 3600 faster than the 7600 at high settings??
So what you are saying is i need a 12 core cpu or higher with more 3d V cache, gotcha
Yes :)
take notes from Intel, bigger number better, so maximize cores, clocks, cache, wattage, maximize everything!
@@1Grainer1Intel: Maximize the chips till we have to prop them up like it's Weekend at Bernie's.
The funny thing is that the 7900X3D is significantly slower than the 7800X3D in gaming because of CCD cross talk. So you need 8 cores with 3D V-Cache or you need 16 cores with 8 of them having 3D V-Cache.
FYI only --> On 21:23 In Indonesia Markeplace, Ryzen 7600 one of the cheapest official store is priced $207,04 vs $190 mentioned.
I love when you show my pc parts on your graphics. Thank you .
8:59 The 4k 4090 Hogwarts legacy chart shows that the 7600 will stutter with RT enabled (much lower than 60fps 1% lows) but the 7800x3d won't. If you looked at only the 1080p or 1440p chart, you would not know that the 7600 would struggle with RT enabled because the 1% lows are above 60fps.
That's just 3D V-Cache.
@@Hardwareunboxed I didn't say anything about the reason why the game would stutter when playing with raytracing at 4k. I fully accept that this is due to the cache, not the number of cores. I merely noted that the 4k chart tells us information about the gameplay smoothness (48fps 1% lows) with that setting which you cannot see on the 1080p or 1440p chart. The 4k chart is not misleading or redundant. It contains useful information.
The latency penalty on the 3600 'infinity fabric' is horrible. I had a 3600X in my gaming PC back when it released and found huge performance gains by pinning games to the last 3 cores, and as many other programs and background tasks to the first 3.
3 cores for gaming wasn't ideal at all, but faster for the games i was playing than 6 poorly coordinated ones. While this is not an issue with the 5000 & 7000 series with 8 cores or less, there are still some big gains (mostly stutters in the 1-5% lows) to be made by having games pinned to the last 6 cores and everything else on the first 2 (assuming it's just discord, browsers and other basic stuff).
so true, for midrange/entry gaming just go for those with single ccx will do
1800x to 3800x was amazing but then going to a 5800x3d removed the ccx latency of the 5000 series and the extra cache (Factorio was mind blowing smoother and could increase factory sizes) I can imagine 7800x3d would be interesting but full system is out of the question for now probably be 9000 series before I replace this system
I find that 2000 and 3000 CPUs been missing a lot and comparing 5600 to a 5800x3d when it’s not a typical upgrade when a 2000/3000 cpu is (sometimes even 1000 cpu as well both the 1000 and 2000 cpu was horrible to use the 3000 3800x was good for me for years )
@@leexgx i upgraded from 2700x to 5800x3d some time ago and play stuff like factorio too.
this 3d cache and single ccx latency was such a good investment for strategy and simulation games!
stellaris endgame is finally playable (barely ^^).
if i would play more multicore friendly modern titles, i probably would have gone with the 5600 for best value.
it hat the better latency, nearly as much clock speed and multicore performance hasn't been a problem in games even with the 2700x.
fast cpu matter when playing competitive shooter game
ppl will do anything to win and get advantage over enemies
there are cpu heavy because most ppl will play at low or medium at 1080p or 1440p
asus already release high end mornitor
currently top2 are rog 2k oled 360hz and rog fhd tn 540hz
ppl will need stronger system to push game performance higher to match their monitor
but for average and casual gamer, mid range cpu is worth more
because it's last longer and cheaper than high end, they can spend more on gpu
and upgrade later once it's not good enough, this is cheaper than buying high end cpu
I shall keep my 12400F for a couple years then.
I installed a 5600G into my mediaserver after I damaged the old 3800X during a cooler upgrade (it stuck to the cooler and then dropped into the socket, bending a few pins and whilst it could be repaired, I lack the tools to do so (magnifying and tiny tweezers of some kind) and buying the tools vs a 5600G wasn't a big difference)
What I found was the 5600G easily as good as the 3800X for media operations, transcoding, encoding, remuxing and so forth... Whilst running cooler... It also allowed me to remove the GPU entirely, which frees up that PCI-E 16x slot that I plan on filling with an NVME card with a bunch of NVME drives... so I can start to retire some of the HDDs with around 75k hrs on them.
I mean there's "need" and there's need.
You may need more, but do you truly require more?
3:20 I believe statements like this is the problem.
most games need one strong thread and like 2-16 background ones. (amount depends on a game, plus if anything else runs in background its nice that its not taking cpu time from main important thread as well)
you will see better performance as long as:
A) main thread have more cpu time for itself, up to what single core can deliver
B) all background threads, like ssd access or data transfer to gpu are able to be served, antivirus, windows downloading update RIGHT NOW, somehow without interfering with A)
this vary per game, obviously.
both matter TO A POINT.
Me watching this with my 4c 8t 12100f. I would like to see a new look on this CPU in 2024.
That would be great.
better in games than the 6 cores 12 threads Ryzen 3600
@@MaxIronsThird🤯
Why does the 7800X3D drops from 7:00 173/121 to only 8:20 134/107 after upgrading the GPU from 7900 XT to the RTX 4090 with same game & settings!?
Look up "nvidia driver overhead". There's a video here.
Wasn't there an article from a PC mag a decade ago that asked, "is three cores all you need to game?"
Edit: Found the article, "How Many CPU Cores Do You Need?" by Tom's Hardware back in 2009. The conclusion: "As far as games go, we see a huge 60% performance jump from going single-core to dual-core, and a further 25% leap from dual- to triple-core. Quad cores offer no benefits in the sampling of games we tested. While more games might change the landscape a little, we think the triple-core Phenom II X3s are looking good as a low-cost gaming option."
I wouldn't know. But GTA IV (one of the worst PC ports ever made) was already out back then. So even back then, I expect the answer was: NO.
3 cores? not 4?
@@user-wq9mw2xz3j Yep, it was back in the days of AMD's Phenom II chips. A number of their quad-core chip production had a bad core, so they disabled it and sold them as Triple-core CPUs labled Phenom II X3. I had one of them myself, until they made the hexacore Phenom II.
Ah, I found the article, it was, "How Many CPU Cores Do You Need?" by Tom's Hardware. The conclusion: "As far as games go, we see a huge 60% performance jump from going single-core to dual-core, and a further 25% leap from dual- to triple-core. Quad cores offer no benefits in the sampling of games we tested. While more games might change the landscape a little, we think the triple-core Phenom II X3s are looking good as a low-cost gaming option."
Honestly, I appreciate having it added as a middle ground. Thanks for all the hard work!!
I hate when it depends
Always been like this due to higher clocks and generational improvements....
IF ALL CPUs where to run at the SAME clocks, it wouldn't be that big of a difference!
Even more hateful when it depends on you! So unfair! Even when it's unfair to your favor.
@@Nepturion88 IPC improvements typically account for half-ish (sometime more, sometimes less) of a generational improvement. So it's still substantial what you get without accounting for clock speed.
Given the chart at 18:15, just popping in a 5700X3D for 230 bucks if you still have Ryzen 3000 series seems like a a great option for anyone with a 6700xt or 3070 or better
I appreciate the fact you didn't just use the 4090. You threw in a 3070 and lower cards for comparison. Thank you.
Literally helps so much
I think this video cleared up a lot of convoluted analysis about perhaps an overly limited data set and explained it all much better.
Thanks for following up.
I'm very happy with my Ryzen 7600, but I'll say it *needs* a cooler upgrade. The stock cooler just doesn't cut it. But a 23$ air tower cooler is perfectly enough. I'm glad I paired that CPU with a 6750XT -that I already owned- for 1440p gaming, I feel it's pretty well balanced. And I'm confident my motherboard will be enough for the next low TDP Ryzen 5, zen5 or more probably zen6 in a couple of years.
Thanks HUB, your videos are pure gold, the best content covering CPUs, motherboards and GPUs for consumers interested in gaming PCs 👍🏻
_"...my Ryzen 7600, but I'll say it needs a cooler upgrade"_ Uh, NO! It's a 65W CPU! There is absolutely no reason to *ever* "upgrade" the cooler on a baseline 7600; overclocking gains are minuscule, completely pointless and simply generate more heat (like the irrelevant 7600X),
@@awebuser5914 with the stock cooler, the 7600 hits 95º after about 5 seconds of 100% load, without PBO and, of course, no overclocking. I don't care if AMD says that is ok and it's by design, I'm not letting my CPU work at those temperatures all the time, whatever they say... They could have included the higher tier stock cooler and it would be a different matter.
@@mkcristobal just undervolt it by 30mV or something
I undervolted my 7600 to -20 and 85.now it's run way cooler.i m also using rx6750 xt in eco profile it's cap fps to 60 but cpu and GPU runs much cooler.
You guys must understand that undervolting the lowest CPU in the current lineup, which is just a 65W TDP, because the stock cooler underperforms, should not be the norm, and is something that regular users wouldn't do.
So say for example you wanted a low end new gpu like the rx 7600 or the rtx 4060. At what point would it make sense to get more cpu than the 7600? Should you even bother with 7800x3d if you are wanting a low end new gpu? I imagine the gpu would be bottlenecked until you went up to a 7700xt gpu.
Nothing brings in the clicks like an "it depends" video title.
It depends.
I’d like to see a graph 📈 showing resolution and how much it depends.
I remember running into similar FPS changes when I upgraded from my Ryzen 5 2600 to my Ryzen 7 5700G. It has 2 more cores, but that doesn't explain the new issues I was running into. Specifically, I was getting shoddy 1% lows. With the 2600, the 3000 MHz RAM I was using was fine because the CPU was just that limited that the RAM could transfer data between it and my 6700 XT just fine. But with the 5000 series being just that much more powerful, the slow RAM was causing hitching issues.
I've since upgraded to some G.Skill 4000 MHz, which runs stable at full speed and a 2000 MHz FCLK. The 1% lows still hurt because my CPU is affected by the USB disconnection issue, forcing me to downgrade the PCIe speed since an SoC overvolt of even 10mV proves to be unstable. But even with the downgraded PCIe speeds, it's still miles better than with the 3000 MHz RAM. I'm still able to get within around 10 FPS of most Horizon: Zero Dawn benchmarks for my 6700 XT, and I'm even using Linux as my main OS.
Now that I think about it, that leads me to a potential video idea: Benchmarks using native DirectX vs DXVK/VKD3D. There's actually a lot of games out there that get a speed boost even within Windows when using DXVK. It'd be interesting to have some quantifiable data for it.
Not to be in the "but you missed x CPU!1!" camp, but to really lock it home you could have included the 2700X -- that would have been an 8 core CPU getting beaten by both 6 core CPUs.
It's a bit old now to care. Tbh.
Another great video. Keep doing more like this and the one you did last week.
The i5 9400F is a perfect 6 core. Well, it's 6/6. 😂
14nm moment
I still have one with my 3080
Big bottleneck, but at 1440p it's good enough for now
Haha, 9400F and 9600K would be really competitive if they did not cripple them with not giving them hyper threading.
My friend had the same CPU until he upgraded to a Ryzen 5 5600G. Was not much but it did give that extra performance he wanted paired with a RX 5600XT
@@kgt8742 i don't get why he chose 5600g over 5600/5600x if he is going to pair it with a dedicated GPU
i find r7 7700 for 250$. I need it or not? Or r5 7600 for 200$ is better? do only gaming 1080p with 6600. worth it?
still using r5 3600 until now
Straight up just threw away my 3600 once I built my 7800x3d system, they say the pc doesn’t make the gamer a better gamer well when your 1%lows are 200+ it kind of does + good 5G wifi
The change I've often noticed the most has been adding more GPU lately. 6 cores and the best GPU one can afford might still be the meta with 7800X3D or an 8 core/higher being the more premium options.
Test those CPU's with a mid tier card like a 3060 and there won't be much difference. This test is bad
😅
What else, test them with voodoo3dfx ?
Yea nobody would use a Ryzen 5 3600 or 7600 with a RX 7900XT. This is not how people in real life would combine cpu and gpu.
Right, your correct but this is a pure CPU test which means needs to be CPU bound and why they use a powerful GPU, it forces the CPU to max out to give you best performance
@@cowboyk5834 Don't try to explain what cpu benchmarks are to them. They won't get it anyway.
I know this channel (or almost any big youtube channels) doesn't cover the emulation performance but I think that could have had interesting results in this specific test video. As far as I know, core counts actually matter on emulating more demanding stuff like ps3 games etc but it's quite difficult to find a reliable source like you guys who actualy made a comprehensive video about it. Great video regardless. Hope to see people commenting "do I need 8 cores for gaming" in a couple of days again.
Anxiously waiting for Tech Deals' reaction on twitter
I remember, when i paid attention to these things, that platform latency played a big part on gaming performance.
That applied even when game was GPU limited, if i remember correctly. I think it's possible that when GPU is waiting, that waiting time is increased when player does something and frames and environment needs to be recalculated. Every change takes time and response time of system affects, something like "input latency" in displays.
At the time AMD introduced CPU's with integrated memory controller (used to be in a chip set) which decreased latency greatly.
20:00
I think a video about cleaning or tweaking a pc from you will be GREAT and HELPFUL for many of us
Tech Deals in shambles.
19:46 This is wrong. What matters is how many threads the background task is doing work on. If you have 24 cores and a background task that uses, say, eight work threads, that will not significantly impact gaming performance. If you have a 7950X3D and make sure the game runs on the V-cache CCD while background tasks run on the other CCD, you won't be able to tell the task is running, unless either it or the game is very sensitive to memory throughput (constantly reading large quantities of data to and from memory), rather than latency. If you were to limit a background task to four threads and do the same with a 7800X3D, the impact would be greater, because less of the cache will be available for the game.
With a TR system, it's like with the 7950X3D, just without the extra cache boost. You can limit tasks to specific CCD's, meaning the L3 cache where the game is running won't get filled with data from those tasks. Given the four memory channels, you'll also have more memory throughput to work with, if either the game or the tasks are reliant on it.
Steve, at 12:11 you mean CPU limited?
Nice video btw., the cleaning mop is hilarious!
i have a question
if a cpu (7600 for example) hits 100fps with a high end graphics card while cpu bottlenecked
and a gpu (4070 for example) hits 100fps with a high end cpu (14900k/7800x3d) while gpu bottlenecked
does that mean that one will hit about 100fps with a 7600 and 4070 or will the performance suffer more
I wonder if higher core count cpu receives less of an impact on utilization from a higher polling rate mouse? If so than gaming cpu needs to be a higher core count now.
What is a good 5000 series for both streaming and playing games at the same time? My 5600 doesn't like me playing games at max 1080p while streaming with my 4070super..
Most applications out there usually boil down to simple load -> basic calc/ evaluation -> store instructions on the CPU. You'd want modern hardware with modern ISA so that latency sensitive apps like games can run the best. Games usually arent complex and require a lot of number crunching. Ideally you want clock frequency and IPC, atleast with a traditional thread model that most games use. When you program around Fibers this changes somewhat but comes with its own downsides.
15:03 why are results in CS2 so much different between 7900xt and rtx4090?
Why not use a 7800X and eliminate the cache variable?
In broad terms, more cores will be advantageous either because the game can schedule certain tasks to better spread the load across (depending on the game that is) or mostly because modern versions of Windows (10 or 11) are resource hogs and often will chew up as many resources as they can from machines by multiplying useless processes, having processes suspended so they don't eat away CPU resources but they still reserve a portion of your system's memory, and so forth. In short having more cores helps you game because Windows is so bad that you need to make sure it has enough to "eat" while still allowing you to have resources to play your games :P
I did a "budget" high end build late last year with a 7600X/7900XT. I'm pretty happy with it's performance for the $$$. I couldn't justify spending another 280$(bundle savings) on the 7800X3D when I game at 1440p. Shoot I even try to see if I can run closer to 100fps in games and keep the thermals/power usage down. My wife plays Hogwarts on max setting no RT capped at 75fps. I don't even see temps above 65C like this. Gotta love it
Almost 3 years with my 3600 when i finally managed to upgrade at something mid-range, and i still don't plan to upgrade until this one starts to show it's age. It's paired with RTX 3060 so i'm more than happy at the moment. It was a huge upgrade from my previous i5-2400 and still rocks with whatever i throw at it!
What video is the graph with the RX 5600XT at 4:52 from?
How much is the difference between low/ultra settings due to the CPU? Noticably when heavily GPU limited there's a difference between presets, but not between CPUs. I then assume the difference is mostly GPU bound...
But then in 1080p low/medium where the performance is definitely CPU limited, we still see a very similar frame rate difference between Low and Medium presets. Suggesting the CPU is mostly responsible. Is it certain groups of settings that primarily hit the GPU/CPU specifically? In that case I'd love to find out which settings hit my CPU (to lower them) and which hit the GPU (it has headroom) to keep as much eyecandy as possible without losing that 30-50% performance
What does matter in terms of large number crunching??
Different programs still implement multicore differently.
Extra cores are best fore heavy creation and management softwares, like editors and various servers.
As long as you meet your specific workloads overhead, more won't make much difference, outside costs, but less will hurt.