Are 6 Cores Really All You Need for Gaming? It Depends
Вставка
- Опубліковано 31 тра 2024
- Thermal Grizzly: www.thermal-grizzly.com/en/kr...
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 4070 Super - geni.us/wSqSO07
GeForce RTX 4070 Ti Super - geni.us/GxWGmYQ
GeForce RTX 4080 Super - geni.us/80D6BBA
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 4070 - geni.us/8dn6Bt
GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
GeForce RTX 4060 - geni.us/7QKyyLM
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
Radeon RX 7800 XT - geni.us/Jagv
Radeon RX 7700 XT - geni.us/vzzndOB
Radeon RX 7600 XT - geni.us/eW2iWo
Radeon RX 7600 - geni.us/j2BgwXv
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6750 XT - geni.us/53sUN7
Radeon RX 6650 XT - geni.us/8Awx3
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6600 - geni.us/cCrY
Video Index
00:00 - Welcome to Hardware Unboxed
01:27 - Ad-Spot
02:06 - Core Count & Gaming Performance
05:14 - Test System Specs
05:27 - Hogwarts Legacy
09:17 - Starfield
12:32 - Counter-Strike 2
15:59 - Ryzen 5 7600 vs 3600 Specs
18:35 - Misconception: Multi-tasking
19:58 - Misconception: Clean Test Systems
20:29 - Wrapping up the 4K discussion
22:16 - Outro
Are 6 Cores Really All You Need for Gaming? It Depends
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo - Наука та технологія
If you want to learn more about cores/cache performance of Ryzen processors then check this video out (if you missed it) ua-cam.com/video/0mO4op3bL90/v-deo.html
Starfield is such a mess - should not be included its that bad.
When I play cyberpunk ryzen 95950X I get a avg fps of 120/150 while I have wall paper engine running and discord on while in a call sharing my screen so people can see my gameplay and I have a UA-cam video on in the background my friend with the same system GPU mb so on has a ryzen 5600x and when he plays cyberpunk even at 720p low while he has wallpaper engine on and shared screen he is getting a avg of 60/70 and when he closes everything but the game and runs discord on his phone he instantly goes up to like 90ish FPS
yeah its no secret 4 cores is all you need.
Tech Deals wouldn't approve of this though. 2:30
So I know this has been covered before but it's been a while for me, If I shut off multicore processing in a 32thread CPU like the 5950X does it help any? there would still be 16 cores but I have just never switched it off and tested it in a game. I love my 5950X it is perfect power to power(w) ratio lol for most of what I use it for but hey I am always happy to game a little faster.
time to complain about a CPU thats not included
2500k owners in shambles
Where is my 8086k lol
I do wish AMD will drop 6-core and start with 8-core with the next generation. It was amazing when we went from 4-core Intel stuff for ten years to 6-core with AMD Ryzen but that was 7+ years ago now.
So true 😂
I feel those types have unlimited free time. No concept of the labor and love that goes into this kind of content 😂
Only amd CPUs exist.
But Steve, how can you not include the Phenom x6 1055t to represent the popular 6 core cpu?
Dont forget The legendary fx 6300
1055t I am still using....
to old dude. the intel Q6600 was also a killer. Or duo core 6600 at that time when 2 core were still enough.
But if we are gonna do that I want to see Octo Opteron 850
Yes, and also i7 970 :P
You should have a video called "It Depends" which covers every technical question possible
I might make that video, it depends.
Oh heck!!!
That video would be a full length feature film!!!
@@GB-zi6qrwell, I think it depends on
@@berengerchristy6256 🤣🤣🤣🤣
@@Hardwareunboxed well played
The 7600X being as fast as a 5800X3D in gaming should say a lot tbh
5.45 ghz all cores under 85C in all games.. why would i need more
@@laurentiudll ghz doesnt tell anything.. 10 years ago i overclocked my garbage buldozer fx-8350 5.1ghz and i bet 7600x is faster even 2ghz.
@@laurentiudll because 85c is too hot REEEEEEEEEEEEEEEEEEEEE
Says nothing since it clocks 0.5 -0.9ghz higher
The 5800x3d kinda matches it which is sorta nice, but the people who made the smart choice and went with the 7x00x are going to be the ones laughing when they pop in a cheap zen 5 or zen 6 upgrade.
20:00 Best B roll of 2024 😂
electrostatic shock: Let me introduce myself!
bro really watched a 20 min video at 5x speed
I cleaned all the electrostatic off first.
@@Hi_Im_o2Probably. Lmao
@@GewelRealSynthetic dusters get static very easily, but organic (ostrich feather) ones really don't.
Cores and effect.
No cores for concern.
Uhiehuehue
Rebel without 8 cores.
Cores Light.
I coronate you King! of the coreny puns, my brother from another motherboard.
okay, that's enough dad.
Still, why didnt Steve include my mighty 5600x in the test. He better re-do whole test again? right?
On it!
Hehe yes and included my 8600K that I have delided, replaced the IHS and is running with an aggressive OC in a water-cooled system with relatively slow 64 GB memory.
That chart is going to be very long, maybe the video should be a vertical video 🤣
@@HardwareunboxedIm not clear on which video this video is in regards to, Could you link it please? Also sorry to say but Im not clear on the big point being made with this video’s testing. I get the point made at the beginning (that when it comes to cpu’s, the overall throughput capability of the cpu matters in so far as it needs to be capable of your desired fps).
@@Hardwareunboxed Thanks Steve
😂😂😂
If they made 7600X3D, it would probably be much closer to 7800X3D making 6 cores truly all you need for gaming
They are collecting bad 7800x3d and when there are enough… they will release 7600x3d… they don`t have enough of those bad cpus yet!
@@haukionkannelTruuu
The 3D v-cache die of 7900X3D could be used for a potential 7600X3D.
@@haukionkannel As long as it won't be a stupid microcenter exclusive as 5600X3D was then it will be great
@@mruczyslaw50 or a 7700x3d just like the 5700x3d with a little lower clock than 7800x3d for like 50 to 70$ less would be nice. in germany the 7700 costs 230€ and the 7800x3d costs 340€, so something in the middle there would be lit
So that's how you keep your Windows clean 19:59
Multitasking is gonna hammer your cache and memory pipeline all the same regardless of your core count. I don't understand why people think the number of cores determines everything. Not everything is Cinebench, barely anything is.
I even feel my SATA channels being "hammered" when I run a not so demanding game from one SSD and downloading torrents to another HD and my virtual memory is on separate SSD btw. Maybe I should try to replug my SATA drives in different channels. Maybe qbittrrent is keeping RAM too busy, memory allocation is not big though.
I still hear people all the time saying cores don't matter constantly. Apparently programmers are too stupid to take advantage of multiple cores and we should go back to single core. The truth is both matter and it depends on your use case.
still, it would be nice if somebody actually tested it
@@zangetsu6638 HUB did. Look it up.
The best multitasking example is capturing gameplay using CPU encoding (which still offers the best quality). If you have a single-PC setup, whether you're recording or streaming, you'll need a beefy CPU if you want to use it to encode video.
Even the newest 6-core CPUs are not enough to encode 1080p60 (x264 medium preset) while gaming. You need a modern 8-core chip for that. Higher resolutions would require a lot more cores. It's a niche use case, but it exists.
Crazy how Starfield is so CPU bound even at 4K 🤯
The be fair to Bethesda, their asset quality is kind of...insane. Even a damn sandwich is high poly count.
So if you walk into a residential area with all the assets, its gonna be quite taxing.
Ancient gaming engine that they refuse to replace.
@@Ryotsu2112 People keep buying their crap so why would they.
Solution: don't play Starfield. 🤔
High asset quality + full of physics simulated objects.
You can get away with 6 cores if you add more core fluid. I usually run about 3 quarts of 0w-20GB core fluid.
Hold my almighty tweezers there, check your power supply first!!!
What?
So true
You can save a couple bucks if you run it with WD-40 instead, though you gotta be careful when applying it that you don't get runoff into the PCIe slots.
@@moldyshishkabob Indeed. I myself sometimes like to throw the cpu into a tub of good old lard. And make sure all the pins get nice and insulated for the electrons.
19:47 I can attest to that. Gaming performance absolutely tanks when I'm compiling code in the background on a 32 core threadripper.
"but where is the 5800x3d!?!?!? 🤓"
I should have never said anything :)
@@Hardwareunboxed ha ha lol
In my system!! Bad HUB trolling us 😉😉
5700x3d kicked it to the side.
😂😂😂
This was super useful for me! I'm building a gaming pc for the first time in over a decade and couldn't decide between the 7600 (non x) or the 7800x3d. Reason being AM5 is a new platform with hopefully two more generations of CPU to look forward to. Save money and get the 7600? Then have a super easy upgrade path later on with something like a 9800x3d or whatever is the final version for AM5. I can apply the $190 savings with the 7600 towards a MUCH more useful GPU upgrade.
Most reviewers don't really bother comparing chips like the 7600 and 7800x3d because as you said they aren't really comparable for use. But for me it is and I really appreciate your time to do this review!
Absolutely, I bought the 7600 on sale for dirt cheap and put the savings into a better GPU. Also knowing I'll be able to upgrade the CPU in a few years with no regrets.
I did too. I got the 7600 and opted for 7900 xt on sale that then could fit in my budget.
@@Olofberglund Hell yeah that's exactly what I'm going to do too
Just upgraded from AMD Ryzen 5 3600 to 5600X. Some might consider that barely an upgrade, but I got my 5600X for cheap, and in the end, there will be only a 40 euro difference after I sell my 3600. Was considering 5700X3D or 5800X3D, but there is basically none in the used market (since they kick ass) and I will not pay a premium for new a CPU.
I went from 3700X to 5700X X3D chips were still on the way. Noticed at least 10% jump in FPS with less stutters...
5600x 2024 is kind of buns
Still a decent upgrade if u got it cheap..The 5600 runs much more efficient and better in general. With that u dont need to go for a x3D anymore now.
I want from ryzen 1500x to a ryzen 5600x
Went from 3600 to 5600 for cheap last year too. Does all I need!
Yepp, this is the video we needed to clarify the topic. 👍
There's only one that might have been added: CPU utilization chart. What I've seen recently, Ryzen5 3600 is at the end of its breath, especially in online competitive games.
Nonetheless, nice job HWU! 😊
Good point on the online competitive games. I just built my daughter an AM4 build using the R5 3600 (it was on my shelf brand new) with a Powercolor 6800XT and seems to do quite well in 1440p with games such as Baldur's Gate 3 and other RPG games. But as you rightly point out intensive FPS games will suffer FPS.
“I don’t know where that comes from”
Tech Deals:
*whistling while walking away*
Incredible work and content, it answers a topic which is always discussed but often without data.
Great 👍
Another great video. Keep doing more like this and the one you did last week.
This a conversation that has been going on for at least the last 5 to 10 years. Ultimately, it will always boil down to the specific game engine and its capability to utilize additional threads.
This reminds of a conversation that happened roughly 4 - 5 years ago when the 9000 series Intel Core processors (Coffee Lake Refresh) were released. At the time, the data was showing that, when clocked to the same frequency, the 6C/6T 9600K, the 8C/8T 9700K, and 8C/16T 9900K all performed within 10 fps of each other. The reasoning at the time is that most game engines were still limited to 1 or 2 threads at most, so adding more cores didn't really help you in gaming performance and clock frequency is what made the difference. By buying the higher SKUs, what you were really purchasing was the ability to clock 100 to 200 MHz higher than the lowest SKU and only if you took the time to overclock them.
yea and most games are still coded single core heavy. getting 4 threads to 6 alone was a leap and was it can still be agued the sweet spot for gaming and even then that's only certain games as most games don't use all physical 8 cores let alone all 16 threads. BF2042 uses 8c and 16 threads since it actually utilizes hyperthreading and WZ as well. i cant thing of any others off top of my head that do
I love when you show my pc parts on your graphics. Thank you .
Awesome video, thanks guys! I really like this because it’s very useful to see exactly very the limits of a cpu is, so you don’t have over spend. 7600 in this case seem to give very similar performance at higher settings at higher resolutions so if this is where you intend to go a GPU upgrade is possible with out compromising on the performance and at the same time if you lower to medium an CPU upgrade might be something to think about. I really like to see more of this with more GPUs, like I have said before this really beats out those crappy “bottleneck calculators”. Great work guys! ❤
your videos are so entertaining and informative!
Me watching this with my 4c 8t 12100f. I would like to see a new look on this CPU in 2024.
That would be great.
better in games than the 6 cores 12 threads Ryzen 3600
Basically 3600 single core perf is so low compared to 7600 single core perf. The architecture difference, the selectable channel in DDR5, the increase in L2 cache and how L3 cache is connected to the Core plays a part in the gaming workload. Gaming workload cannot take advantage of all the threads but DX12 and modern middleware like Unreal 4/5 is able to take advantage of more cores.
Rendering and other multi-core workload like multithreaded workload works differently and depends on which instructions are being used. SIMD or FP instructions for eg.
Skipping the 5600(x) was sheer stupidity in this case. Inexcusable...
but when you see r5 3600 pushing 200fps you stil may not give a fcuk about upgrade ;)
Don't forget shader compilation steps in modern DX12 games also scales with CPU cores.
"multithreaded workload works differently"
yeah and we have been begging HU to put multithreading workloads in their tests, but they say it's too hard.
they would rather just spout off guesses.
TBH, even with games, the concept of a single-threaded major application are almost gone. The vast majority of modern games use multiple threads, maybe not all available, but they are certainty *not* confined to a single thread. Really, it's just a legacy concept that stubbornly refuses to die, like so many others...
17:54 my kind of conclusion. Thank you Steve.
Hats off to you for this. Amazing work. Waiting for my 7500f to arrive this week 👍
I had to explain this while trying to budget out a gaming system to a friend, that a ryzen 7500f was faster than his 9900k, and by a significant margin, but he could not get around that his had 8 cores.
6% faster. You should have kept him on the 9900k instead of side grading him. And used the money for a better gpu
@@aightm8 The argument was probably that he could upgrade later to a "cheap" 7800x3d in 5 years.
@@zeenyuhrass then upgrade later. There's no advantage to sidegrading to am5 today because you might upgrade later. Really makes no sense.
@@aightm8 Na, tdp is big difference and also platform 7500f is better choice overall
@@foxskyful ok.... But there's hardly any immediate advantage. It makes no sense. You'd really rather have 200-300 less budget for your GPU. To save 10 a year in electricity + 6% CPU performance
TLDR:
Generation over Cores over X3D
BUT the small notes are 2'000 pages thick
Vcache tends to make a lot more difference for gaming than cores--just look at how the 5600X3D smokes the 5950X in the 12-game average at 18:11--so I'd prefer "generation over X3D over cores."
It's great to see all the data, thanks!!
Steve, at 12:11 you mean CPU limited?
Nice video btw., the cleaning mop is hilarious!
I feel like to really answer the question about "how much do cores matter" you should compare chips of the same architecure with different core counts or, maybe even better, take a high core count chip and start disabling various amounts of cores... Oh wait, you already did - 2 years ago for Intel 10th gen and only 2 months ago for Zen3. Almost as if this debate keeps coming back every year.
I would post the links for the lazy but those tend to be removed for obvious reasons, so tl;dr frequency/cache/architectural improvements tend to matter a lot more than cores beyond the 6th (and definitely beyond the 8th)
The 7500f seems like a really good deal since its just a 7600 without a gpu, give it an oc and its close to the best gaming cpus out there?
Agreed, if you could buy it anywhere
Come to European side of life. We got cookies. Um, I mean processors. But not the 5600X3D
Just bought it last month, what a bargain beast this thing is.
It essentially doesn't exist and is typically more expensive than the retail 7600!
🔥
Another banger vid, love it! Now would u do a vid for us, the casuals, doing "realistic" builds (like mid or upper mid range cpu+gpu in popular games) so we know exactly what to expect from an upgrade if u have the time?
Of course it depends, I have an "old" 6-core 5600X and an "outdated" RX6600 and there isn't a single game I want to play that I can't at 60fps. Of course if I wanted to play modern shooters in 4K at 144HZ, different story. Hell, for most games I play my PC is complete overkill.
This goes straight against what TechDeals has been preaching for some time now.
But Steve has, for years now, done the testing to back up his claims. About two years ago he did a CPU-to-GPU Scaling video where the number of CPUs, GPUs, resolutions and games he tested against each other added up to 1700 benchmark runs for a single video.
I wouldn't say TechDeals is reliable but he does talk about real world experience. I don't trust one benchmark with no real world experience to tell me that more cores don't matter. It seems reasonable to believe some software can designate cores for certain processes so I find it extraordinary to believe his conclusion based on such puney evidence.
I think Frame Chasers is a clown but maybe he's right about his criticisms of some of these UA-cam authorities just making these unsubstantiated claims. This seems like an example of malpractice in my opinion.
@@JustADudeGamer Steve has been doing this for over 20 years, and this isn't his first collection of games benchmarks to back this up. He caught flak for proving that quad-cores were categorically out if you want a good gaming experience, and now people are giving him flak for showing that in the vast majority of modern games 6 cores work perfectly fine with some benefitting a bit from 8 cores. Above that it's just down to clock speed.
@@andersjjensen There's nothing magical about 6 or 8 cores. If a game is at 60% usage at 8 cores there's no way you'll never benefit from 16 cores when multitasking. A UA-cam video hardly uses anything. I think some games even benefit from 16 cores. He makes these sweeping statements based on very little. He's probably right for some average game and light usage it doesn't matter but it's just not logical as a sweeping statement that high core count doesn't help multitasking while gaming.
i think videos like these give much better understanding of given cpu performance
Great content ! Keep it up !
Not necessarily more cores - but speed of said cores - if you get better overall performance from better implemented cores and threads that work more efficiently that should mean speed increases. Anyway, great video - always love your efforts to educate your audience with EVIDENCE.
Wasn't there an article from a PC mag a decade ago that asked, "is three cores all you need to game?"
Edit: Found the article, "How Many CPU Cores Do You Need?" by Tom's Hardware back in 2009. The conclusion: "As far as games go, we see a huge 60% performance jump from going single-core to dual-core, and a further 25% leap from dual- to triple-core. Quad cores offer no benefits in the sampling of games we tested. While more games might change the landscape a little, we think the triple-core Phenom II X3s are looking good as a low-cost gaming option."
I wouldn't know. But GTA IV (one of the worst PC ports ever made) was already out back then. So even back then, I expect the answer was: NO.
3 cores? not 4?
@@user-wq9mw2xz3j Yep, it was back in the days of AMD's Phenom II chips. A number of their quad-core chip production had a bad core, so they disabled it and sold them as Triple-core CPUs labled Phenom II X3. I had one of them myself, until they made the hexacore Phenom II.
Ah, I found the article, it was, "How Many CPU Cores Do You Need?" by Tom's Hardware. The conclusion: "As far as games go, we see a huge 60% performance jump from going single-core to dual-core, and a further 25% leap from dual- to triple-core. Quad cores offer no benefits in the sampling of games we tested. While more games might change the landscape a little, we think the triple-core Phenom II X3s are looking good as a low-cost gaming option."
I was really happy with the 3600/RX580 8GB for 1080p medium settings but then went up to the 5600 and RX6600 and can very high/Ultra my games at 1080p and I'm happy with it.
what games are u playing? i have a 1080p monitor and dont want to overspend on gpu, need something that is just enough to play in 1080p high
@@RickGrimez9490 Which CPU do you have and what is your total budget? If your CPU sucks and the motherboard does not support modern enough CPUs, you need to upgrade the CPU or at worst, both. Otherwise, you need to limit your spending on your GPU so there won't be any bottlenecks unless you upgrade other parts later, or plan to play with high resolutions and more demanding graphics settings to eliminate CPU bottlenecks. Otherwise, you might lose some performance. It's not the end of the world, but it's something to consider while upgrading your gear.
If you wanna max out 1080p then buy rx7600 or rx7600xt.... More than enough for 1080p...
@@RickGrimez9490 I mod skyrim se for 1080p.
With optimize setting + ENB, i can say i only need 3070/4070 with my 3600.
Thats goin to have like consistent 60fps and probably some low 52-55 for super crowded place.
@@Purjo92 im currently saving money. What ik is that it will be am5 plattform, r5 7600 i need a 1080p gpu, was thinking rtx 4060 or rx 7600xt, or something for max 400 Euro (im from germany). My goal is to build this pc in september
Hello. Can you help me? I need a pc for Professional Content Creation/UA-camr (Mostly videos for UA-cam & TikTok). I do mostly video editing, Premiere 45%, mostly 1440p but may do 4k for some clients) and play (mostly Fortnite 40%, no need 4k for this game, higher frames the better, may need 4k for other games) and do design (illustrator and Photoshop 15%). I will be always record when playing and may stream sometimes. I know intel is better for Premiere but I want to go AMD route for various reasons.
Is the 4080 super with 7950x3D the best for my use case? Or should I wait for the uncoming cpus?
tks a lot , your videos are very professional en relevant , tks
Can you please make a cpu scaling test for intel Arc?
The 7700k (4c) was faster than the 1600 (6c).
The 2600 (6c) was faster than the 1700 (8c).
The 3600 (6c) was faster than the 2700x (8c).
The 5600 (6c) was faster than the 3700x (8c)
The 7600 (6c) is faster than the 5800x (8c).
How many times do we have to go through this?
And it will be a while before we are seeing a situation like Dragon's Dogma 2 on the i3 12100 where having only 4 cores simply isn't enough.
Tech hardware channels are allergic to understanding technical terms. IPC is all you’d need to mention and explain.
There's a bot that stole your comment
@@Navierstokes256 not only ipc, cache size, access memory latency, how cache is access by cores. ipc is just instructions per clock, if anything else is designed wrong ipc doesn't matter
@@koczaioandaniel4014 Yes so we're in agreement. Explain what IPC is and naturally you'll explain what is contributive to that.
it called my attention did you changed camera? lens? or editing sharpness? it looks good but has more moire and steve looks older
so have they move the minimum or acceptable 60fps higher or its still 60fps? cause not everyone has 20/20 vision to see all the affects/graphics
I game at 4K, have an overclocked 5800x3D, and in a moment of weakness ordered a crazy overkill 7950x3D, mobo, and memory. I sadly sent it back to NewEgg unopened yesterday and now I’m feeling awfully good about it.
Good choice mate. Unless you have RTX 4090, then you fucked up. :D
You do productivity task on your build?
price reduction incoming with the release of 8000 cpus.
I just bought a 9 7950x3D yesterday to start a build around that.
@@Purjo92 indeed.
as a 7600x owner, i appreciate this very much. Most testbenchmarks only use the 7800x3d for gpu testing but now we have clean data that they arent that far apart.
they are
@@MaxIronsThird not really, 15% is nothing. invest 200 into next tier gpu, probably is a better idea with better results, especially at 2k-4k gaming.
@@MaxIronsThird depends on what you're running. I prefer a 7800X3D for my 4090 for the 1% lows it offers. And I still use DLSS to get the fps to 120 at least. But, I have to share this, apart from depth of field, even DLSS ultra performance mode looks great, it's astounding 🤯 😂
@@MaxIronsThird7800x3d is completely useless if u play triple A titles, especially on ultra settings. Even on 1080p, ull be gpu limited if ure not on the highest of highest end of gpu.
It is fast, but its still useless for 99% of people who doesnt own a 7900xt or anything beyond a 4080
@@MaxIronsThirdif you’re playing at 4K with a 4090 then you’ve wasted your money buying a 7800x3D over a 7600
15:03 why are results in CS2 so much different between 7900xt and rtx4090?
Something I'd love to see is a more extensive investigation as to what games benefit greatly by that 3D cache CPU's, things like simulation category games for example and to see how much of a correlation there is by category of games and larger than normal gains.
This would likely be hard to do with this in-depth benchmarking because you can't just use your standard test suite and a lot of games simply do not have easy ways to test this with consistent repeats, realistically it would be more rough tests that can show it as a trend rather than exact benchmarking data, so for example the 7700X and 7800X3D compared and seeing if there is a very clear unmistakable difference like in for example Assetto Corsa Competizione.
It would be neat to be able to conclude and recommend something like "if you play games of this category you are more likely to benefit from the 3D cache CPU models" or the opposite where certain categories of games are unlikely to benefit much and be reasonably confident that it's helpful advice. Sure in the end every game is different but there could potentially be trends within categories of games just from the nature of the experience.
They already did it years ago
I mean there's "need" and there's need.
You may need more, but do you truly require more?
Great stuff, Steve!
Can I just say real quick that the video quality looks absolutely CRISP. This is probably some of the best 4k footage I have seen of ANY tech UA-camr.
while i get that you used the 7800x3d because it was the fastest gaming cpu out right now, I think that using a non-Vcache 8 core cpu would have had a more profound statement. showing the tiny increase in frames going from the 7600x to the 7700x would have really driven home the fact that having 8 cores over 6 is less important to having newer architecture or more cache. Someone new to tech seeing these slides may get the idea that the 7800x3d is getting its boost from the 2 extra cores and not truely understand that its the Vcache doing all the heavy lifting on that part.
I hate when it depends
Always been like this due to higher clocks and generational improvements....
IF ALL CPUs where to run at the SAME clocks, it wouldn't be that big of a difference!
Even more hateful when it depends on you! So unfair! Even when it's unfair to your favor.
@@Nepturion88 IPC improvements typically account for half-ish (sometime more, sometimes less) of a generational improvement. So it's still substantial what you get without accounting for clock speed.
VINDICATION!!! THANK YOU!! I chose the Ryzen 5 7600 as my CPU in my build last year to pair with a 7900XTX. I was given so much grief both in person and online about what a bad CPU/GPU combo that was and that I HAD to get a 7800X3D or else I was otherwise severely crippling my performance. Thing is, I was making my build with 4k gaming in mind, and I already knew the difference at 4k wouldn't be huge enough to warrant paying double the price for a CPU that won't benefit me. Wow not only is the 4k performance close, it's pretty much the exact same between both CPUs!!!
It feels so good to know I was right in my decision (not that I ever had any doubts about it) and also to pocket a decent bit of change in the process (which helped with the purchase of that 7900XTX lol). Thank you for revisiting this topic and showing folks the facts!!
should have included 7900x, because most people are just going to see "8 cores is faster than 6 cores" because 3D Vcache
The latency penalty on the 3600 'infinity fabric' is horrible. I had a 3600X in my gaming PC back when it released and found huge performance gains by pinning games to the last 3 cores, and as many other programs and background tasks to the first 3.
3 cores for gaming wasn't ideal at all, but faster for the games i was playing than 6 poorly coordinated ones. While this is not an issue with the 5000 & 7000 series with 8 cores or less, there are still some big gains (mostly stutters in the 1-5% lows) to be made by having games pinned to the last 6 cores and everything else on the first 2 (assuming it's just discord, browsers and other basic stuff).
so true, for midrange/entry gaming just go for those with single ccx will do
1800x to 3800x was amazing but then going to a 5800x3d removed the ccx latency of the 5000 series and the extra cache (Factorio was mind blowing smoother and could increase factory sizes) I can imagine 7800x3d would be interesting but full system is out of the question for now probably be 9000 series before I replace this system
I find that 2000 and 3000 CPUs been missing a lot and comparing 5600 to a 5800x3d when it’s not a typical upgrade when a 2000/3000 cpu is (sometimes even 1000 cpu as well both the 1000 and 2000 cpu was horrible to use the 3000 3800x was good for me for years )
@@leexgx i upgraded from 2700x to 5800x3d some time ago and play stuff like factorio too.
this 3d cache and single ccx latency was such a good investment for strategy and simulation games!
stellaris endgame is finally playable (barely ^^).
if i would play more multicore friendly modern titles, i probably would have gone with the 5600 for best value.
it hat the better latency, nearly as much clock speed and multicore performance hasn't been a problem in games even with the 2700x.
I won the silicon lottery with my 7600X. It often gets up to 6 Ghz for short spurts without drawing a ton or power or creating a lot of heat. Sits around 5.5 all cores most of the time in gaming if needed. Staying cool and drawing a modest amount of power.
At what voltage?
@@Safetytrousers Stock. I did try an under volt and an overvolt. With the first few BIOS this was pretty unstable. With the newer BIOS that's been fixed. Gigabyte X670 GAMING X AX. Not sure what the issue was for the first 6-8 months I had it. Had BSOD's and boot failures quite a few times. BIOS F8 fixed it about 10 months ago and it's been very stable since. That said I stopped playing with the voltage.
My cooling is very good and my office is usually pretty cool anyway. In fact in the UK so far this year - it's been hard to get warm no matter how much heat my PC and home radiators pump out.
@@PaulRoneClarke Stock on my ASUS board is crazy high. I have to offset.
@@Safetytrousers It does seem very BIOS/MOBO dependent. I've built my own PC's since the original Pentium, and it's been a long time since BIOS and MOBO's made such a big difference. I think the last time I had so much variation between BIOS versions was the old AMD Athlon FX57. On an old MSI motherboard. I had to update the BIOS about 5x till I got the performance. At the time that meant making a floppy disk and a lot more messing about than it does these days. I suspect many people stayed with very poor performance and never knew why.
thank you, i had been wondering about the multi tasking. nobody ever talks about it their videos, it was super helpful to know.
HU just mumbled their opinion.... there was no testing involved besides some barely lightweight multitasking that even an i3 would shine in.
Hello From Canada...I noticed A Hyte Y 60 With the Lcd Corner screen...How well does that work? Compared to the knockoffs I see popping.?
Popping Up that is.
So what you are saying is i need a 12 core cpu or higher with more 3d V cache, gotcha
Yes :)
take notes from Intel, bigger number better, so maximize cores, clocks, cache, wattage, maximize everything!
@@1Grainer1Intel: Maximize the chips till we have to prop them up like it's Weekend at Bernie's.
The funny thing is that the 7900X3D is significantly slower than the 7800X3D in gaming because of CCD cross talk. So you need 8 cores with 3D V-Cache or you need 16 cores with 8 of them having 3D V-Cache.
Nothing brings in the clicks like an "it depends" video title.
It depends.
I’d like to see a graph 📈 showing resolution and how much it depends.
only thing i can say about multitasking slowing games is that one time i actually did that... i was 14? had bulldozer cpu, was playing old at that time Prince of Persia Warrior Within, at some point game started being 3x normal speed because fps got unlocked somehow, how did i fix that? i overloaded pc with task and throttled fps to achieve normal gameplay
fun times
I installed a 5600G into my mediaserver after I damaged the old 3800X during a cooler upgrade (it stuck to the cooler and then dropped into the socket, bending a few pins and whilst it could be repaired, I lack the tools to do so (magnifying and tiny tweezers of some kind) and buying the tools vs a 5600G wasn't a big difference)
What I found was the 5600G easily as good as the 3800X for media operations, transcoding, encoding, remuxing and so forth... Whilst running cooler... It also allowed me to remove the GPU entirely, which frees up that PCI-E 16x slot that I plan on filling with an NVME card with a bunch of NVME drives... so I can start to retire some of the HDDs with around 75k hrs on them.
I appreciate the fact you didn't just use the 4090. You threw in a 3070 and lower cards for comparison. Thank you.
Literally helps so much
still using r5 3600 until now
i use a i5 8400 which is around the same performance or a little slower. since i am gaming in 1080p i have no problems at all.
Straight up just threw away my 3600 once I built my 7800x3d system, they say the pc doesn’t make the gamer a better gamer well when your 1%lows are 200+ it kind of does + good 5G wifi
What video is the graph with the RX 5600XT at 4:52 from?
I know this channel (or almost any big youtube channels) doesn't cover the emulation performance but I think that could have had interesting results in this specific test video. As far as I know, core counts actually matter on emulating more demanding stuff like ps3 games etc but it's quite difficult to find a reliable source like you guys who actualy made a comprehensive video about it. Great video regardless. Hope to see people commenting "do I need 8 cores for gaming" in a couple of days again.
Anxiously waiting for Tech Deals' reaction on twitter
I'm very happy with my Ryzen 7600, but I'll say it *needs* a cooler upgrade. The stock cooler just doesn't cut it. But a 23$ air tower cooler is perfectly enough. I'm glad I paired that CPU with a 6750XT -that I already owned- for 1440p gaming, I feel it's pretty well balanced. And I'm confident my motherboard will be enough for the next low TDP Ryzen 5, zen5 or more probably zen6 in a couple of years.
Thanks HUB, your videos are pure gold, the best content covering CPUs, motherboards and GPUs for consumers interested in gaming PCs 👍🏻
_"...my Ryzen 7600, but I'll say it needs a cooler upgrade"_ Uh, NO! It's a 65W CPU! There is absolutely no reason to *ever* "upgrade" the cooler on a baseline 7600; overclocking gains are minuscule, completely pointless and simply generate more heat (like the irrelevant 7600X),
@@awebuser5914 with the stock cooler, the 7600 hits 95º after about 5 seconds of 100% load, without PBO and, of course, no overclocking. I don't care if AMD says that is ok and it's by design, I'm not letting my CPU work at those temperatures all the time, whatever they say... They could have included the higher tier stock cooler and it would be a different matter.
@@mkcristobal just undervolt it by 30mV or something
I undervolted my 7600 to -20 and 85.now it's run way cooler.i m also using rx6750 xt in eco profile it's cap fps to 60 but cpu and GPU runs much cooler.
You guys must understand that undervolting the lowest CPU in the current lineup, which is just a 65W TDP, because the stock cooler underperforms, should not be the norm, and is something that regular users wouldn't do.
Nice video once again, thank you.
I think this video cleared up a lot of convoluted analysis about perhaps an overly limited data set and explained it all much better.
Thanks for following up.
Not to be in the "but you missed x CPU!1!" camp, but to really lock it home you could have included the 2700X -- that would have been an 8 core CPU getting beaten by both 6 core CPUs.
It's a bit old now to care. Tbh.
The i5 9400F is a perfect 6 core. Well, it's 6/6. 😂
14nm moment
I still have one with my 3080
Big bottleneck, but at 1440p it's good enough for now
Haha, 9400F and 9600K would be really competitive if they did not cripple them with not giving them hyper threading.
My friend had the same CPU until he upgraded to a Ryzen 5 5600G. Was not much but it did give that extra performance he wanted paired with a RX 5600XT
@@kgt8742 i don't get why he chose 5600g over 5600/5600x if he is going to pair it with a dedicated GPU
Will more cores/threads help speed up shader compilation? My PC appears to take forever in doing that, and I was wondering whether a CPU upgrade would help.
Faster CPU cores are likely going to help the most there, not more of the same cores.
Love these videos
Test those CPU's with a mid tier card like a 3060 and there won't be much difference. This test is bad
😅
What else, test them with voodoo3dfx ?
Yea nobody would use a Ryzen 5 3600 or 7600 with a RX 7900XT. This is not how people in real life would combine cpu and gpu.
Right, your correct but this is a pure CPU test which means needs to be CPU bound and why they use a powerful GPU, it forces the CPU to max out to give you best performance
@@cowboyk5834 Don't try to explain what cpu benchmarks are to them. They won't get it anyway.
Tech Deals in shambles.
A better review of 8 vs 6 cores would be the 7800X3D vs 7900X3D with only the 6 cores with cache tested. This would be a true apples-to-apples comparison.
We've made that video.
@@Hardwareunboxed but we forget everything and will happily watch another iteration ;)
Do you have PBO enabled for the 7600?
good stuff, thx. would u pls consider doing end to end latency tests for CPUs with LDAT? It would be also interesting to see how much hyperthreading like on a 7950X improved in that regard. in general i think latency,stutters and lows should get more attention, as it considerably impatcs how responsive one experiences games
I remember, when i paid attention to these things, that platform latency played a big part on gaming performance.
That applied even when game was GPU limited, if i remember correctly. I think it's possible that when GPU is waiting, that waiting time is increased when player does something and frames and environment needs to be recalculated. Every change takes time and response time of system affects, something like "input latency" in displays.
At the time AMD introduced CPU's with integrated memory controller (used to be in a chip set) which decreased latency greatly.
Honestly, I appreciate having it added as a middle ground. Thanks for all the hard work!!
20:00
I think a video about cleaning or tweaking a pc from you will be GREAT and HELPFUL for many of us
thanks for that-- in the past you made comparisions with locked frequences to compare IPC...and i miss this
Steve you think the performance delta on the Intel's is as big in a similar 6 vs 8 p core as it was in the AMD's ?
The Intel gap is a bit larger as the L3 cache capacity changes.
could you please also implement 0.1% lows into the benchmark/charts?
would love to also have them
Every modern game would register single digits.
I have the R5 3600 workhorse and recently upgraded to a radeon 7800xt gpu. I figured I'm probably cpu bottlenecked but at 4k gaming my GPU is close to 100% utilized! Is the reported gpu usage irrelevant?
nothing beats testing. this content is gold.
Why am I not shocked that Steve uses a feather duster to "clean" his computer. Steve from GN uses the chicken gun from Mythbusters!@😂😂
Why does the 7800X3D drops from 7:00 173/121 to only 8:20 134/107 after upgrading the GPU from 7900 XT to the RTX 4090 with same game & settings!?
Look up "nvidia driver overhead". There's a video here.
Good points!
Super funny. I was thinking last night it would be great to have a cross generation Ryzen benchmark to see how the same core counts perform, then this comes out today! Love it, thank you!
18:32-20:30 oh boy Tech Deals won't like this
At a minimum people should be using a 6 core 12 thread CPU for gaming and work.
How much is the difference between low/ultra settings due to the CPU? Noticably when heavily GPU limited there's a difference between presets, but not between CPUs. I then assume the difference is mostly GPU bound...
But then in 1080p low/medium where the performance is definitely CPU limited, we still see a very similar frame rate difference between Low and Medium presets. Suggesting the CPU is mostly responsible. Is it certain groups of settings that primarily hit the GPU/CPU specifically? In that case I'd love to find out which settings hit my CPU (to lower them) and which hit the GPU (it has headroom) to keep as much eyecandy as possible without losing that 30-50% performance