Are we gonna see a return of Timmy Joe? I really loved watching your tech reviews, you just had a different spice to your videos, especially the newd plebs on the internet series
Jesus Christ it's the man, miss your content bro. You had a way of cobbling together wacky stuff and making it fun, would love to see more C2Q stuff in 2023! Hope you're doin well
@@jimdshea much easier to cool meaning lower temps and fan speed, doesn't heat up your room as much, and in some places can save quite a bit of money. the first thing is a big issue, I have 12900K and with good 360 AIO temps can reach 90 quite easily (like sometimes in overwatch when CPU usage is high for some reason). also it can save you some buck when choosing a cooler (if you buy a new one).
I'm glad I built my first high end gaming rig with AMD Ryzen 7 7800 X3D processor + AMD RX 7900 XTX graphics card. Best decision, and thanks to Steve's benchmarks for enabling me to take an informed decision.
@@Teferilol I chose the AMD graphics card so that I could leverage the Smart Access Memory. Moreover, I don't play any Ray Traced Games, so I had no intention to buy Nvidia 4080. Lastly, it was on a slight discount compared to the 4080.
@@riddlex i chose it, because it was 250 cheaper which is kinda big deal. And i play with rt enabled almost everywhere on 7900xtx in 4k, so not so big difference with 4080 after all, except cyberpunk2077, but i dont know anyone who would overpay so much for 1 game.
@@NavneetvaioOn B650E Aorus master and 980 pro, not long at all. The first boot with DDR5 memory training takes time and when you tune the timings or turn expo on (first boot).
Upgraded from a 5800X. Boot times are good, even a little bit quicker than my ASRock X570 Taichi, (using Asus B650E-E). Bold if you to assume I am American, clearly you are lol, in my country the 7950X3D is way way more expensive, plus I only game do dont need anything more.
Why doesn't Intel come out with some 3DVCACHE CPUs of their own. If now all of a sudden, games are now being optimized for extra cache, then intel doesn't need to miss out
Have the 7800x3d with 4070ti, i thinks it's the most efficient and coolest pc i've ever had. Literally the hottest it get 61° on the gpu when 100% in furmark or when using heavy reshade profiles. Avg temp in games is like 38 cpu and 47/52 gpu (perf mode bios on the ti)
@@waldemarhahn1709 Obviously not water. I see those kinds of temps with a 6900XT on water. It would be a massive waste of money to watercool a 4070 Ti, because the expense of a loop would literally buy you a better GPU and the GPU just isn't that hot.
@@samueleinzaghi8964 i got the 4070 for even more efficency, clocked at 2500mhz with a power draw of 100Watts! 40 series is really amazin power draw wise
i dont see how 4k is unreasonable, its been a thing for long enough it should be easier for low/mid range to do by now but games keep coming out optimized like pure garbage.
@@bradhaines3142 If developers took the effort to optimize their new titles as they SHOULD be doing 1440p monitors would end up being the new 1080p resolution of choice and 4K would eventually be the new 1440p. But instead of making better and more powerful hardware you have Nvidia offering band-aid solutions with their stupid upscaling and fake-frame generation crap. Which makes developers cut corners. *cough-Remnant 2-cough*
same, so glad i got the 7800x3d(especaly with the starfield combo since i was gonna get that game anyways) hope amd stays competetive in the GPU market for next gen, i really hope they gonna have something worth switching my 6900xt out for
7800x3d is mediocre mostly because of weaker multicore. If all you do is game, sure. But if you do any video editing / rendering task or other cpu heavy task i7-13700k smokes 7800x3d both in raw single and multi core.
@@garyb7193 most people still don't need more than 4 or 6 cores with SMT lol. One of my PCs was running a 6700K until a month ago and was only upgraded because Microcenter had insane sale prices on 12th gen i5s.
This CPU is insane. I have a 7800X3D with a -30 CO (undervolt) all cores cooled by a Arctic Liquid Freezer II 360 (rev. 7). The son of a b*tch while running Aida64 stress test reaches 66º, while consuming 60w (measured using Aida64 tool, not 100% accurate but couldn't be that far from the real thing) and being 100% stable. It's f*cking ridiculous. Have it since launch, couldn't be more happy.
User Benchmark says the 13900k (and actually they're really pushing the 13600k) is leagues better than the 7800x3d for everything including gaming. Anybody remember that 2kliksphillip video on why Muser Benshark is a bad place to go for review and comparisons? Pepperidge farm remembers.
I’ve not long upgraded to the 5800X3D from a 1600X on a B350 motherboard, insane platform longevity and crazy upgrade. The difference in 1% lows is wild. If I was to be building a brand new PC from the ground up however, the 7800X3D would be a clear choice. It offers such good gaming performance, efficiency and at a price that’s decent vs the competition.
dont bother with a 7800x3D if you already have a 5800x3D, look for 13900k vs 5800x3D benchmarks (think techpowerup had one) with like 50 games tested and it was only 5% slower overall, let that 5800x3D mature a little or dabble in some curve optimizer settings (if you can manage a -20 or higher offset) a 5800x3D wipes the floor with a 13900k at far less heat/power
@@123TheCloopwell too optimistic i think but it can match 900k at some point coz architecture of zen is really good for gaming coz interlinked with huge cache with only 8 cores while intel has cache distributions like zen1 with 2 chiplet type.
@@123TheCloop oh no I wouldn’t, I just mean if I was in a position to build a brand new system from scratch, that’s the CPU I would choose. Definitely a monster.
As someone who runs a resolution closer to 4K than it is to 1440p, I just want to say thank you for including 4K results when y'all can in cpu reviews like this. Recently got a 4090 & now I'm stepping up my setup from a 5950x to a 7800X3D, partly thanks to all the info gathered from y'all's reviews & others on the 14900k & 7800X3D/7950X3D.
Great video. I really think this benchmarks would of been more interesting if you had some previous generations cpus, matched up with the 4090. Like i7 9700k or maybe 5800x3d ... maybe :)
Thank you for this comparison! And the mountains of data. I have both these processors in separate and very different systems, (i9 in oct 22 because i have spending issues, just finished AMD build this month) 13900k/7200 RAM/Suprim X 4090 in a Corsair 5000T (UWQHD) and a 7800X3D/6000 RAM/4090 FE in a Dan C4-SFX (4K). I9 is on a AIO and R7 is Air Cooled. R7 build also cost me half the price of i9. When you consider the above, I found a roughly max 6% difference between these two systems when doing 3DMark tests at 1440p & 4K, even with the i9 at stock uncapped settings. The efficiency of the AMD, and it's gaming capability is really a game changer (sorry), even more so when you consider the i9 is in a 75L case, and the R7 is in a 14.7L case (that's 80.4% smaller size). If you aren't doing anything CPU intensive daily but are gaming, get the 7800x3d. Save yourself lots of watts. (Sorry again).
Thanks for the video. perfect timing, a friend of mine is going to build a new system and I was talking to him about these 3 cpu's .. microcenter has some good intel combos but that being said i run the 7800x3d
I wish you could run the same test using the 7900XTX @ 1080P / 1440P (4K testing is a waste of time bc its gpu bound). It'd be interesting to see if SAM really works better than resizable bar,and,this card tends to beat the 4090 in 1080p in some scenarios due to lower driver overhead.
Yeah 4k is a waste of time for Cpu testing. But if they dont Test it, Many people get angry at them and say that with these Specs nobody would play at 1080p.
In praise of 4K testing : it may just show that these CPUs offer no difference in performance but this is important knowledge. Using a 4090 there are going to be plenty of people who target 4K. For us the CPU decision is either 1) what is the cheapest CPU that still supports good 4K gaming ? Or 2) if someone isn’t budget constrained the question then is , is there any performance trade off between productivity and gaming with a CPU at “any price” ? Your review really shows the Intel chip is great for the non budget constrained gamer who wants a mixed use system that can game at 4K and do productivity work. I would love to see a video on “what is the cheapest 4K gaming system you can buy”. Maybe also look how well an older system like AM4 performs versus newer low end CPUs. The conclusion is likely to be that like perhaps far older CPUs are still great for 4K gaming. There’s a lot of attention these days on 1 per cent lows - does that even show up when running a 4090 against a R3600 and a 7800X3D? How much? You could have a running spot on what is the cheapest way to build a 4K rig.
I love the power consumption charts. Would also be nice if you could maybe add idle to it to as my computer sits idle for most of the time during work or web browsing.
What I'm really missing from this comparison is test of simulation heavy games, like total war turn times, factorio megabases or paradox titles like stellaris or CK3. I know there aren't really any canned benchmarks you can use for that type of game, but there are ways to test it.
7800x3d is a Monster. I went from 5800x3d to a water cooled 13900k and just built my son a 7800x3d water cooled (all with 4090's) and im blown away by the 7800x3d. I never would have went with intel if i knew how efficient it was going to be and still be a monster in games. (Coming from a 5800x3d i should have guessed, though, haha)
Will you ever do a benchmark with a 4090 (and 4k + all the bells and whistles of course) and a whole bunch of cpu's, to see how low can one go until the cpu actually starts to slowing the gpu down? In the above case we can see that at 4K the cpu doesn't matter but I'm curious what's the cheapest cpu one could use and still get the maximum out of the gpu.
On the verge of finally….FINALLY upgrading my pc with a new build and am torn between 7800x3d and 13700 or 900. Money isn’t an issue, and I generally just do gaming/sim racing. I’m really looking for general system stability and reliability…all that said, would you suggest sticking with intel or going for the amd route. Thanks and cheers!
if you are sim racing and just gaming, then just go with the 7800X3D, easier to cool, sips a lot less power, and will perform considerably better in Sim, as well as Unity games.@@christophersmith8028
Thanks for 4k results. Always good to have at least one solid evidence once in a while (considering no one besides you is willing to do this painful job) PS: Genuine ACC exclusive players will appreciate very much 👍
It's funny because the mean and median don't do a great job of reflecting the difference, but then you look at certain games and it's a pretty big difference. Most significant is the amount of power used, though. You're usually using 100W+ more in a 13900k for very similar performance.
Just on the 4K results - I appreciate it might not seem worthwhile, but it's still useful for someone building a new PC specifically for 4K - giving them re-assurance that they can use whichever platform is more cost effective without leaving any performance on the table. As new GPUs are released, that's not necessarily a given.
No, you can use the 1080p result to know what you want, the CPU that wins in the 1080p result will also wins in a possible scenario where 4K is not GPU bound. 4K results are STUPID and a loss of time.
@@GhOsThPk The 1080p result tells you about *future* performance at 4K, not *current* performance at 4K. If nobody is tests performance at 4K with current GPU hardware, then you don't *know* the CPUs will perform the same with current hardware, all you have is an assumption.
thank you for the video :) going in blind, i don't know, what cpu is gonna come ahead, but i'm very happy with my 3d. In the end, i chose amd, for the same reason i chose my 1700x ~6(ish? ) years ago. Platform longevity. With that said, being fully on water, with a monoblock, it means even more for me. May the best cpu win.
Great review as always but I think that 4K results are still important. Yes 4K results don't show which CPU is better than the other but that's the whole point in my opinion. Many people don't realize that if you buy an RTX 4090 you will probably play at 4K which means that you don't really care about which CPU is faster since you will probably be GPU limited. Now maybe with future generation GPUs it will make sense to get a 7800X3D for gaming at 4K cause maybe at that point you will be CPU limited. Now if you buy an RTX 4090 and that you play at 1080p or 1440p (I mean it's your money, you do whatever you want with it), then yes those benchmarks make sense. At the end of the day, my point is that with a realistic usage (so an RTX 4090 for 4K gaming), you will probably care more about other stuff than CPU bottleneck like power consumption or upgrade path.
Great work Steve. As a 4K user it is always interesting to see the differences at this resolution of the different hardware. I think a 7950X3D vs 13900K comparison would be great, comparing productivity performance as well as gaming.
Heres a hint. Cpus do not care about resolution much. If it can hit 144fps at 720p, it can get close to it at 4k assuming the gpu isn't a bottleneck. If you want to test for yourself. Run some tests on your own games at sub 1080p. See how lowering rhe resolution doesn't change the fps if you don't have a gpu bottleneck.
awesome vid - good data. As you bring the ACC benchmark that is happy for the v cahce, i would LOVE to see it compared to an iRacing replay. as its really not optimized- and raw performance should be exposed clearly. but really would love to see it compared to the ACC. :)
With the newest Bios update for AMD allowing for faster memory, I would be interested in a comparison of 6000MT versus the new stable top men speeds for both X3D and non-3D
Do you realize that in your Hetzner sponsor segment, you show a server with an Intel 13900 at 00:57.. Back when this video was filmed, this wasn't the case but now these CPU's are known to be causing issues with servers :)
I suppose the video was already in the making during the new update on AMD CPUs thus using the "slow" 6000 Mhz memory. I was curious to see how much better it now works with 7000+ memory.
The 6400mhz+ memory does not run at 1:1 UCLK:FCLK Infinity Fabric due to memory controller limitation - for gaming performance to be matched to cl30 6000, one would need approximately a cl36 8600 running at 2:1, which is an insane ask and only double digits of all DDR5 RAM sticks out there could do so stably.
It doesn't. Some CPU's might be able to achieve stable 6400. But otherwise if you go for higher memory speeds, you get 1:2 ratio. Below 7800 MT/s this means less or same performance at best. And it's hard to make 7600, much less 7800 and 8000 stable.
I got the 7800X3D and undervolted it, not only did that give me better performance by about 3-7% depending on the game. It also has never gone above 55C even in cyberpunk. This thing absolutely sips power. Compare that to my roommate's 13900k, and his cpu is always sitting at 90C in games and draws a ton of power. The one down side of having gotten the 7800X3D over my old CPU is that my room doesn't heat up as much during cold times anymore since this cpu literally just doesn't run hot when you undervolt it.
"Always sitting at 90C in games" is nonsense, unless the guy has stock intel cooler from the 90s. Go watch actual gaming footage, e.g. by TestingGames. i7/i9 runs like 5-10C hotter at the most compared to the x3D.
@@AR-ey1ur not if your room gets hot and eventually increases your ambient temperature higher. And no he doesn't have a stock cooler. You literally can see that the Intel cpu uses 94 more watts on average and that's stock. Once you undervolt the 7800x3d that is closer to 100 watts. Idk if you know what 100 watts is, but that is a lot of heat output. Stop fan boying.
@@z_wulf 100w watts is 1 old light bulb. not really that much power, the issue with a cpu is the density of the power, if you touched the cpu at 100w youd burn yourself instantly. also ambient doesnt make that big of a temp difference, its kind of linear. you warm up the room 3 degrees, your temps go up around 3 degrees. thats why gamer nexus rates coolers with 'delta over ambient' meaning meaning the temp difference between ambient and the temp its running at.
I upgraded to a 7800x3d with ddr5 6000 from an I9 9900k with ddr4 3600 and the difference felt like a gpu upgrade. All while using a lot less power. Love that little cpu. Edit: this is with the same gpu, a stock 3080ti.
I think this isn't mentioned enough, benchmark videos always show native resolution tests. I saw massive increase in fps with my 3080 when going from AM4 to AM5 because I play with DLSS at 1440p.
Could you tell me what components you used to build the computer? It's my first time and I would like to ensure that I can play Cyberpunk 2077 smoothly.
Heh, uncanny. You upgraded from the exact same CPU+Mem that I currently have to the CPU+Mem that I'm currently considering. With the impressive future prospects (8800x3d?) of the AM5 socket, I'm sold on this now.
Did you turn off the MSI MPG Z790 Carbon WiFI? 2:25 Would that cause the extra total board power consumption to be that high? Apples to apples comparison matter of course. The Gigabyte X670E Aorus does not feature built-in WiFi.
same here. I upgraded to the 13700k early last year and have been thinking on what next, but there is really only a few more frames diff in some of these and several are the same. I think we have got to a time where fast is just fast and spending more money really not going to get you much.
100% agree. My goal is to remain always just slightly gpu bound and kept my CPU as long as possible. Playing at 4k is comfortable now and allows just that.
6000 MHz RAM is no longer the sweet spot for the 7800X3D as of the latest BIOS update, it can now support RAM with speeds up to 8000 MHz, but the memory controller now runs at 6400 MHz so that's the new 1:1 sweet spot. I overclocked my 6000 MHz CL30-36-36-76 corsair vengeance RAM to 6400 MHz CL30-38-38-80 with the infinity fabric overclocked to 2133 MHz to again match that 1:1:1 ratio and it runs like a dream.
i love my 7800x3d , bought it on sale for 400 euro this cpu is just perfect for itx builds efficient , powerful and easy to cool i have fractal design ridge build xD
Yup, walking around a crowded town in MMOs and my FPS remains 60 at ALL times, never drops below. Now just gotta wait for GPU pricing to reach sane price levels :P
@@mikfhanI gave up waiting after the COVID/crypto mining boom; I don’t think they’ll ever drop to reasonable levels again so after twenty years of Nvidia I’ve switched to AMD.
i9 needs to have 10+ P cores. Being called i9 but having same cores as i7 is dumb. On the contrary the i7 looks the best for value per dollar. $150 cheaper for 11 less fps.
Always appreciate all of the hard work you guys put into your videos. I honestly think you guys are the best in the biz, and you’re my go to slice for reviews and benchmarks. Keep up the great work!
I have been buying only Intel for the past 20+ years but my next purchase (before end of year) is going to be AMD. Seems like the better choice. Update: I just bought a 7800X3D. Loving it!
The only thing helping Intel at this point is the marketing I think saying something like: “u r cool if u use Intel” will be enough to win allot of dumb sheeple on their side I mean look at Nvidia VS Radeon for example
Both of them generate crazy amount of fps. I'll stick to my 5800x3d for good few years. With 1440p 144fps monitor even with 3070 im matching the refresh rate at least most of d the time
to be fair I'd still buy fast ram on AM5 cause 1) it's not that much more expensive, especially if you're spending this kind of money on cpu 2) you're likely to be able to take advantage of it with future AM5 cpus 3) new Agesa shows that current motherboards can run them faster, it's just down to the IMC which should get better with Ryzen 5 hopefully
If you're gonna spend extra on memory, you probably gain more for gaming by buying a 6000 kit that can hit very low latencies (manually tune it for more gains), rather than faster clocking memory.
@@blue-lu3iz I'm not talking about "expensive memory", I'm talking about spending 10 or 20 dollars more on ram that can do up to 8000 mhz as opposed to one that can barely do 6000, it's not that hard to understand, it's not about future ram being faster, it's about current memory already being very fast (and cheap) but the IMC can't keep up
Regarding power, I still have fond memories of when I installed my 4090 and benchmarked the new witcher 3 bells and whistles. I tripped a circuit breaker 🤣
I think an optimized 13900k with undervolt can be more efficient than stock, it would be interesting to do benchmarks like this. But overall i9 are not for gamer when you know it has 24 core.
The problem is that you can't benchmark an undervolt because not all chips can achieve the same undervolt. Same reason they don't undervolt the x3d, which gains a lot of thermal headroom at -30mv all core. Regardless the intel chip uses a lot more power and you need more expensive ram.
Yep, Intel has lots of headroom for whatever reason, you can under-volt the heck out of the 13 Gen without losing performance. Though on AMD side, even a slight under-volt will cause a huge performance loss.
@@asdf_asdf948 Alright I got to admit my data is pretty outdated, I had a Ryzen 3600 in my other system and this is what I've discovered. Did AMD change the algorithm for PBO?
Thanks for nice and informative video 😃 PS: also maybe someone knows or have benches where 7800x3D is compared with i7/i9 on idle / web-browsing scenarios for total system power consumption. I think there are a lot of gamers who games 3-4h a day and the rest 8-10h hours is just idling or web-browsing.
Once you can hit 4k +60fps all that matters is power consumption and saving money. Someone really should do a build series trying to hit that sweet spot of 60fps 4k high/ultra with the lowest power consumption that gives the best bang for buck.
@@Superiorer 144hz at 4k? ;) if you can do 60fps @4k you can probably easily do 120-144 at 1140p/1080. Wont be long till 1080 is obsolete on desktop monitors and 4k is new standard.
Thank you for the 4K comparison, I only game in 4K and these tests will help me a lot making upgrades in future if needed. Continue 4K in your tests please ❤
It's not even close after you start running -25PBO, I went from 4.5Ghz all cores to 4.9Ghz this increased performance incredibly in games. It was already faster than the 13900K now the 13900K doesn't beat it in any game. Best part is it's air cooled with my 2011 NH-D14.
But reviewers should avoid any overclocking, because those who give OC results soon are fed golden samples by marketing, giving unrealistic results not reproducible by consumers. The curve optimiser undervolt too can need tweaking to be stable running ST, the fastest cores may tolerate little undervolting even when undervolting works super stably in all core workloads.
@@erichall090909 Nope, it should be restricted to a sane power limit. There's been far too many tricks by mobo BIOS and CPU vendors to score narrow benchmark wins by throwing a ridiculous amounts of watts at the problem. The Intel i9 is wasting on average ⅓ of the entire PS5/XBOX power.
Then it just becomes an arms race of who can OC better. It doesn't help that Intel has gone nuts with letting board partners run wild with power limits.
Appreciate why you are using a 4090 to test the CPU performance, but would be good to also do maybe one test where you step down the GPU grades and show the point where the CPU becomes irrelevant. I suspect for most gamers on mid-low range GPUs both of these processors would be massively overkill. It would be a lot of work I guess but a matrix of recommended minimum CPU GPU would be kind of cool.
The cpu is already irrelevant in all of these tests. If anyone actually bought a 4090 they will be in 4k and you could use a 5 year intel cpu and get same frame rate. Just like if you can only afford a 3060 or 3070 every game will be gpu limited so the cpu don’t matter.
@@Jimmys_TheBestCop Yes this is my gut feeling too. I suspect that 99% of the people buying the 7800X3D will end up GPU limited whatever they are playing. I guess that was my point for this video, it could have done with one of those 'Who should buy either of these products for gaming?' sections, to which the answer would be 'only morons' running at 1080p with a 4090 on non competitive games (which are rarely CPU limited anyway and players turn the settings down for lower latency anyway). The 7800X3D feels like a bit of a non-product, I don't get it's segmentation, at least the 13900K has other use cases.
It an "enthusiast" CPU. The only problem is no enthusiast can afford current gen GPUs. There are almost no current gen gpus on Steam. Whoever is buying current gen gpus are either not gaming on steam, which seems nearly impossible, or are using it for production and or AI. Watch Tech Yes City video where he does the first ddr4 8 core chip i7 5960x vs the last ddr4 8 core chip the Ryzen 5800x3d. Using a 4090 at 4k they were exactly the same. Even at 1080p the nearly 10 year old chip finished within 20% fps of the 5800x3d in many games. You could probably purchase any of the last 3 gen CPUs and you will still be limited by whatever GPU you can either find or afford.
@@Jimmys_TheBestCop The 4090 on steam is 0.56% that is about 1 out of every 200 gamers, and is a similar stat to the 3090s. Its a small percent, but that is still a lot of cards in real terms. I do know some people in 3D design who all bought one, so that is another use case for them I guess. I still think it is besides the point whether you are gaming, doing 3D Rendering, doing AI or other CUDA tasks, anyone with a 4090 would be making a mistake buying the 7800X3D. It's either overkill for gaming, or undercooked for productivity. I really can't find a compelling case for it.
@codemonkeyalpha9057 It's cheaper, more power efficient, and has Vcache that can dramatically increase performance on certain games. Plus it's not bad for productivity either, just not the best. It has more than enough cores to power through tasks at it's price bracket. I honestly don't know how you can say it's a non product when your logic also makes the 13900k a non product as it's not needed in 4k with a 4090. Unless your a rapid fan of only intel, that is.
I would be very interested to see how the 13900k would perform in Watts/FPS if you undervolted it, and what the performance difference would be if you ran it under a powerlimit similar to the TDP of the 7800X3D. I did watch a video, where the 13900k managed to achieve somewhat similar gaming(!) performance under a powerlimit to what you could expect without it.
Intel architecture are way way below AMD for now. Its doesnt matter how good 13900k is, AMD can just try little bit and make a better things. Man the intel lose to architecture it just a losing battle, even if they got helped by Microsoft Windows P E cores system.
We want to see the 4k benchmarks in order to know the minimum level of CPU you need with a 4090, which mostly only makes sense to own if you play at 4k. It's expensive and saving a few bucks here and there is desirable 😅. Thanks Steve.
I don't see your point, if the CPU get X fps at 1080 doesn't get the same amount at 4k if the GPU can provide enough frames? So checking GPU performance at 4k and CPU perfomance at 1080p should be enough to see where is the GPU bottleneck and buy a CPU closer to that performance
@danavidal8774 I don't understand what you are saying. Maybe double-check your grammar? I'm interested in what's the cheapest, most power efficient CPU I can use and still get the max performance with a 4090 at 4k. For example, it seems like using a 13700k for me is fine because using a higher-end CPU is pointless.
@@jonathanmatthews5245 I would say the 13700k is high end too and overkill A CPU able to get 120fps or more at 1080p is able to get 120fps at 4k too so you only need a GPU that is able to get 120 fps at 4K, increasing the resolution does not increase the workload for the CPU
Thanks for taking the extra time to test in 4k. Personally, I find it very useful, since for me testing CPU performance in isolation is pretty pointless - if I only saw CPU tests in 1080p low, I'd expect to get similar performance improvements when switching CPU while playing in 4k, which is not the case.
On the power consumption slides, three of the titles that had the closest difference were games where 7800X3D provided 15% more performance! So the real performance per watt was far worse than the 92watts on average.
What about for gaming+ streaming to Twitch/UA-cam on the same PC? I'm trying to figure out if intels little E cores actually have any value for what i do
I'm one of those ones that appreciate the 4K data. If I had a 4090 I'd be gaming primarily at that resolution. I'd love to see a video testing CPUs with a 4090 to see where CPU bottlenecks start to show in various titles.
Love the 7800x3d, especially after i tuned it down by 1% and went from 85 under full load to 60. Didnt notice a performance difference at all. And this is using a z73 360.
After dealing with the heat from a 12900k and 3090 I am really wanting more a more power efficient PC. My AC can't keep up and it's like I'm sitting next to a heater.
It would be nice if you guys included some single player Japanese games like Yakuza Like a Dragon, Persona 5, Final Fantasy VII Remake and so. Also considering CPUs are important on PS3, 360 and Switch emulation, testing games on those emus would be helpful.
SOOOOO close to a Mil! Glad to see you guys are still offering the best testing and tech opinions out there!
Thanks Timmy, nice to hear from you mate 👍
Are we gonna see a return of Timmy Joe? I really loved watching your tech reviews, you just had a different spice to your videos, especially the newd plebs on the internet series
They are the best channel and put the most work - so well deserved 👍
Subbed to you BTW - love your channel too 👍
a mil is 1000... en.wikipedia.org/wiki/Per_mille mille/mil was always 1k
Jesus Christ it's the man, miss your content bro. You had a way of cobbling together wacky stuff and making it fun, would love to see more C2Q stuff in 2023! Hope you're doin well
7800x3d absolute efficiency monster.
@@jimdshea less heat, can get away with weaker cooler, better electricity bill
@@aerosw1ftnope
@@VenatFFXIVyup
@@jimdshea much easier to cool meaning lower temps and fan speed, doesn't heat up your room as much, and in some places can save quite a bit of money. the first thing is a big issue, I have 12900K and with good 360 AIO temps can reach 90 quite easily (like sometimes in overwatch when CPU usage is high for some reason). also it can save you some buck when choosing a cooler (if you buy a new one).
@@DrLoomis666yeah but you buy a 500$ dollar cpu for gaming, not youtube
I am fine with 5800X3D and waiting for 8800X3D :). But 7800X3D has impressive performance and power usage for that price.
same boat here, my 5800x3D+RTX3080Ti combo will hopefully last me two generations.
@@dawienel1142 I have 7900XT with it, totally fine for every game today :)
I am waiting when you're going to get 8800X3D for me to get your 5800X3D
yeah id wait it out with that kind of cpu.
5800x3d here 😁
Upgraded from 3500 😉
Best deal ever 😊
I'm glad I built my first high end gaming rig with AMD Ryzen 7 7800 X3D processor + AMD RX 7900 XTX graphics card. Best decision, and thanks to Steve's benchmarks for enabling me to take an informed decision.
nice rig
I got same build, couldnt chose between 7900xtx and 4080, but ultimatly did find 7900xtx for 800$.
@@Teferilol I chose the AMD graphics card so that I could leverage the Smart Access Memory. Moreover, I don't play any Ray Traced Games, so I had no intention to buy Nvidia 4080. Lastly, it was on a slight discount compared to the 4080.
@@riddlex i chose it, because it was 250 cheaper which is kinda big deal. And i play with rt enabled almost everywhere on 7900xtx in 4k, so not so big difference with 4080 after all, except cyberpunk2077, but i dont know anyone who would overpay so much for 1 game.
Yeah I got the 7800X3d but went with the 4080... Amazing performance that CPU
Thanks for including 4k numbers. Really interested in the lows there.
@@Tpecep yep, noticed that.
They still don't get why we want to see that.
Just upgraded to a 7800X3D and B650E motherboard, so far it's been great.
what did you have before?
How are rhe boot times ?
@@NavneetvaioOn B650E Aorus master and 980 pro, not long at all. The first boot with DDR5 memory training takes time and when you tune the timings or turn expo on (first boot).
@@blue-lu3izIf you don’t need more cores and don’t want to play around with the CCD with 3D V-cache and the one without.
Upgraded from a 5800X. Boot times are good, even a little bit quicker than my ASRock X570 Taichi, (using Asus B650E-E).
Bold if you to assume I am American, clearly you are lol, in my country the 7950X3D is way way more expensive, plus I only game do dont need anything more.
The watts per frame is impressive!
More powerful and better efficiency
for which cpu?
@@walllec Watch the video?
@@walllec 7800x3d
Why doesn't Intel come out with some 3DVCACHE CPUs of their own.
If now all of a sudden, games are now being optimized for extra cache, then intel doesn't need to miss out
Thank you HU for constantly pushing with the best benchmark content out there!
@spiderpig1736 The temperature doesn't matter and the total power draw favored the 13900K because it is slower
@spiderpig1736 this benchmark is a meme to shut the idiots up who keep asking for cpu comparisons between 1080/1440/4k
So glad that I grabbed a 7800x3d. It's been amazing for me and it's just in a class of its own in terms of power efficiency.
Have the 7800x3d with 4070ti, i thinks it's the most efficient and coolest pc i've ever had.
Literally the hottest it get 61° on the gpu when 100% in furmark or when using heavy reshade profiles.
Avg temp in games is like 38 cpu and 47/52 gpu (perf mode bios on the ti)
@@samueleinzaghi8964 nice =) may i ask, what cooling solution you use ? water or Air ?
@@waldemarhahn1709 air. Using an nh-u12a with a bunch of salvaged 120mm fans also from Noctua. Case is the Fractal North (glass).
@@waldemarhahn1709 Obviously not water. I see those kinds of temps with a 6900XT on water. It would be a massive waste of money to watercool a 4070 Ti, because the expense of a loop would literally buy you a better GPU and the GPU just isn't that hot.
@@samueleinzaghi8964 i got the 4070 for even more efficency, clocked at 2500mhz with a power draw of 100Watts! 40 series is really amazin power draw wise
Thank you for including 4K data for us unreasonable people, I genuinely mean that.
Gotta see those flat bargraphs. Very interesting.
Ahh yes this 4k GPU wall is in fact a wall … hmmmm
Exactly, finally we can see them.
i dont see how 4k is unreasonable, its been a thing for long enough it should be easier for low/mid range to do by now but games keep coming out optimized like pure garbage.
@@bradhaines3142 If developers took the effort to optimize their new titles as they SHOULD be doing 1440p monitors would end up being the new 1080p resolution of choice and 4K would eventually be the new 1440p. But instead of making better and more powerful hardware you have Nvidia offering band-aid solutions with their stupid upscaling and fake-frame generation crap. Which makes developers cut corners. *cough-Remnant 2-cough*
Definitely won't regret my choice of the 7800X3D for quite some time.
same, so glad i got the 7800x3d(especaly with the starfield combo since i was gonna get that game anyways) hope amd stays competetive in the GPU market for next gen, i really hope they gonna have something worth switching my 6900xt out for
For me it will be exactly as long as it takes for the 8800x3d to come out lol
7800x3d is mediocre mostly because of weaker multicore. If all you do is game, sure. But if you do any video editing / rendering task or other cpu heavy task i7-13700k smokes 7800x3d both in raw single and multi core.
@@deivytrajan but if you do any of that then why wouldn't you get a i9 or 7950
Great cpu👌
Go the 7800x3D!
With starfield promo 😁
It's great to see competition like this. In the end it benefits all gamers.
Yeah, when Intel had total dominance, we saw higher prices and the industry stuck on 4 and 6 cores for nearly a decade.
@@garyb7193 most people still don't need more than 4 or 6 cores with SMT lol. One of my PCs was running a 6700K until a month ago and was only upgraded because Microcenter had insane sale prices on 12th gen i5s.
It's crazy how efficient the 7800X3D is. Thanks Steve!
This CPU is insane. I have a 7800X3D with a -30 CO (undervolt) all cores cooled by a Arctic Liquid Freezer II 360 (rev. 7).
The son of a b*tch while running Aida64 stress test reaches 66º, while consuming 60w (measured using Aida64 tool, not 100% accurate but couldn't be that far from the real thing) and being 100% stable. It's f*cking ridiculous.
Have it since launch, couldn't be more happy.
Would be nice to see an idle/browsing power consumption comparison along the game power consumption comparison
Testing on our boards (with the latest BIOS) the 7800X3D system idles at 86 watts and the 13900K system at 89 watts.
A comparative with the 7700x would also show interesting info.
User Benchmark says the 13900k (and actually they're really pushing the 13600k) is leagues better than the 7800x3d for everything including gaming. Anybody remember that 2kliksphillip video on why Muser Benshark is a bad place to go for review and comparisons? Pepperidge farm remembers.
Wow the watt readings with 4090 and 7800x3d is the sa.e as my i7 7700k with 3070. Crazy af
I’ve not long upgraded to the 5800X3D from a 1600X on a B350 motherboard, insane platform longevity and crazy upgrade. The difference in 1% lows is wild.
If I was to be building a brand new PC from the ground up however, the 7800X3D would be a clear choice. It offers such good gaming performance, efficiency and at a price that’s decent vs the competition.
dont bother with a 7800x3D if you already have a 5800x3D, look for 13900k vs 5800x3D benchmarks (think techpowerup had one) with like 50 games tested and it was only 5% slower overall, let that 5800x3D mature a little or dabble in some curve optimizer settings (if you can manage a -20 or higher offset) a 5800x3D wipes the floor with a 13900k at far less heat/power
@@123TheCloopnot true
@@123TheCloopwell too optimistic i think but it can match 900k at some point coz architecture of zen is really good for gaming coz interlinked with huge cache with only 8 cores while intel has cache distributions like zen1 with 2 chiplet type.
@@123TheCloop oh no I wouldn’t, I just mean if I was in a position to build a brand new system from scratch, that’s the CPU I would choose.
Definitely a monster.
@@123TheCloop he said "If I was to be building a brand new PC from the ground up"
As someone who runs a resolution closer to 4K than it is to 1440p, I just want to say thank you for including 4K results when y'all can in cpu reviews like this. Recently got a 4090 & now I'm stepping up my setup from a 5950x to a 7800X3D, partly thanks to all the info gathered from y'all's reviews & others on the 14900k & 7800X3D/7950X3D.
Faster (even if not by not), cheaper, and uses less power. Seems like a winner to me.
Great video. I really think this benchmarks would of been more interesting if you had some previous generations cpus, matched up with the 4090. Like i7 9700k or maybe 5800x3d ... maybe :)
Thank you for this comparison! And the mountains of data.
I have both these processors in separate and very different systems, (i9 in oct 22 because i have spending issues, just finished AMD build this month) 13900k/7200 RAM/Suprim X 4090 in a Corsair 5000T (UWQHD) and a 7800X3D/6000 RAM/4090 FE in a Dan C4-SFX (4K). I9 is on a AIO and R7 is Air Cooled. R7 build also cost me half the price of i9.
When you consider the above, I found a roughly max 6% difference between these two systems when doing 3DMark tests at 1440p & 4K, even with the i9 at stock uncapped settings.
The efficiency of the AMD, and it's gaming capability is really a game changer (sorry), even more so when you consider the i9 is in a 75L case, and the R7 is in a 14.7L case (that's 80.4% smaller size).
If you aren't doing anything CPU intensive daily but are gaming, get the 7800x3d. Save yourself lots of watts. (Sorry again).
Thanks for the video. perfect timing, a friend of mine is going to build a new system and I was talking to him about these 3 cpu's .. microcenter has some good intel combos but that being said i run the 7800x3d
And u get starfield with it 😁
I wish you could run the same test using the 7900XTX @ 1080P / 1440P (4K testing is a waste of time bc its gpu bound). It'd be interesting to see if SAM really works better than resizable bar,and,this card tends to beat the 4090 in 1080p in some scenarios due to lower driver overhead.
Yeah 4k is a waste of time for Cpu testing. But if they dont Test it, Many people get angry at them and say that with these Specs nobody would play at 1080p.
Agreed
@@zalankhan5743 yes, i plan to get a 4k tv, i would be furious without 4k testing,
Also remember that AMD has an update that now gets 7200 memory to work on Ryzen.
SAM is rebar, just rebranded. They've done videos on that in the past IIRC. Every test was within a percent or so.
man the 7800X3D is fucking nuts
@@thenutsonyourchin Yeah and this testing didn't even include Microsoft Flight Simulator, that cpu is nuts in that game, look it up.
Could be interesting to see how they fare on the XTX as well.
Just bought myself a 7800X3D with 4070Ti, what a beast!
In praise of 4K testing : it may just show that these CPUs offer no difference in performance but this is important knowledge. Using a 4090 there are going to be plenty of people who target 4K. For us the CPU decision is either 1) what is the cheapest CPU that still supports good 4K gaming ? Or 2) if someone isn’t budget constrained the question then is , is there any performance trade off between productivity and gaming with a CPU at “any price” ?
Your review really shows the Intel chip is great for the non budget constrained gamer who wants a mixed use system that can game at 4K and do productivity work.
I would love to see a video on “what is the cheapest 4K gaming system you can buy”. Maybe also look how well an older system like AM4 performs versus newer low end CPUs.
The conclusion is likely to be that like perhaps far older CPUs are still great for 4K gaming. There’s a lot of attention these days on 1 per cent lows - does that even show up when running a 4090 against a R3600 and a 7800X3D? How much?
You could have a running spot on what is the cheapest way to build a 4K rig.
I love the power consumption charts. Would also be nice if you could maybe add idle to it to as my computer sits idle for most of the time during work or web browsing.
You really need some MMOs and factory games like satisfactory or factorio in these bench marks
What I'm really missing from this comparison is test of simulation heavy games, like total war turn times, factorio megabases or paradox titles like stellaris or CK3. I know there aren't really any canned benchmarks you can use for that type of game, but there are ways to test it.
Thanks for including the 4K information in your testing! Subscribed!
7800x3d is a Monster. I went from 5800x3d to a water cooled 13900k and just built my son a 7800x3d water cooled (all with 4090's) and im blown away by the 7800x3d. I never would have went with intel if i knew how efficient it was going to be and still be a monster in games.
(Coming from a 5800x3d i should have guessed, though, haha)
Will you ever do a benchmark with a 4090 (and 4k + all the bells and whistles of course) and a whole bunch of cpu's, to see how low can one go until the cpu actually starts to slowing the gpu down? In the above case we can see that at 4K the cpu doesn't matter but I'm curious what's the cheapest cpu one could use and still get the maximum out of the gpu.
I have the privilege of owning both, 7800X3D 1% lows are noticeably better consistently, but I love the productivity benefits of the 13900k
On the verge of finally….FINALLY upgrading my pc with a new build and am torn between 7800x3d and 13700 or 900. Money isn’t an issue, and I generally just do gaming/sim racing. I’m really looking for general system stability and reliability…all that said, would you suggest sticking with intel or going for the amd route. Thanks and cheers!
if you are sim racing and just gaming, then just go with the 7800X3D, easier to cool, sips a lot less power, and will perform considerably better in Sim, as well as Unity games.@@christophersmith8028
@@christophersmith8028 Did you watch the video at 9:10? If you prefer lower avg frame rates, and bigger dips in framerates, stick with intel.
@@jonboy602 Yeah if you got a 4090 and playing on lowest settings
Thanks for 4k results.
Always good to have at least one solid evidence once in a while (considering no one besides you is willing to do this painful job)
PS: Genuine ACC exclusive players will appreciate very much 👍
It's funny because the mean and median don't do a great job of reflecting the difference, but then you look at certain games and it's a pretty big difference. Most significant is the amount of power used, though. You're usually using 100W+ more in a 13900k for very similar performance.
Just on the 4K results - I appreciate it might not seem worthwhile, but it's still useful for someone building a new PC specifically for 4K - giving them re-assurance that they can use whichever platform is more cost effective without leaving any performance on the table. As new GPUs are released, that's not necessarily a given.
No, you can use the 1080p result to know what you want, the CPU that wins in the 1080p result will also wins in a possible scenario where 4K is not GPU bound.
4K results are STUPID and a loss of time.
@@GhOsThPk The 1080p result tells you about *future* performance at 4K, not *current* performance at 4K. If nobody is tests performance at 4K with current GPU hardware, then you don't *know* the CPUs will perform the same with current hardware, all you have is an assumption.
7700x would be a nice edition to include!!
You guys have built a really great channel......I love your B-roll and your outro music.....of course, the reviews are excellent too 😊
7800X3D + UV @ -15 per core, 110W PPT and 85C temp throttle limit, makes it even more efficient, while keeping the same performance.
7800X3D’s default PPT is around 95W.
@@PowellCat745
no that's the actual power Limit. The PPT is 160W
You kind of have to. I'm struggling a bit on air (NH-U12S) to keep the 5800X3D below 80C. Definitely not possible without an UV.
@@The_Noticer.
I am using a BeQuiet Dark Rock Pro 4. It's pretty good, even with non UVed CPU
Running at -30 per core is more like it when you have a premium board with good Load Line Calibration settings.
Thank you. These are the best graphs for CPU gaming benchmarks on youtube!
thank you for the video :)
going in blind, i don't know, what cpu is gonna come ahead, but i'm very happy with my 3d. In the end, i chose amd, for the same reason i chose my 1700x ~6(ish? ) years ago. Platform longevity.
With that said, being fully on water, with a monoblock, it means even more for me. May the best cpu win.
Great review as always but I think that 4K results are still important.
Yes 4K results don't show which CPU is better than the other but that's the whole point in my opinion.
Many people don't realize that if you buy an RTX 4090 you will probably play at 4K which means that you don't really care about which CPU is faster since you will probably be GPU limited.
Now maybe with future generation GPUs it will make sense to get a 7800X3D for gaming at 4K cause maybe at that point you will be CPU limited.
Now if you buy an RTX 4090 and that you play at 1080p or 1440p (I mean it's your money, you do whatever you want with it), then yes those benchmarks make sense.
At the end of the day, my point is that with a realistic usage (so an RTX 4090 for 4K gaming), you will probably care more about other stuff than CPU bottleneck like power consumption or upgrade path.
Great work Steve. As a 4K user it is always interesting to see the differences at this resolution of the different hardware.
I think a 7950X3D vs 13900K comparison would be great, comparing productivity performance as well as gaming.
Finally a reputable and complete comparision for 4k, thanks guys
Heres a hint. Cpus do not care about resolution much. If it can hit 144fps at 720p, it can get close to it at 4k assuming the gpu isn't a bottleneck.
If you want to test for yourself. Run some tests on your own games at sub 1080p. See how lowering rhe resolution doesn't change the fps if you don't have a gpu bottleneck.
awesome vid - good data. As you bring the ACC benchmark that is happy for the v cahce, i would LOVE to see it compared to an iRacing replay. as its really not optimized- and raw performance should be exposed clearly. but really would love to see it compared to the ACC. :)
With the newest Bios update for AMD allowing for faster memory, I would be interested in a comparison of 6000MT versus the new stable top men speeds for both X3D and non-3D
Do you realize that in your Hetzner sponsor segment, you show a server with an Intel 13900 at 00:57..
Back when this video was filmed, this wasn't the case but now these CPU's are known to be causing issues with servers :)
I suppose the video was already in the making during the new update on AMD CPUs thus using the "slow" 6000 Mhz memory. I was curious to see how much better it now works with 7000+ memory.
The 6400mhz+ memory does not run at 1:1 UCLK:FCLK Infinity Fabric due to memory controller limitation - for gaming performance to be matched to cl30 6000, one would need approximately a cl36 8600 running at 2:1, which is an insane ask and only double digits of all DDR5 RAM sticks out there could do so stably.
It doesn't. Some CPU's might be able to achieve stable 6400. But otherwise if you go for higher memory speeds, you get 1:2 ratio. Below 7800 MT/s this means less or same performance at best. And it's hard to make 7600, much less 7800 and 8000 stable.
Interesting results, the power efficiency really surprised me! Great job guys!
I got the 7800X3D and undervolted it, not only did that give me better performance by about 3-7% depending on the game. It also has never gone above 55C even in cyberpunk. This thing absolutely sips power. Compare that to my roommate's 13900k, and his cpu is always sitting at 90C in games and draws a ton of power. The one down side of having gotten the 7800X3D over my old CPU is that my room doesn't heat up as much during cold times anymore since this cpu literally just doesn't run hot when you undervolt it.
"Always sitting at 90C in games" is nonsense, unless the guy has stock intel cooler from the 90s.
Go watch actual gaming footage, e.g. by TestingGames. i7/i9 runs like 5-10C hotter at the most compared to the x3D.
@@AR-ey1ur not if your room gets hot and eventually increases your ambient temperature higher. And no he doesn't have a stock cooler. You literally can see that the Intel cpu uses 94 more watts on average and that's stock. Once you undervolt the 7800x3d that is closer to 100 watts. Idk if you know what 100 watts is, but that is a lot of heat output. Stop fan boying.
@@z_wulf 100w watts is 1 old light bulb. not really that much power, the issue with a cpu is the density of the power, if you touched the cpu at 100w youd burn yourself instantly. also ambient doesnt make that big of a temp difference, its kind of linear. you warm up the room 3 degrees, your temps go up around 3 degrees. thats why gamer nexus rates coolers with 'delta over ambient' meaning meaning the temp difference between ambient and the temp its running at.
GPU Limited using a 4090 💀
These CPUs are in a class of their own
I upgraded to a 7800x3d with ddr5 6000 from an I9 9900k with ddr4 3600 and the difference felt like a gpu upgrade. All while using a lot less power. Love that little cpu.
Edit: this is with the same gpu, a stock 3080ti.
I think this isn't mentioned enough, benchmark videos always show native resolution tests. I saw massive increase in fps with my 3080 when going from AM4 to AM5 because I play with DLSS at 1440p.
Also, my resolution is 1440p with a frame rate cap of 163hz due to 165hz monitor.
Could you tell me what components you used to build the computer? It's my first time and I would like to ensure that I can play Cyberpunk 2077 smoothly.
Heh, uncanny. You upgraded from the exact same CPU+Mem that I currently have to the CPU+Mem that I'm currently considering. With the impressive future prospects (8800x3d?) of the AM5 socket, I'm sold on this now.
Did you turn off the MSI MPG Z790 Carbon WiFI? 2:25 Would that cause the extra total board power consumption to be that high? Apples to apples comparison matter of course. The Gigabyte X670E Aorus does not feature built-in WiFi.
Glad I went with the 13700k , way cheaper than the two flagship on sale and giving the same 4K performance
same here. I upgraded to the 13700k early last year and have been thinking on what next, but there is really only a few more frames diff in some of these and several are the same. I think we have got to a time where fast is just fast and spending more money really not going to get you much.
100% agree. My goal is to remain always just slightly gpu bound and kept my CPU as long as possible.
Playing at 4k is comfortable now and allows just that.
6000 MHz RAM is no longer the sweet spot for the 7800X3D as of the latest BIOS update, it can now support RAM with speeds up to 8000 MHz, but the memory controller now runs at 6400 MHz so that's the new 1:1 sweet spot. I overclocked my 6000 MHz CL30-36-36-76 corsair vengeance RAM to 6400 MHz CL30-38-38-80 with the infinity fabric overclocked to 2133 MHz to again match that 1:1:1 ratio and it runs like a dream.
i love my 7800x3d , bought it on sale for 400 euro
this cpu is just perfect for itx builds
efficient , powerful and easy to cool
i have fractal design ridge build xD
i got the R7 7800X3D went from i7 8700k world of difference and my gaming experience is beyond fun !
GPU?
@@rezaramadea7574 wtf do you care lmao
@@DavidFregoli It's okay, just making casual conversation :)
Yup, walking around a crowded town in MMOs and my FPS remains 60 at ALL times, never drops below. Now just gotta wait for GPU pricing to reach sane price levels :P
@@mikfhanI gave up waiting after the COVID/crypto mining boom; I don’t think they’ll ever drop to reasonable levels again so after twenty years of Nvidia I’ve switched to AMD.
i9 needs to have 10+ P cores. Being called i9 but having same cores as i7 is dumb. On the contrary the i7 looks the best for value per dollar. $150 cheaper for 11 less fps.
It would be great if you include 0.1% lows as well, as a lot of people say 3D chips have "The Dip".
Gamers Nexus includes 0.1% lows and they scale exactly as you'd expect on both AMD and Intel.
Well done, AMD! Almost a year (September) with my 7700X and i couldn't be happier!
Always appreciate all of the hard work you guys put into your videos. I honestly think you guys are the best in the biz, and you’re my go to slice for reviews and benchmarks. Keep up the great work!
I have been buying only Intel for the past 20+ years but my next purchase (before end of year) is going to be AMD. Seems like the better choice.
Update: I just bought a 7800X3D. Loving it!
The only thing helping Intel at this point is the marketing I think saying something like: “u r cool if u use Intel” will be enough to win allot of dumb sheeple on their side I mean look at Nvidia VS Radeon for example
@@HoodHussler Always go for what gives best performance and price vs logo n the box sometimes its blue sometimes its red and sometimes its green
Both of them generate crazy amount of fps. I'll stick to my 5800x3d for good few years. With 1440p 144fps monitor even with 3070 im matching the refresh rate at least most of d the time
to be fair I'd still buy fast ram on AM5 cause 1) it's not that much more expensive, especially if you're spending this kind of money on cpu 2) you're likely to be able to take advantage of it with future AM5 cpus 3) new Agesa shows that current motherboards can run them faster, it's just down to the IMC which should get better with Ryzen 5 hopefully
Just tune the memory, it's mostly all the same stuff when buying high-quality memory.
@@Hardwareunboxed any tips to start tuning the memory?
@@junlau780 look up buildzoid optimized timings
If you're gonna spend extra on memory, you probably gain more for gaming by buying a 6000 kit that can hit very low latencies (manually tune it for more gains), rather than faster clocking memory.
@@blue-lu3iz I'm not talking about "expensive memory", I'm talking about spending 10 or 20 dollars more on ram that can do up to 8000 mhz as opposed to one that can barely do 6000, it's not that hard to understand, it's not about future ram being faster, it's about current memory already being very fast (and cheap) but the IMC can't keep up
Regarding power, I still have fond memories of when I installed my 4090 and benchmarked the new witcher 3 bells and whistles. I tripped a circuit breaker 🤣
I think an optimized 13900k with undervolt can be more efficient than stock, it would be interesting to do benchmarks like this.
But overall i9 are not for gamer when you know it has 24 core.
The problem is that you can't benchmark an undervolt because not all chips can achieve the same undervolt. Same reason they don't undervolt the x3d, which gains a lot of thermal headroom at -30mv all core. Regardless the intel chip uses a lot more power and you need more expensive ram.
Yep, Intel has lots of headroom for whatever reason, you can under-volt the heck out of the 13 Gen without losing performance. Though on AMD side, even a slight under-volt will cause a huge performance loss.
@@pixels_per_inch that's just not true lol.
@@asdf_asdf948 It's true and the overclock potential is bad without using the pbo.
@@asdf_asdf948 Alright I got to admit my data is pretty outdated, I had a Ryzen 3600 in my other system and this is what I've discovered. Did AMD change the algorithm for PBO?
Love these videos, thank you. Any chance you could add 1% low variances to those charts where you compare the whole benchmark suite as rows?
Thanks for nice and informative video 😃
PS: also maybe someone knows or have benches where 7800x3D is compared with i7/i9 on idle / web-browsing scenarios for total system power consumption. I think there are a lot of gamers who games 3-4h a day and the rest 8-10h hours is just idling or web-browsing.
Testing on our boards (with the latest BIOS) the 7800X3D system idles at 86 watts and the 13900K system at 89 watts.
@@Hardwareunboxed Thank you for sharing this info.
got myself a 7800x3D with 64 GB of Hynix A die ram and it is just incredible for games. Next upgrade probably 9800x3D or 10800x3D on same motherboard
Once you can hit 4k +60fps all that matters is power consumption and saving money. Someone really should do a build series trying to hit that sweet spot of 60fps 4k high/ultra with the lowest power consumption that gives the best bang for buck.
For AMD just use Radeon Chill and set 60fps. You’ll get your 60fps at the lowest power consumption.
Or maybe you want to play any game and want 120 to 160 fps?
@@Superiorer then just go for whatever the highest end you can afford and forget about power? ;)
@@AdamsWorlds No, 144 hz gaming is amazing. 60 hz is low.
@@Superiorer 144hz at 4k? ;) if you can do 60fps @4k you can probably easily do 120-144 at 1140p/1080. Wont be long till 1080 is obsolete on desktop monitors and 4k is new standard.
Thank you for the 4K comparison, I only game in 4K and these tests will help me a lot making upgrades in future if needed. Continue 4K in your tests please ❤
Will you try 8000 Mhz RAM with the 7800X3D ? And compare it to 13900k with the same RAM speed
I bet it will be slower.
Uncore freq ruins the gain from high freq RAM, if they can maintain Uncore frequency, them the higher RAM freq. = higher performance
@@blue-lu3iz The new 1:1 got pushed up too though, to 6400 now. Originally it was 5200, then 6000. So constant improvements.
x3d isnt getting much from ram speeds - u can just go on cheap 6000 cl32
@@marcinkarpiuk7797 or 5600 CL28
I am so happy with my 7800X3D and 7900XTX. This cpu is so nice.
It's not even close after you start running -25PBO, I went from 4.5Ghz all cores to 4.9Ghz this increased performance incredibly in games. It was already faster than the 13900K now the 13900K doesn't beat it in any game. Best part is it's air cooled with my 2011 NH-D14.
That's a beast of a CPU
But reviewers should avoid any overclocking, because those who give OC results soon are fed golden samples by marketing, giving unrealistic results not reproducible by consumers.
The curve optimiser undervolt too can need tweaking to be stable running ST, the fastest cores may tolerate little undervolting even when undervolting works super stably in all core workloads.
But then you have to Oc the 13900
@@erichall090909 Nope, it should be restricted to a sane power limit.
There's been far too many tricks by mobo BIOS and CPU vendors to score narrow benchmark wins by throwing a ridiculous amounts of watts at the problem.
The Intel i9 is wasting on average ⅓ of the entire PS5/XBOX power.
Then it just becomes an arms race of who can OC better. It doesn't help that Intel has gone nuts with letting board partners run wild with power limits.
Thanks for this. You helped me make my choice. Going with the 7800x3D
Appreciate why you are using a 4090 to test the CPU performance, but would be good to also do maybe one test where you step down the GPU grades and show the point where the CPU becomes irrelevant. I suspect for most gamers on mid-low range GPUs both of these processors would be massively overkill. It would be a lot of work I guess but a matrix of recommended minimum CPU GPU would be kind of cool.
The cpu is already irrelevant in all of these tests. If anyone actually bought a 4090 they will be in 4k and you could use a 5 year intel cpu and get same frame rate. Just like if you can only afford a 3060 or 3070 every game will be gpu limited so the cpu don’t matter.
@@Jimmys_TheBestCop Yes this is my gut feeling too. I suspect that 99% of the people buying the 7800X3D will end up GPU limited whatever they are playing. I guess that was my point for this video, it could have done with one of those 'Who should buy either of these products for gaming?' sections, to which the answer would be 'only morons' running at 1080p with a 4090 on non competitive games (which are rarely CPU limited anyway and players turn the settings down for lower latency anyway). The 7800X3D feels like a bit of a non-product, I don't get it's segmentation, at least the 13900K has other use cases.
It an "enthusiast" CPU. The only problem is no enthusiast can afford current gen GPUs. There are almost no current gen gpus on Steam. Whoever is buying current gen gpus are either not gaming on steam, which seems nearly impossible, or are using it for production and or AI.
Watch Tech Yes City video where he does the first ddr4 8 core chip i7 5960x vs the last ddr4 8 core chip the Ryzen 5800x3d. Using a 4090 at 4k they were exactly the same. Even at 1080p the nearly 10 year old chip finished within 20% fps of the 5800x3d in many games.
You could probably purchase any of the last 3 gen CPUs and you will still be limited by whatever GPU you can either find or afford.
@@Jimmys_TheBestCop The 4090 on steam is 0.56% that is about 1 out of every 200 gamers, and is a similar stat to the 3090s. Its a small percent, but that is still a lot of cards in real terms. I do know some people in 3D design who all bought one, so that is another use case for them I guess. I still think it is besides the point whether you are gaming, doing 3D Rendering, doing AI or other CUDA tasks, anyone with a 4090 would be making a mistake buying the 7800X3D. It's either overkill for gaming, or undercooked for productivity. I really can't find a compelling case for it.
@codemonkeyalpha9057 It's cheaper, more power efficient, and has Vcache that can dramatically increase performance on certain games. Plus it's not bad for productivity either, just not the best. It has more than enough cores to power through tasks at it's price bracket.
I honestly don't know how you can say it's a non product when your logic also makes the 13900k a non product as it's not needed in 4k with a 4090. Unless your a rapid fan of only intel, that is.
Were these benchmarks done on Windows 10 or Windows 11?
I would be very interested to see how the 13900k would perform in Watts/FPS if you undervolted it, and what the performance difference would be if you ran it under a powerlimit similar to the TDP of the 7800X3D. I did watch a video, where the 13900k managed to achieve somewhat similar gaming(!) performance under a powerlimit to what you could expect without it.
You can undervolt the X3D too.
Intel architecture are way way below AMD for now. Its doesnt matter how good 13900k is, AMD can just try little bit and make a better things.
Man the intel lose to architecture it just a losing battle, even if they got helped by Microsoft Windows P E cores system.
@@michaelbuto305nah. Just OC both and RAM and see where the chips land then 😁
We want to see the 4k benchmarks in order to know the minimum level of CPU you need with a 4090, which mostly only makes sense to own if you play at 4k. It's expensive and saving a few bucks here and there is desirable 😅. Thanks Steve.
I don't see your point, if the CPU get X fps at 1080 doesn't get the same amount at 4k if the GPU can provide enough frames?
So checking GPU performance at 4k and CPU perfomance at 1080p should be enough to see where is the GPU bottleneck and buy a CPU closer to that performance
@danavidal8774 I don't understand what you are saying. Maybe double-check your grammar? I'm interested in what's the cheapest, most power efficient CPU I can use and still get the max performance with a 4090 at 4k. For example, it seems like using a 13700k for me is fine because using a higher-end CPU is pointless.
@@jonathanmatthews5245 I would say the 13700k is high end too and overkill
A CPU able to get 120fps or more at 1080p is able to get 120fps at 4k too so you only need a GPU that is able to get 120 fps at 4K, increasing the resolution does not increase the workload for the CPU
I knew that the Ryzen would be more power efficient, but the gap was quite suprising D: Thanks for the great test!
Would be nice to see MSFS in these benchmaks as it is highly CPU limited and it actually matter what cpu you choose.
People who are not jihadis' still play that?
@@TheCooperman666 LMAO! 😂
Thanks for taking the extra time to test in 4k. Personally, I find it very useful, since for me testing CPU performance in isolation is pretty pointless - if I only saw CPU tests in 1080p low, I'd expect to get similar performance improvements when switching CPU while playing in 4k, which is not the case.
On the power consumption slides, three of the titles that had the closest difference were games where 7800X3D provided 15% more performance! So the real performance per watt was far worse than the 92watts on average.
What about for gaming+ streaming to Twitch/UA-cam on the same PC? I'm trying to figure out if intels little E cores actually have any value for what i do
I'm one of those ones that appreciate the 4K data. If I had a 4090 I'd be gaming primarily at that resolution. I'd love to see a video testing CPUs with a 4090 to see where CPU bottlenecks start to show in various titles.
The CPU does not bottleneck at 4K, is the GPU
They don't... maybe in some esport titles with hundreds of fps, but why bother testing that.
@@danavidal8774 If you use a low end i3 cpu at 4k it will be the bottleneck in many games.
So Happy I got the 7950X3D its been great to me!!! POWER!
Slam dunk for AMD, wow
Looking forward to your coverage of 14th gen.
100-200W less for a faster CPU ?
Between 13900K and 7800X3D - AMD is by far better choice...👍
Not mentioning temps ? 😕
Love the 7800x3d, especially after i tuned it down by 1% and went from 85 under full load to 60. Didnt notice a performance difference at all.
And this is using a z73 360.
100-190w... damn those intel power draw... thats a whole PSU tier
also needs better cooling and a better motherboard with stronger vrm
After dealing with the heat from a 12900k and 3090 I am really wanting more a more power efficient PC. My AC can't keep up and it's like I'm sitting next to a heater.
It would be nice if you guys included some single player Japanese games like Yakuza Like a Dragon, Persona 5, Final Fantasy VII Remake and so.
Also considering CPUs are important on PS3, 360 and Switch emulation, testing games on those emus would be helpful.
Persona 5 isn't really a good benchmark, that game can run on a toaster.
@@arcticowl1091 its just some games i plan on playing once i get a pc lol
@Kyush4 All of those would run fine unless you picked up a new RX 6400 as a gpu.😂
First time moved to AMD with 7800x3d, x670e auros master and don’t regret it 👍🏻
can't beat the efficiency of the AMD part, let alone the price difference when you are comparing them for gaming only.
Thank you Steve! Appreciate you and your Awesome Team!💯