What do you think about the additional frametime charts? More? Too many? Go for 2 lines or 3 lines? Grab our BRAND NEW Silicone Project & Solder Mat! store.gamersnexus.net/products/gn-project-soldering-mat Check out our review of the Amazon Basics CPU cooler! It's real! ua-cam.com/video/iRe8zVCjCNw/v-deo.html
Frame time is important but hard to interpret on those graphs since you are only showing the raw data. Maybe a histogram would be better? (FrameX - FrameXneg1) / [0.5(FrameX + FrameXneg1)
Frametime charts are good. We've been used to them as gamers for several years now. On a different topic, could you please explain the OS contamination in a bit more detail.? As a selfish request, do you know if a 5800X3D upgrade from a 3900X on W10 would have the same problems?
@@teemuvesala9575 Idk where that came from. Irrelevant data would be the number of contact points between the mobo and the CPU. I guess it would be fun to know but is otherwise irrelevant to the performance of said CPU. Both for AMD and Intel CPUs it would be noise, for most people at least. Data is data, and it won't be affected by the feelings of some hardcore fanbois. Data doesn't care and that's whats cool about them. Also this is GN we are talking about, they will use every technique and new data in all of their upcoming reviews so it'll benefit everyone.
Yeah, for those who also need to have multithreaded workloads (even while gaming, like when my wife turns on subtitles on a film, and Plex starts transcoding like crazy in the middle of "muh gaming", you need the extra cores.)
The 7800X3D benefits some older games that are simulation heavy, e.g. Factorio, Dwarf Fortress, Rimworld (especially when heavily modded), Tarkov, MMOs such as WOW, Star Citizen. Especially when the multiplayer game has zones with extremely loaded locations, the extra V-cache seems to shine there. Really glad the wait is technically over for reviews of the 7800X3D! Thank you for the great review!
i hope GN will do an indepth benchmark with games like Tarkov, Squad and those notoriously "upgrade resistant" badly optimized multiplayer shooter engines... Hell Let Lose, Arma 3 etc etc... they just need to figure out a method to get representative and reproducable results. Those games all ought to be scaling only with single core performance but now with x3d its a different story, its very interesting They can cooperate with server admins to keep the conditions consistent from test to test... walk a few predefined routes on a 100 player sever in squad or something...
@@bobalazsgaming havent played in ages but it used to be notoriously single core dependent, would be interessting to see how it reacts to the x3d! I dont care whether a cpu gives 400 or 500 fps in some old singleplayer game that happens to have a built in benchmark... I want to see how these cpus handle real world multiplayer conditions in games that are known to be CPU bottlenecked and badly optimized. Those scenarios also happen to be the most challenging to benchmark if you want to do it right... @Gamers Nexus should accept the challenge
Keep going with the frametime graphs. They are far more telling about an overall gaming experience versus framerate. Having both in the review reveals the whole picture.
@@GamersNexus I am hoping to see an easter egg/ funny meme on one of the monitors at some point in future. Maybe Steve talking in one of them with the sound coming off from the tv rather than the Steve that stands in the current video talking. :' )
that massive increase in power efficiency with the AMD parts cannot be understated. I see the 7950X3D as a fantastic option for someone like me who games and does production. Not only is it much more power efficient than the 7950X but its also faster in gaming and barely loses much in production. I think that's why people were requesting a 3D version of the higher 5000 series.
Personally I like the inclusion of frametime graphs, whenever I'm comparing hardware and OC data I use FT graphs to see improvements or degradation. I hope you keep them in future videos. Also you can go up to 4 data sets, but you should have them draw over each other from left to right each time you show a new one so it's easy to read.
Love the frame time graphs! It’s important to see for a lot of us that want to experience smooth gameplay. I have not seen any reviewer implement this as of yet. Keep up the great work!
Yeah, nobody else on the internet is doing that. I hope they keep it up, because this is the kind of content that makes Gamers' Nexus truly irreplaceable and in a league of their own alongside Digital Foundry and Hardware Unboxed.
@@selohcin digitalfoundry always include frametimes as well. But yeah, they're kinda approaching it from the other side, like: "How well does the game perform on which device?" whereas GN approaches it "How well does the device perform on which game?" You CAN check out both for either case. But I usually go to DF if I want to know what platform to buy a game for, and to GN if I want to know what platform to actually put together.
I love the detailed frametime charts and the explainers when there's a massive excursion. I generally look at the 1% (or 10%) lows over the averages or peak. A more steady frame rate is better than a faster framerate that stutters.
Agreed. I never understood why people get so focused on only average FPS whereas the lows are the most important if you want good overall gaming experiences.
@@KingOfComedyXD High 1% lows are better. You really want the 1% lows to be as close to the average frame rate. You don't want the lows to be low because that means that the framerate drops to that fps that percentage of time.
I went with the 5800X3D because on CPU's I planned on skipping this Gen. Haven't regretted it yet. What surprised me the most was how competitive the 13600k was, especially for the price.
@BillyPistocco my 13600kf uses around 80watts average when gaming. All the charts we see of power consumption are max core work loads which AMD do much more efficiently. But for day to day use and gaming 13th gen isn't too bad.
The efficiency on these chips is mindblowing. I thought my 5600X would be the last sub 100W CPU I will ever own. And all this while being one of the fastest Gaming CPUs and no ridiculous increase in pricing like in the GPU space.
Great review! What I took away was - 7950x3d is competitive with Intel for workstation AND gaming PCs - at half of the Intel's power consumption. If you don't care about workstation and only care about gaming, 7800x3d is an OBVIOUS best-pick. This is a good result for AMD, and probably what they had in mind as well. Power consumption and fan noise is important to folks who keep a PC in their bedrooms - or anywhere that quiet is a premium.
Fan noise on a high-end Intel pulling 250W+ is a vacuum cleaner as the fans can barely keep up and the video card as well due to the heat soak of the chassis, since it's all running close to throttling limits.. once it all spins up, it stays there with all of the fans struggling to keep up.
@@griffin1366 The 13700K is more efficient with the e-cores turned on, otherwise, forget 3 watts for iddle power load and get the 40ish watts on Ryzen.
I was an LTT fan, but MAAAAN, this channel is so much better. The reviews are soooo good and easy to understand, it is not even the same tier. So happy I discover it! Cheers!
To be fair, the channels do largely different things. GN gives knowledge with a side of entertainment, LTT gives entertainment with a side of knowledge.
@@qwormuli77 Maybe just me but I enjoy GN's entertainment that much more since it's based on almost satire-like accuracy whereas LTT is cheap slap-stick in comparison (I know this sentence makes me look like an absolute douche but I really can't watch LTT anymore)
@@johnwayne-ou5yy Honestly, I mostly just agree. My point was about the ratio of humour to brain inflation and not the success at either, but it's true that I probably laugh more at GN regardless.
I loved the frametime graph, even if it's only a 1v1 comparison or limited to just 3-4 I feel like it is very informative to see just how consistent they can get. Also it'll be interesting to see how the CSGO graphs change when CS2 is pushed out
From what I'm seeing CS2 is quite a lot more optimized the CSGO despite still being in a beta. Frametime variance is much better despite the small fps decrease. Its also more GPU bound. And yes CS2 LOVES the 3d cache.
@@tanphan1848 How do you know it loves the 3d cache, any benchmarks? I´m planning on getting the 7800x3d but wasn´t so sure yet if CS2 will benefit from the 3d cache
I have the 7800x3d and 7600x. In csgo my 7600x is slightly faster. In csgo2 my 7800x3d is faster. I think about 30-35% higher fps but I will have to retest. However aside from FPS alone I think the 7800x3d feels smoother and more fluid at the same frame rates. Could be placebo
I agree - at the end when he went over why to buy - he says the only thing to consider is one for gaming and the other for general - however that overlooks a 52% reduction in watt hours - that he pointed out at the beginning of his review. I realize that power may not be everything - but if you are looking at overall system budget the total power draw starts to be an issue. Especially when you consider a high end gaming machine can rival a toaster or coffee pot - appliances you want to limit on a standard house circuit - computers used to be low enough for that to not be an issue and I suspect many people don't even think about it - but as these power draws creep up you run into issues especially when you consider that a 15 amp standard house circuit is supposed to be kept at 80% max load (for continuous loads) - computers are on all the time - so lower power draw is a huge thing.
I like that there's no easy answer between AMD and Intel in this particular segment. Feels like no matter which one you choose, you're getting a great product. I'd probably lean more towards the 7800X3D as I don't use any of that productivity software, and I've been impressed with Zen efficiency in my laptops for years. Another excellent review GN.
450$ for an 8 core CPU in 2023 is retarded. Straight up. Ask yourself these questions: do you have a 4090? do you play in 1080p? do you play at medium settings? If the answer to all those questions is yes then do you care about 350 vs 400fps? Will the premium actually pay off in YOUR use case? Will there be any noticeable difference with YOUR graphics card in the games that YOU play at YOUR resolution and settings? Do you instead play the latest games where more cores do matter? (Just look at the frametime graph on cyberpunk, the frametime is actually worse on the lower core chip). Do you do any multitasking at all? For the same price the 7900x is the better choice for 99% of people compared to the 7800x3d. You don't need to use productivity applications to make use of more than 8 cores. Just boot up cyberpunk or any of the new CODs. They'll use it.
@@andytunnah7650 kind of curious if any undervolting is possible on the 13600k myself. I imagine i'd be the best all rounder for under $300 if you can.
@@JoelHernandez-tz3vk Yes it's super easy. I can get my 13600k to run the same C23 score (24K) at 1.262v (vs stock auto 1.320v) and reduce temps from 87c to 80c. The 13600k is also $249.99 now which makes it a great all around chip for gaming and productivity. Easily one of the best Intel CPUs in a long time.
I always love Steve's hard hitting takes; no pulled punches here. Also, I really appreciate the frametime data, but would you consider also uploading the charts overlayed on top of the game playing the background? Just find it easier to parse and scroll through for specific titles and lets you fit more products together, ie. see DF's
@@FAT9L Sorry, I was more pointing out that the way DF does it makes it easier to visually understand how framtimes impact gameplay and easier to narrow down pain points to specific parts of a game. I use to have specific testing locations for microstutters in Dark Souls 3 or Elden Ring for instance.
@@aj0413_ ah okay, I understand. Built-in benchmarks typically run for longer than he talks through each segment, so I'm not sure if that's doable. But I would also like to see those, now that you mention it. Maybe it could make for some kind of reference video on the second channel?
@@FAT9L yep, that’s what I was thinking. Secondary, dedicated videos for frametimes like DF does at times. I think how he handled it here is okay, but it leaves you wondering “where in-game was this tested?” For instance
The reason I prefer Gamers Nexus over any other tech reviewer is the sheer amount of data and how in depth you guys go on reviews. More data is always better for us consumers in making an informed purchase
If you're going to show fps results for *some* games in 1440p, it'd be nice to see them for all games. Was really interested in the 1440p results for cyberpunk. Especially because even at 1440p, I know the cpu still has meaningful differences model to model.
@@Maxime-ho9iv 1080p has still 60% in Steam's Survey....and in high(er) resolutions its more like we get bottlenecked by the GPU, not the CPU. So 1080p is more than fine, not meaningless.
I'm enjoying all these reviews. As a pure gamer who married a 5800X3D last year, I appreciate the consistent AM5 'Stick with your wife, you don't need a mistress'. Long live AM4.
Might be a little late but I absolutely love the frametime chart. Really helps visualize how stuttering and framedrops might manifest in a real scenario. Seeing a frametime ms value also helps infer potential input latency ramifications that may not be apparent from raw FPS
Frametime was super interesting, I’d love to see that presented for few titles and against a couple more key CPUs for better reference, say; 5700x, 5800x3D & plain 7700x + maybe couple older ones, 3600 and 9900k, would be incredible for reference of any improvement. Great content.
Off topic. I am not sure if it is the lighting or the colour grading, but this video has a much MUCHHH better 'Colour' than all your previous videos. I love it!
AM4 might be EOL but right now the 5800X3D still seems to be a viable option for gaming, especially at it's current below 300 bucks price. It also offers quite good frametime performance, just not the highest FPS. BTW: The CPU shown at the end of the video while takling about the 13600k is a 13900k.^^
I'm going to be upgrading to the 5800X3D from 5600X - I feel it's not worth it currently to do a whole PC upgrade when the 5800X3D is still kicking ass
@MICKUS TOPOLINO IMPERATOR CAESAR AUGUSTUS MAUS This. I went from a Ryzen 2600 to a 5800X3D. The difference is night and day. Add to the fact you don't have to spend money on new RAM and mobo, it is simply a no brainer.
I’m definitely getting a 5800X3D this year. Either when the price falls below $300 during the year or at the next Black Friday/Cyber Monday deals at year’s end.
The new set looks really professional. I compared with the old one and I think either lighting or color grading has improved as well, the skin tones are much better reproduced.
The efficiency is actually insane on this CPU. It draws 6 more watts at full load than my CPU (ryzen 5 3600xt) yet has like double the performance in games
This is going to be a game-changer for laptops as well, with big performance gains and no throttling. Remember that this also has fairly decent integrated graphics. And just within what a typical (gaming) laptop can handle in terms of heat and power draw. A clear win-win for AMD.
Thank you for including frame time! As a VR enthusiast, it is a struggle to find frame times on these newly released CPUs. In VR, frame times can be the difference between an enjoyable experience or one that is unpleasant and can cause a bad time. An overall smooth frame time and lower frame times make a huge difference in VR and it's not talked about nearly enough. Even on high resolution HMDs, where even a 4090 is the bottleneck, CPU frame times can have huge impact on not only the games performance, but the overall enjoyment of the VR experience.
I like how the background has been getting more and more lively :) love the work you guys put in, props! Edit: I regret looking at the comments, the war has already begun.
The 5600X is still available in our previous datasets, so you can get a pretty close approximation! We are still adding stuff to the freshest re-runs/tests
As someone who does data analysis as part of my day job in Quality, the frame time charts are helpful as you have a better picture of what the CPU is doing versus reporting an average value (especially without a +/- standard deviation value). Thanks for adding them.
I like the frametime comparisons, but don't expect them on reviews of large number of items, they kinda highlight how the software and hardware work with x item.
Would you consider adding specific cache-sensitive games for the X3D reviews? For example, Cities Skylines is completely CPU-bound for most of a city's life, but some reviews of the 5800X3D showed huge improvements. For people like me who play this game, or other cache-sensitive games like Factorio, it would be a useful datapoint to know if the cache improves performance enough to consider upgrading from a non-X3D or Intel CPU.
It only makes sense for factorio if you live and breathe it, because game is decently optimized as is and will run great on average hardware. You will only see the usefulness of x3d cpu if you push the limits and have hundreds of hours in this game. If you are that person then x3d is amazing.
I guess the general opinion for Cities Skylines is still intel-bound for gamers who had 81tiles and heavy mod. Some gamers' comparisons on CSL with essential mod/asset indicated that 58x3d/795x3d had perfect fps on light load, but not consistent fps, or not even close fps(both avg, 1%low and vision exp) as 127k/139ks correspondingly, especially in medium and heavy detailed scenario. Unfortunately, most of prof reviews for Cities Skylines are no mod or very limited mod, which results x3d being an "unbeatable" winner.
Great review, just 1 thing you could have put more emphasis on in the conclusion is how much more power efficient the 7800x3d is compared to the 13700k. (especially in this state of the worlds and electricity prices)
Steve, Thank you for including The Frame Time! please keep the frame time in the cpu Review's even if not all understand them i DO and i know you know its super important and can make a buyers choice. =)
I just picked up a 7800x3d for $350 after instant coupon and bundling. 32gb vengeance, x670E msi tomahawk,78003xd, and starfield approximately $750 out the door.
I really like the frame time charts. I'd love to see them more against the most appropriate competitor (or maybe an outlier to show how significant a difference), and then the wide-range comparisons with the bar charts. Great combo to get the full picture!
Really liked the frametime review. Too bad AMD nerfed the 7800X3D, could've easily matched the 13700K in those other games if they let it rip! Keep up the good work.
Re: 4:46. The boost frequency on the V-Cache CCD of the 7950X3D is 5.2GHz, it isn't dropped that much for the 7800X3D (200MHz), it just looks like a lot because the dual CCD parts have that higher boost CCD that won't cook the VCache at that high of clock.
@@mr.dingleberry4882 Not if the V Cache CCDs on the higher end parts are getting similar clocks. Having V Cache drops clocks because the extra cache can’t handle the additional heat generation that comes from driving the higher clocks that the standard CCDs can run at.
Thank you so much for adding the efficiency chart at 8:55 this is the main reason I choose a particular CPU and this information is often hard to come by.
7800X3D: You care mainly about gaming, low energy usage, and are fine with the $449 price point. 13700k: You care about gaming and production, don't mind higher energy use, and want to save a bit at ~$420. 13600k: You care about gaming and production, don't mind slightly higher energy use, and like the $300 price point.
7950x: You want the best for both gaming and production, don't mind the higher price point 13900k: same as 7950x and you don't mind the higher energy usage
I know Steve mentioned it briefly but the i5-13600k is flying under the radar here. As someone who was hoping for a 5800x3d jump, the 7800x3d is double the price of a 13600k, with really marginal gains.
The 13600K really is the jack of all trades, master of none. I feel like it's slept on at the $300 price range. Great all-rounder that doesn't require exotic cooling or tuning.
It's a tough choice I agree. the 7800x3d is more power efficient. You will make that money back in your electric bill haha. as time goes on we will see which CPU ages better.
@@wijn1008 id hope its more power efficient since its 8c 16t vs 12c 20t. In gaming the 13600K doesnt actually use that much power. The 13600K is hands down the better buy, even for gaming E cores completely handle pc background tasks and let the 6 p cores focus on just gaming, and contrary to popular belief, threadscheduling has been optimized to actually use e cores in gaming too now. That danny dude did a video also proving you get better performance leaving ecores on and not turning them off. For all intents and purposes the 13600K is closer to a Ryzen 9 as an i5 than a Ryzen 5 or Ryzen 7
@@wijn1008 Yep, don't look at the all core benchmark power draw, imagining that's what it pulls in real use, unless you enjoy doing blender renders on the CPU. Most of the time both brands have pretty reasonable power draw when gaming.
Glad to see frametime being included in your tests, I don't know if I am just particularly sensitive to stutters but I'd rather 90 fps with sub 8ms deviations than 150fps with 30ms any day but sadly they are usually not related eg. lowering my fps often didn't reduce my stuttering. So I guess GN will be my first port of call when its time to upgrade because I would loose my mind if I upgraded from my 8700K/1080Ti and still had stuttering issues even if it did double my average fps.
I just upgraded from 7700k and a 1080ti, to this chip and a 6800xt, so I'll report back. Yeah the CPU is overkill for that card, but I'll probably upgrade GPU next gen and wanted to get on AM5 somewhat early-ish since I seem to always buy mobos a month or two before they announce a new gen.
@@dhLotan I almost forgot I had the stuff. I've been paranoid not wanting to ruin the 700 dollars in parts, so gave them a wide berth to get the BIOS stuff figured out and totally forgot I had that board/chip up in the closet. Gonna do it this week now that you reminded me. 😅
This talk of frametiming against FPS is something I’ve felt insane about, have you ever used a Gysnc monitor with an Nvidia card to notice the difference?
Great work as always. The more depth in the videos, the better that they are as the vast majority of viewers are enthusiasts, therefore the frame time charts should be included in all future reviews.
Great to see strong competition. I was worried I'd have major buyers remorse after just building a new rig with a 13700k + 4090 and though this shows the 7800X3D would've been a strong alternative I'm happy with my choice since I still have some value for productivity software performance. Each gen I keep switching back and forth from intel to AMD (whoever's the best fit at the time) and I have to admit I always feel like I run into less rare odd issues running Intel (and Nvidia on the GPU front) vs AMD. Including one off games that have very odd dependency on individual core boost speeds etc. Not throwing shade on AMD, again I keep swapping back and forth each gen between them and Intel and Nvidia, but it always feels like drivers and edge case issues are just a bit fewer when I'm in Intel + Nvidia land... maybe just me though.
I have the exact same issue : chosing between the 7800X3D and the 13700K is really hard. When 3D cache is used, the 7800X is a beast, but not so much when the 3D cache is useless. And I play almost only heavy sim games so I think that in a few years, the lack of cores compared to the 13700K will be a problem for me. On the other hand, the 270 Watts of the 13700K are a bit much and should be a nightmare to cool properly with an air cooler. But I guess it's not always used as a warmer with a Prime95 load. Oh, and also the nightmare with AMD drivers and DDR5 compatibility on recent AM5 motherboards. I might stay with Intel... What are your temps & power consumption, when gaming ? (BTW, what is your cooling setup, that might help me too). Thanks in advance =)
Thank you for this response. I'm finally looking into upgrading to a completely new build after rocking a hella budget rig for the past 4 years(optiplex, 1060 6gb, PSU...the general optiplex build. lol). Sim racing and occasional fps titles are really all its for, so technically just gaming, but the minimal frame rate differences don't mean much to me in the end, and the real thing I'm looking for is reliability and longevity. I feel like I've heard multiple people with the same thought on just having a smoother time with intel when it comes down to drivers and compatibility. ill likely be thrilled with either one, but think a 13700k over the 7800x3d might just make the most sense. Thoughts? and cheers!
@@christophersmith8028 I have always been an intel user... starting with the q6600>3930k>6800k> 9900k>11700k>11900k>12900k...I went from the 12900k to amd's 7800x3d and the difference has been huge in fps uplift for the games I play(specifically dota 2(120fps average)/warzone(50fps)/baldurs gate 3(60fps)) with a 4080. I haven't had any driver issues thus far. Besides the fps uplift in these games, when I shift from gaming to more productivity I can drop in a 7950x3d or 8950x3d when it comes out eventually. Unfortunately with intel the best you can do is 14900k which is coming out soon, and is said to not have a huge uplift over 13900k. Meanwhile rumors are that next gen for amd are going to see around 20% improvement... All that said, it does seem as though these days you have to do research for the specific games that you care about. For me it was dota 2 and baldurs gate, so the 7800x3d had huge gains.
On Frame Time charts - Have you considered presenting this data in something like a box-and-whiskers chart format? That would capture much of the interesting information from the frame time charts in a much more compact format, allowing you to compare several processors at once without it getting too messy to be legible.
Same. I have a Ryzen 9 3900x and was thinking about the 5800x3d. But the cost of the 5800x3d compared to the 7800x3d is the same. So I have bit the bullet and upgraded to an msi b650 plus mobo and corsair ddr5 6400mhz ram just today after watching this review. I've had this build for a few years now so £580 for the above is worth is. Just wish It was worth financially upgrading my rtx 3080 but that will have to wait another year or two ; )
No other channel (that I know of) is doing half as much to deep dive into performance of hardware like you guys are. The effort that goes into these videos is kind of astounding.
Steve thanks for another amazing benchmark video as always. My gaming is generally limited to flight and racing sim games, which have the habit of being single-core CPU heavy, and often not amazingly optimized. I'd really like it if you'd put these tests forth for X-Plane 12 and rFactor, for example, and see how each CPU performs. Cheers buddy!
I just bought a 7800X3D to replace my current 7700X. I know that the difference in gaming performance is likely to be somewhat minor. The 7700X is a damn good chip. It’s great for games and it’s especially good for consumer/prosumer level productivity tasks, which I don’t really do on this machine very often. Anyway, it seems like one could expect anywhere from a 5-15% uplift in frame rates depending on the games and that isn’t a major improvement but I primarily bought the X3D because of its efficiency and low power consumption. The added gaming performance will basically just be a nice bonus. After switching from my AM4 system with a 3700X and a 2070 Super to my current system with a 7700X and a 6950XT, the amount of heat being dumped into the room by this PC is substantially greater than I’ve ever experienced before. The Zen 4 chips seem to generally run pretty warm and I live in Texas. After a few hours in some games, the room is uncomfortably hot, despite the rest of my house staying at 70 degrees. I’m using a 360mm AIO and still sometimes see the 7700X sitting in the mid to high 80’s. My 6950XT runs quite cool so I’m hoping that once I pop the X3D into this machine, it should draw significantly less power, create less heat and improve game performance to boot. It’s kind of a lot of money to spend just to get a more efficient chip with a fairly marginal overall performance increase but I think it’ll be worth it and now I plan to go quite some time before doing another CPU upgrade.
Im planning the same upgrade and I had the same issue with 7700x so high temps but I fixed it setting it to 65w eco mode and temps went down a lot between 55-65 while gaming, so how are temps on 7800x3d? thanks.
@@armandoarroyo I run my 7800X3D on eco mode and even in demanding games cranked to max settings, it’s usually hovering around 62 degrees. I’ll get intermittent spikes up to 70-73 from time to time but it always settles back down quickly. I’m running a 360 AIO exhausting through the top so it’s also exhausting the heat from my 6950XT and still the temps for both are totally under control. I’ve been playing cyberpunk again for the last couple of weeks and both stay at a steady 61-63 degrees. With that game, I typically use a frame rate cap at about 75fps to make sure I get the graphic quality I want without running too hot. If I don’t cap the frame rate, both the gpu and cpu will definitely start getting into the high 70’s, maybe even low 80’s. But that’s a pretty demanding game at max settings. I play exclusively single player games and nearly all of them can run max settings or very close to max with no fps cap and temp issues whatsoever. I definitely recommend the 7800X3D. It’s probably not the best bang for buck purchase you can make if you already have the 7700X but it’s been rock solid for me.
@@kylestewart4444 yeah I already have 7700x and I care about temps, how you set the eco mode on 7800x3d? Just adjusting the CO to a negative value ? I have heard that is kind of different when indervolting 7800x3d. Still wondering if it is worth the upgrade tbh.
That frametime chart actually made me hold on my decision to purchase the CPU. The whole reason of wanting a 3D cache for me is a smoother gameplay, if there are frametime spikes, that's a big issue.
They're sub-10ms spikes. You won't even notice them, and the more major spikes that ARE noticeable are likely going to happen on any CPU because they're not the CPU's fault. Basically it's a graph that doesn't matter other than to confirm that it's a normal CPU that compares evenly to other normal CPUs. The spikes are a non-issue, in any practical way.
You have seen 3 charts for 3 games of which 2 were completely normal and just one showed more variance. This anomaly was for Cyberpunk 1.6. The 2.0 update has improved the x3D performance by a ton to where the 5800x3D beats the 13700k in 1% lows by 10 fps and average fps just by 2 fps (according to HW Unboxed benchmark), meaning Cyberpunk was the problem.
@@jonathaningram8157 nice. I'm thinking about it as well because I simply want to support this power efficient technology even though I probably don't really need it.
I love the chart recap starting at 25:51. I really hope you keep doing that. With all the data presented in a video like this, it's easy to get lost in the weeds.
I wouldn't really say AMD intentionally "kneecapped" the 7800X3D's clock speed to make the 7950X3D "look good". The max boost speed listed (5.7 GHz) is for the overall CPU. For the 7950X3D, that's specifically the faster normal CCD, with the V-cache CCD with a boost closer to 4.925-5.1 GHz, essentially the same 5 GHz average. If the core parking and thread assignments are done appropriately for the more complex 7950X3D, that would explain why the 7950X3D and 7800X3D essentially neck-to-neck. Granted, that's a big "if" which, does show in some titles. Would it have been more transparent if AMD listed separate boosts for the different CCDs? Yes. But mentioning that would give an easier perception on what's going on. Because the way this is presented, it seems like the 7800X3D is magically making up "that" clock speed difference.
I'm generally really interested in the frametime charts because you can see stuff like micro-stutter clearly there. Nothing annoys me more than a high framerate that doesn't feel good because it has constant issues with the frametimes.
I'm a little surprised that the advantage over the basic non-3D parts isn't higher. It seemed like 5800x3D was magic. I guess DDR5 bandwidth really helped to close the gap in latency between 3D and non-3D.
I bought 13700k for a new computer a few days ago, and I was worried whether I should purchase a new 7800X3D and refund the 13700k, or just stay with it. In conclusion, I am staying with the 13700k since my main will be Adobe programs and gaming for a sub. Yes, the amount of electricity that 7800X3D use is impressive, which made me hard to choose between the two CPUs even though 7800X3D shows a less productive result when compared to 13700k. However, 13700k is faster for rendering, which means it will use a similar amount of electricity in a short time. It might be less productive than 7800X3D, but I don't really care that much about it. Plus, I got 13700k for a reasonable price, so I am just staying with 13700k. (I actually bought 13700kf + Was in sale) Still, if someone needs a new PC just for gaming, 7800X3D will be an excellent choice than 13700k.
I wasn't really interested in the 7800X3D as a part, so I was dubious about watching the video, but uh... can I just say the ad content for the new silicone solder mat got my attention? I just started branching out my model building side of the hobby from just assembling plastic models to also adding electronics to them and hot damn, talk about timely product rollout.
What do you think about the additional frametime charts? More? Too many? Go for 2 lines or 3 lines?
Grab our BRAND NEW Silicone Project & Solder Mat! store.gamersnexus.net/products/gn-project-soldering-mat
Check out our review of the Amazon Basics CPU cooler! It's real! ua-cam.com/video/iRe8zVCjCNw/v-deo.html
As a 1440p gamer - thank you for those charts!
Frame time is important but hard to interpret on those graphs since you are only showing the raw data.
Maybe a histogram would be better? (FrameX - FrameXneg1) / [0.5(FrameX + FrameXneg1)
I haven't experienced a time like this, where its pointless to upgrade, in decades.
Frametime charts are good. We've been used to them as gamers for several years now.
On a different topic, could you please explain the OS contamination in a bit more detail.? As a selfish request, do you know if a 5800X3D upgrade from a 3900X on W10 would have the same problems?
Keep them charts coming.. Great info!
I like the subtlety of the box grabbing part, where all the boxes that fall down are for Ryzen 9 CPU's, which the 7800X3D has dethroned.
I wish I could say that was intentional...
@@GamersNexus You can say that. Retcon it.
@@chrisfanning5842 he could also call ufd tech to brettcon it
@@chrisfanning5842 OK - then it was intentional!
Interesting how we pull meaning out of chaos sometimes.
You must be an English major.
Can't wait for UserBenchmark to trash on this CPU for no reason.
They'll post some kind of indecipherable rant about marketing and UA-cam reviewers, or something.
I love UserBenchmark. They write the best comedy "reviews" of AMD CPUs :)
@@alaeriia01 It's like April 1st all year round
“Advanced Marketing Devices” -Untermensch Mark
I wish we had an AI trained on UserBenchmark's rants, it'd be funny to see it make up ridiculous reasons to trash random products.
I’m glad you included the frame time charts in this one. It’s almost always helpful to have more data available
@@Mrfiufaufou some data is noise, irrelevant
@@SaudadeSunday Yeah when its not in favor of AMD fanboy arguments, the data is always noise... This is the AMD fanboy logic.
@@teemuvesala9575 Idk where that came from. Irrelevant data would be the number of contact points between the mobo and the CPU. I guess it would be fun to know but is otherwise irrelevant to the performance of said CPU. Both for AMD and Intel CPUs it would be noise, for most people at least. Data is data, and it won't be affected by the feelings of some hardcore fanbois. Data doesn't care and that's whats cool about them.
Also this is GN we are talking about, they will use every technique and new data in all of their upcoming reviews so it'll benefit everyone.
This data is significant in the VR space when you need to be under 11ms frametimes
@@teemuvesala9575 where the feck did that come from? if anyone's a fanboy of something, it's you
"...allowing the 7950X3D a lead of 1%! It's massive! Definitely worth hundreds of dollars!"
Never change, Steve 😂
in gaming.
Yeah, for those who also need to have multithreaded workloads (even while gaming, like when my wife turns on subtitles on a film, and Plex starts transcoding like crazy in the middle of "muh gaming", you need the extra cores.)
But all the people who totally will use content creation and video rendering 24/7 and can't use a tiny 8 core chip it'll take like 6 minutes longer!
@@filipealves6602no what you need is a separate server so you are not a pleb :)
@@12me91I do have to say my 5950x is very nice for AV1 encoding we it can take 24 hours as reasonable settings to encode an entire 4k movie…
The 7800X3D benefits some older games that are simulation heavy, e.g. Factorio, Dwarf Fortress, Rimworld (especially when heavily modded), Tarkov, MMOs such as WOW, Star Citizen. Especially when the multiplayer game has zones with extremely loaded locations, the extra V-cache seems to shine there. Really glad the wait is technically over for reviews of the 7800X3D! Thank you for the great review!
i hope GN will do an indepth benchmark with games like Tarkov, Squad and those notoriously "upgrade resistant" badly optimized multiplayer shooter engines... Hell Let Lose, Arma 3 etc etc... they just need to figure out a method to get representative and reproducable results. Those games all ought to be scaling only with single core performance but now with x3d its a different story, its very interesting
They can cooperate with server admins to keep the conditions consistent from test to test... walk a few predefined routes on a 100 player sever in squad or something...
possibly Arma 3?
@@bobalazsgaming havent played in ages but it used to be notoriously single core dependent, would be interessting to see how it reacts to the x3d!
I dont care whether a cpu gives 400 or 500 fps in some old singleplayer game that happens to have a built in benchmark...
I want to see how these cpus handle real world multiplayer conditions in games that are known to be CPU bottlenecked and badly optimized. Those scenarios also happen to be the most challenging to benchmark if you want to do it right...
@Gamers Nexus should accept the challenge
Dwarf Fortress... Sold!
What kind of frame rates you been getting in Star Citizen? GPU?
Thank you so much for showing benchmarks at 1440p!
Keep going with the frametime graphs. They are far more telling about an overall gaming experience versus framerate. Having both in the review reveals the whole picture.
agree specially if u are getting over 60fps in my experience frametimes are so much more important
Steve and team, the new set is looking so good. Y'all are killing it. Been waiting on this review, great work as always.
Thank you! Loving this set. We're still figuring out when to shoot in each location, but it's so nice having multiple spots for a different feel.
@@GamersNexus that's what she said... (love you guys)
@@GamersNexus I am hoping to see an easter egg/ funny meme on one of the monitors at some point in future. Maybe Steve talking in one of them with the sound coming off from the tv rather than the Steve that stands in the current video talking. :' )
@@GamersNexus Need a nice high back chair next to a fireplace for "Fireside chats with Steve"
@@GamersNexus Is there a bathroom you can shoot in for when you review a bad product? Location seems fitting.
that massive increase in power efficiency with the AMD parts cannot be understated. I see the 7950X3D as a fantastic option for someone like me who games and does production. Not only is it much more power efficient than the 7950X but its also faster in gaming and barely loses much in production. I think that's why people were requesting a 3D version of the higher 5000 series.
I wanted the 7950X3D as my first option but I have heard a lot of bad things hows it going for you?
It doesn't fact in idle power consumtion which is 5 times higher than Intel... it's pretty much similar for basic web browsing and light work.
@@Headgrumbleoh wow 3 dollars a year!
@@PhyrexJmore like 300 if you're from Europe lol
Cries in europe indeed.
Idle draw was fixed a bit back though no?
Personally I like the inclusion of frametime graphs, whenever I'm comparing hardware and OC data I use FT graphs to see improvements or degradation. I hope you keep them in future videos. Also you can go up to 4 data sets, but you should have them draw over each other from left to right each time you show a new one so it's easy to read.
Love the frame time graphs! It’s important to see for a lot of us that want to experience smooth gameplay. I have not seen any reviewer implement this as of yet. Keep up the great work!
Yeah, nobody else on the internet is doing that. I hope they keep it up, because this is the kind of content that makes Gamers' Nexus truly irreplaceable and in a league of their own alongside Digital Foundry and Hardware Unboxed.
@@selohcin digitalfoundry always include frametimes as well. But yeah, they're kinda approaching it from the other side, like: "How well does the game perform on which device?" whereas GN approaches it "How well does the device perform on which game?" You CAN check out both for either case. But I usually go to DF if I want to know what platform to buy a game for, and to GN if I want to know what platform to actually put together.
I love the detailed frametime charts and the explainers when there's a massive excursion. I generally look at the 1% (or 10%) lows over the averages or peak. A more steady frame rate is better than a faster framerate that stutters.
Agreed. I never understood why people get so focused on only average FPS whereas the lows are the most important if you want good overall gaming experiences.
@@kotztotz3530when reading a graph, are high 1% lows better than low 1% lows?
@@KingOfComedyXD High 1% lows are better. You really want the 1% lows to be as close to the average frame rate. You don't want the lows to be low because that means that the framerate drops to that fps that percentage of time.
I went with the 5800X3D because on CPU's I planned on skipping this Gen. Haven't regretted it yet. What surprised me the most was how competitive the 13600k was, especially for the price.
you will be good for 5+years imo if you are willing to turn down the settings to get more fps.
@BillyPistocco my 13600kf uses around 80watts average when gaming. All the charts we see of power consumption are max core work loads which AMD do much more efficiently. But for day to day use and gaming 13th gen isn't too bad.
same here man, glad i chose the 5800x3d
I'm keeping my 5800X3D to on my x470 board. My 3rd CPU upgrade since 2018, I might upgrade in another 2yrs, not sure yet.
My 3080 and 5800x3d, still rock fine. Skipping this gen, happy with what I got still
0:11 i love how he pulls the 7800x3d from beneath all the ryzen 9 parts as a metaphor
I love the power consumption data. I would not mind seeing more scenarios included.
The efficiency on these chips is mindblowing. I thought my 5600X would be the last sub 100W CPU I will ever own. And all this while being one of the fastest Gaming CPUs and no ridiculous increase in pricing like in the GPU space.
i7 is similar efficiency and has it beat in almost every category...
Uh, it's a $500 CPU that requires a $250+ mobo and $150 in 32GB DDR5 to work properly. Maybe 0.01% of gamers will buy this.
@@mintymus similar efficiency? 7:56 did you even watch the video? lmao
@@mintymus not even close to the same efficiency lmao
@@sophieedel6324 wtf do you need a $250 board for? a $150 one does everything you need, and you dont need 32gb of ram
Great review! What I took away was - 7950x3d is competitive with Intel for workstation AND gaming PCs - at half of the Intel's power consumption. If you don't care about workstation and only care about gaming, 7800x3d is an OBVIOUS best-pick.
This is a good result for AMD, and probably what they had in mind as well.
Power consumption and fan noise is important to folks who keep a PC in their bedrooms - or anywhere that quiet is a premium.
Fan noise on a high-end Intel pulling 250W+ is a vacuum cleaner as the fans can barely keep up and the video card as well due to the heat soak of the chassis, since it's all running close to throttling limits.. once it all spins up, it stays there with all of the fans struggling to keep up.
Is the 7950X3D still worth it? Have heard bad things
He gave the win to the intel cpu but for someone who wants to be more futureproofed the am5 cpus will be better
Glad I've waited for the 7800x3d, the delayed release was so suspicious it stunk!
Since my PCs main use is gaming this is a great CPU
I'd love to see it against the 13700k with e-Cores disabled and undervolted.
@@griffin1366 The 13700K is more efficient with the e-cores turned on, otherwise, forget 3 watts for iddle power load and get the 40ish watts on Ryzen.
@@saricubra2867 I'm sorry, but what Ryzen has 40w power draw on Idle?
@@viniciuslima5021 The 5800x 3D certainly reports 40w draw when just watching UA-cam videos, I can't speak to what the Intel does though.
Yes now you can buy from the stinky guys that tried to trick you
I was an LTT fan, but MAAAAN, this channel is so much better. The reviews are soooo good and easy to understand, it is not even the same tier. So happy I discover it! Cheers!
Yeah I had to unsubscribe, tech nerds like the nitty gritty stuff we don’t need a big show put on to enjoy a video
@@guccipucci69420amen
To be fair, the channels do largely different things. GN gives knowledge with a side of entertainment, LTT gives entertainment with a side of knowledge.
@@qwormuli77 Maybe just me but I enjoy GN's entertainment that much more since it's based on almost satire-like accuracy whereas LTT is cheap slap-stick in comparison (I know this sentence makes me look like an absolute douche but I really can't watch LTT anymore)
@@johnwayne-ou5yy Honestly, I mostly just agree. My point was about the ratio of humour to brain inflation and not the success at either, but it's true that I probably laugh more at GN regardless.
I loved the frametime graph, even if it's only a 1v1 comparison or limited to just 3-4 I feel like it is very informative to see just how consistent they can get. Also it'll be interesting to see how the CSGO graphs change when CS2 is pushed out
@Donuts Tech reviewers are gonna have a hell of a time rerunning all of the CS benchmarks to update their CPU and GPU graphs
@Donuts If they wanna do that they should have a bhop enabled playlist.
From what I'm seeing CS2 is quite a lot more optimized the CSGO despite still being in a beta. Frametime variance is much better despite the small fps decrease. Its also more GPU bound.
And yes CS2 LOVES the 3d cache.
@@tanphan1848 How do you know it loves the 3d cache, any benchmarks? I´m planning on getting the 7800x3d but wasn´t so sure yet if CS2 will benefit from the 3d cache
I have the 7800x3d and 7600x. In csgo my 7600x is slightly faster. In csgo2 my 7800x3d is faster. I think about 30-35% higher fps but I will have to retest. However aside from FPS alone I think the 7800x3d feels smoother and more fluid at the same frame rates. Could be placebo
Definitely include frametime graphs if you can! They provide a much clearer picture of stability compared to averages. Thank you for the review!
That level of performance in 85W has me sold. Such an efficient and powerful little chip
It would be PERFECT for gaming laptops with its relatively low power use, but of course AMD manage to miss the killshot
Yet inferior in every other way.
i'll pay $350 max, not $449.
I agree - at the end when he went over why to buy - he says the only thing to consider is one for gaming and the other for general - however that overlooks a 52% reduction in watt hours - that he pointed out at the beginning of his review. I realize that power may not be everything - but if you are looking at overall system budget the total power draw starts to be an issue. Especially when you consider a high end gaming machine can rival a toaster or coffee pot - appliances you want to limit on a standard house circuit - computers used to be low enough for that to not be an issue and I suspect many people don't even think about it - but as these power draws creep up you run into issues especially when you consider that a 15 amp standard house circuit is supposed to be kept at 80% max load (for continuous loads) - computers are on all the time - so lower power draw is a huge thing.
@@greebj nah, it isn't as easy as "just put whole thing into laptop".
Wait for their laptop CPUs and we will see
I like that there's no easy answer between AMD and Intel in this particular segment. Feels like no matter which one you choose, you're getting a great product. I'd probably lean more towards the 7800X3D as I don't use any of that productivity software, and I've been impressed with Zen efficiency in my laptops for years. Another excellent review GN.
450$ for an 8 core CPU in 2023 is retarded. Straight up. Ask yourself these questions: do you have a 4090? do you play in 1080p? do you play at medium settings?
If the answer to all those questions is yes then do you care about 350 vs 400fps?
Will the premium actually pay off in YOUR use case? Will there be any noticeable difference with YOUR graphics card in the games that YOU play at YOUR resolution and settings?
Do you instead play the latest games where more cores do matter? (Just look at the frametime graph on cyberpunk, the frametime is actually worse on the lower core chip).
Do you do any multitasking at all? For the same price the 7900x is the better choice for 99% of people compared to the 7800x3d. You don't need to use productivity applications to make use of more than 8 cores. Just boot up cyberpunk or any of the new CODs. They'll use it.
For me it's the power usage of intel that puts me off.
Yep. I do use some productivity software, which leans me a bit toward the 7950x3d given the efficiency.
@@andytunnah7650 kind of curious if any undervolting is possible on the 13600k myself. I imagine i'd be the best all rounder for under $300 if you can.
@@JoelHernandez-tz3vk Yes it's super easy. I can get my 13600k to run the same C23 score (24K) at 1.262v (vs stock auto 1.320v) and reduce temps from 87c to 80c. The 13600k is also $249.99 now which makes it a great all around chip for gaming and productivity. Easily one of the best Intel CPUs in a long time.
All cool with the extra explainer time. Always good to bring new people and learners aboard the tech train. Thanks, Steve!
Hardly anyone does framtime charts, and to me that's just weird.
They add so much valuable insight that I definitely hope you continue doing them!
I always love Steve's hard hitting takes; no pulled punches here. Also, I really appreciate the frametime data, but would you consider also uploading the charts overlayed on top of the game playing the background? Just find it easier to parse and scroll through for specific titles and lets you fit more products together, ie. see DF's
All the chapters are labeled with the type of test + which game they're testing, makes scrubbing through extremely easy!
@@FAT9L Sorry, I was more pointing out that the way DF does it makes it easier to visually understand how framtimes impact gameplay and easier to narrow down pain points to specific parts of a game.
I use to have specific testing locations for microstutters in Dark Souls 3 or Elden Ring for instance.
@@aj0413_ ah okay, I understand.
Built-in benchmarks typically run for longer than he talks through each segment, so I'm not sure if that's doable. But I would also like to see those, now that you mention it. Maybe it could make for some kind of reference video on the second channel?
@@FAT9L yep, that’s what I was thinking. Secondary, dedicated videos for frametimes like DF does at times. I think how he handled it here is okay, but it leaves you wondering “where in-game was this tested?” For instance
The reason I prefer Gamers Nexus over any other tech reviewer is the sheer amount of data and how in depth you guys go on reviews. More data is always better for us consumers in making an informed purchase
The set, the lighting, and of course GN's peerless review process...I think this is your best video yet!
Glad people like the lighting and set!
@@GamersNexus So glad you changed the monitor wall to static images that don't distract from the presenter.
If you're going to show fps results for *some* games in 1440p, it'd be nice to see them for all games. Was really interested in the 1440p results for cyberpunk. Especially because even at 1440p, I know the cpu still has meaningful differences model to model.
Even more, 1440p should really be the default now.
1080p is absolutely meaningless honestly.
@@Maxime-ho9iv 1080p has still 60% in Steam's Survey....and in high(er) resolutions its more like we get bottlenecked by the GPU, not the CPU. So 1080p is more than fine, not meaningless.
Cpu heavy games as well like Simulators, Dayz, Escape from Tarkov
Buying this cpu for 300 bucks (on a microcenter sale), I am so glad I waited.
IT’S TIME!! I’ve been waiting for reviews of this CPU to see how big of an upgrade it would be to go up to AM5 for gaming.
Nice! Glad we can deliver the review!
I really like the watt-hour comparison. Surprised at such a large difference between relatively recent CPUs.
I'm enjoying all these reviews. As a pure gamer who married a 5800X3D last year, I appreciate the consistent AM5 'Stick with your wife, you don't need a mistress'. Long live AM4.
A year ago this was the best cpu money could buy. A year later... Nothing has changed.
Might be a little late but I absolutely love the frametime chart. Really helps visualize how stuttering and framedrops might manifest in a real scenario.
Seeing a frametime ms value also helps infer potential input latency ramifications that may not be apparent from raw FPS
I applaud the use of frametime charts. So much more info on how the real user experience will be like 👌
love seeing the frame time graphs! always interesting to see imo, would like to see them more often
Frametime was super interesting, I’d love to see that presented for few titles and against a couple more key CPUs for better reference, say; 5700x, 5800x3D & plain 7700x + maybe couple older ones, 3600 and 9900k, would be incredible for reference of any improvement. Great content.
1:25 Steve I honestly have been dreaming of a mat like this. Thank you Gamers Nexus!!! ❤
As an avid FFxiv player, I want to thank the GN's team for including the benchmarks for the game.
If you're reading this: have a very nice day :3
The efficiency is impressive. This CPU is perfect for SFF builds.
not rly. unless you mean SFF builds for gaming
@@probablykeen yes it's what I meant
I found the 'watt hour' graph with all of the other CPUs to be extremely useful. Excellent content as usual. Thank you.
Off topic. I am not sure if it is the lighting or the colour grading, but this video has a much MUCHHH better 'Colour' than all your previous videos. I love it!
AMD's advancements seems promising. I'm still on the 3600 myself but I'll consider the 8800x3D when that is a thing in like 1 or 2 years.
Nah cause it took them what 4 months into the new year to release it , it probably won’t be released 4 months into 2025
Why not upgrade to the 5800X3D. Isn't that the point of AM4, so you won't have to upgrade your motherboard
@@pixels_per_inch you can literally get am5 mobo and 64 ram for 300
@@Jay-jayy-n9u Really?
@@pixels_per_inch yes, not the expensive ones , the entry was cheaper for me than am 4
It doesn't get much finer than watching these reviews after a long day. Bless you all at GN, one of the best channels on UA-cam.
AM4 might be EOL but right now the 5800X3D still seems to be a viable option for gaming, especially at it's current below 300 bucks price. It also offers quite good frametime performance, just not the highest FPS.
BTW: The CPU shown at the end of the video while takling about the 13600k is a 13900k.^^
I'm going to be upgrading to the 5800X3D from 5600X - I feel it's not worth it currently to do a whole PC upgrade when the 5800X3D is still kicking ass
If you already have an AM4 mobo with an older gen Ryzen and you don't want to waste too much money, then the 5800X3D is the perfect option
@MICKUS TOPOLINO IMPERATOR CAESAR AUGUSTUS MAUS This. I went from a Ryzen 2600 to a 5800X3D. The difference is night and day. Add to the fact you don't have to spend money on new RAM and mobo, it is simply a no brainer.
I’m definitely getting a 5800X3D this year. Either when the price falls below $300 during the year or at the next Black Friday/Cyber Monday deals at year’s end.
@@CLfreak246 right! It’s such a great processor.
The new set looks really professional. I compared with the old one and I think either lighting or color grading has improved as well, the skin tones are much better reproduced.
The new look is amazing , i was gladly surprised by it
The efficiency is actually insane on this CPU. It draws 6 more watts at full load than my CPU (ryzen 5 3600xt) yet has like double the performance in games
This is going to be a game-changer for laptops as well, with big performance gains and no throttling. Remember that this also has fairly decent integrated graphics. And just within what a typical (gaming) laptop can handle in terms of heat and power draw. A clear win-win for AMD.
I do a lot of code compiling and the main game I mostly play is ffxiv so having stuff like this in your benchmarks really does help a lot!
Thank you!
Terrible game, the average IQ of the people playing that game is pretty low. I'd know because I used to play it, but then I learned better
ur mother named u andy bro lol@@AndyU96
Love the plants on the new set
Thank you for including frame time! As a VR enthusiast, it is a struggle to find frame times on these newly released CPUs. In VR, frame times can be the difference between an enjoyable experience or one that is unpleasant and can cause a bad time. An overall smooth frame time and lower frame times make a huge difference in VR and it's not talked about nearly enough. Even on high resolution HMDs, where even a 4090 is the bottleneck, CPU frame times can have huge impact on not only the games performance, but the overall enjoyment of the VR experience.
John Carmack gave a great engineering talk about this a few years ago.
So much this.
They worse then Intel ones, who would have thought...
Thank you GN for not just hitching to the hype train and for providing a detailed and objective review.
exactly why i'm getting a 13600k. i'm not just gaming
I like how the background has been getting more and more lively :)
love the work you guys put in, props!
Edit: I regret looking at the comments, the war has already begun.
Your solder and project mat is the first product that you've made that I'm interested in having. Well done. Looks great.
Excellent work as always! I would love you guys to include de R5 5600x on your charts, I think it's a nice middle-ground comparison to have. Thanks!
The 5600X is still available in our previous datasets, so you can get a pretty close approximation! We are still adding stuff to the freshest re-runs/tests
Yeah AM4 comparisons are pretty important to such a "wait and see" product.
As someone who does data analysis as part of my day job in Quality, the frame time charts are helpful as you have a better picture of what the CPU is doing versus reporting an average value (especially without a +/- standard deviation value). Thanks for adding them.
I like the frametime comparisons, but don't expect them on reviews of large number of items, they kinda highlight how the software and hardware work with x item.
I feel like this 7800X3D is going to be the value sweet spot ! Awesome Review !!
thank you GN for clearifying that the dropped boxes were empty; that genuinely pained me.
Would you consider adding specific cache-sensitive games for the X3D reviews? For example, Cities Skylines is completely CPU-bound for most of a city's life, but some reviews of the 5800X3D showed huge improvements. For people like me who play this game, or other cache-sensitive games like Factorio, it would be a useful datapoint to know if the cache improves performance enough to consider upgrading from a non-X3D or Intel CPU.
It only makes sense for factorio if you live and breathe it, because game is decently optimized as is and will run great on average hardware. You will only see the usefulness of x3d cpu if you push the limits and have hundreds of hours in this game. If you are that person then x3d is amazing.
Tarkov
HUB or Linus has a factorio test for 7800X3D and it blows the chart off, followed by 5800X3D
Bethesda games really benefit from cache as well.
I guess the general opinion for Cities Skylines is still intel-bound for gamers who had 81tiles and heavy mod. Some gamers' comparisons on CSL with essential mod/asset indicated that 58x3d/795x3d had perfect fps on light load, but not consistent fps, or not even close fps(both avg, 1%low and vision exp) as 127k/139ks correspondingly, especially in medium and heavy detailed scenario.
Unfortunately, most of prof reviews for Cities Skylines are no mod or very limited mod, which results x3d being an "unbeatable" winner.
Please keep doing frametimes! For people who care about high resolution games or VR, they’re incredibly important.
Great review, just 1 thing you could have put more emphasis on in the conclusion is how much more power efficient the 7800x3d is compared to the 13700k. (especially in this state of the worlds and electricity prices)
Steve, Thank you for including The Frame Time! please keep the frame time in the cpu Review's even if not all understand them i DO and i know you know its super important and can make a buyers choice. =)
I just picked up a 7800x3d for $350 after instant coupon and bundling. 32gb vengeance, x670E msi tomahawk,78003xd, and starfield approximately $750 out the door.
How’s your experience been with all the games?
Holy moly this is an in-depth review
Thanks! Can't wait to finish adding Stellaris and other new games, too!
Steve coming out swinging with that opening statement lol.
I really like the frame time charts. I'd love to see them more against the most appropriate competitor (or maybe an outlier to show how significant a difference), and then the wide-range comparisons with the bar charts. Great combo to get the full picture!
I'm definitely very intrigued and would like more frame time graphs. Definitely a valuable video!
Thank you GN for all the effort that goes into collecting all this data. That's why you guys are the best.
👍
Really liked the frametime review. Too bad AMD nerfed the 7800X3D, could've easily matched the 13700K in those other games if they let it rip! Keep up the good work.
Re: 4:46. The boost frequency on the V-Cache CCD of the 7950X3D is 5.2GHz, it isn't dropped that much for the 7800X3D (200MHz), it just looks like a lot because the dual CCD parts have that higher boost CCD that won't cook the VCache at that high of clock.
So you dont think that the limited frequency ceiling for the 7800X3D is a scummy attempt at product segmentation?
@@mr.dingleberry4882 Not if the V Cache CCDs on the higher end parts are getting similar clocks.
Having V Cache drops clocks because the extra cache can’t handle the additional heat generation that comes from driving the higher clocks that the standard CCDs can run at.
I managed to snag a 7800X3D in the three minutes before it sold out everywhere... pretty excited to upgrade from a 3570K!
is that treating you well so far?
Thank you so much for adding the efficiency chart at 8:55 this is the main reason I choose a particular CPU and this information is often hard to come by.
7800X3D: You care mainly about gaming, low energy usage, and are fine with the $449 price point.
13700k: You care about gaming and production, don't mind higher energy use, and want to save a bit at ~$420.
13600k: You care about gaming and production, don't mind slightly higher energy use, and like the $300 price point.
7950x: You want the best for both gaming and production, don't mind the higher price point
13900k: same as 7950x and you don't mind the higher energy usage
7700x?
I've grown to like Steve smashing stuff.
I know Steve mentioned it briefly but the i5-13600k is flying under the radar here. As someone who was hoping for a 5800x3d jump, the 7800x3d is double the price of a 13600k, with really marginal gains.
The 13600K really is the jack of all trades, master of none. I feel like it's slept on at the $300 price range. Great all-rounder that doesn't require exotic cooling or tuning.
@@steph_on_yt I managed to buy my 13600k for $250 at BestBuy and I love it.
It's a tough choice I agree. the 7800x3d is more power efficient. You will make that money back in your electric bill haha. as time goes on we will see which CPU ages better.
@@wijn1008 id hope its more power efficient since its 8c 16t vs 12c 20t. In gaming the 13600K doesnt actually use that much power. The 13600K is hands down the better buy, even for gaming E cores completely handle pc background tasks and let the 6 p cores focus on just gaming, and contrary to popular belief, threadscheduling has been optimized to actually use e cores in gaming too now. That danny dude did a video also proving you get better performance leaving ecores on and not turning them off. For all intents and purposes the 13600K is closer to a Ryzen 9 as an i5 than a Ryzen 5 or Ryzen 7
@@wijn1008 Yep, don't look at the all core benchmark power draw, imagining that's what it pulls in real use, unless you enjoy doing blender renders on the CPU. Most of the time both brands have pretty reasonable power draw when gaming.
Glad to see frametime being included in your tests, I don't know if I am just particularly sensitive to stutters but I'd rather 90 fps with sub 8ms deviations than 150fps with 30ms any day but sadly they are usually not related eg. lowering my fps often didn't reduce my stuttering. So I guess GN will be my first port of call when its time to upgrade because I would loose my mind if I upgraded from my 8700K/1080Ti and still had stuttering issues even if it did double my average fps.
I just upgraded from 7700k and a 1080ti, to this chip and a 6800xt, so I'll report back.
Yeah the CPU is overkill for that card, but I'll probably upgrade GPU next gen and wanted to get on AM5 somewhat early-ish since I seem to always buy mobos a month or two before they announce a new gen.
@@ayewhaddupdoe How has your experience been so far?
@@dhLotan I almost forgot I had the stuff. I've been paranoid not wanting to ruin the 700 dollars in parts, so gave them a wide berth to get the BIOS stuff figured out and totally forgot I had that board/chip up in the closet. Gonna do it this week now that you reminded me. 😅
@@ayewhaddupdoeNow you’ve got me curious. I’ve got a 1080 and 6700K and am looking to upgrade as well. Have you tried your new parts yet?
This talk of frametiming against FPS is something I’ve felt insane about, have you ever used a Gysnc monitor with an Nvidia card to notice the difference?
Great work as always. The more depth in the videos, the better that they are as the vast majority of viewers are enthusiasts, therefore the frame time charts should be included in all future reviews.
Great to see strong competition. I was worried I'd have major buyers remorse after just building a new rig with a 13700k + 4090 and though this shows the 7800X3D would've been a strong alternative I'm happy with my choice since I still have some value for productivity software performance. Each gen I keep switching back and forth from intel to AMD (whoever's the best fit at the time) and I have to admit I always feel like I run into less rare odd issues running Intel (and Nvidia on the GPU front) vs AMD. Including one off games that have very odd dependency on individual core boost speeds etc. Not throwing shade on AMD, again I keep swapping back and forth each gen between them and Intel and Nvidia, but it always feels like drivers and edge case issues are just a bit fewer when I'm in Intel + Nvidia land... maybe just me though.
I have the exact same issue : chosing between the 7800X3D and the 13700K is really hard. When 3D cache is used, the 7800X is a beast, but not so much when the 3D cache is useless. And I play almost only heavy sim games so I think that in a few years, the lack of cores compared to the 13700K will be a problem for me. On the other hand, the 270 Watts of the 13700K are a bit much and should be a nightmare to cool properly with an air cooler. But I guess it's not always used as a warmer with a Prime95 load. Oh, and also the nightmare with AMD drivers and DDR5 compatibility on recent AM5 motherboards. I might stay with Intel...
What are your temps & power consumption, when gaming ? (BTW, what is your cooling setup, that might help me too).
Thanks in advance =)
Thank you for this response. I'm finally looking into upgrading to a completely new build after rocking a hella budget rig for the past 4 years(optiplex, 1060 6gb, PSU...the general optiplex build. lol). Sim racing and occasional fps titles are really all its for, so technically just gaming, but the minimal frame rate differences don't mean much to me in the end, and the real thing I'm looking for is reliability and longevity. I feel like I've heard multiple people with the same thought on just having a smoother time with intel when it comes down to drivers and compatibility. ill likely be thrilled with either one, but think a 13700k over the 7800x3d might just make the most sense. Thoughts? and cheers!
@@christophersmith8028 I have always been an intel user... starting with the q6600>3930k>6800k> 9900k>11700k>11900k>12900k...I went from the 12900k to amd's 7800x3d and the difference has been huge in fps uplift for the games I play(specifically dota 2(120fps average)/warzone(50fps)/baldurs gate 3(60fps)) with a 4080. I haven't had any driver issues thus far. Besides the fps uplift in these games, when I shift from gaming to more productivity I can drop in a 7950x3d or 8950x3d when it comes out eventually. Unfortunately with intel the best you can do is 14900k which is coming out soon, and is said to not have a huge uplift over 13900k. Meanwhile rumors are that next gen for amd are going to see around 20% improvement...
All that said, it does seem as though these days you have to do research for the specific games that you care about. For me it was dota 2 and baldurs gate, so the 7800x3d had huge gains.
@@Togairu Baldur's Gate 3 at 60 fps with a 4080 and 7800x3d???
@vane909090 that was the fps gain not the fps average sorry if that wasn't clear
Enjoyed the frametime charts. Would like to see more of them.
On Frame Time charts - Have you considered presenting this data in something like a box-and-whiskers chart format? That would capture much of the interesting information from the frame time charts in a much more compact format, allowing you to compare several processors at once without it getting too messy to be legible.
One year later, watching this vid while deciding between upgrading to 5800x3d or to AM5 with 7800x3d. Thanks for the thorough review!
Same. I have a Ryzen 9 3900x and was thinking about the 5800x3d.
But the cost of the 5800x3d compared to the 7800x3d is the same.
So I have bit the bullet and upgraded to an msi b650 plus mobo and corsair ddr5 6400mhz ram just today after watching this review.
I've had this build for a few years now so £580 for the above is worth is.
Just wish It was worth financially upgrading my rtx 3080 but that will have to wait another year or two ; )
Love that you have the frame time charts. In competitive titles like CS:GO it is very important that they are consistent. Thank you.
No other channel (that I know of) is doing half as much to deep dive into performance of hardware like you guys are. The effort that goes into these videos is kind of astounding.
Steve thanks for another amazing benchmark video as always. My gaming is generally limited to flight and racing sim games, which have the habit of being single-core CPU heavy, and often not amazingly optimized. I'd really like it if you'd put these tests forth for X-Plane 12 and rFactor, for example, and see how each CPU performs. Cheers buddy!
I just bought a 7800X3D to replace my current 7700X.
I know that the difference in gaming performance is likely to be somewhat minor. The 7700X is a damn good chip. It’s great for games and it’s especially good for consumer/prosumer level productivity tasks, which I don’t really do on this machine very often.
Anyway, it seems like one could expect anywhere from a 5-15% uplift in frame rates depending on the games and that isn’t a major improvement but I primarily bought the X3D because of its efficiency and low power consumption. The added gaming performance will basically just be a nice bonus.
After switching from my AM4 system with a 3700X and a 2070 Super to my current system with a 7700X and a 6950XT, the amount of heat being dumped into the room by this PC is substantially greater than I’ve ever experienced before. The Zen 4 chips seem to generally run pretty warm and I live in Texas. After a few hours in some games, the room is uncomfortably hot, despite the rest of my house staying at 70 degrees. I’m using a 360mm AIO and still sometimes see the 7700X sitting in the mid to high 80’s. My 6950XT runs quite cool so I’m hoping that once I pop the X3D into this machine, it should draw significantly less power, create less heat and improve game performance to boot.
It’s kind of a lot of money to spend just to get a more efficient chip with a fairly marginal overall performance increase but I think it’ll be worth it and now I plan to go quite some time before doing another CPU upgrade.
I doubt you’d see a 20? watt increase in heat from your CPU. Especially when a GPU can output 5x more heat than your CPU
Im planning the same upgrade and I had the same issue with 7700x so high temps but I fixed it setting it to 65w eco mode and temps went down a lot between 55-65 while gaming, so how are temps on 7800x3d? thanks.
@@armandoarroyo I run my 7800X3D on eco mode and even in demanding games cranked to max settings, it’s usually hovering around 62 degrees. I’ll get intermittent spikes up to 70-73 from time to time but it always settles back down quickly. I’m running a 360 AIO exhausting through the top so it’s also exhausting the heat from my 6950XT and still the temps for both are totally under control.
I’ve been playing cyberpunk again for the last couple of weeks and both stay at a steady 61-63 degrees. With that game, I typically use a frame rate cap at about 75fps to make sure I get the graphic quality I want without running too hot. If I don’t cap the frame rate, both the gpu and cpu will definitely start getting into the high 70’s, maybe even low 80’s. But that’s a pretty demanding game at max settings. I play exclusively single player games and nearly all of them can run max settings or very close to max with no fps cap and temp issues whatsoever.
I definitely recommend the 7800X3D. It’s probably not the best bang for buck purchase you can make if you already have the 7700X but it’s been rock solid for me.
@@kylestewart4444 yeah I already have 7700x and I care about temps, how you set the eco mode on 7800x3d? Just adjusting the CO to a negative value ? I have heard that is kind of different when indervolting 7800x3d. Still wondering if it is worth the upgrade tbh.
That frametime chart actually made me hold on my decision to purchase the CPU. The whole reason of wanting a 3D cache for me is a smoother gameplay, if there are frametime spikes, that's a big issue.
They're sub-10ms spikes. You won't even notice them, and the more major spikes that ARE noticeable are likely going to happen on any CPU because they're not the CPU's fault.
Basically it's a graph that doesn't matter other than to confirm that it's a normal CPU that compares evenly to other normal CPUs. The spikes are a non-issue, in any practical way.
You have seen 3 charts for 3 games of which 2 were completely normal and just one showed more variance. This anomaly was for Cyberpunk 1.6. The 2.0 update has improved the x3D performance by a ton to where the 5800x3D beats the 13700k in 1% lows by 10 fps and average fps just by 2 fps (according to HW Unboxed benchmark), meaning Cyberpunk was the problem.
@@HopemanGG I bought it this summer :)
@@jonathaningram8157 nice. I'm thinking about it as well because I simply want to support this power efficient technology even though I probably don't really need it.
I love the chart recap starting at 25:51. I really hope you keep doing that. With all the data presented in a video like this, it's easy to get lost in the weeds.
Really appreciate that you guys use "optimized" settings in cyberpunk rather than ultra.
I wouldn't really say AMD intentionally "kneecapped" the 7800X3D's clock speed to make the 7950X3D "look good". The max boost speed listed (5.7 GHz) is for the overall CPU. For the 7950X3D, that's specifically the faster normal CCD, with the V-cache CCD with a boost closer to 4.925-5.1 GHz, essentially the same 5 GHz average. If the core parking and thread assignments are done appropriately for the more complex 7950X3D, that would explain why the 7950X3D and 7800X3D essentially neck-to-neck. Granted, that's a big "if" which, does show in some titles. Would it have been more transparent if AMD listed separate boosts for the different CCDs? Yes. But mentioning that would give an easier perception on what's going on. Because the way this is presented, it seems like the 7800X3D is magically making up "that" clock speed difference.
Have been waiting for this review. Just learning blender and just now building first pc, but glad I went with 13700k
I'm generally really interested in the frametime charts because you can see stuff like micro-stutter clearly there. Nothing annoys me more than a high framerate that doesn't feel good because it has constant issues with the frametimes.
you guys have such a good looking set and a damn good visually pleasing output with these videos
Love the frametime graphs! Way more meaningful than the averages!
I'm a little surprised that the advantage over the basic non-3D parts isn't higher. It seemed like 5800x3D was magic. I guess DDR5 bandwidth really helped to close the gap in latency between 3D and non-3D.
Still rocking with my 3900X, but if I change I'll clearly look into that one !
3900x gang 🤘😎🤘
@@maximilianrockefeller8854 🤘😎🤘
I bought 13700k for a new computer a few days ago, and I was worried whether I should purchase a new 7800X3D and refund the 13700k, or just stay with it. In conclusion, I am staying with the 13700k since my main will be Adobe programs and gaming for a sub.
Yes, the amount of electricity that 7800X3D use is impressive, which made me hard to choose between the two CPUs even though 7800X3D shows a less productive result when compared to 13700k. However, 13700k is faster for rendering, which means it will use a similar amount of electricity in a short time. It might be less productive than 7800X3D, but I don't really care that much about it. Plus, I got 13700k for a reasonable price, so I am just staying with 13700k. (I actually bought 13700kf + Was in sale)
Still, if someone needs a new PC just for gaming, 7800X3D will be an excellent choice than 13700k.
I wasn't really interested in the 7800X3D as a part, so I was dubious about watching the video, but uh... can I just say the ad content for the new silicone solder mat got my attention? I just started branching out my model building side of the hobby from just assembling plastic models to also adding electronics to them and hot damn, talk about timely product rollout.
i got the 7800x3d for $265 on black friday. now that was worth the wait.