Thanks for the patience on this one! We finished it ~2 weeks ago, but we didn't complete the editing since we were working so much on that 12VHPWR story with the 4090. But now it can finally go out! Watch our revisit of the AMD Ryzen 2000 series! ua-cam.com/video/VcnfsEjckqY/v-deo.html Find our Intel i5-13600K CPU review here: ua-cam.com/video/todoXi1Y-PI/v-deo.html And our Intel i9-13900K review here: ua-cam.com/video/yWw6q6fRnnI/v-deo.html
Why isn't there a CPU benchmark for cpu heavy strategy games like Crusader Kings III or Stellaris? Like how long it takes a game to finish playing on max speed. The games get really sluggish at the end.
Why don't you include 4X games like Civ6 in the game tests which can actually stress test the CPU, instead of running 1080p tests with a 3090Ti? You're testing at 1080p because at higher resolutions you're GPU bound, I get that. But these 1080p tests are rather pointless, as not many people play 600 fps 1080p games. Phil's Hardware included Civ6 tests and there you can clearly see which CPU is better in a real world relevant scenario.
@@luxemier first, it trades blows with the 7700X, and the 7900X and 7950X are workstation CPUs who suffer a bit of latency due to the dual CCD. Second, wait for the 3D SKUs lol. And that's only gaming, in productivity the 7950X wins or matches the 13900K in all workloads.
Your videos are the reason I was able to make an informed decision about buying what is best for the buck in my budget range. You guys with your top-notch reporting managed to save me hundreds of euros in the long run. Thank you for the work that you do.
Thank you for this review! I haven't upgraded my pc in over 10years. Was still running an i7-4790k and a 980TI. Was still able to play games. But I splurged and got this cpu and a 3080TI. So glad I did. Night and day difference and this review made me happy. Thank you!
I'd possibly still be running 4790k and GTX970 to this day if it wasn't for PSU dying, CPU temp problems and GPU software problems. Performance-wise they still hold up "acceptably" in 99% of games for 1080p.
@@GamersNexus truth. When they dropped the 5600g to $110 a few days after, I had a few second thoughts... But I've learned my pattern of behavior actually prefers fewer upgrades over longer periods of time, so I don't have buyers remorse anymore. If we get 2x the performance at the same price within the next two years though... Then maybe. But I'm just glad we got out of the ivy bridge to kaby lake slump.
@@jkiu2692 I'm also still on 4790k, that thing won't die and still works ok. It's by far the most long lasting cpu I ever had and I'm still not convinced that I need a new one yet.
Man, I was considering going for the 13600k instead of the 13700k. But then I thought "I'll watch the GN review". And after watching this, I've decided that for 350US for the KF (it was on sale) against 300 for the 13600k? It was a no-brainer upgrade from the current 4470k I've been running all these years. Thanks so much! I always love your reviews and your processes, and I love the fact that I was finally able to act on an upgrade, after years of watching.
Do not go kf unless u know exactly what u need, intel igpu act like a booster help speeding up PR streaming and all other sht. If u do any sht relate to adobe than k is a no brainer
@@zihechen3111 Thanks! I mostly do gaming and very light streaming (Older games most of the time lol). So I went kf either way, I am rocking a 1080ti still and I'll hold off for one more gen to upgrade that. But yeah, I do no workload-related activities on this PC.
Upgraded from 8700K to 13700KF. Have it power limited at 190W with PL1/2, and it achieves basically the same performance in games, and just shy in benchmarks from stock. Best of all, can avoid the annoying thermal spikes when load starts!
@@ganomaly Ah I see how my comment is confusing. I meant that the 13700KF performs the same in my games when I limit the PL1/2 to 190W compared to stock. Not compared to the 8700K. In terms of whether it's worth upgrading it depends what's your current CPU and what you're trying to achieve
I still have the 8700k at 5ghz. Wondering whether I should upgrade or wait a tad bit more. Either and or Intel not sure haha thank you for the reply. It'll be for gaming btw, and I've just picked up a 3090 couple months ago. Any advice would bee appreciated 👍🏼
Happy that I went for the 13600K, saw an 80% improvement in iRacing fps over my overclocked 9600K. Probably even more now that I discovered the MSI "CPU lite load mode" that was set way too high from the factory. CPU temp dropped almost 20 degrees by changing a single parameter in the BIOS.
I had a i5 8500 since 2018. Few days ago I bought i7 13700K. I am super happy with it. currently using a Noctua NHD15 with it. It gets a little toasty under high load but the performance increase is great. As always great review
I would appreciate , although it is less common in gaming , if emulation benchmarks were at some point included. Mainly since those are more CPU intensive than GPU and have been making strides in optimisation and compatibility as of late.
i was literally in the market for a 13th gen i7 or i9 and this was the video i needed! I imagine many others are making this transition as well during the holidays. great content as always! will continue to give you my watch time and likes!!!
When i built my current computer I put in a 6700k and everyone was telling me i wasted my money. I explain i wanted to keep using it for about 6 years but didn't matter i apparently wasted my money. It took only 2 or 3 years for the 6600k that everyone told me i should of bought to start struggling in games. Not making that mistake this upgrade skipping the 13600k again.
Si cierto CPU potencia de intel es único que trabaja más mucho para jugar máximo FPS por 1080p más mejor mucho que menos 1440p y 4k? Creo que GPU es una de parte mayoría en cuello de botella si existen
I'm in the middle of a complete rebuild from my 8700k (sold my tower before a large move) and I decided on the 13700k shortly after launch. This definitely helps me feel good about my decision.
Im so torn now because of amd price drop. Going from a 9700k aswell. I have a $100 gift card for Amazon too so I can get the 7700x for 250 or 7900x for $374 . My third option is microcenter 13700k for $380
As of late August, 2023 - Micro Center has this bad boy on sale in a bundle. The CPU came out to 260 bucks, got 32GBs of DDR5 6000 36 for like 65 bucks, and a z790p wifi (didn't want the wifi capabilities but whatever) for 170 bucks. Couldn't pass it up. My 3070 is very happy and so is the credit card company lmao.
I got mine on launch day via best buy for the same price... I had them price match micro center then used a 10% off coupon + I had a free $200 gift card from recent phone purchase and some rewards points all in all I ended up spending about $150 after tax for mine!
I, too, have thoroughly consumed said video and observed Steve loudly proclaim something at some point in this video. Dry sarcasm may have been involved.
Actually an informative and interesting video now that I have actually scrolled through it. Serious consideration for a 13600k for me tho, it doesn't seem like the 13700 is worth it for the $. I do video editing and light gaming, and my FX 8350/hd 7870 is feeling a bit tired. Seems like intel is on top this time around.
Built a system for a pal who brought a 12100F (unfortunately) and honestly, even at the extreme low end it absolutely blew me away how quick it was considering i'm still running an overclocked FX-8350.
Because Bulldozer IPC was so bad that 1st gen Intel CORE is faster per clock and anything above feels high end compared to that piece of trash. Being more accurate, is not the IPC itself, it's the terrible floating point speed. Alder Lake and Raptor Lake's p-core floating point is ridiculously good. Emulation beasts and destroy Digital Audio Workstations (they are all floating point based).
@@saricubra2867 For certain tasks it was even worse than core 2 duos, lol. I distinctly remember an overclocked E8600 beating overclocked 8150 in Skyrim, lmao.
The CPU market is finally exciting again! You really love to see it! Can't wait to see DDR5 to come down more so i can take advantage of these latest improvements though. It's still a little out of my price range. Thanks once again GN team. We appreciate the work you all do!
@@infernaldaedra Nope, you have never needed a good cpu for gaming. I run 4k oled TV as monitor and 5820k from 2014 and I am not CPU bound at all with my 6900XT
@@Mekojonessharktooth you're putting out 250 watts of energy, a lot of that will be heat. So yes it will expell more BTU than a 95 watt 9900k. I have a 9900k and just upgraded to 13700k. I was able to undervolt -.090 vcore to get the temps to stabilize around 80c. My 9900k was UV too and would hit around 72c under max load. So there will be a 12% increase in temps. That doesn't directly correlate to how much warmer your room would get. There's so many variables. TLDR. Yes your room will get warmer around 10% fast or warmer than before, depending on room size, ventilation and cooling.
DDR5 prices are only a bit lower than DDR4 right now, if anyone's getting a new Mobo, it could be the time to go for ddr5 (But yes absolutely, if you already have expensive high quality DDR4 kit, keeping it is the better option at the moment)
I was able to get a 13700k this weekend at $370 (thanks MicroCenter), looks like it should be pretty good value at that price! It's one hell of an upgrade from a 2700X. Thanks GN!
I paid $350 on launch day thanks to price match at bb + 10% off coupon then had $200 free gift card from recent phone purchase so paid like $150 for mine!
I'd love to see a section about undervolting in these cpu reviews. Can you undervolt it? What chipset do you need? How much performance do you loose at which power target? With these modern CPUs and their enormous power consumption and cooling requirements that gets more and more interesting.
I guess the reason undervolting doesn't show up in reviews is that it's almost entirely dependent on the silicon lottery. One CPU might be great for undervolting, and another sample of the same SKU loses stability as soon as you touch the voltage.
Seriously beginning to feel like picking up the 5800X3D was a great decision. I only use my pc to game and its still very solid compared to most of the recent CPU releases. Couldn't be happier
I have yet to see a recommendation against the 5800X3D if all you do is gaming. Well, it depends on what resolution as well, it matters less at 4K but might still matter quite a bit, especially at the lows, if the particular game you like can avail of the large cache. Of course it depends on what you paid for it as well.
@@Vantrakter 1440p paired with a 3080ti. Got it for something like $375? It was on sale like 6 months back. Sold my old 3700x to a friend in need of a CPU for $100, so ended up even cheaper in the end. For most of the games I play I didn't notice much of a difference, but in Escape From Tarkov and a few other titles the difference was fairly noticeable.
I want to thank the GN team in it's entirety, you guys are my personal favorite YT channel - and one of the extreme few that I watch constantly. Some things I really appreciate: your integrity when it comes to dealing with companies, and other business relationships, and the clearly high levels of pride you rightfully take in your own work. I notice that unlike all other UA-camrs, you don't jump-cut your speech every 3 seconds, and instead just, speak passionately about the subject at hand. It also seems like you Steve, are not reading word for word off a script, and instead reference well-written notes. This makes the delivery of the information and the video itself feel authentic, warm, and real. Also, the editors and graphics artists have done wonderfully. I love the falling timer-bars on the sides of the screen when displaying a graph. A small, yet very welcome and helpful detail that few tend to think of. Very well done, GN you are the model that others should strive to be, and you have my respect. Thank you for the phenomenal videos and entertaining hardware and engineering documentation.
Hello! For the production section of the CPU evaluations, have you considered adding a music production software, whether that be ableton live, fruityloops, etc.? Y'all are the gold standard for reviews in my book and knowing how the music to gaming overlap exists would be valuable to someone like myself. Thanks!
How would you guys feel about changing around a little with your CPU blender benchmark? Instead of showing graph of render times with all CPU's, would there be a way to say benchmark FPS performance in viewport like just rolling thru your intro animation? Since no one 'is or should' render with their CPU's anyway, but it seems like they do a bigger difference during viewport performance. Where you acually do the work before giving it to the GPU for render. Or I might be wrong. Love your work as always! Cheers
It doesn't matter...the test is to show performance compared to other CPUs and they have the data for older CPUs to compare it with, wouldn't make sense to change to a new test when they already have the data to use for compareing against another cpu
@@samgoff5289 Fair point. It would be like adding another game to the chart. Annoying yes but over time it could work. Just feeel like blender charts are a little bit miss representive. Example: TR 3970x vs. 13700k. On a normal chart the Threadripper cpu would win since it has more cores and can render faster. But no one renders with CPU power and I bet the 13700k would be overall a way better experience in viewport performance. But! I might be wrong. Maybe time to dig out the old parts at home and test them.
You guys really enjoy buying into dead platforms, don't you? How did it feel like watching everyone upgrading on AM4 for cheap? Ready for history to repeat itself on AM5?
@@lagarttemido I mean considering we got the others on release 4 years ago and only upgraded now, none of us are even going to notice, cause as you may have noticed, we're not the type to upgrade every single time a new thing comes out.
@@lagarttemido I've been looking through all these comments and no one is opting for the 7700X or 7900X, which I'm thinking of getting because in 2024 or 2025 I can buy the chip that's beating 14 or 15th gen intel with no motherboard move. I don't know why you blame them though. That "dead" platform is insanely fast - and stable. Intel also has way higher memory bandwidth than AM5 is achieving w/ DDR5 too. A lot of Intel users upgrade every 5-10 years. A new platform is always required with that type of frequency. I do think it's a perk to upgrade in 2-4 years to whatever the latest CPU was for your platform. People that I know in the gaming community don't do a CPU swap unless they're going to the latest chips on the latest platforms. Hence why people don't care it's "dead" socket wise.
Love the i7 gen chips. Looking to upgrade my motherboard and chip from the i7-7700 to the i7-13700k (Currently priced at $375.99). This upgrade is also going to be my first step into AIO's and I'm glad you tested the performance with the Arctic Freezer AIO I intended to buy for this build. Great review! thank you!
I think it would be very useful to add system (or CPU) power consumption during idling and/or gaming, because I think not a lot of people need their CPU to execute heavy tasks continuously...
@@robertmyers6488 I was speaking in general, not specifically to this CPU. Though I'd argue that even people who use high-end CPUs for rendering may have significant idling times. And that gamers do buy these CPUs
@@orangecapy9494 gamers absolutely buy these chips.... It's not that much more than 13600k and does offer meaningful gaming uplifts and at 4k/120 every frame you can get counts but it does it without costing 50%+ more like that 13900k (now THAT'S a chip only a production person should worry about) I got my 13700k for $350 and it's well worth it for gaming!
@@__aceofspades you are burning a hundred watts more per hour in performing the supposed benefit of this cpu. The idle doesn't. Require cooling and a larger power supply. Face it burns a lot of power that requires costs. The 7900x is a no brainer for the same thread count.
I'm feeling just fine with my 13700k that I upgraded from the legendary X58 platform with a Xeon X5680 at 4.4ghz. I'll just say the gains were fairly substantial! As far as power usage goes it seems Intel was quite safe making sure all CPU's remain stable and cranked the vcore much higher than it probably needs. Mine would near 1.4V at times and with a -100mV undervolt I dropped the package temp from 87C to 78C in cinebench. Not only did temps drop nearly 10C, power saw around a 50W reduction under load and to top it off it scores the same if not higher 30k plus in MT. It seems undervolting is the new norm for both Intel and AMD and if you don't do it you're missing out on a lot of free advantages.
The 13700k just makes way more sense for people who want "high end" but don't actually need production work tasks... And it's like 50% more money to get 13900k over 13700k if you shop around.
My heart says I want the 13900k, but realistically I know the value is just horrible compared to the 13700k since I usually just game. Decisions decisions
@@Chadimusprime87 ive upgraded from 12700kf to 13900k for future proofing reason. plus i have an expensive mobo and i dont want to replace that for a while.
I had the same chip 2017-2020 but the "hitting a wall" thing is just not something I'm comfortable with plus I hate to see my parts lose all their value and basically end up as giveaways or just sitting in a drawer. I sold my 7700k for $300 and got a 10900k on its launch then sold that chip back in May this year for $390 and finally got a 13700k for $350. I've managed to keep upgrading and yet I bet I spent only about what you spent in the same time or maybe even less....
Pretty pleased with my 13700k purchase. After weighing the extra platform cost to jump on a new AMD socket AND pay for DDR5, I couldn't make myself make the leap. Ended up buying a Z690 board and reusing my DDR4 memory. I made the mistake of buying an i5 6600k years ago. This was right at the end of the "4 cores is all you'll ever need for gaming" era. I'm ok spending a little bit extra to future proof my purchase.
6600 boat, EOL which is why now that the 7900x is on sale i feel inclined towards it given the same price given how Intel messed me up. I could possibly even make do with a cheaper cooler.
When i bought my 6700k back in 2016 and awesomesauce "now bitwit" and paul were saying i5 is all you need for gaming and in 2017 amd out of nowhere reminded everyone that wasn't the case lol I knew it in bf1 that 4 cores wasn't enough. Had my 6700k sweating hard out of the box. Even overclocked that poor thing to 4.7ghz at 1.4v for years lol Once I went to a 10700k back in 2020 it was a night and day difference on every single game I play. The 1% lows upgrade was the biggest thing for me. 8 cores is the sweet spot and i'll never go under 8 cores ever again. That's why i'm worried about the upcoming intel 14th gen. Been reading around the top of the line i9 cpu will only be a 6 core. Hope that isn't true..
5800X3D, 7900X, and 13700k are the only 3 cpu's I'd think about buying right now. I kinda lean towards the 7900X, as it and the 13700 are stronger on productivity (mixed use), but the 7900x draws a lot less power, and actually increases in performance with a further modest down-tune in voltage (avoids thermal throttling, somewhat silicon dependent though). I also want at least 8 big cores purely for the primary application, with some additional compute outside those for background stuff (which the 5800X3D lacks). That matters in productivity _now_ but could eventually matter in gaming too. More pcie lanes on AMD too.
I like the Idea of the X3D technology with the RDNA3 where the gpu has faster cache from the CPU. I'm curious how much a benefit you really get though.
I was planning to build an I913900K for my first built, but stepped away from the I9 because of reported high temps. I wasn't sure about that for a first assembly so i went with the I7 13700K. Z690E STRIX DDR5 32GB 6200Mhz 3090TI
@@rounaks6519 well worth the upgrade, the 13600k is 2x as fast as a 5ghz 2500k in single core performance (3x as fast as a stock 2500k) and 5x faster in multicore workloads. They also overclock pretty well if you get a good cooler.
I don't play a lot of high FPS games, but strategy and simulation. Is there any chance of seeing a Civ 6 turn time benchmark when the 7000 series X3D comes out? Would be nice to compare 5800X3D 5950X 7700X 7950X against a hypothetical 7700X3D.
I replaced my 10900k with this chip and couldn't be happier it's a beast in gaming and for what little "productivity" stuff I do it's been tearing right through it as well! Well worth it ESPECIALLY if you can get it for micro center pricing of under $400
@@blkshp25 That is the problem, always waiting for what's round the corner. It won't be cheap either, certainly not $50 more. will probably be priced to undercut the 13900k say $590 to $600 depending on how greedy AMD want to be.
Great video and echoes my own personal gaming and "creator" / professional experience. I got a 13900K as so much of what I do for work is single thread / core clock dependant but for gaming + part time studio work a 13700K is a heck of a performer and you could get (couple of thoughts) faster DDR5 (if your workflow even cares 5 vs 4), spend more on a GPU, larger and/or quicker NVME, heck put the difference toward a custom loop...
Even happier with my purchase after watching this video. My 13700k will arrive Friday and I will finally be putting my 3600 to rest. Can't wait to see the difference in performance.
Purchased this CPU in November with a bundle from Micro Center, and I really couldn't be happier. Running it at 5.3, and currently using an RX 7800 XT, which Micro Center had at an amazing price. Had a tower that was about 10 years old, using an i5-3550 with a 1060Ti. Such a huge upgrade. Thanks for all the great videos, keep them coming!!
Purchased mine on launch weekend as a replacement for 6700k, happy with it so far. Only needed to undervolt for it to not throttle with a Corsair H105 under full load.
I got the i7 13700k for 450 euros quite a big deal 20 days ago it was 540 here. CPU market is great unlike the GPU in europe. Also Intel supports 7200+ ddr5 now. Which is finally a kinda good upgrade compared to ddr4 3600 at close to double the CAS latency. It´s in the mail, thanks for validating my purchase. Now I can sleep well xD
I'm curious about this, I haven't done a deep dive on ddr5 but I've noticed the CAS is much higher. Is there a performance gain to be had in gaming when double the speed but also doubling the CAS?
@@kernoleary1394 Well from my basic knowledge when you double your speed and double your CAS you get the same latency but at a higher data rate. So in basic theory DDR5 should perform in all cases equal or better (if high data is required e.g. Open World). But there are like 50 timings in a ram so take this with a grain of salt. (and I dont know how they affect performance)
I just upgraded to the 13700K from a 9900X and it is SUCH an improvement. I actually had the i9 in my cart, but by the time I compared a few motherboards it had sold out. Honestly so happy that made me consider the i7 because I really needed that "cool your jets" moment 🤣 The difference between the i9 and i7 seems so negligible for my uses
I just parted together a new PC and put it on order yesterday and did the exact same thing as you. I was going with the I9 13900K but after some research I learned that the I9 13900 wasn't really practical for my application and the I7 13700 is more than enough, it'll be a HUGE upgrade over my current PC that uses I5 6400K.
Thanks for all your in depth reviews. I built my first PC a year and a half ago and to this day it's been going strong and stable the entire time. Started with a 2080 super and 10600k OC@5Ghz and I've since upgraded to 3080ti (after price drop, lol) and 10900k OC@4.8Ghz cooled by a noctua NH D15 in a Lian Li lancool II mesh. Almost all parts I got from you guys' recommendations and I wouldn't feel comfortable buying a PC part if I wasn't able to get the input and extensive reviews you guys provide.
The 13600k power consumption versus the 8% to 10% performance increase against the 13700k starts to make the long-term energy cost a considerable factor.
The i7 also powers more cores. So of course it will draw more power. The i7 and i9 are for people that do more than gaming at 1080p. i7 = Gaming at max settings, Ray-tracing, 1440p-4K + streaming i9= all what the i7 can do + 3D and render work and more. But as others have pointed out, Intel is doing an old trick from AMD days before Ryzen. Jack up the power to gain more performance.
If you care about power consumption, you can just limit the CPU power available. At the cost of reduced performance... maybe. Seriously though, in gaming, it rarely goes above 150W, usually more around 110W. And like I said, you can limit it for full load scenarios. And, say, decrease the power to half, but only get something like 15-25% less performance. You'd be surprised how efficient even Intel chips can get when they are not pushed to the maximum. Though on average, the Zen 4 chips are still doing better (being on that TSMC's 5nm helps a lot)
@@_pant0m You don't even need to offset anything. I have Asrock mobo with i5 13600K, everything at stock, just changed CPU Vcore Compensation to Level 1 and CPU Cooler type to Air Cooler and now I am getting 70-75C on Cinebench R23, cooled with Deepcool AK620. This CPU rocks!
OK, but if you are pairing this with an RTX4080 or 4090 and running games at 4K, does the 13700K have an edge over the 13600K, or is the issue simply GPU? i.e will running games at 4K on an RTX 40 series card have CPU bottlenecks with a 13600K?
Hello Gamers Team! Thanks for all the reviews on all new stuff! Do you know if my i9 9900k bottlenecks in 4K gaming, if i pair it with 4080/4090 or 7900XTX?
Just installed my new 13600k with a z690 (all hail Q-Flash Plus) following your review of it. These new benchmarks just further cement my decision. I really wanted to go AMD to upgrade my ageing 8700k, but the value in the 13600k is just crazy. Especially paired with a discounted "old" MB and the RAM I already had. I hope AMD can do something for the 3Ds to redeem themselves on this generation a bit, the cost of entry is just too high and not justifiable right now. Thanks Steve !
In the same boat. Intels offer was just too good. AMD is just not off to a good start this gen. But as you said, X3D will properly fix that. I just can't wait that long for a maybe. Hope AMD can change my mind when I need to upgrade in 6 years.
Ryzen 7000 already received heavy discounts here in Germany. The 7700X was 460€ at the time I bought it and the price dropped to 370€ last week (390€ as of today).
@@gasbrenner2264 MSI B650 Tomahawk Wifi (270€), 32 Gb G.Skill Trident Z5 Neo 6000cl30 (265€) and a WD SN850X 1 TB SSD (~110€). Also grabbed an Arctic Liquid Freezer II 420 for 80€ off of the Arctic eBay store, but a 360 or even 280 are more than sufficient. Just check if your cooler is affected by Arctics recall and has the corresponding "QC" mark if you're buying from a 3rd party seller or 2nd hand.
Thanks for a great review and comparison. I am interested in buying either the 13600k or the 13700k, it will probably be used a lot for emulation. (rpcs3, yuzu, cemu etc). Not sure if the 13700k is worth the extra money or if would get (close to) similar results with the 13600k. I have read that the intel cpus are generally better than amd for emulators like rpcs3, but not sure how accurate that is. Anyway, thanks again :)
Thank you for the quality videos. I’m leaving a comment because I heard that helps channels out. I always buy mid grade equipment since I only play simple games but I like staying informed.
Very good coverage/explanation, as always, kudos. Personally I just recently purchased a 5800X3D as last upgrade for my AM4 board, as it's the most "optimal" for my specific workloads (that's more dependent on RAM speeds and suffers a lot from cache misses, rather than raw CPU speed) with the bigger cache, as well as upgrading to the next-gen a complete rebuild is not financially viable for me, concerning board, PSU etc. upgrades that would be required. A test for those kinds of workloads could be interesting for viewers, even if it's not the typical AAA gaming load (with Minecraft still being a very popular game that falls into that category from what I gather, and Factorio being a decently popular game where the community already established a benchmarking process to compare hardware). Just to show that it's not necessarily only raw speed that counts for everything. Considering the current energy prices (at least over here in Europe), it might be interesting for "consumers" (read: viewers) to add a test comparing power usage for a non-maxed out "reference load" and at "max load" given a specific setup. Might also be interesting to compare this across generations to see how it develops over time, given that the current trend seems to be to "just draw moar powa to make it go fasta" (obviously simplified). I'm going to assume any of those numbers generated in such a test would need to be taken with a grain of salt though, since I'll assume a more powerful CPU might generate more load for other components that would increase their power draw as second order effects. Still interesting though.
I love this data. Really showing some real competition between AMD and Intel depending on what you are doing. Which is really great to see. Intel finally has to work harder and price accordingly.
As a suggestion, would love to see some data on idle power usage for CPUs and systems in general. Saw a nice video by @theTechNotice saying intel idle was like 50w lower than AMD. At 10hours a day, 5.5 days a week, £0.34 per unit, that's around £50 a year difference, just for CPU. That's now. If you're looking at 5-10 years lifetime, who know where prices are going. That could be a point for intel in an otherwise close race. In productivity, most of time is not spent building or compiling but editing and thinking
Great Vid guys. Before the recent price drop. I was wondering if the $100+ more for the 7900x over the 13700k was worth it and this sheds a lot more light on my inevitable decision
What a top-notch review. Most helpful as well, I have a 9th gen i5 system that I want to gift to my little brother and I've been exploring what my options for a new system are. Thanks for making this
I got it, I think it‘s a winner, cheaper than competitor, great gaming, single and very good multicore productivity performance. Also runs DDR4 i could port from previous build.
I just picked up a 5800x3d off of Amazon for $345. It would seem AMD has yet to kill it off. I don't blame them, though. Because now I have a gaming CPU on my AM4 platform that will be competitive for the next three years.
Why would they kill it off? It's a 7 month old chip with users like you in mind who don't want to spend a ton on mobo and RAM upgrades. And it's like 5% slower than the 13900K at 4K with 4000-series cards.
@@pauljeffs7 You misunderstand. 5% slower is nothing. I'd buy the 5800X3D over the 13900K in a heartbeat. It's weird how you thinking that I touted 5% better framerates as some kind of accomplishment actually makes sense in the current landscape. But even in the comments some people really will only look at the fps metric and conclude that it makes sense to get the 13900K just because it's "the best".
Another wonderfully informative video! I recently upgraded from a 9700k > 13600k a few weeks ago, and this review further cements the fact that i made the correct choice and that it is the best value/performace ratio CPU for gamers at the moment!
I just upgraded to a 13700K coming from a 9700K as I wanted at least 8 performance cores as I already had 8 cores from before to really feel like I got a full upgrade sorta. I went with Intel only due to cost since being able to stick with my current Samsung B-die DDR4 RAM and the Z690 DDR4 motherboards were overall pretty cheap compared to most DDR5 and AMD motherboards, was saving quite a few pennies there. It's kinda funny how the roles have reversed that Intel has gotten the "budget friendly" option. The temperature was a bit of a shock initially and I use a Arctic Freezer II 360 AIO + Arctic MX-6 thermal paste with Thermalright CPU Contact Frame so I was expecting to be quite well set but even with this setup it would reach 90C+ fairly soon on some cores in CineBench R23 with AVX-2 load but then I also saw it boosted up to 1.462v during full load which also came a bit of a shock coming from a 9700K which I ran at 1.34v and 1.36v was sorta the maximum possible on air cooling on that CPU with acceptable temps. Oh well running XTU's CPU stress test for example show a much more reasonable 72~80C or so max temp and while gaming for example it's more in the 50-60C max or so. Instead of undervolting since the temps were still "acceptable" I went for boosting clocks as much as possible at the standard volts (or whatever the motherboard was feeding it anyway) and only had to change LLC set on my ASRock motherboard from LLC 5 to level 4 (they are reversed on ASRock as in Level1 will provide the least drop and Level5 the most) and was able to get even 5.7 - 5.6GHz stable across the cores up from 5.4 / 5.3GHz standard and temps are pretty much the same as before. Changing the LLC setting XTU reported like ~213W'ish at LLC5 and 230~241W or so at LLC4 setting running CPU stress test if that is of any accuracy idk but it stopped CPU from failing the test. The LLC5 setting did run a bit cooler than LLC4 too. The memory controller appears much more stable compared to my previous 9700K so reached a bit higher RAM frequency with my Samsung B-die DDR4-3200 CL14 4x8GB DDR4 sticks at same voltage. Went from like ~47.000 MB/s Read, ~46.500 MB/s write, 48.500 MB/s Copy and 58 ns latency with default XMP profile settings in Aida64 benchmark to like 53.000 / 51.000 / 55.000 MB/s / 49.5~49.8 ns latency just tweaking around the ram timings and frequency, I run at only DDR4 3400 CL14-14-14-28 at 1.41v so not really pushing these RAM sticks I've had for a very long time but I feel I'm getting very good performance for the modest settings so might keep it like this as it will be a while until I upgrade the system next time. 3DMark CPU score went up from like 18xxx something to 20.500'ish with the OC settings.
I would give both of my kidneys to have GN use their resources to benchmark ARMA 3 on these CPUs, it's so incredibly CPU dependant and scales well with the cache on the CPUs, I'd be so incredibly curious to see what it scales/performs like on different hardware. Unfortunately I don't think the game is quite popular enough to ever be used in a benchmark.
I have the 12600k. its great and I really like it. I have it paired with an MSI VENTUS 3X 3080 Ti and I play at 1440p on a Samsung G50A 27". great CPU and I get some pretty crazy FPS at 1440p.
I think your Premiere Pro tests are inaccurate. I guess you turned off iGPU so Intel Quick Sync didn't work in your tests? 13900k must be 20-30% faster than 7950X.
They’re testing CPU bound cases anyways so it really doesn’t make a difference, and they’d have to retest every other cpu with the 4090 to have consistent data and that’s crazy to ask for
I know it will be a lot of work but I would also like to see this and also another test when the 7900xtx comes out to see which if certain cpu works well with a certain gpu
Interesting that the i9 often has lower lows than the i7. I'd argue that this is more important than the averages when averages are so high. You won't notice 160 vs 180 average but you might notice 60 vs 80 minimums.
Thanks for the patience on this one! We finished it ~2 weeks ago, but we didn't complete the editing since we were working so much on that 12VHPWR story with the 4090. But now it can finally go out!
Watch our revisit of the AMD Ryzen 2000 series! ua-cam.com/video/VcnfsEjckqY/v-deo.html
Find our Intel i5-13600K CPU review here: ua-cam.com/video/todoXi1Y-PI/v-deo.html
And our Intel i9-13900K review here: ua-cam.com/video/yWw6q6fRnnI/v-deo.html
Hey Steve, any plans to do a 4080 teardown? Or would there be no point because it’s similar to the 4090?
There has been a ~20% AMD CPU price reduction in between.
The 7950X is only USD 555!
Why isn't there a CPU benchmark for cpu heavy strategy games like Crusader Kings III or Stellaris? Like how long it takes a game to finish playing on max speed. The games get really sluggish at the end.
Why don't you include 4X games like Civ6 in the game tests which can actually stress test the CPU, instead of running 1080p tests with a 3090Ti? You're testing at 1080p because at higher resolutions you're GPU bound, I get that. But these 1080p tests are rather pointless, as not many people play 600 fps 1080p games. Phil's Hardware included Civ6 tests and there you can clearly see which CPU is better in a real world relevant scenario.
Looks like AMD lowered their prices for Ryzen 7000 across the board. I’m so glad we have such a competitive CPU market now.
youre still coping huh?
13900k beats amd in almost every game. amd failed this gen were back to the pre ryzen days i guess
@@luxemier lol, you live in a world where only gamers exist?
@@luxemier ?? Its still competitive lmao
@@luxemier first, it trades blows with the 7700X, and the 7900X and 7950X are workstation CPUs who suffer a bit of latency due to the dual CCD. Second, wait for the 3D SKUs lol. And that's only gaming, in productivity the 7950X wins or matches the 13900K in all workloads.
Sadly it is only a temporary black friday thing, it is already confirmed, so we are most likely back to MSRP after this or next week
Your videos are the reason I was able to make an informed decision about buying what is best for the buck in my budget range. You guys with your top-notch reporting managed to save me hundreds of euros in the long run. Thank you for the work that you do.
So awesome to hear that! Thank you for letting us know, and glad to hear you're happy with the outcome!
They really are the absolute best. True professionals without loosing the common touch
Still need to google up prices of your local country, in mine R7 7700X = 391 Euro, 5800X3D = 323 Euro and i7-13700k = 499.99 Euro
I'm glad you've sent them at least 2 euro tip, when they've saved you several hundreds.
@@GamersNexus yes.. Gamers Nexus and harware Unboxed are the best...
Thank you for this review! I haven't upgraded my pc in over 10years. Was still running an i7-4790k and a 980TI. Was still able to play games. But I splurged and got this cpu and a 3080TI. So glad I did. Night and day difference and this review made me happy. Thank you!
i also moved from the 4790k to the 13700... the 4790 was a great chip, but there is simply no comparison.
I went from 4790k to Ryzen 5 3600 to the 13700k.
It's an absolute beast
I had a 4790K too and moved to this cpu and looking to get amds 7900xtx
What a massive jump, like going from ps1 to ps3. Nicely done!!
I'd possibly still be running 4790k and GTX970 to this day if it wasn't for PSU dying, CPU temp problems and GPU software problems. Performance-wise they still hold up "acceptably" in 99% of games for 1080p.
Crazy, I was literally just checking your channel to see if you guys covered this chip yet. Thank you GN!
MAYBE WE WERE CHECKING YOUR SEARCH HISTORY
@@GamersNexus that entry on my search history 3rd down, it was for research purposes, I swear!
This makes me feel good about my purchase of a 13700k for $400 flat (thanks MicroCenter). I upgraded from an 8700k so the improvements are just crazy.
Wow! 8700K was good for years, though!
I'm into doing the same upgrade, did you purchase z690 or z790?
I got mine in bb for 379
I'm still running an 8700k as well. It still runs great, but I recently upgraded to a 3080 and I'm wondering if I should upgrade my CPU now.
Same but from the 8086K but to the 5800X3D. The options are great and you cant go wrong with either vendor atm
upgrading from a 4790k to this cpu was a massive boost in performance!
I did the same, was considering waiting another year, but in my case I'm ok with saving $ with sticking with ddr4.
Bruh that’s like going from a fiat panda to a Bugatti Veyron. Congrats
One of the best things to remember with that kind of upgrade too -- almost anything will be a huge increase, so it's hard to go wrong!
@@GamersNexus truth. When they dropped the 5600g to $110 a few days after, I had a few second thoughts... But I've learned my pattern of behavior actually prefers fewer upgrades over longer periods of time, so I don't have buyers remorse anymore. If we get 2x the performance at the same price within the next two years though... Then maybe. But I'm just glad we got out of the ivy bridge to kaby lake slump.
Yooo nice, I'm upgrading from a 4770T to a 9900K I got for a great deal
Upgrading from 4790K to 13700K. First time in all those years I've felt it is worth it. Hoping this one will last just as long!
Crazy that I’m in the exact same boat almost 8 years on my 4790k what motherboard are you going with
@@jkiu2692 I'm also still on 4790k, that thing won't die and still works ok. It's by far the most long lasting cpu I ever had and I'm still not convinced that I need a new one yet.
Who are you and why are you copying my cpu choices
Crazy that I’ve used a 4790k for over 7 years and just decided to upgrade to a 13700KF and so did you
Well, it's 9 generations of difference so of course it's worth it. 🤔
Upgraded to the 13700k from the 4790k. The improvements are just unreal. I loved the 4790k though...
holy paytience
@@Fahif808 Patience or being poor, that is the question...
I did the same, from the end of DDR3 to the start of DDR5 !
@@novideohereatall you can be poor and still be impatient. So either way 🤷♂
same! from i5 3570k to this
Man, I was considering going for the 13600k instead of the 13700k. But then I thought "I'll watch the GN review". And after watching this, I've decided that for 350US for the KF (it was on sale) against 300 for the 13600k? It was a no-brainer upgrade from the current 4470k I've been running all these years.
Thanks so much! I always love your reviews and your processes, and I love the fact that I was finally able to act on an upgrade, after years of watching.
Do not go kf unless u know exactly what u need, intel igpu act like a booster help speeding up PR streaming and all other sht. If u do any sht relate to adobe than k is a no brainer
@@zihechen3111 Thanks! I mostly do gaming and very light streaming (Older games most of the time lol). So I went kf either way, I am rocking a 1080ti still and I'll hold off for one more gen to upgrade that. But yeah, I do no workload-related activities on this PC.
@@zihechen3111 so kf is more for adobe and not for gaming? I don't understand the difference between k and kf
@@rainretribute9852 k is more for adobe. If does not have igpu, k does, and igpu is a must have if u want any adobe work
Upgraded from 8700K to 13700KF. Have it power limited at 190W with PL1/2, and it achieves basically the same performance in games, and just shy in benchmarks from stock. Best of all, can avoid the annoying thermal spikes when load starts!
so are you saying its not worth upgrading?
@@ganomaly Ah I see how my comment is confusing. I meant that the 13700KF performs the same in my games when I limit the PL1/2 to 190W compared to stock. Not compared to the 8700K.
In terms of whether it's worth upgrading it depends what's your current CPU and what you're trying to achieve
I still have the 8700k at 5ghz. Wondering whether I should upgrade or wait a tad bit more. Either and or Intel not sure haha thank you for the reply. It'll be for gaming btw, and I've just picked up a 3090 couple months ago. Any advice would bee appreciated 👍🏼
Happy that I went for the 13600K, saw an 80% improvement in iRacing fps over my overclocked 9600K. Probably even more now that I discovered the MSI "CPU lite load mode" that was set way too high from the factory. CPU temp dropped almost 20 degrees by changing a single parameter in the BIOS.
What does that mode do and how did you fix it?
@@mattdayman9632 it changes how much voltage is added «for safety» under load/turbo conditions. Too much voltage gives unnecessary temp increase..
@@peiiider how do u do this?
I had a i5 8500 since 2018. Few days ago I bought i7 13700K. I am super happy with it. currently using a Noctua NHD15 with it. It gets a little toasty under high load but the performance increase is great. As always great review
I'm planning on either getting a 13900k or 13700k, but now way I'm running either without a 360mm water cooler lmao
I would appreciate , although it is less common in gaming , if emulation benchmarks were at some point included. Mainly since those are more CPU intensive than GPU and have been making strides in optimisation and compatibility as of late.
Just take an AVX512 i7-12700K batch with DDR5 and Ignore Zen 4 and Raptor Lake. IPC is the same (for 13th gen) and a 6% downgrade on Zen 4.
Would love to see more about coolers, temps and how to tame this beast of CPU. Definitely a great value for 3D work, but power draw worries a bit.
great value for 3d work ?? it's a "C"pu not a "G"... 🤣
i was literally in the market for a 13th gen i7 or i9 and this was the video i needed! I imagine many others are making this transition as well during the holidays. great content as always! will continue to give you my watch time and likes!!!
So which one are u getting ??
When i built my current computer I put in a 6700k and everyone was telling me i wasted my money. I explain i wanted to keep using it for about 6 years but didn't matter i apparently wasted my money. It took only 2 or 3 years for the 6600k that everyone told me i should of bought to start struggling in games. Not making that mistake this upgrade skipping the 13600k again.
@@Kryternsorry i don't understand 100%. r u getting the 13600k instead of the 13700k?
@@rainretribute9852 I got the 13700k
The generational increases from Alder Lake are really impressive. Especially compared to how Intel CPU’s used to be for years. Love competition!
Si cierto CPU potencia de intel es único que trabaja más mucho para jugar máximo FPS por 1080p más mejor mucho que menos 1440p y 4k? Creo que GPU es una de parte mayoría en cuello de botella si existen
I'm in the middle of a complete rebuild from my 8700k (sold my tower before a large move) and I decided on the 13700k shortly after launch. This definitely helps me feel good about my decision.
Same here except 9700k
Same as well from a 7700k
Im so torn now because of amd price drop. Going from a 9700k aswell. I have a $100 gift card for Amazon too so I can get the 7700x for 250 or 7900x for $374 . My third option is microcenter 13700k for $380
@@foster4241 Just remember AMD's new stuff is ONLY DDR5. If that matters to you price point wise.
As of late August, 2023 - Micro Center has this bad boy on sale in a bundle. The CPU came out to 260 bucks, got 32GBs of DDR5 6000 36 for like 65 bucks, and a z790p wifi (didn't want the wifi capabilities but whatever) for 170 bucks. Couldn't pass it up. My 3070 is very happy and so is the credit card company lmao.
Been waiting for this! Been eyeing this to upgrade from my 8700K
I got it for $350 at Micro Center, when it was on sale, last week.
Great price!
I got mine on launch day via best buy for the same price... I had them price match micro center then used a 10% off coupon + I had a free $200 gift card from recent phone purchase and some rewards points all in all I ended up spending about $150 after tax for mine!
I definitely already watched the video and am writing this comment as a reflection of what I have witnessed in said video
😆
I, too, have thoroughly consumed said video and observed Steve loudly proclaim something at some point in this video. Dry sarcasm may have been involved.
Actually an informative and interesting video now that I have actually scrolled through it. Serious consideration for a 13600k for me tho, it doesn't seem like the 13700 is worth it for the $. I do video editing and light gaming, and my FX 8350/hd 7870 is feeling a bit tired. Seems like intel is on top this time around.
Built a system for a pal who brought a 12100F (unfortunately) and honestly, even at the extreme low end it absolutely blew me away how quick it was considering i'm still running an overclocked FX-8350.
Because Bulldozer IPC was so bad that 1st gen Intel CORE is faster per clock and anything above feels high end compared to that piece of trash. Being more accurate, is not the IPC itself, it's the terrible floating point speed.
Alder Lake and Raptor Lake's p-core floating point is ridiculously good. Emulation beasts and destroy Digital Audio Workstations (they are all floating point based).
@@saricubra2867 He’s running piledriver, not bulldozer. That’s the 8150
@@ARH0101 Same sh*t.
@@saricubra2867 No, they’re not. They’re both FX chips of the same micro architecture That’s it.
@@saricubra2867 For certain tasks it was even worse than core 2 duos, lol. I distinctly remember an overclocked E8600 beating overclocked 8150 in Skyrim, lmao.
The CPU market is finally exciting again! You really love to see it! Can't wait to see DDR5 to come down more so i can take advantage of these latest improvements though. It's still a little out of my price range. Thanks once again GN team. We appreciate the work you all do!
Is it? Entry level amd platform is like 1000€ with ddr5, 7700x and mobo
There's no games that even seem to be benefiting tbh :( the problem right now is the game development itself
@@infernaldaedra Nope, you have never needed a good cpu for gaming.
I run 4k oled TV as monitor and 5820k from 2014 and I am not CPU bound at all with my 6900XT
@@DuBstep115 Cap
@@DuBstep115 what is the made of the monitor and how is it with text/coding?
From 9900k to 13700k was noticeable, and got to keep my DDR4 !
Also 5.5 GHz on p-cores without any effort, making it basically a 13900k for gaming
was looking at this upgrade...was the additional heat output noticeable as well? My rooms already getting too hot with 30 series and overclocked 9900k
@@Mekojonessharktooth you're putting out 250 watts of energy, a lot of that will be heat. So yes it will expell more BTU than a 95 watt 9900k. I have a 9900k and just upgraded to 13700k. I was able to undervolt -.090 vcore to get the temps to stabilize around 80c. My 9900k was UV too and would hit around 72c under max load. So there will be a 12% increase in temps. That doesn't directly correlate to how much warmer your room would get. There's so many variables.
TLDR. Yes your room will get warmer around 10% fast or warmer than before, depending on room size, ventilation and cooling.
@@Mekojonessharktooth Just go and buy AMD. Much more efficient.
DDR5 prices are only a bit lower than DDR4 right now, if anyone's getting a new Mobo, it could be the time to go for ddr5
(But yes absolutely, if you already have expensive high quality DDR4 kit, keeping it is the better option at the moment)
@@lagarttemido just stfu about amd
I’m finally retiring my 4770k for the 13700k. Can’t wait to put my new rig together.
Same here LOL ... rocking with 4770k and RTX 3090 ... Maybe I go with 13700k too =)
I wanna go from 5800X to 13700K. Intel for the win.
I went for the 12700K vs my i7-4700MQ (similar to i7-3770). Raptor Lake IPC is the same as Alder Lake.
this is amongs the clearest and simple to understand comparatives visual I've seen on this platform, thank you for your work.
I was able to get a 13700k this weekend at $370 (thanks MicroCenter), looks like it should be pretty good value at that price! It's one hell of an upgrade from a 2700X. Thanks GN!
I paid $350 on launch day thanks to price match at bb + 10% off coupon then had $200 free gift card from recent phone purchase so paid like $150 for mine!
good for u
@@TJ.85 Nicely done!
how is the power draw compare to the 1200k ? do u know
@@guilladogui8259 Jinormous
I'd love to see a section about undervolting in these cpu reviews. Can you undervolt it? What chipset do you need? How much performance do you loose at which power target?
With these modern CPUs and their enormous power consumption and cooling requirements that gets more and more interesting.
I guess the reason undervolting doesn't show up in reviews is that it's almost entirely dependent on the silicon lottery. One CPU might be great for undervolting, and another sample of the same SKU loses stability as soon as you touch the voltage.
Seriously beginning to feel like picking up the 5800X3D was a great decision. I only use my pc to game and its still very solid compared to most of the recent CPU releases. Couldn't be happier
I have yet to see a recommendation against the 5800X3D if all you do is gaming. Well, it depends on what resolution as well, it matters less at 4K but might still matter quite a bit, especially at the lows, if the particular game you like can avail of the large cache. Of course it depends on what you paid for it as well.
@@Vantrakter 1440p paired with a 3080ti. Got it for something like $375? It was on sale like 6 months back. Sold my old 3700x to a friend in need of a CPU for $100, so ended up even cheaper in the end. For most of the games I play I didn't notice much of a difference, but in Escape From Tarkov and a few other titles the difference was fairly noticeable.
13600k/13700k is better? why did you buy last gen?
@@jordanplays-transitandgame1690 because i started building it almost a year ago...
@@tonederf6419 oh then lol you got a good build
I want to thank the GN team in it's entirety, you guys are my personal favorite YT channel - and one of the extreme few that I watch constantly. Some things I really appreciate: your integrity when it comes to dealing with companies, and other business relationships, and the clearly high levels of pride you rightfully take in your own work.
I notice that unlike all other UA-camrs, you don't jump-cut your speech every 3 seconds, and instead just, speak passionately about the subject at hand. It also seems like you Steve, are not reading word for word off a script, and instead reference well-written notes. This makes the delivery of the information and the video itself feel authentic, warm, and real. Also, the editors and graphics artists have done wonderfully. I love the falling timer-bars on the sides of the screen when displaying a graph. A small, yet very welcome and helpful detail that few tend to think of. Very well done, GN you are the model that others should strive to be, and you have my respect.
Thank you for the phenomenal videos and entertaining hardware and engineering documentation.
Going from an i7 3700k to the 13700k is crazy. I loved that 3700k, but this new one is fantastic. Got it for only $329 new from Microcenter too!
i have a i7 3700k too. Good old gpu
dude I swear I am like you I wen from i7 3700k to the 13700k also from GTX 980 HOF to RTX 4070TI ROG OC OMG
@@016duda Haha nice, I also had a GTX 980 and went to a 3090 Ti.
@@OGZxK1LL3R we hit our lowest point dude 😂❤️
Hello! For the production section of the CPU evaluations, have you considered adding a music production software, whether that be ableton live, fruityloops, etc.?
Y'all are the gold standard for reviews in my book and knowing how the music to gaming overlap exists would be valuable to someone like myself.
Thanks!
How would you guys feel about changing around a little with your CPU blender benchmark?
Instead of showing graph of render times with all CPU's, would there be a way to say benchmark FPS performance in viewport like just rolling thru your intro animation? Since no one 'is or should' render with their CPU's anyway, but it seems like they do a bigger difference during viewport performance. Where you acually do the work before giving it to the GPU for render. Or I might be wrong.
Love your work as always! Cheers
It doesn't matter...the test is to show performance compared to other CPUs and they have the data for older CPUs to compare it with, wouldn't make sense to change to a new test when they already have the data to use for compareing against another cpu
@@samgoff5289 Fair point. It would be like adding another game to the chart. Annoying yes but over time it could work.
Just feeel like blender charts are a little bit miss representive. Example: TR 3970x vs. 13700k. On a normal chart the Threadripper cpu would win since it has more cores and can render faster. But no one renders with CPU power and I bet the 13700k would be overall a way better experience in viewport performance. But! I might be wrong. Maybe time to dig out the old parts at home and test them.
No joke, I was just searching Reddit a couple hours ago trying to find out why there wasn’t a GN 13700k video. You guys rock!
Im planning on building a PC soon and your reviews have been invaluable, thank you
Great review and benchmarks. I recently got one and I must say I'm very satisfied with it. Even though I could've gone with a 13600k.
Good stuff! Went from a 9700k to a 13700k, definitely have noticed a performance improvement
yah! 2-3 interval was the best path upgrade
I'm on 9700k and my 13600k is arriving soon. Can't wait.
You guys really enjoy buying into dead platforms, don't you? How did it feel like watching everyone upgrading on AM4 for cheap? Ready for history to repeat itself on AM5?
@@lagarttemido I mean considering we got the others on release 4 years ago and only upgraded now, none of us are even going to notice, cause as you may have noticed, we're not the type to upgrade every single time a new thing comes out.
@@lagarttemido I've been looking through all these comments and no one is opting for the 7700X or 7900X, which I'm thinking of getting because in 2024 or 2025 I can buy the chip that's beating 14 or 15th gen intel with no motherboard move. I don't know why you blame them though. That "dead" platform is insanely fast - and stable. Intel also has way higher memory bandwidth than AM5 is achieving w/ DDR5 too. A lot of Intel users upgrade every 5-10 years. A new platform is always required with that type of frequency. I do think it's a perk to upgrade in 2-4 years to whatever the latest CPU was for your platform. People that I know in the gaming community don't do a CPU swap unless they're going to the latest chips on the latest platforms. Hence why people don't care it's "dead" socket wise.
Love the i7 gen chips. Looking to upgrade my motherboard and chip from the i7-7700 to the i7-13700k (Currently priced at $375.99). This upgrade is also going to be my first step into AIO's and I'm glad you tested the performance with the Arctic Freezer AIO I intended to buy for this build. Great review! thank you!
I think it would be very useful to add system (or CPU) power consumption during idling and/or gaming, because I think not a lot of people need their CPU to execute heavy tasks continuously...
You shouldn't buy this cpu for gaming so it is irrelevant. Buy this if you render. Then again you get the better of both with 7900x.
@@robertmyers6488 I was speaking in general, not specifically to this CPU.
Though I'd argue that even people who use high-end CPUs for rendering may have significant idling times. And that gamers do buy these CPUs
Intel has lower idle power consumption because AMD's I/O die drinks power when idle.
@@orangecapy9494 gamers absolutely buy these chips.... It's not that much more than 13600k and does offer meaningful gaming uplifts and at 4k/120 every frame you can get counts but it does it without costing 50%+ more like that 13900k (now THAT'S a chip only a production person should worry about)
I got my 13700k for $350 and it's well worth it for gaming!
@@__aceofspades you are burning a hundred watts more per hour in performing the supposed benefit of this cpu. The idle doesn't. Require cooling and a larger power supply. Face it burns a lot of power that requires costs. The 7900x is a no brainer for the same thread count.
Thanks as always for your reviews GN! Hope we can get some more RAM and HDD & NVMe benchmarks?
I'm feeling just fine with my 13700k that I upgraded from the legendary X58 platform with a Xeon X5680 at 4.4ghz. I'll just say the gains were fairly substantial! As far as power usage goes it seems Intel was quite safe making sure all CPU's remain stable and cranked the vcore much higher than it probably needs. Mine would near 1.4V at times and with a -100mV undervolt I dropped the package temp from 87C to 78C in cinebench. Not only did temps drop nearly 10C, power saw around a 50W reduction under load and to top it off it scores the same if not higher 30k plus in MT. It seems undervolting is the new norm for both Intel and AMD and if you don't do it you're missing out on a lot of free advantages.
I also upgraded from an intel 4790k to the 13700. The increased performance was very noticeable.
undervolting has been the thing to do for the last 15 years ...
Microcenter has the 13700k on sale for $379 as part of their early black friday deal! Picking mine up tomorrow 🤪
1:37 Thank you for listing the cpus and prices. Normies like me don’t have all of the names and prices memorized. The table is really helpful
Can't wait to see it compared to the 8080
I have a i9-13900k and I still can't stop making googly eyes at the i7-13700k...
The 13700k just makes way more sense for people who want "high end" but don't actually need production work tasks... And it's like 50% more money to get 13900k over 13700k if you shop around.
This is a very awesome review. Hopefully you'll feature 13th gen benchmarks on ddr4 motherboards and memory soon
I was waiting on this review to decide between AMD and INTEL this generation, great review!
I upgraded from a 8600k to a 12700k via the microcenter deal. It is blowing my mind how powerful these new chips are.
Absolutely! I went from 9700k to 12700kf. I haven't used an app or game that use the whole cpu yet.
9700k to 13700k now… haven’t gotten to try it yet but 👀
My heart says I want the 13900k, but realistically I know the value is just horrible compared to the 13700k since I usually just game. Decisions decisions
Same here, love my 12700kf
@@Chadimusprime87 ive upgraded from 12700kf to 13900k for future proofing reason. plus i have an expensive mobo and i dont want to replace that for a while.
I decided to go with this from a i7 7700k. It def served me well but was def hitting a wall in anything cpu heavy at this point.
I had the same chip 2017-2020 but the "hitting a wall" thing is just not something I'm comfortable with plus I hate to see my parts lose all their value and basically end up as giveaways or just sitting in a drawer. I sold my 7700k for $300 and got a 10900k on its launch then sold that chip back in May this year for $390 and finally got a 13700k for $350.
I've managed to keep upgrading and yet I bet I spent only about what you spent in the same time or maybe even less....
@@TJ.85 If you enjoy upgrading that frequently you really should have moved to AM5.
Im going from a 6700K and it still serves me well, apart from VR where it really struggles
@@lagarttemido Hehe true, and aim for the 7800X3D next year ;)
@@laggmonstret so many amd fan boys, kinda cringe
Commentary on the 5800x3d is on point, seeing the performance compared to 13700k and 7700x in gaming sold me.
Please keep doing GTA. 😢
Pretty pleased with my 13700k purchase. After weighing the extra platform cost to jump on a new AMD socket AND pay for DDR5, I couldn't make myself make the leap.
Ended up buying a Z690 board and reusing my DDR4 memory.
I made the mistake of buying an i5 6600k years ago. This was right at the end of the "4 cores is all you'll ever need for gaming" era. I'm ok spending a little bit extra to future proof my purchase.
I am in the same boat as you i have a 6600K now and i really need a decent cpu now for gaming :)
@@silotes94 grab a z690 board and make the jump! It'll be a great upgrade.
6600 boat, EOL which is why now that the 7900x is on sale i feel inclined towards it given the same price given how Intel messed me up. I could possibly even make do with a cheaper cooler.
When i bought my 6700k back in 2016 and awesomesauce "now bitwit" and paul were saying i5 is all you need for gaming and in 2017 amd out of nowhere reminded everyone that wasn't the case lol I knew it in bf1 that 4 cores wasn't enough. Had my 6700k sweating hard out of the box. Even overclocked that poor thing to 4.7ghz at 1.4v for years lol Once I went to a 10700k back in 2020 it was a night and day difference on every single game I play. The 1% lows upgrade was the biggest thing for me. 8 cores is the sweet spot and i'll never go under 8 cores ever again. That's why i'm worried about the upcoming intel 14th gen. Been reading around the top of the line i9 cpu will only be a 6 core. Hope that isn't true..
@@SVT_LIGHTNING Well I can tell you that my new CPU is awesome so hopefully it will have a bit more longevity before it starts to show it's age!
5800X3D, 7900X, and 13700k are the only 3 cpu's I'd think about buying right now.
I kinda lean towards the 7900X, as it and the 13700 are stronger on productivity (mixed use), but the 7900x draws a lot less power, and actually increases in performance with a further modest down-tune in voltage (avoids thermal throttling, somewhat silicon dependent though). I also want at least 8 big cores purely for the primary application, with some additional compute outside those for background stuff (which the 5800X3D lacks). That matters in productivity _now_ but could eventually matter in gaming too. More pcie lanes on AMD too.
I like the Idea of the X3D technology with the RDNA3 where the gpu has faster cache from the CPU. I'm curious how much a benefit you really get though.
based on what you are upgrading from going 7th gen AMD you will need to change a lot more than just the CPU
@@SALMANNNNNNN Yeah. Not really talking about an upgrade, but an all new build.
I'd kinda like to keep the old beast functional for other purposes.
I was planning to build an I913900K for my first built, but stepped away from the I9 because of reported high temps. I wasn't sure about that for a first assembly so i went with the I7 13700K.
Z690E STRIX
DDR5 32GB 6200Mhz
3090TI
I recently upgraded from a 8600k to a 13700k, couldn't be more pleased with this processor. It exceeded my expectations by far.
which cooler are you using liquid cooler or air cooler? I'm using deepcool LE500 for i7 13700k
I've got a 13600k coming to upgrade from a non k 8700. I'm real excited lol
I just upgraded from an AMD FX8350 to a 13600k.
@@MustafaKhan-fs3np Noctua Nh-d15 chromax black, dual 140mm fans
Ddr4 or 5?
Upgraded my sandybridge 2500k to a 13600k. It was a lovely jump in performance!
Considering this jump from my 2500K now
@@rounaks6519 well worth the upgrade, the 13600k is 2x as fast as a 5ghz 2500k in single core performance (3x as fast as a stock 2500k) and 5x faster in multicore workloads. They also overclock pretty well if you get a good cooler.
@@NomadJRG what cooler did you end up getting
@@rounaks6519 Cooler Master Hyper 212 Evo V2
I don't play a lot of high FPS games, but strategy and simulation. Is there any chance of seeing a Civ 6 turn time benchmark when the 7000 series X3D comes out? Would be nice to compare 5800X3D 5950X 7700X 7950X against a hypothetical 7700X3D.
Been waiting for this. Picked up a 4090 recently and looking for a cpu upgrade as well over my “old” i9-10850k
Please don't call it old, that's what I have with my 3080. Yeah I know it is old technically but mine replaced an aging but defiant 2600K !!
Wait for AMD's new X3D lineup in a few months
I replaced my 10900k with this chip and couldn't be happier it's a beast in gaming and for what little "productivity" stuff I do it's been tearing right through it as well!
Well worth it ESPECIALLY if you can get it for micro center pricing of under $400
@@blkshp25 That is the problem, always waiting for what's round the corner. It won't be cheap either, certainly not $50 more. will probably be priced to undercut the 13900k say $590 to $600 depending on how greedy AMD want to be.
@@blkshp25 I was gonna wait for amds x3d, but after looking at these reviews, it might just equalize with the new Intel cpus
Great video and echoes my own personal gaming and "creator" / professional experience. I got a 13900K as so much of what I do for work is single thread / core clock dependant but for gaming + part time studio work a 13700K is a heck of a performer and you could get (couple of thoughts) faster DDR5 (if your workflow even cares 5 vs 4), spend more on a GPU, larger and/or quicker NVME, heck put the difference toward a custom loop...
such a well made comprehensive review of the different cpu's. thanks so much!
So basically the 7-13700k will work well for music production. Good to know.
Even happier with my purchase after watching this video. My 13700k will arrive Friday and I will finally be putting my 3600 to rest. Can't wait to see the difference in performance.
From 3000 gen, you should see a HUGE improvment. 13700k will last you for a while
@@UncleSev 3600 is amd, intel was ivy bridge on i7-3770
@@warrax111 ah, my bad 😅
Purchased this CPU in November with a bundle from Micro Center, and I really couldn't be happier. Running it at 5.3, and currently using an RX 7800 XT, which Micro Center had at an amazing price. Had a tower that was about 10 years old, using an i5-3550 with a 1060Ti. Such a huge upgrade. Thanks for all the great videos, keep them coming!!
Purchased mine on launch weekend as a replacement for 6700k, happy with it so far. Only needed to undervolt for it to not throttle with a Corsair H105 under full load.
I got the i7 13700k for 450 euros quite a big deal 20 days ago it was 540 here. CPU market is great unlike the GPU in europe. Also Intel supports 7200+ ddr5 now. Which is finally a kinda good upgrade compared to ddr4 3600 at close to double the CAS latency. It´s in the mail, thanks for validating my purchase. Now I can sleep well xD
I'm curious about this, I haven't done a deep dive on ddr5 but I've noticed the CAS is much higher. Is there a performance gain to be had in gaming when double the speed but also doubling the CAS?
@@kernoleary1394 Well from my basic knowledge when you double your speed and double your CAS you get the same latency but at a higher data rate. So in basic theory DDR5 should perform in all cases equal or better (if high data is required e.g. Open World).
But there are like 50 timings in a ram so take this with a grain of salt. (and I dont know how they affect performance)
@@kernoleary1394 Oh my god its Kevin O' Leary.
@@awof indeed, undercover
I just upgraded to the 13700K from a 9900X and it is SUCH an improvement. I actually had the i9 in my cart, but by the time I compared a few motherboards it had sold out. Honestly so happy that made me consider the i7 because I really needed that "cool your jets" moment 🤣 The difference between the i9 and i7 seems so negligible for my uses
Was going to jump back to Intel, but ended up with a 5800X3D instead. That's a real money saver that ;)
I just parted together a new PC and put it on order yesterday and did the exact same thing as you. I was going with the I9 13900K but after some research I learned that the I9 13900 wasn't really practical for my application and the I7 13700 is more than enough, it'll be a HUGE upgrade over my current PC that uses I5 6400K.
Thanks for all your in depth reviews. I built my first PC a year and a half ago and to this day it's been going strong and stable the entire time.
Started with a 2080 super and 10600k OC@5Ghz and I've since upgraded to 3080ti (after price drop, lol) and 10900k OC@4.8Ghz cooled by a noctua NH D15 in a Lian Li lancool II mesh. Almost all parts I got from you guys' recommendations and I wouldn't feel comfortable buying a PC part if I wasn't able to get the input and extensive reviews you guys provide.
This'll be stepping up to the plate for my retiring 6700k.
The 13600k power consumption versus the 8% to 10% performance increase against the 13700k starts to make the long-term energy cost a considerable factor.
offset -.100-.130v
The i7 also powers more cores. So of course it will draw more power. The i7 and i9 are for people that do more than gaming at 1080p.
i7 = Gaming at max settings, Ray-tracing, 1440p-4K + streaming
i9= all what the i7 can do + 3D and render work and more.
But as others have pointed out, Intel is doing an old trick from AMD days before Ryzen. Jack up the power to gain more performance.
Absolutely, also in terms of not contibuting to the planets heat death seems like a smarter option.
If you care about power consumption, you can just limit the CPU power available. At the cost of reduced performance... maybe. Seriously though, in gaming, it rarely goes above 150W, usually more around 110W. And like I said, you can limit it for full load scenarios. And, say, decrease the power to half, but only get something like 15-25% less performance. You'd be surprised how efficient even Intel chips can get when they are not pushed to the maximum. Though on average, the Zen 4 chips are still doing better (being on that TSMC's 5nm helps a lot)
@@_pant0m You don't even need to offset anything. I have Asrock mobo with i5 13600K, everything at stock, just changed CPU Vcore Compensation to Level 1 and CPU Cooler type to Air Cooler and now I am getting 70-75C on Cinebench R23, cooled with Deepcool AK620. This CPU rocks!
OK, but if you are pairing this with an RTX4080 or 4090 and running games at 4K, does the 13700K have an edge over the 13600K, or is the issue simply GPU? i.e will running games at 4K on an RTX 40 series card have CPU bottlenecks with a 13600K?
You will be gpu bottlenecked at 4k.
Currently running a 3700x and planning to buy a 5800x3D (its in stock here). Is that a good upgrade or not?
Hello Gamers Team! Thanks for all the reviews on all new stuff! Do you know if my i9 9900k bottlenecks in 4K gaming, if i pair it with 4080/4090 or 7900XTX?
Just installed my new 13600k with a z690 (all hail Q-Flash Plus) following your review of it. These new benchmarks just further cement my decision. I really wanted to go AMD to upgrade my ageing 8700k, but the value in the 13600k is just crazy. Especially paired with a discounted "old" MB and the RAM I already had.
I hope AMD can do something for the 3Ds to redeem themselves on this generation a bit, the cost of entry is just too high and not justifiable right now.
Thanks Steve !
In the same boat. Intels offer was just too good. AMD is just not off to a good start this gen. But as you said, X3D will properly fix that. I just can't wait that long for a maybe. Hope AMD can change my mind when I need to upgrade in 6 years.
Ryzen 7000 already received heavy discounts here in Germany. The 7700X was 460€ at the time I bought it and the price dropped to 370€ last week (390€ as of today).
Which other components did you buy? Board and Ram and SSDs?
@@gasbrenner2264 MSI B650 Tomahawk Wifi (270€), 32 Gb G.Skill Trident Z5 Neo 6000cl30 (265€) and a WD SN850X 1 TB SSD (~110€). Also grabbed an Arctic Liquid Freezer II 420 for 80€ off of the Arctic eBay store, but a 360 or even 280 are more than sufficient. Just check if your cooler is affected by Arctics recall and has the corresponding "QC" mark if you're buying from a 3rd party seller or 2nd hand.
Thank you for not making a gaming exclusive video. This is just what I needed
Been looking for this review! 👍
"Sometimes software has specific quirks."
-The kindest thing Gamers Nexus has ever said about Adobe Premiere
Thanks for a great review and comparison. I am interested in buying either the 13600k or the 13700k, it will probably be used a lot for emulation. (rpcs3, yuzu, cemu etc). Not sure if the 13700k is worth the extra money or if would get (close to) similar results with the 13600k. I have read that the intel cpus are generally better than amd for emulators like rpcs3, but not sure how accurate that is. Anyway, thanks again :)
13600k will be more than enough
@@Jetdrag Thanks man. I will save around 120$ (In my country) by chosing the 13600k over the 13700k 😀
Thank you for the quality videos. I’m leaving a comment because I heard that helps channels out. I always buy mid grade equipment since I only play simple games but I like staying informed.
Very good coverage/explanation, as always, kudos.
Personally I just recently purchased a 5800X3D as last upgrade for my AM4 board, as it's the most "optimal" for my specific workloads (that's more dependent on RAM speeds and suffers a lot from cache misses, rather than raw CPU speed) with the bigger cache, as well as upgrading to the next-gen a complete rebuild is not financially viable for me, concerning board, PSU etc. upgrades that would be required.
A test for those kinds of workloads could be interesting for viewers, even if it's not the typical AAA gaming load (with Minecraft still being a very popular game that falls into that category from what I gather, and Factorio being a decently popular game where the community already established a benchmarking process to compare hardware). Just to show that it's not necessarily only raw speed that counts for everything.
Considering the current energy prices (at least over here in Europe), it might be interesting for "consumers" (read: viewers) to add a test comparing power usage for a non-maxed out "reference load" and at "max load" given a specific setup. Might also be interesting to compare this across generations to see how it develops over time, given that the current trend seems to be to "just draw moar powa to make it go fasta" (obviously simplified). I'm going to assume any of those numbers generated in such a test would need to be taken with a grain of salt though, since I'll assume a more powerful CPU might generate more load for other components that would increase their power draw as second order effects. Still interesting though.
I love this data. Really showing some real competition between AMD and Intel depending on what you are doing. Which is really great to see. Intel finally has to work harder and price accordingly.
Damn, denying the urge to upgrade my 6700k to something shiny and new is getting REALLY hard these days...
Great video. I still wonder why all popular UA-camrs recommend a dead platform or at least never express their concerns!?
As a suggestion, would love to see some data on idle power usage for CPUs and systems in general. Saw a nice video by @theTechNotice saying intel idle was like 50w lower than AMD. At 10hours a day, 5.5 days a week, £0.34 per unit, that's around £50 a year difference, just for CPU. That's now. If you're looking at 5-10 years lifetime, who know where prices are going. That could be a point for intel in an otherwise close race. In productivity, most of time is not spent building or compiling but editing and thinking
I wanted a fast gaming machine but couldn't convince myself to spend the top tier 13900k prices so opted for 13700k. It felt about right.
will be doing the same soon
Great Vid guys. Before the recent price drop. I was wondering if the $100+ more for the 7900x over the 13700k was worth it and this sheds a lot more light on my inevitable decision
Even if they were at the same price, I would still choose 13700K over 7900X
What a top-notch review. Most helpful as well, I have a 9th gen i5 system that I want to gift to my little brother and I've been exploring what my options for a new system are. Thanks for making this
I got it, I think it‘s a winner, cheaper than competitor, great gaming, single and very good multicore productivity performance. Also runs DDR4 i could port from previous build.
I just picked up a 5800x3d off of Amazon for $345. It would seem AMD has yet to kill it off. I don't blame them, though. Because now I have a gaming CPU on my AM4 platform that will be competitive for the next three years.
Why would they kill it off? It's a 7 month old chip with users like you in mind who don't want to spend a ton on mobo and RAM upgrades. And it's like 5% slower than the 13900K at 4K with 4000-series cards.
@@Outmind01 Because they wan't people to switch to the next generation
@@Outmind01 5% slower.... 45% cheaper.... And 60% less energy used... Don't be such an Intel fanboy lol
@@pauljeffs7 Intel fanboy? He's praising the 5800X?!
@@pauljeffs7 You misunderstand. 5% slower is nothing. I'd buy the 5800X3D over the 13900K in a heartbeat.
It's weird how you thinking that I touted 5% better framerates as some kind of accomplishment actually makes sense in the current landscape. But even in the comments some people really will only look at the fps metric and conclude that it makes sense to get the 13900K just because it's "the best".
Another wonderfully informative video! I recently upgraded from a 9700k > 13600k a few weeks ago, and this review further cements the fact that i made the correct choice and that it is the best value/performace ratio CPU for gamers at the moment!
9700k owner here at 5.1ghz. what games do you play and how much did you notice the fps/load times/... difference?
I just upgraded to a 13700K coming from a 9700K as I wanted at least 8 performance cores as I already had 8 cores from before to really feel like I got a full upgrade sorta. I went with Intel only due to cost since being able to stick with my current Samsung B-die DDR4 RAM and the Z690 DDR4 motherboards were overall pretty cheap compared to most DDR5 and AMD motherboards, was saving quite a few pennies there. It's kinda funny how the roles have reversed that Intel has gotten the "budget friendly" option.
The temperature was a bit of a shock initially and I use a Arctic Freezer II 360 AIO + Arctic MX-6 thermal paste with Thermalright CPU Contact Frame so I was expecting to be quite well set but even with this setup it would reach 90C+ fairly soon on some cores in CineBench R23 with AVX-2 load but then I also saw it boosted up to 1.462v during full load which also came a bit of a shock coming from a 9700K which I ran at 1.34v and 1.36v was sorta the maximum possible on air cooling on that CPU with acceptable temps. Oh well running XTU's CPU stress test for example show a much more reasonable 72~80C or so max temp and while gaming for example it's more in the 50-60C max or so. Instead of undervolting since the temps were still "acceptable" I went for boosting clocks as much as possible at the standard volts (or whatever the motherboard was feeding it anyway) and only had to change LLC set on my ASRock motherboard from LLC 5 to level 4 (they are reversed on ASRock as in Level1 will provide the least drop and Level5 the most) and was able to get even 5.7 - 5.6GHz stable across the cores up from 5.4 / 5.3GHz standard and temps are pretty much the same as before. Changing the LLC setting XTU reported like ~213W'ish at LLC5 and 230~241W or so at LLC4 setting running CPU stress test if that is of any accuracy idk but it stopped CPU from failing the test. The LLC5 setting did run a bit cooler than LLC4 too.
The memory controller appears much more stable compared to my previous 9700K so reached a bit higher RAM frequency with my Samsung B-die DDR4-3200 CL14 4x8GB DDR4 sticks at same voltage. Went from like ~47.000 MB/s Read, ~46.500 MB/s write, 48.500 MB/s Copy and 58 ns latency with default XMP profile settings in Aida64 benchmark to like 53.000 / 51.000 / 55.000 MB/s / 49.5~49.8 ns latency just tweaking around the ram timings and frequency, I run at only DDR4 3400 CL14-14-14-28 at 1.41v so not really pushing these RAM sticks I've had for a very long time but I feel I'm getting very good performance for the modest settings so might keep it like this as it will be a while until I upgrade the system next time. 3DMark CPU score went up from like 18xxx something to 20.500'ish with the OC settings.
I would give both of my kidneys to have GN use their resources to benchmark ARMA 3 on these CPUs, it's so incredibly CPU dependant and scales well with the cache on the CPUs, I'd be so incredibly curious to see what it scales/performs like on different hardware. Unfortunately I don't think the game is quite popular enough to ever be used in a benchmark.
The 12600K is quite impressive considering the price
I have the 12600k. its great and I really like it. I have it paired with an MSI VENTUS 3X 3080 Ti and I play at 1440p on a Samsung G50A 27". great CPU and I get some pretty crazy FPS at 1440p.
Very true
13700KF is currently $379.99 on Ebay dealer way cheaper then most stores.
Thanks for comparing all CPUs!! Respects from Argentina.
I think your Premiere Pro tests are inaccurate. I guess you turned off iGPU so Intel Quick Sync didn't work in your tests? 13900k must be 20-30% faster than 7950X.
I would have loved to see the benchmarks tested on the rtx 4090 instead of the 3090ti.
Maybe later when the cable problem is resolved.
They’re testing CPU bound cases anyways so it really doesn’t make a difference, and they’d have to retest every other cpu with the 4090 to have consistent data and that’s crazy to ask for
I know it will be a lot of work but I would also like to see this and also another test when the 7900xtx comes out to see which if certain cpu works well with a certain gpu
@@TheEmperorHyperion it is resolved
probably caught fire
Interesting that the i9 often has lower lows than the i7. I'd argue that this is more important than the averages when averages are so high. You won't notice 160 vs 180 average but you might notice 60 vs 80 minimums.
margin of error. in many tests the i5 3600k had better lows than either i7 13700k or i9 13900k.
only you have such amazing analysis skill steve and there are very , very few like you in the yt tech market right now
Nice, I was hoping this one would get reviewed