But I feel very long because I tried to built 12600k pc built 2 month before but I see the leak 13600k specs and benchmarks so I wait 2 Months like 2 years
Sorry for the mislabeled charts around 1:00. We actually fixed these before publishing but the old versions made it into the video. We are examining our processes so that we don't have these issues in the future. We hope to release a new version of this video with the fixes implemented right away.
Factorio is an interesting benchmark since it's a game made with extremely modern techniques in term of memory management and cache use, and also the sheer amount of things the game is keeping up
@@Multithreadi7 Yeah same! But super big factories with tons of belts and trains are a true CPU benchmark. Also the game is making really good use of CPU cache
Factorio is entirely single-threaded, save for I believe multiplayer networking. The real benefits of a Factorio benchmark are how heavily reliant it is on both CPU Cache and RAM speeds.
i'm upgrading from a 3770K to a 13600k. That is a 10 generation gap. 3770k has been used almost every day for 10 years. it has to have like well over 25 000 hours of use. and out of sheer respect for the 3770k i'm getting another intel cpu.
that's mad. respect for holding out that long... I've been using a 4690K for 7 ish years now, although I've just got a newer laptop to use as my main machine now!
@@Gentoo701 man why and how did you wait so long? that poor Q6600 must have been seriously sweating to do anything since about 2017... that's about a 15x increase in multi core performance, wild
Yes we're always on the look out for which new chips can run our factories the fastest (which also tend to be the best for compiling and developing the game).
I’d love to see more strategy games in your CPU benchmarks. A huge population in Cities: Skylines or Anno or something similar. A lot of action titles just don’t stress the CPU enough, as you mentioned.
I'm glad total war made it on there. But if they want to really stress the cpu they need to benchmark campaign end turn times rather than just the battle benchmark.
competition makes the world go round. the threat of loosing marketshare forces companies to innovate. He who does not work does not eat. it's as simple as that.
Now this is glorious. AMD and Intel going neck and neck leapfrogging each other with every release. I'm fully expecting AMD to have 3D vcache chips sitting in the wings to take the crown back. We're rapidly approaching the moment where it doesn't matter which brand of CPU you buy as they're both solid choices and that is the dream for us consumers.
EVGA is just a dead company walking. They didn't leave GPU because they want to focus on PSU, they left GPU because they're a poorly run company that is probably going to be fighting for its life within the next couple of years.
@@evilbred974 While I agree EVGA will face some difficult decisions in the future, saying they're poorly run does not give them enough credit. They are the absolute best OEM, bar none, in terms of customer service and their absence is really felt in the RTX 4000 generation. From the GN video it sounds like when EVGA's time is up it'll be more of a deliberate sunsetting of the company rather than a failure, and I can't speak for everyone, but I can respect that.
@@evilbred974 The company with the top performing GPU skus of every generation. (Seriously, go check ANY historical chart of TimeSpy Extreme, and you'll see 100% EVGA cards). The top company trusted by enthusiasts for being the only one to have consistent customer support. That company is a poorly run one? K, cool. As someone who's been a customer of EVGA of many years they absolutely fucking demolished any other board partner out there. Anyone who's actually spent any time in the industry mourns their loss. EVGA makes very good margins on their PSUs and will likely be successful in that sector as they already have been.
@@evilbred974 uh no. They're the best OEM and they're the only one that isn't run like trash because they stand by their products. It's just working Nvidia means they can't keep doing that without going into red repeatedly because Nvidia are assholes. If EVGA goes under it'll be a sad day because no other OEM is even CLOSE to them when it comes to support and quality.
@@janreichenbach265 Please do not encourage extreme obesity. I wouldnt comment on his weight, but applauding people for being extreme overweight is not OK either.
The consumer CPU scene is so different to what it was 10 years ago. Back in the Intel Ivy Bridge days where the 3770K was the fastest consumer chip, you'd be wowed by an extra 50cb in Cinebench R15. Now even a mid-range CPU has a huge amount of performance.
I put a 4670k and R9 280x in my first proper build late 2013. They lasted 6 years before I upgraded to a 2700x and 5700. Now I can't imagine having any CPU or GPU for anywhere near that long. Already upgraded my GPU again and now considering AM5 or 5800x3d during the inevitable Christmas sale.
@@TheAdamAdy You can definitely get away with far older hardware today that back in the day. Tbh even I don't need to upgrade that badly, I'm a GPU engineer so I'd be the first to but my only upgrade this year was to the Alienware QD-OLED ultrawide and it was absolutely mind blowing better than any speed upgrade for my system. My 6700xt pumps surprisingly great frames at 3440x1440, fuck I even played Quake 2 RTX at 75 fps with acceptable dynamic resolution scaling. 75 fps feels like 100 on that monitor with its near zero pixel response time. I had a mild CPU bottleneck at 1440p earlier but it's mostly gone at ultrawide now lol.
My i7 3770 non K ivy bridge CPU from January 2013 is still working and now I'm going to get i5 13600K very soon I7 3rd Gen never failed me even after running it without a proper heat sink fan but only heat sink sitting over it for almost 3 years and this year in the summer my CPU was reaching the Temps between 70 - 95 for almost straight months and got frozen many many times due to reaching threshold 100C temp but it won't die. Unfortunately I'd to buy Aio coolermaster 360ML for cooling and my ivy bridge is back under 30 Degrees at idle. Nasty mother fucker ivy bridge. 👌 👍 👍 👍 I'm sure of it were an amd CPU, it would've been a dead fried toasty piece of metallic shit by now.
Yeah idk other channels aren't making this many mistakes on their review, i think i spotted like 3 realised mistakes and 9 unrecognised mistakes that I picked pretty easy, most to do with the graphs. .. I don't understand when they have a lab TEAM to do this testing and other channels sometimes are either a two man or 3 man show don't make this many mistakes.
@@kinpact6100 there are some mistakes I agree, but other channels are testings against 5+ other systems, and I think most of the issues are down to them trying to iron out the big with how they are setting up automation of the testing. It does show and looks a little bad, but the fact that they can be easily cought and pointed out means it's fixable and a minor inconvenience to us as viewers at worst.
@@AlexanderMichelson AMD is still the little guy. Intel still generates multiple times the profits and multiple times the cashflow and multiple times the revenue of AMD. The amount of chips they sell dwarfs AMD and they have multiple times the workforce of AMD. You are just looking at market cap. Stock price is just what investor are willing to trade the stock for. Its the same reason Tesla is valued 2x Toyota despite selling 1/10th the vehicle. Tesla is still the underdog, investor are just speculating eventually will be no1. Right now Intel is building out fabs, instead of using those earning to pay larger dividends to investors, so that is also impacting their share price. AMD also has more room to grow given it has much lower marketshare while intel is already big and overall dominates across the bigger picture (enterprise, laptop, etc). Investor are also concerned that Intel's hundred billion poured into fabs is somewhat risky, since they are competing against TSMC, the dominant player in the pureplay Fab industry.
Nice review! The one thing that keeps bothering me is saying the 13900K costs $589 when that is ONLY the case at Micro Center. This is the price that Intel sells them for in bulk orders of 1000. Most other retailers are selling these chips with at least a 10% price premium
@@randommcranderson5155 Do you not understand that most people don't live near a Microcenter? Yeah let's tell everyone the Microcenter price and when they go to best buy they'll get sticker shock. It's imperative to specify the proper pricing especially when Microcenters are very niche.
@@randommcranderson5155 If “readily” is defined as “In person only, in 25 US cities, limit one CPU per 30 days else higher price applies”. I’d call that one retailer’s promo price, not the typical price.
@@randommcranderson5155 microcenter was giving out free ddr5 with the 7950x. So does that now count as part of the msrp? Microcenter is a special case and does not represent actual msrp.
@@randommcranderson5155 all the amd fanboys trying to defend their overpriced 7950x. Like why would you compare the prices of the places with the higher prices?
@@Z3t487 13600k is $320 and was getting super close to the 7900x that costs hundreds more. The 12600k is going to be like $200 now too while being very comparable or even faster than the ryzen 5 7600x...
Anyone else noticing the audio desync happening around the 4:15-4:16 mark? It continues throughout the entire video and makes sense with the graphs, but whenever Anthony is shown, it isn't synced to his lips.....
Yea, when doing my research earlier this year for my first gaming pc I quickly realized most videos are about all the good stuff. I ended up getting a pretty solid prebuilt from Lenovo and it satisfies most of my needs coming from ps3 era gaming and only owning a switch after that. i5 + 1660 super + 16gb ram is all ya need for a 1080p/60fps gaming experiences for a large amount of games
It's going to be interesting to see what happens in Q1 of next year. AMD is rumored to be planning 3d stacked varients of zen 4, which may or may not alter the graph again. Isn't competiton great?
@@mcbeaver9858 ??? your point doesnt make any sense, the competition ensures that 1 company cant just put on insane prices, because they have to compete with the other unlike *cough* nvidea *cough*
The transition from calling a workload "CPU-bound" to the "a CPU can be bound for your couch" product integration might be the best segue ever on this channel
Before, $1500 could get you a high end PC. Nowadays, $1500 only gets you a high end graphics card. I used to be able to afford the latest and greatest. Now I am upgrading 1-2 generations behind the latest and greatest, the gains aren't significant and the games I play are the same. Great work and input on the video!
that being said technically today you are getting way more bang for your buck per transistor. not only that pc's today are far more powerful and capable than ones in the past.
Exactly, I just bought a 3090 because my coworker was selling it for $600. I was so excited, but as soon as it booted up, I just wanted to go back to playing hollowknight lol. I also played some metro exodus to see the performance gains, but its all just a bit of a rat race. Im still just gonna play left 4 dead 2 tonight
ATLAST SOMEONE POINT THIS OUT. People forgot its RSP not MSRP. Additionally it will be on that price point if retailers buy them at bulk. I don't know why a lot of media outlets seems to forgot to point that one out.
Not sure if you've noticed or if someone else has already mentioned this but at around 12:17 the Prime95 Small FFT's graph comparing the 12600k and 13600k temperature and package power has both of CPU's package power labeled as 13600k
@@perfectman3077 or another 300 pound for a new b650 motherboards and another 200 pound for ddr5 16gb ram when you could get 32gb or even 64gb for same amount
You may have thought of this. To visualize energy use better, you could add watts as a divisor. In short performance /watts. The inverse may work as well. I think a FPS by watt chart will challenge the idea of just chasing CPU performance.
I mean if you can afford any of these top end cpus an extra 30 dollars on your PSU really shouldn't be a factor, but I understand the cooling argument if that's more of where you were trying to go.
What a great time to be alive in the CPU space! Intel and AMD are actually competing against each other! I only hope this kind of competition between them can keep going!
@@carlsagan2371 Exactly. I don't care if the i5 has 10-15 more fps if it consumes like 60-80 watts more... And the AM5 platform is futureproof where Intel on the other hand is known of dumping their sockets every 2nd year or so.
I'd like to see a video about theoretical builds of the 5800X3D vs. other CPUs. At what point does it stop gaining? Because if you need a 4090 to bottleneck it, then it would still be the price to performance choice.
@@Steve55555 Hardware unboxed reviews showed the 5800x3d passing the DDR5 12900k with a 4090, when they previously had the 12900k ddr5 beating the 5800x3d with a 3090ti.
It's so amazing to see how Anthony has evolved. I remember back when he'd make cameo appearances in LTT videos, and when on camera, was obviously uncomfortable and nervous. A bit awkward even. Today? He looks & sounds like a natural.
My i5-12400 getting a score of 12400 in Cinebench R23 multi-core is already insane to me, so the 13600K getting double that is absolutely mind-blowing. It's very tempting to upgrade to the 13600K next summer when, hopefully, prices will have come down a bit by then.
@@drago939393 My 12400 is plenty fast for most of my computing needs, but when it comes to rendering videos with lots of effects, transitions, commentary, background music, etc, having the extra power of the 13600K would hugely speed up the rendering process.
@@doabrad1850 That's fair, but... It's still not THAT much of a difference. If it's worthwhile for you, go for it. I just think such users are very rare.
@@charleshorseman55 LTT now is probably in the pocket of Apple and Intel, their recent editorial slant more or less confirms it. Compared to a lot of the other smaller and (truly) independent reviewers they really go out of their ways to cheerlead in the hype generation for dominant players. It's too bad since LTT had its humble beginnings too and was for a long time quite pro-consumer in their slant. It's probably going to be a trend now though given their viewer numbers on youtube and the power to shape narratives for big product launches.
@@zxbc1 cheerlead for dominant players? Are they supposed to... ignore one platform performing better than the other? They hype AMD all the time when THEY are better. Linus has been very outspoken about how he is cheering for AMD. Hell is Desk PC has an AMD cpu in it. how can you actually think he is an intel shill
People pointing out all the typos in the video and complaining comparing it to their last review with mistakes instead of using their heads. These mistakes are coming from product launch reviews people. They get the product only a couple weeks at most before launch day and have to do all the testing, filming, and editing in time to release a video on launch day. Same thing with the 4090 video mistakes were made. Product launch reviews are rough especially while uploading like 10 videos a week on multiple channels. Cut them some slack Jesus.
I am so happy you have included Factorio in your Testing. I love this game and always found the normal games most people test didn't represent it's performance well.
@@salakasto I mean Intel hasn't changed really. They're only supporting two generations and then it's a dead socket. Laughable compared to AMD's compatibility. I wouldn't want to invest into a space heater based on antique technology and a dead platform
there are lot of info error in the beginning like 0:55 Ryzen 7000 process is 7nm in video instead of 5nm which is shown in 5000 process and 1:10 all 12th gen parts named as i5 instead of their resptive names. I think something is not right there as many of last videos also contains many errors and are significant which are not acceptable. I hope you will look into this problem.
Great review, I have not built a computer since 2013. Honestly thank you. But I would like to know your thoughts on Intel i7 13700K. from what I can understand, its right up there next to the i9 13900K.
@Nico_fr67 I actually went with an Asus 7900 maximus hero board and a Intel i9 13900K. Now looking at which all In one cooler. Most likely the EK Elite 6 fan model. Any thoughts?.?.?
I would like to see a cost to performance to watt ratio benchmark. When I bought my Ryzen 7 5800X and RTX 3070Ti, I based it all on the cost to performance ratio. It's very helpful since the jump for a better hardware is not linear for the price even though the performance jump is linear
@@bedburgde7677 He posts his video on a second Channel in english :) But yeah from what I've read, AMD is still performance to wattage king. Very important in times of energy crisis. Core reason why I actually bought an 7900X instead of a Raptor Lake.
Great video; awesome to see a bit more competition ramping up between team blue and red again. PS. Hate to be pedantic, but for the L3 Cache of the 5800X3D (@1:00), you've got 25MB listed instead of 96MB... Edit: Also, the fabrication processes for the AMD cpu's are backwards, between generations.
Possible error at 1:01. Doesn’t the 5800x3D have 96MB of L3 cache rather than 25MB? I think it’s important to correct because this is why it’s has been so good at gaming and the slide makes it appear that the 13900K has more cache
There are so many mistakes in the charts that it's actually concerning. This is the second LTT video in the last few weeks that I have seen both major and minor mistakes in the charts and in what was actually said.
@@eli.k9190 Gamers Nexus is also known to work ludicrous amounts of hours around the time of reviews, LTT work your typical 9-5. Not excusing the mistakes but we also do not know how many employees LTT dedicates to these reviews.
Yeah I like my 13900k too, in older (Diablo 3, Dragon age inquisition) games it uses 5-6% (vsync at 60Hz), it's very quiet even with the 4070ti, the whole pc uses 160W, exactly the same as my old pc (i5 4570, amd hd7850) but much much higher (potential) framerates and efficiency. With rendering it draws 300W from the cpu but gpu is cool of course. There's hardly any application or game that would put both gpu and cpu at 100% usage so it's not that hard to cool (Noctua NH-D15). It's nice to have the power available. When the pc idles it draws 50W. Also the same as my old pc. I used quickCPU to setup my power plan and it fixed the cpu parked issue for me. Cpu can now use all cores for rendering and go back to parked cores drawing just under 5W when idling. Can't wait for new rpg games coming out this year!
@1982 Original Yawn. The cycle is just repeated over and over again. Intel then AMD then rinse wash and repeat. Gaming at this point and Multimedia needs are easily met by other company. Now it's more on the bang for the buck which hilariously is on Intel's side. Let's keep this party going to get more performance at the cheapest price possible.
Something that could be interesting for your lab is something like "lifetime operating cost" of CPU/GPU. It's probably hard to determine what a typical workload, and power usage is. And prices of components and electricity change over time. But when comparing price points, even if something is way cheaper right now, if it uses more power, it might prove more costly in the long run. As good ass the competition has been in pushing for more performance in the past few years, it also incentivised the push for more power in order to obtain that performance, and that seems like a step in the wrong direction.
Exactly what i'm thinking. At heavy load we see the intels pulling and extra ~100w compared to the zen 4. The intel will produce more heat and increase your air conditioner bill because that heat the CPU is making needs to be canceled out. I don't have exact figures on all factors but my guess is that the Intel is a poor choice if you're thinking about total cost of ownership. I wouldn't buy one.
Nobody cares about power usage when you're talking about dollars per year lmfao. Sure Intel CPUs might draw more power but that's under 100% load. How often are you putting your CPU under heavy load as an average person who plays games?
Agreed; I came away from this thinking "it's the value choice so long as you don't care about your power and cooling bills". Of course, that may well be an accurate description of the majority of LTT's usual audience... :)
Great review as always but I particularly love the final comments to not worry too hard about your older hardware. Just because these new chips are smashing it does not mean you need an upgrade. Most people are not running a 4090 or a monitor that can go 300+Hz and most of us are not rendering 4k video all day.
It would be really interesting to see the game ai performance(Aka time per turn) for strategy games such as Civilization VI, Galatic civilization 3 and Distant world 2 (End game, 2000 stars) both between the generations, and between Intel and Amd since that is a point where a faster cpu may save you quite some waiting time.
Still considering the insane performance boost the 5800X3D had over the 5800X, there's no chance I'll be buying anything before we're getting the 7800X3D. Intel's 13600K is good, but for gaming, it doesn't really do well enough compared to the 7600X. It's time to wait for next year again.
If 7600x beat in avg FPS 5800X3D at its lower price and no 3d cache AMD fans would be raving how impressive that is. 13600k literally does it - "doesn't really do well enough". Mhm, Okay.
The 13900k is not $110 less. The price we saw is the tray price at 1000 units. Its going to be closer to $650 i reckon at retail. Here in Australia there's $70 AUD between them, which is less than a 10% difference. Not to mention the 13900k is sucking back more juice than a Threadripper...
this. i want to see a comparison for the power draw when gaming between the 2 chips and then how much per hour the cost to run each is in kw/h. a 200w difference for a 5hr use per day works out to be around $AU140 per year, that means to run the intel will cost you more than you saved for the cpu
in sweden, retail is 9890sek /876usd for 7950x and 8490sek /751usd (8190sek/725usd) for 13900k(f). So if you go for a 13900kf its 151usd less, or regular 13900k its 125usd cheapar. For 151usd I can actually buy a motherboard for the 13900kf. and yes use my ddr4 ram. if you game, it does not use that much power. even less then many other cpus if you undervolt it.
This clearly demonstrates significant differences in cpu potential for games at 1080p; but it also shows smaller differences at 1440p - which is of more interest to those concerned with < fps/cpu cost > at those resolutions. Even more so at 2160p.
My whole house draws about 400watts an hour on average. With these modern CPUs and GPUs this simply triples to about 1400 watts.. damn, that's some big power draw.
400 an hour damn. Just my heater in my room is 1500watts and we have 6 rooms with these heaters and 1 for the living room, hallways and kitchen at 4500watts... 😂
Curious as to why, when all other reviews I saw weren't so enthusiastic about the launch citing lower than expected efficiency, you barely mentioned it... I get that different reviewers might have different priorities but efficiency is kinda important, especially when considering the energy crisis we're currently facing.
I would argue that anyone planning to buy an i9 isn't worried about energy costs. If you're buying the i5 and using it for evening gaming, it's not going to add all that much $$$ to your energy bill. The combination of these chips + a new graphics card may end up forcing some folks to buy a new PSU however. I ended up bumping up to a 1000W PSU for my new system.
LTT now is probably in the pocket of Apple and Intel, their recent editorial slant more or less confirms it. Compared to a lot of the other smaller and (truly) independent reviewers they really go out of their ways to cheerlead in the hype generation for dominant players. It's too bad since LTT had its humble beginnings too and was for a long time quite pro-consumer in their slant. It's probably going to be a trend now though given their viewer numbers on youtube and the power to shape narratives for big product launches.
That point about the 5600X at the end really spoke to me. I upgraded from an 1800X to a 5600 about a month and a half ago and I couldn't be happier. These expensive (especially on AMD) and absurdly hot CPUs are great, but I still don't think I'm going to upgrade for a while, especially since my 1070 is probably the bottleneck now and these newer platforms should get cheaper over time. I only have to pray that the power consumption is going down because my 650W PSU is the only internal part I see surviving my next upgrade.
I run a5600x stock 16gb 3200hz, and a msi 1080ti at 2560x1440 165hz, with a 550w seasonic PSU i am supper happy with this setup... really good energy efective... Run all games no prob,like warzone 100fps + etc ... i want to upgrade psu and 3080ti next year! Cpu the same(5600x) for 1 or 2 years more... 👌 good luck to all
@@adreanmarantz2103 You should upgrade your monitor, it seems. I've got a 9700k and a Strix 3070 and I'm running Warzone at like 110-130FPS on a 2560×1440 165Hz monitor. Even MW2 is like 80-110 on the campaign. I'm running full resolution render (again 1440p) and everything on high in the MW2 campaign.
@@KamikazeSOF2 Good call, and thanks for the stats and response. I've got a couple of Asus 24" 144hz displays that are only 3 and 4 years old, so I've been reluctant to toss them aside for 1440, and I wasn't sure if my cpu could push a higher rez and I'm trying hard to not have to replace my strix mobo(270h), but you've got me thinking about display shopping again.
@@dukezinnia1667 inverting a gazillon node matrix is useful for mechanical anylisis, doing a gazillon sum/product on gazillons nodes is nice too but i guess that one is done on gpus today.
+1 on this. My go to source has been CFDOnline till now, but would appreciate having a more in depth view from LTT. With folks from their team doing FEA for merch, it should be easily doable.
This is great work, though for graphs you should order the cpus by max fps, it makes it much easier to understand relative performance and simply reads much nicer if the video consumer is not pausing to read each of them.
Just realized how stupidly powerful the 13900K is when I looked at how my old, barely bottlenecking 6700K MT scores in cinebench R23. 6700... 4300 points 13900... 40000 points. I still don't feel overly pressured to upgrade but really starting to consider it.
You'll deff want to upgrade in the near future. 8700k is pretty much as low as you want to go but its poorly designed. No rush, but in the next year or so its time for an upgrade.
With all these benchmark videos I usually default to LTT, Jayz, and GN to see the different opinions. I am very glad you addressed the 13600k as that is the one I ( and most likely a lot of ppl) are looking to actually buy. Also thanks for testing with the ddr4 ram because I was curious if going ddr5 was necessary. For my needs seems like 13600k and ddr4 are going to be best bang for the buck.
Why not 5800X3D, a very cheap AM4 Mobo and DD4. I feel it still will perform better and cost even less. Its not like 13600k is any more future proof - you are buying a "end of socket" mobo anyway. Thats one thing that AMD has going for it with 7XXX series as well - you can expect to be able to upgrade it.
agree with you, Intel still producing low end consumer CPUs even to this day. i'm currently using Pentium G6405 it's a 'lil monster not gonna lie. Sadly AMD not producing Athlon again or introducing low end CPUs :(
@@adamstrix8319 yeah the reviewers always kinda live in what's normal gear to them, when in reality most people just play 1080 and have absolutely no use with the high end gpus and cpu's reviewed here. It's sad that AMD stopped the low end indeed, I remember my phenom II unlocked from 2 to 3 cores more then any other chip.
@Omni. Because DDR5 5600 is at least £200 for 32GB. That is the price of TWO used Ryzen 5 3600s! That's TWO CPUs each of which is more than sufficient for the task. Talking about wasting money *smh*
@@giyu_pls Yeah, I built a SFF PC maybe a year ago with the 12600K. Honestly might check if the 13700K is worth looking at. But regardless, it's gotta be nice knowing you just have to swap out the CPU and keep the rest of your parts list identical.
@@JavoCover i agree, would love to see a lategame savefile of Anno1800 and Civ6 aswell as, i feel like, those are spectacular candidates aswell when it comes to "CPU gaming". But it seems like that its not to clear to reviewers yet that they might should adjust the kind of games. On the other hand its first of all hard to benchmark 200 games and second they might wanna keep the same games to make it more comparable with older gen CPUs, even though this might not always be very accurate because the old games might be to outdated to make use of for example the 3D-V Cache. Hard to say, but maybe the automated benchmarking from the LTT Lab will help a lot, lets hope!
An even stronger than expected showing from Intel. That's great news. Hopefully this will force price adjustments from AMD and X3D variants to release sooner rather than later. Competition is good. Although, since I play at 4K, my 3700X is still good enough for me. When it does become time for upgrade, it should make a huge difference.
The graphs are visually very confusing to me. I can't distinguish the many different processors fast enough to follow what Anthony is saying. It would help a lot to have the manufacturer logos next to the names (you already did that in previous videos) and additionally some visual indicator to separate the different CPU generations. Visuals are much quicker to grasp than text in this situation.
@@JackAdams0 His body fat is fine for a grown man. He just has a thick neck that's all. Adult men are not supposed to be low in body fat, that's actually unhealthy. Anything between 12%-20% is a healthy level. I'd say Anthony is around 22%-25% which is not to far beyond healthy levels. Its only his neck that is "meaty", which is something that is still better than being a pencil neck.
Ok serious question that I don’t see often addressed; how does the desktop performance charts translate to mobile/laptop performance. This video makes the 13600 look pretty good, but that doesn’t necessarily mean it would fare as well in a laptop.
A test that i think is very relevant these days is an "average" power consumption test. Right now, "after covid", alot of people are still working from home. I would love to see a test on power consumption on a system that's very low when working from home, let's say via a citrix connection and that can also be a top notch gaming system when my workday ends. Energy prices in Europe is insane right now, it's to a point where i dont use my second monitor just to save power. Thinking maybe i should buy a laptop for work stuff instead.
Something people also forget is that when it's toasty out (which is a lot of the year in say, the US), every watt of heat you spit out of your system (so cpu plus PSU inefficiency) ALSO has to be removed by an air conditioning unit which is not efficient (i.e. you will use more than 1 watt of energy to remove that one watt of heat output). So potentially every watt of power draw could be another two watts of air conditioning. A small space heater is 500 watts, for reference.
I'd be really interested in a deeper dive into the 5800x3d with the new and older graphics cards I think there's a lot of interest in this chip with Black Friday and Cyber Monday coming up
I am wondering when Game Developers will optimize for the 5800x3D cache. being able to store instructions in that fast cache would benefit games I would think. of course it's up to the developers to optimize, I wonder what other processes can be optimized to use the large cache in this chip. definitely agree it's of interest.
@@808animescoob9 I'm also hoping so. Tho, most likely. Major game devs will optimize their towards current gen consoles first. PC version optimization will come way later, when they're trying to sell DLC, extra content with surprise mechanism, or something alike. Except for Bethesda, it's actually you who make the effort on game optimization, bug fixes, better QOL, etc.
@@henrikbornemann7599 thats not true, if your refering to der8auers Video, the 7950x is faster in everything at 90W except 3Dmark and there its less than 1% In given times with the high energy prices, the 13900k pulls far to much power for me.
These charts are very confusing to read to me, with an obtuse order (I guess it's based on price?) and sometimes you suddenly do order by performance. I'm constantly searching for the 13600K which I'm interested in, and it's hard to see the exact breakdown of performance since it all just looks jumbled up. Also would be helpful if you for instance used different colours for Intel and AMD, and possibly the new ones as well. Highlighting what you're talking about is good, but I'd like to also be able to pause and be able to read the graph myself and I find it harder to do than in other videos and articles.
Talking about having a good cooler for 13th gen honestly warrants a video showing which coolers (aio) are best. If you can do a peltier video i think a "13th gen best aio/watercool option" video would be even better and more useful for the audience :) I think even a 360 vs 240 could be part of the video too. 100% worth time invested.
Well I wasn't interested in the 7000 Series because of the requirement of DDR5 RAM, but if you look at these power consumption numbers, seems like intel isn't worth a buy aswell. Gonna stick with my R7 2700, that is just sipping 65W and still performs just fine. Gonna say though, this review was a little much centered around performance and while mentioning it, pretty much ignored the cost of running these chips in terms of power consumption. Yes getting 5% more performance at the very high end is really cool, especially if you pay less for it, but 350W peak? That is an unacceptable price to pay for so little gain. Running something like this in europe where energy prices just exploded because of the war is simply not an option if you have a functioning brain. And while the gains of the 13600K are more interesting especially in productivity, it also pulls up to 220W? An i5?!? Thats on par with a RTX3070 or an RX6800 graphics card, what the hell? Sorry but when DDR5 gets cheap enough to be worth the thought of buying it, with this trend, I'm gonna stick to AMD. If Intel cannot get their efficiency up and their power consumption down there is simply no way of me even considering them, even if they achieve 30% more performance, because I will pay for it in the long run
The other thing is the performance difference in games only occurs at 1080P, where it doesn't matter because your framerate will be super high anyway. Then at 4k there is no performance difference. I feel like the power consumption is far more important. As you'll have to spend more on cooling if you go with Intel, more on your PSU, and more on your energy costs. All of which could lead to you buying an older tier or cheaper CPU, and a better GPU, which is what really matters in gaming.
>>this review was a little much centered around performance and while mentioning it, pretty much ignored the cost of running these chips in terms of power consumption. I was arguing with an Intel fanboy earlier about the problem with efficency and his reply back to me was you can always tune down the 13900K to save power. And that idiot simply didn't realize that 7950X also have an eco mode and with this generation the two CPUs are trading blows on performance while Intel is losing on efficiency.
Exactly, they mentioned power draw in a CPU benchmark, but what about in games themselves? They have the possibility of using that much power but games use the CPU far less so power draw will be less.
I use a R5 3600, as many other gamers do. I'd love if you could do a comparison between it and the newest chips, taking into account the motherboard and RAM upgrades.
@@singular9 Can hardly be generalized, depends a lot on the game. I'm seeing some CPU bottlenecks with my R7 3700X and an RX 6700 XT @ 1440p, but that is with me actually benchmarking and paying attention. Besides the OG Crysis I'm always above 60 fps and that's practically all that matters to me, mostly averaging at 100+ fps.
R5 3600 is about 19 to 20 % slower than a R5 5600 so with a little math you can gat the results. But I think there will be a ryzen comparsion from someone in the near future.
Im sitting in the same boat, my R5 3600 is still good enough to drive my RTX3090. While there might be a slight CPU bottleneck in flight sims, I dont play in 1080p, most of the time Im in VR, so GPU bound. And while this video (clearly not sponsored by intel) is suggesting that the 13600 is the price/performance king, if you already own AM4 system, the clear and undisputed price/performance king is 3800X3D. And that is my next upgrade, Im probably skipping this gen of CPUs.
To help mitigate potential stability/degradation widely reported on flagship Raptor lake SKUs, I have changed a lot of my BIOS settings based on what I have watched and read. For my 13900k and 14900k I have done the following: MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.5 GHz and E-core boost to 4.3GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC value. I have set voltage offset at a modest -0.015v and set the Core limit to 300 Amps. I have disabled the C6&C7 C states and EIST. Lastly I have locked AVX at 0 offset. I have tested on P95, CB R23 and CB R15. All great and in a mid 20 degree room, no workload exceeds 80c on package or cores. Very happy and benchmarks are very close to where they were before taming these beasts.
While performance is impressive, the power usage is not accepttable. Pretty surprised there isn't more about power consumption during games. i9-13900K doesn't seem to be great value in games, considering how well i5-13600K and properly how well i7-13700K is going to perform. Who are going to enjoy i9-13900K? People who don't care about money and don't mind paying lots more for 5% more? Hard to see i9-13900K being a great chip for productivy, when most people are going to have problems cooling it and allso lets not forget, it's not fun siitting in a room withn a machine spewing out that much heat.
AMD still isn't very far behind the 13900K with their current stack. Will be interesting to see these charts when they release the 3D V-Cache versions of the 7000 series CPU's.
Intel really had to ramp up the power consumption to compete with AMD here. They barely got a 4% average better performance than the 7950X while consuming 50% more power. Honestly the AMD platform is probably a better value for the longer term, both because of power consumption and because of the AM5 chipset still having another 4-5 years of support vs Intel's current chipset which is now end-of-life with 13th gen.
Overall, yeah. Can't really argue with that at the moment. Still though, none of AMD's CPUs are making people upgrade their PSU to account power draw spikes of 350W. They both have their advantages and disadvantages. If AMD can give better performance than Intel that's more efficient but costs a bit more money, I would probably still go AMD. If you're purely worries about getting the best bang for your buck though, then yeah you should probably go with Intel.
@@marty5300 Because of platform cost though mostly. But their platform is worth more imo if you can use it for 3-4 years again. Still has to drop a little though, starting price is just high
Why do you always upgrade your CPUs all the time? Are you so impatient to wait for Zen 4 Gen 2? Do you think that AMD isn't already researching on how to make their 9xxx better?
You know what's nice about the 5800X3D? The fact that it's the third CPU that I'm using on the SAME MOTHERBOARD. In honour of the fact that I don't have to buy another 16GB of RAM (especially not DDR5 RAM) or an AM5 motherboard, I bought another 16GB of DDR4. With my RX 6800 XT, I'll be good for quite a few years to come.
Felt like the power chart is getting downplayed with the 3 variables chart. And the missing efficiency charts for the cpus. I9 13th gen throttle so much it is unrealistic for normal users to utilize its performance. I5 13600k is great from what have shown here and tempted me to get one myself, but i have to stop and look at the graph of power and temps to see how it behaves and how much cooling is needed. A warping 60 watt increase from the previous gen while skipping the charts in like 5 seconds felt like not wanting people to know its main drawbacks. Same goes for i9, a nhd-15 and we can see from the chart that only shows on the screen for like 5 seconds the i9 throttle and had to run at much lower clock speed. What are the results if it runs completely smooth without throttling? Would the result be the same or higher? If i am getting a nhd-15 is that the performance im getting? Where my cpu stays at 100-95 °C? If I didnt watch other reviewers video, recommending an i9 system for a friend and he was asking why his pc is so loud and hot? Having no clues that a freaking 360mm AIO cant tame the i9 beast? And thats some bad news for my friend because he is going to spend more replacing cpu cooling and better case for airflow for having no clue it takes so much more watt and cooling for maximum power to crunch through the workloads. Or felt scammed when the system underdeliver the performance for having a not so recommended system cooling. As much as I want to see amd stays on top, poor ass like me are still looking at budget intel chips for future builds. Shows what is and isnt important to the viewers linus. U brought me into this tech world, i dont want to see you trying to create the competition between the two companies by praising the supposedly weak side atm. It is not fair for viewers that watch only ur videos to decide on the pc parts that they are choosing.
@@LaCroix05 he did mentioned it, sadly it is like a carry over speech, gave me the vibe that they are rushing and not wanting you to notice the problem here
@@mattmah7164 Who the f buy i9 and still use 700watt psu? Who the f buy i9 and care about electrical bill? This is why I don't like GN cultist. They didn't think by themselves after watching the video. Watch the video again, at the end they are literally recommending everyone to still use ryzen 5xxx because we already got more fps than monitor refresh rate can ahndle anyway.
@La Croix is not just about the power consumption, it is for the amount of heat from the cpu. Yes u are right on the electric bill and the psu choice. But the problem is on the cooling. I5 seems ok on a 200++ wattage to u? Even they say nhd15 is ok, but when i wanted to build one myself, what does it says online? 125W! So what do u think im gonna do? Buy a 125W capable cpu cooler right? And what's gonna happen? Cpu throttle and wonder why my cpu is weaker than shown above? A lot of viewers that watch ltt are considered new or watch them bcs they are entertaining when compared to other similar tech channels. They should emphasize things everyone should know when they are getting it. About that i9 chart, take a look at the clock speed. Maybe u didn't realise it throttled? They used the terms and words to make it not so much of a big deal. But it throttled so bad on the nhd15. If someone wanted i9 for a workshop workload crunching. Same thing would happen, it throttled and run slower and need to spend more on a cooler which can be saved, idk maybe watching this if they DONT RECOMMEND a nhd15? And recommend smtg like a quality 360mm AIO. (360 AIO might not even enuf to cool it btw)
I like both sides. My entire life own intel platforms but finally made the jump to Ryzen 5950x on release day. Interested to see what thermals and power draw the Intel chip does against the 7000 series. Because electricity is at a premium right now and I'd rather save 300 watts if I sacrifice 20 fps.
i hope they're reading the market. however cool it is to have the best performing chip, most people will opt for the best-buy, and one of the most important characteristics of the best-buy model is low power consumption. i don't mean moderate, i mean low. as in 65w tdp low. i currently have an i5-6400 which i use for work and some light gaming (yes, intel hd graphics level gaming). so, if i were to upgrade, i'd be looking at a 65w tdp chip with decent integrated graphics.
I was thinking of going AMD for my last build but I've never went AMD because the selection for Intel is better, I was very surprised at how good the new i5s are.
Yeah Hardware Unboxed, Hardware Canucks, Gamers Nexus, and Optimum Tech got lower Cinebench scores. LTT’s benchmarks since the 4090 release have been rather off…
You test at 1080p high in an older game ALSO because even with the fastest graphics card there is, the RTX 4090 as of now, at 4K you're going to be GPU bound (at 1440p in newer games). So the CPU doesn't really matter much for gaming, any of the ones in this test will do for your next gaming system should always be the advice, if you're interested in high fps. We still always like to see those CPU fps benchmark numbers though, it's just fun! :) Thanks for the great review, really enjoyed watching it.
CSGO in benchmarks is kind of meh, as people that bench it max everything out. Most people play not even Full HD on lowest settings, and even blackbars, stretched and so on.
whats wild is the 5600x is currently sub 200$ if you dont mind not having the absolute best, its an awesome cpu, paird with a 1660ti, most games i play have over 100fps
5600X is still nice, and with the 5800X3D available it still has a bit of an upgrade path for performance as well if needed. Personally though if I were building today, a 12400 would be my choice since I could get that for also under $200 but get 13th Gen Intel down the road too.
I love competition! Also, love the end comments. These graphs do really make you feel super behind in the PC world, but reminding that these numbers are extremes and your PC is still great was a nice to hear.
A PC is a tool, we tend to forget that. Who cares how fast it is? As long as it does what you want it to do, that's fine. Hell, just 3 years ago I 'upgraded' to an Intel 4th gen CPU and got my first taste of an SSD. It's still capable of more than I'll ever throw at it and I'm sitting somewhere between casual and power user.
Still happily playing retro and indie games on my i5-6500 / GTX 1060 from 2016... but it's fun seeing how much things have changed since then! (Although Windows 10 end of support is eventually going to force me to upgrade...)
You are in good company. My brother STILL uses his Powercolor Red Devil RX 480 8GB GPU and a Ryzen 7 2700x. He refuses to upgrade and says that his PC runs his games just as he likes and has no plans to upgrade anytime soon.
Felt like 12th gen came out a week ago and 11th gen a few months ago. Time flies.
Dude i feel like i just put my 10700k in my pc like a year ago
But I feel very long because I tried to built 12600k pc built 2 month before but I see the leak 13600k specs and benchmarks so I wait 2 Months like 2 years
@@JayMac you did 😢
meanwhile im still rockin a 7700k LOL
Really does
Sorry for the mislabeled charts around 1:00. We actually fixed these before publishing but the old versions made it into the video. We are examining our processes so that we don't have these issues in the future. We hope to release a new version of this video with the fixes implemented right away.
I'm sending your sponsors harassing messages because you let a morbidly obese guy segment them
As a video editor, I feel your pain.
Wow a fix within an hour, impressive. 👍
damn its just a small error no need to do all that extra work lol
Yeah no wonder the graph felt off. Also, second graph mistake in a row? Maybe the second video being super graph heavy skews the chances a bit :v
Factorio is an interesting benchmark since it's a game made with extremely modern techniques in term of memory management and cache use, and also the sheer amount of things the game is keeping up
I never thought I would ever see Factorio in a CPU Benchmark on any big Tech Channel
@@Multithreadi7 Yeah same! But super big factories with tons of belts and trains are a true CPU benchmark. Also the game is making really good use of CPU cache
Really happy to see factorio used int he benchmark as that's the kind of game I play a lot of.
They should also try benchmarking satisfactory it's a 3d factory game similar to factorio but a lot better by coffee stain studios
Factorio is entirely single-threaded, save for I believe multiplayer networking.
The real benefits of a Factorio benchmark are how heavily reliant it is on both CPU Cache and RAM speeds.
i'm upgrading from a 3770K to a 13600k. That is a 10 generation gap.
3770k has been used almost every day for 10 years. it has to have like well over 25 000 hours of use. and out of sheer respect for the 3770k i'm getting another intel cpu.
that's mad. respect for holding out that long... I've been using a 4690K for 7 ish years now, although I've just got a newer laptop to use as my main machine now!
That's a legendary chip, man. I'd keep it on display on my desk if i were u.
So I still have a 3770K too. It runs 24/7 all the time.
But the upgrade will be green. :-)
Bro i litterally upgrading from Intel core 2 Q6600 to i7 13700k. It's 14-15 generation gap. Holy $h1t.
@@Gentoo701 man why and how did you wait so long? that poor Q6600 must have been seriously sweating to do anything since about 2017... that's about a 15x increase in multi core performance, wild
Yes we're always on the look out for which new chips can run our factories the fastest (which also tend to be the best for compiling and developing the game).
LOL u guys
I don't get the chart. How do you even benchmark a game that runs at 60 ups?
Crazy
Factorio is one of the best games ever!
Hahahah... you deserve a medal @factorio xD
Can’t wait to get a 4090 and a 13900k so I can replace my space heater with something more useful
4090 and 13900k is the new sun.
I mean, who needs to buy a space heater that draws 1500 watts, when you can get a pc that draws 1000 watts AND provides entertainment
Cold winter+13900+4090+a blanket (or a poncho. Really. Try it) = we're going to need a benchmark for comfyness.
@@corbynite external 2x480mm custom loop rad out the window?
And play at 1080p lol 🤣
I’d love to see more strategy games in your CPU benchmarks. A huge population in Cities: Skylines or Anno or something similar. A lot of action titles just don’t stress the CPU enough, as you mentioned.
Yes! Cities: Skylines (vanilla and modded) as well as Anno and Planet Coaster/Zoo would be amazing
Also Stellaris lategame really stresses the CPU.
At the very least, Factorio (assuming a mega base) should be putting similar load to an RTS/Macro-SIM. But yes, a few more variants would be nice.
I'm glad total war made it on there. But if they want to really stress the cpu they need to benchmark campaign end turn times rather than just the battle benchmark.
Crusader Kings with everybody getting married population explosion
We should all be thankful to AMD for this Intel chip. Without AMD it would not of been out for at least 5 more years.
of been ? of what ?
ya if amd just didn't make 7000 series earlier intel would kill themselves
competition makes the world go round. the threat of loosing marketshare forces companies to innovate. He who does not work does not eat. it's as simple as that.
@@Mike-jv8bv Well said
All hail AMD 🎉 (sarcasm)
Now this is glorious. AMD and Intel going neck and neck leapfrogging each other with every release. I'm fully expecting AMD to have 3D vcache chips sitting in the wings to take the crown back. We're rapidly approaching the moment where it doesn't matter which brand of CPU you buy as they're both solid choices and that is the dream for us consumers.
I'd argue Intel still has better out of box driver/software support. Much less problems than AMD each gen.
lmao spot on
yep, intel has better drivers and functions. if the same price = intel.
@@piranias for now, Microsoft has to kick it into high gear, and that is out of the hands of AMD.
3d doesn’t help 4k gaming
Hey guys, there's a minor error at 1:26 at the top of the charts, naming all 12th gen CPUs Core i5. Otherwise great video, love the great work you do.
5800X3D also is 96mb L3 cache, not 25 lol
This is the third mistake they've made in their videos with charts recently. Quality control seems to be going downhill recently.
@@egocd were all human here.
@@BiggestMichael LTT is like 50 humans or how many people they employ now
@@egocd Mistakes happen. Don't they make like, 5 videos a week at LMG?
So what I'm getting from this is that EVGA chose a *great time* to focus on their PSU line of business.
EVGA is just a dead company walking. They didn't leave GPU because they want to focus on PSU, they left GPU because they're a poorly run company that is probably going to be fighting for its life within the next couple of years.
@@evilbred974 I dunno man I have extremely great customer service with them, would be nice to see them not quit the GPU industry and team up with AMD
@@evilbred974 While I agree EVGA will face some difficult decisions in the future, saying they're poorly run does not give them enough credit. They are the absolute best OEM, bar none, in terms of customer service and their absence is really felt in the RTX 4000 generation. From the GN video it sounds like when EVGA's time is up it'll be more of a deliberate sunsetting of the company rather than a failure, and I can't speak for everyone, but I can respect that.
@@evilbred974
The company with the top performing GPU skus of every generation. (Seriously, go check ANY historical chart of TimeSpy Extreme, and you'll see 100% EVGA cards). The top company trusted by enthusiasts for being the only one to have consistent customer support. That company is a poorly run one?
K, cool. As someone who's been a customer of EVGA of many years they absolutely fucking demolished any other board partner out there. Anyone who's actually spent any time in the industry mourns their loss. EVGA makes very good margins on their PSUs and will likely be successful in that sector as they already have been.
@@evilbred974 uh no. They're the best OEM and they're the only one that isn't run like trash because they stand by their products. It's just working Nvidia means they can't keep doing that without going into red repeatedly because Nvidia are assholes. If EVGA goes under it'll be a sad day because no other OEM is even CLOSE to them when it comes to support and quality.
This guy is simply the best. The narration, the knowledge, the execution, the humor, oooof
yes though im concerned for his health
He should skip at least one cheeseburger a day for his health. I'm fat too so I can say that. :)
.. AND he's gorgeous !
@@janreichenbach265 Please do not encourage extreme obesity. I wouldnt comment on his weight, but applauding people for being extreme overweight is not OK either.
@@Hvanudetfornogethis weight is impressive
Same as his knowledge
If only the gpu market had this level of competition.
Who knows with RDNa3. AMD could pull out a heck of a lead over Nvidia with it.
Oh that would be awesome. I think we have hope too, with Intel. Hopefully Arc gets another generation.
i5 13600k >= i7 12700k
@@codywatrous3636 Yes hopefully, its going to take maybe another 3-5 years but that's if nvidia or amd doesn't have anything up there sleeves.
@@Drewsterman777 You're out of the loop, buddy.
The consumer CPU scene is so different to what it was 10 years ago. Back in the Intel Ivy Bridge days where the 3770K was the fastest consumer chip, you'd be wowed by an extra 50cb in Cinebench R15. Now even a mid-range CPU has a huge amount of performance.
I put a 4670k and R9 280x in my first proper build late 2013. They lasted 6 years before I upgraded to a 2700x and 5700. Now I can't imagine having any CPU or GPU for anywhere near that long. Already upgraded my GPU again and now considering AM5 or 5800x3d during the inevitable Christmas sale.
@@cosmic_gate476 Which is insane. I just gave my old Intel 3930k and R9 290 to my sis and she can still play all games at decent fps @1080p
Yeah Way back when Intel fleecing consumers with dual cores lol.
@@TheAdamAdy You can definitely get away with far older hardware today that back in the day. Tbh even I don't need to upgrade that badly, I'm a GPU engineer so I'd be the first to but my only upgrade this year was to the Alienware QD-OLED ultrawide and it was absolutely mind blowing better than any speed upgrade for my system. My 6700xt pumps surprisingly great frames at 3440x1440, fuck I even played Quake 2 RTX at 75 fps with acceptable dynamic resolution scaling. 75 fps feels like 100 on that monitor with its near zero pixel response time. I had a mild CPU bottleneck at 1440p earlier but it's mostly gone at ultrawide now lol.
My i7 3770 non K ivy bridge CPU from January 2013 is still working and now I'm going to get i5 13600K very soon
I7 3rd Gen never failed me even after running it without a proper heat sink fan but only heat sink sitting over it for almost 3 years and this year in the summer my CPU was reaching the Temps between 70 - 95 for almost straight months and got frozen many many times due to reaching threshold 100C temp but it won't die. Unfortunately I'd to buy Aio coolermaster 360ML for cooling and my ivy bridge is back under 30 Degrees at idle.
Nasty mother fucker ivy bridge. 👌 👍 👍 👍
I'm sure of it were an amd CPU, it would've been a dead fried toasty piece of metallic shit by now.
Holy cow this is a lot to look over and consider. Super big thanks to the lab folks for getting all this out.
basically 13600k > 7600x by a lot, and 13900k > 7900x and 7950x by not too much but it's still a lot cheaper
@@dogmeat-ph2pd oh I understood it all, it's just impressive how much data ltt labs is putting out
Yeah idk other channels aren't making this many mistakes on their review, i think i spotted like 3 realised mistakes and 9 unrecognised mistakes that I picked pretty easy, most to do with the graphs. ..
I don't understand when they have a lab TEAM to do this testing and other channels sometimes are either a two man or 3 man show don't make this many mistakes.
@@kinpact6100 there are some mistakes I agree, but other channels are testings against 5+ other systems, and I think most of the issues are down to them trying to iron out the big with how they are setting up automation of the testing. It does show and looks a little bad, but the fact that they can be easily cought and pointed out means it's fixable and a minor inconvenience to us as viewers at worst.
Labs didn't do shit on this video. It's actually pretty poor compared to the GN and HUB analysis of this launch.
This makes me feel so much better about getting the 13700. If Anthony says it's good, I'm here for it. I trust him with my life.
What about supporting the little guy? AMD I mean.
@Måns Westerling I guess that's true, which is a bit weird considering that AMD has much smaller cpu and gpu market share.
AMD pulled a big guy move with little guy stats. But, I preciate them to get Intel to rethink their costs.
You can't trust him around an unlocked fridge.
@@AlexanderMichelson AMD is still the little guy. Intel still generates multiple times the profits and multiple times the cashflow and multiple times the revenue of AMD. The amount of chips they sell dwarfs AMD and they have multiple times the workforce of AMD. You are just looking at market cap. Stock price is just what investor are willing to trade the stock for. Its the same reason Tesla is valued 2x Toyota despite selling 1/10th the vehicle. Tesla is still the underdog, investor are just speculating eventually will be no1. Right now Intel is building out fabs, instead of using those earning to pay larger dividends to investors, so that is also impacting their share price. AMD also has more room to grow given it has much lower marketshare while intel is already big and overall dominates across the bigger picture (enterprise, laptop, etc). Investor are also concerned that Intel's hundred billion poured into fabs is somewhat risky, since they are competing against TSMC, the dominant player in the pureplay Fab industry.
Nice review! The one thing that keeps bothering me is saying the 13900K costs $589 when that is ONLY the case at Micro Center. This is the price that Intel sells them for in bulk orders of 1000. Most other retailers are selling these chips with at least a 10% price premium
Yes, I'm surprised both LinusTechTips and JayzTwoCents missed this. The 13900k is 660 USD on Newegg atm.
@@randommcranderson5155 Do you not understand that most people don't live near a Microcenter? Yeah let's tell everyone the Microcenter price and when they go to best buy they'll get sticker shock. It's imperative to specify the proper pricing especially when Microcenters are very niche.
@@randommcranderson5155 If “readily” is defined as “In person only, in 25 US cities, limit one CPU per 30 days else higher price applies”.
I’d call that one retailer’s promo price, not the typical price.
@@randommcranderson5155 microcenter was giving out free ddr5 with the 7950x. So does that now count as part of the msrp? Microcenter is a special case and does not represent actual msrp.
@@randommcranderson5155 all the amd fanboys trying to defend their overpriced 7950x. Like why would you compare the prices of the places with the higher prices?
I'm actually really happy to see Intel and AMD compete so much, and glad to see Intel being the budget king for once
Deep pockets budget king to be exact.
@@Z3t487 wait for the i3 13100
Same honestly
@@Z3t487 13600k is $320 and was getting super close to the 7900x that costs hundreds more. The 12600k is going to be like $200 now too while being very comparable or even faster than the ryzen 5 7600x...
@@hyperturbotechnomike can only seem to rely on Intel to be in the 3 game coz I haven't seen a ryzen 3 for at least 2 generations
As a developer it would be insanely cool to see code compilation benchmarks
saw another channel doing both unity edit: unreal, not unity, and other code compilation with the 13900k and 13600k at top.
They run chrome compile in the tests. Amd wins due to more better cores.
These benchmarks are useless because the RAM was not invariant.
Watch Alex Ziskind's channel,, that's what he focuses on
@@flagger1 that guy is a total apple funboy
Anyone else noticing the audio desync happening around the 4:15-4:16 mark? It continues throughout the entire video and makes sense with the graphs, but whenever Anthony is shown, it isn't synced to his lips.....
i love videos like this, it reminds me that literally anything is perfectly acceptable and would be a massive upgrade over my desktop pc
Now just think of the possibilities next October 😊
Do you know when the 13th gen laptop cpu will be out?
@@maxjames00077 I would guess Q1 next year maybe? That was about the time frame for the gen 12 I think.
Yea, when doing my research earlier this year for my first gaming pc I quickly realized most videos are about all the good stuff. I ended up getting a pretty solid prebuilt from Lenovo and it satisfies most of my needs coming from ps3 era gaming and only owning a switch after that. i5 + 1660 super + 16gb ram is all ya need for a 1080p/60fps gaming experiences for a large amount of games
@@sentientcardboarddumpster7900 Not my i5-3570K though. Too slow for modern games, even at FHD! My 1070 is still fine.
It's going to be interesting to see what happens in Q1 of next year. AMD is rumored to be planning 3d stacked varients of zen 4, which may or may not alter the graph again. Isn't competiton great?
not that great for my wallet though :(
@@mcbeaver9858 You don't have to buy it, just watch for the drama!
@@mcbeaver9858 you will open your wallet wide and you’ll be happy about it!
No according to nvidia
@@mcbeaver9858 ??? your point doesnt make any sense, the competition ensures that 1 company cant just put on insane prices, because they have to compete with the other unlike *cough* nvidea *cough*
The transition from calling a workload "CPU-bound" to the "a CPU can be bound for your couch" product integration might be the best segue ever on this channel
I was today years old when I learned segue is spelt like that 😂
@@Thecryingonion I always feel like it should be segway
Before, $1500 could get you a high end PC. Nowadays, $1500 only gets you a high end graphics card.
I used to be able to afford the latest and greatest.
Now I am upgrading 1-2 generations behind the latest and greatest, the gains aren't significant and the games I play are the same.
Great work and input on the video!
Maybe everything improves except yourself. Get a better job crylord
that being said technically today you are getting way more bang for your buck per transistor. not only that pc's today are far more powerful and capable than ones in the past.
Exactly, I just bought a 3090 because my coworker was selling it for $600. I was so excited, but as soon as it booted up, I just wanted to go back to playing hollowknight lol. I also played some metro exodus to see the performance gains, but its all just a bit of a rat race. Im still just gonna play left 4 dead 2 tonight
@@cadeng8168 Same. Upgrading from a 4770 soon. Only performance I care about is vr for Blade And Sorcery ect.
But by today's standards even budget builds gets you much more quality, performance and more advanced than the high end pc's 10-15 years ago.
589$ is not the MSRP. It's actually 659$ because Intel always writes the 1K unit pricing on their slides.
ATLAST SOMEONE POINT THIS OUT. People forgot its RSP not MSRP. Additionally it will be on that price point if retailers buy them at bulk. I don't know why a lot of media outlets seems to forgot to point that one out.
I just bought a 13900K for $559 at Microcenter
@@Senarak Just rekt em
@@Senarak doubtful since even microcenter cant buy it at that price.
@@VoldoronGaming Want to see my receipt?
Not sure if you've noticed or if someone else has already mentioned this but at around 12:17 the Prime95 Small FFT's graph comparing the 12600k and 13600k temperature and package power has both of CPU's package power labeled as 13600k
Similar mistakes and issues at a minute into the video the charts form there on have names and stuff all mixed up.
The only thing I learned from all this is that the 13600K lineup is going to be an absolute game changer when it comes to value
Ok
Ok
Value goes out the window when you’ll need to spend another 100$ for a new power supply.
@@perfectman3077 if you thinking of buying top chip, you either own a high power PSU or dont care that much about 100 bucks.
@@perfectman3077 or another 300 pound for a new b650 motherboards and another 200 pound for ddr5 16gb ram when you could get 32gb or even 64gb for same amount
You may have thought of this. To visualize energy use better, you could add watts as a divisor. In short performance /watts. The inverse may work as well. I think a FPS by watt chart will challenge the idea of just chasing CPU performance.
I mean if you can afford any of these top end cpus an extra 30 dollars on your PSU really shouldn't be a factor, but I understand the cooling argument if that's more of where you were trying to go.
What a great time to be alive in the CPU space! Intel and AMD are actually competing against each other! I only hope this kind of competition between them can keep going!
I know, we don't want another AMD/Intel supremacy where either is the standard for years before one catches up again.
Well my electricity bill from Intel sure as hell won't keep me going, that's for sure.
@@carlsagan2371 Exactly. I don't care if the i5 has 10-15 more fps if it consumes like 60-80 watts more... And the AM5 platform is futureproof where Intel on the other hand is known of dumping their sockets every 2nd year or so.
Is it true Intel would still be selling dual core if Ryzen didn't exist?
@@bajszosjozsef4850 no one knows what will AMD do 3 years for now, and they are learning fast
I'd like to see a video about theoretical builds of the 5800X3D vs. other CPUs. At what point does it stop gaining? Because if you need a 4090 to bottleneck it, then it would still be the price to performance choice.
Yeah kinda comical how last gen CPU with a funny cache thing needs a fucking 4090 to be sit down lmaoo
I think its still the best thing you can put in AM4 socket. So if you dont want to upgrade whole system yet. Its good enough.
Let's just wait for Ryzen 7000 with 3d cache, then we shall see who keeps the gaming crown
i think you misunderstood, the 3090ti already maxed out the x3d, the 4090 just showed it cant go any higher if given a better graphics card
@@Steve55555 Hardware unboxed reviews showed the 5800x3d passing the DDR5 12900k with a 4090, when they previously had the 12900k ddr5 beating the 5800x3d with a 3090ti.
I was afraid to comment on this video.
I was afraid to reply on this comment.
I was afraid to reply to this comment.
I was afraid to reply to your comment.
I was afraid to like this comment
I was afraid to like this comment
It's so amazing to see how Anthony has evolved. I remember back when he'd make cameo appearances in LTT videos, and when on camera, was obviously uncomfortable and nervous. A bit awkward even. Today? He looks & sounds like a natural.
My i5-12400 getting a score of 12400 in Cinebench R23 multi-core is already insane to me, so the 13600K getting double that is absolutely mind-blowing. It's very tempting to upgrade to the 13600K next summer when, hopefully, prices will have come down a bit by then.
Dang it didn't go up by 1200 😭😭😭
price will be higher next year.. so buy 13600k immediately
Why?
@@drago939393 My 12400 is plenty fast for most of my computing needs, but when it comes to rendering videos with lots of effects, transitions, commentary, background music, etc, having the extra power of the 13600K would hugely speed up the rendering process.
@@doabrad1850 That's fair, but... It's still not THAT much of a difference. If it's worthwhile for you, go for it. I just think such users are very rare.
I would have liked to see the 7700x on more graphs due to recent reports of disabling a single die on the 7950x increasing gaming performance
Yes, it was sadly skipped over (for some reason)
@@charleshorseman55 LTT now is probably in the pocket of Apple and Intel, their recent editorial slant more or less confirms it. Compared to a lot of the other smaller and (truly) independent reviewers they really go out of their ways to cheerlead in the hype generation for dominant players. It's too bad since LTT had its humble beginnings too and was for a long time quite pro-consumer in their slant. It's probably going to be a trend now though given their viewer numbers on youtube and the power to shape narratives for big product launches.
That's not really something a standard user would do though...
@@zxbc1 cheerlead for dominant players? Are they supposed to... ignore one platform performing better than the other? They hype AMD all the time when THEY are better. Linus has been very outspoken about how he is cheering for AMD. Hell is Desk PC has an AMD cpu in it. how can you actually think he is an intel shill
@@zxbc1 that's BS dude
People pointing out all the typos in the video and complaining comparing it to their last review with mistakes instead of using their heads. These mistakes are coming from product launch reviews people. They get the product only a couple weeks at most before launch day and have to do all the testing, filming, and editing in time to release a video on launch day. Same thing with the 4090 video mistakes were made. Product launch reviews are rough especially while uploading like 10 videos a week on multiple channels. Cut them some slack Jesus.
Little did they know why they were so fast lol.
Oh boy this is the start of intel's fall
lol this aged well ;O
I am so happy you have included Factorio in your Testing. I love this game and always found the normal games most people test didn't represent it's performance well.
I will link you to something new, a life-changing financial business investment
its*
I had to go back and double check that I heard Factorio. One of the best games I've played in years.
Agreed, especially as it's one of the game genres I'll play more often, and play far enough that CPU does become a bottleneck.
The most important thing could be ram speed.
And it’s unclear what save they use because a mega base wouldn’t reach 60 ups
Finally. True competition since 2000. I cannot wait to see what the next 5 years bring.
I never thought I'd see Intel be price efficient again and support backwards compatibility past one gen. Thanks AMD
@@salakasto I mean Intel hasn't changed really. They're only supporting two generations and then it's a dead socket. Laughable compared to AMD's compatibility. I wouldn't want to invest into a space heater based on antique technology and a dead platform
there are lot of info error in the beginning like 0:55 Ryzen 7000 process is 7nm in video instead of 5nm which is shown in 5000 process and 1:10 all 12th gen parts named as i5 instead of their resptive names. I think something is not right there as many of last videos also contains many errors and are significant which are not acceptable. I hope you will look into this problem.
Great review, I have not built a computer since 2013. Honestly thank you. But I would like to know your thoughts on Intel i7 13700K. from what I can understand, its right up there next to the i9 13900K.
Yeah with a power consumption at 274w at stock better go with a 13600k 😇
@Nico_fr67 I actually went with an Asus 7900 maximus hero board and a Intel i9 13900K. Now looking at which all In one cooler. Most likely the EK Elite 6 fan model. Any thoughts?.?.?
@@nico_fr6743 just setup a small thermonuclear plant in your backyard to support the pc
@@nico_fr6743 how much is the 13600k doing in power consuption?
@@nico_fr6743 how much is the 13600k doing in power consuption?
I would like to see a cost to performance to watt ratio benchmark. When I bought my Ryzen 7 5800X and RTX 3070Ti, I based it all on the cost to performance ratio. It's very helpful since the jump for a better hardware is not linear for the price even though the performance jump is linear
Just watch the review Hardware Unboxed made. Much more indepth, and frankly a much better review.
@@andreasw.hvammen3946 yeah this video also was filled with mistakes
Would've loved to see price to performance and power to performance comparisons to AMD aswell, given how important sustainability is nowadays.
I ve seen a video from Der Bauer about it, but it is in German
And power consumption because in the Netherlands a kilowatt-hour costs a whopping 0,40€ here that is 0,39$.
Do you actually care about sustainability.
@@bedburgde7677 He posts his video on a second Channel in english :) But yeah from what I've read, AMD is still performance to wattage king. Very important in times of energy crisis. Core reason why I actually bought an 7900X instead of a Raptor Lake.
@@bedburgde7677 he uploads every video he does on a different english channel btw
Great video; awesome to see a bit more competition ramping up between team blue and red again.
PS. Hate to be pedantic, but for the L3 Cache of the 5800X3D (@1:00), you've got 25MB listed instead of 96MB...
Edit: Also, the fabrication processes for the AMD cpu's are backwards, between generations.
There are so many errors in the video. All Intel names are messed up. i5 12900k etc. - god.
Anthony's presentation tone is so nice that I always get scared by the sponsor change...
Possible error at 1:01. Doesn’t the 5800x3D have 96MB of L3 cache rather than 25MB? I think it’s important to correct because this is why it’s has been so good at gaming and the slide makes it appear that the 13900K has more cache
There are so many mistakes in the charts that it's actually concerning. This is the second LTT video in the last few weeks that I have seen both major and minor mistakes in the charts and in what was actually said.
Welcome to techtober lol they're probably benchmarking around the clock
@@ventilate4267 well gamer's nexus with a quarter of employees has no issues in "techtober".
@@eli.k9190 Gamers Nexus is also known to work ludicrous amounts of hours around the time of reviews, LTT work your typical 9-5. Not excusing the mistakes but we also do not know how many employees LTT dedicates to these reviews.
What mistakes? You can't mention mistakes and the point out none of them LMFAO
@@dhkatz_ well, some of the charts are mislabeled for one
I love my 13900k, makes video editing so much faster especially on 4k hd videos. Although most games it will be at 10-20% utilization at all times
wait what? Which games? What settings? max?
@@kartoffelbrei8090 im playing horizon right now avg 30% utilization not overclocked. 5500mhz. My other games below 20%
What codecs are you editing? Editing speed heavily relies on the video’s codec
Looking forward to upgrading from my AMD 5 3600. Should be a great difference with After Effects.
Yeah I like my 13900k too, in older (Diablo 3, Dragon age inquisition) games it uses 5-6% (vsync at 60Hz), it's very quiet even with the 4070ti, the whole pc uses 160W, exactly the same as my old pc (i5 4570, amd hd7850) but much much higher (potential) framerates and efficiency. With rendering it draws 300W from the cpu but gpu is cool of course. There's hardly any application or game that would put both gpu and cpu at 100% usage so it's not that hard to cool (Noctua NH-D15).
It's nice to have the power available. When the pc idles it draws 50W. Also the same as my old pc. I used quickCPU to setup my power plan and it fixed the cpu parked issue for me. Cpu can now use all cores for rendering and go back to parked cores drawing just under 5W when idling.
Can't wait for new rpg games coming out this year!
Its just a back and forth battle as it always has been. Next up 7000 series with 3dcache. Absolutely loving this competitive environment again.
Remember 5% improvements each gen... 3-7000 series, lol
wHOOoOoOoOaAaAa 3d cache jerking it
"as it always has been" What log did you roll out from under. Intel stagnated the market for a decade.
@@cybersecuritydeclassified4793 the market yes, but the performance that they bring is what he mentioned i believe
@1982 Original Yawn. The cycle is just repeated over and over again. Intel then AMD then rinse wash and repeat. Gaming at this point and Multimedia needs are easily met by other company. Now it's more on the bang for the buck which hilariously is on Intel's side. Let's keep this party going to get more performance at the cheapest price possible.
Something that could be interesting for your lab is something like "lifetime operating cost" of CPU/GPU.
It's probably hard to determine what a typical workload, and power usage is. And prices of components and electricity change over time.
But when comparing price points, even if something is way cheaper right now, if it uses more power, it might prove more costly in the long run.
As good ass the competition has been in pushing for more performance in the past few years, it also incentivised the push for more power in order to obtain that performance, and that seems like a step in the wrong direction.
Exactly what i'm thinking.
At heavy load we see the intels pulling and extra ~100w compared to the zen 4. The intel will produce more heat and increase your air conditioner bill because that heat the CPU is making needs to be canceled out.
I don't have exact figures on all factors but my guess is that the Intel is a poor choice if you're thinking about total cost of ownership. I wouldn't buy one.
Nobody cares about power usage when you're talking about dollars per year lmfao. Sure Intel CPUs might draw more power but that's under 100% load. How often are you putting your CPU under heavy load as an average person who plays games?
"...ass...", giggety
Agreed; I came away from this thinking "it's the value choice so long as you don't care about your power and cooling bills". Of course, that may well be an accurate description of the majority of LTT's usual audience... :)
@@dhkatz_ Everybody in the know cares.. since thermal efficiency in the only real indicator of technological achievement.
Great review as always but I particularly love the final comments to not worry too hard about your older hardware.
Just because these new chips are smashing it does not mean you need an upgrade. Most people are not running a 4090 or a monitor that can go 300+Hz and most of us are not rendering 4k video all day.
I am in my dreams lol
It would be really interesting to see the game ai performance(Aka time per turn) for strategy games such as Civilization VI, Galatic civilization 3 and Distant world 2 (End game, 2000 stars) both between the generations, and between Intel and Amd since that is a point where a faster cpu may save you quite some waiting time.
Still considering the insane performance boost the 5800X3D had over the 5800X, there's no chance I'll be buying anything before we're getting the 7800X3D. Intel's 13600K is good, but for gaming, it doesn't really do well enough compared to the 7600X. It's time to wait for next year again.
If 7600x beat in avg FPS 5800X3D at its lower price and no 3d cache AMD fans would be raving how impressive that is. 13600k literally does it - "doesn't really do well enough". Mhm, Okay.
Lmao yeah keep coping hard bro
@@kazioo2 it didn't, 3d v cache is around the corner
Why not wait? Why buy into a EoL platform instead of am5?
What do you mean? The 13600K is extremely competitive with the 7600X. It beat it in several benchmarks.
@@aravindpallippara1577 If you already have DDR4 then the cost to buy into that "EoL" platform is significantly less.
The 13900k is not $110 less. The price we saw is the tray price at 1000 units. Its going to be closer to $650 i reckon at retail. Here in Australia there's $70 AUD between them, which is less than a 10% difference.
Not to mention the 13900k is sucking back more juice than a Threadripper...
I will link you to something new, a life-changing financial business investment
this. i want to see a comparison for the power draw when gaming between the 2 chips and then how much per hour the cost to run each is in kw/h. a 200w difference for a 5hr use per day works out to be around $AU140 per year, that means to run the intel will cost you more than you saved for the cpu
$660 for 13900K, $630 for KF. So $40 difference.
@@monstarghsclips6635 the 13900k is at 188w and the 7950x is at 140w on cyberpunk
in sweden, retail is 9890sek /876usd for 7950x and 8490sek /751usd (8190sek/725usd) for 13900k(f). So if you go for a 13900kf its 151usd less, or regular 13900k its 125usd cheapar. For 151usd I can actually buy a motherboard for the 13900kf. and yes use my ddr4 ram. if you game, it does not use that much power. even less then many other cpus if you undervolt it.
Charts mistakes at LMG are becoming a tradition
This clearly demonstrates significant differences in cpu potential for games at 1080p; but it also shows smaller differences at 1440p - which is of more interest to those concerned with < fps/cpu cost > at those resolutions. Even more so at 2160p.
My whole house draws about 400watts an hour on average. With these modern CPUs and GPUs this simply triples to about 1400 watts.. damn, that's some big power draw.
I'm so thankful that the floor I live on pays for distributed power collectively, Although i think I'm the only one haha
400 an hour damn. Just my heater in my room is 1500watts and we have 6 rooms with these heaters and 1 for the living room, hallways and kitchen at 4500watts... 😂
400 watts? Are you sure?
My 800sq foot apartment used 1200kWhr last feburary.
@@maxjames00077 not everyone can afford a mansion, I don’t know if you knew that
Curious as to why, when all other reviews I saw weren't so enthusiastic about the launch citing lower than expected efficiency, you barely mentioned it... I get that different reviewers might have different priorities but efficiency is kinda important, especially when considering the energy crisis we're currently facing.
I would argue that anyone planning to buy an i9 isn't worried about energy costs. If you're buying the i5 and using it for evening gaming, it's not going to add all that much $$$ to your energy bill.
The combination of these chips + a new graphics card may end up forcing some folks to buy a new PSU however. I ended up bumping up to a 1000W PSU for my new system.
On point sir. Smelly smelly linus
LTT now is probably in the pocket of Apple and Intel, their recent editorial slant more or less confirms it. Compared to a lot of the other smaller and (truly) independent reviewers they really go out of their ways to cheerlead in the hype generation for dominant players. It's too bad since LTT had its humble beginnings too and was for a long time quite pro-consumer in their slant. It's probably going to be a trend now though given their viewer numbers on youtube and the power to shape narratives for big product launches.
@@zxbc1 lol how many times you will copy pasta this?
You accuse Linus is Intel shill now?
Then I want to know how much GN paid you every copy paste.
@@LaCroix05 Don't fanboi anyone or any brand
I love that you've included factorio in your testing suite!
The sponsored ads were tasteful and short. There are tons of people sharing information like this but it's the execution that brings me here.
That point about the 5600X at the end really spoke to me. I upgraded from an 1800X to a 5600 about a month and a half ago and I couldn't be happier. These expensive (especially on AMD) and absurdly hot CPUs are great, but I still don't think I'm going to upgrade for a while, especially since my 1070 is probably the bottleneck now and these newer platforms should get cheaper over time. I only have to pray that the power consumption is going down because my 650W PSU is the only internal part I see surviving my next upgrade.
I run a5600x stock 16gb 3200hz, and a msi 1080ti at 2560x1440 165hz, with a 550w seasonic PSU i am supper happy with this setup... really good energy efective... Run all games no prob,like warzone 100fps + etc ... i want to upgrade psu and 3080ti next year! Cpu the same(5600x) for 1 or 2 years more... 👌 good luck to all
@@informitas0117 I'm lopsided in the other direction (i think) a 3070ftw on a i7-7700k @1080.
@@adreanmarantz2103 should be 1440p 144hz with that ideally
@@adreanmarantz2103 You should upgrade your monitor, it seems.
I've got a 9700k and a Strix 3070 and I'm running Warzone at like 110-130FPS on a 2560×1440 165Hz monitor. Even MW2 is like 80-110 on the campaign.
I'm running full resolution render (again 1440p) and everything on high in the MW2 campaign.
@@KamikazeSOF2 Good call, and thanks for the stats and response. I've got a couple of Asus 24" 144hz displays that are only 3 and 4 years old, so I've been reluctant to toss them aside for 1440, and I wasn't sure if my cpu could push a higher rez and I'm trying hard to not have to replace my strix mobo(270h), but you've got me thinking about display shopping again.
Can you include finite element analysis simulations in your next tests, for people who use their cpus mainly to crunch numbers?
You mean like calculating a zillion digits of pi? I know that is a historical benchmark. But what is a modern example?
@@dukezinnia1667 inverting a gazillon node matrix is useful for mechanical anylisis, doing a gazillon sum/product on gazillons nodes is nice too but i guess that one is done on gpus today.
Finite element analysis would be good to add to their tests.
+1 on this. My go to source has been CFDOnline till now, but would appreciate having a more in depth view from LTT. With folks from their team doing FEA for merch, it should be easily doable.
Would it even be relevant though? GPUs are far better are performing calculations than CPUs. That's what's running AI processes anyways.
can you sort graphs by fps? its kinda confusing when results are all over the place
Exactly
The graphs are so confusing, please just arrange when from highest to lowest.
This is great work, though for graphs you should order the cpus by max fps, it makes it much easier to understand relative performance and simply reads much nicer if the video consumer is not pausing to read each of them.
I really just can't wait for the new Ryzen series to see how they will compete.
This is going to be a good end of the year for consumers.
*Hopefully*
ryzen is already out what do you mean?
@@lencas112 X3D
@@unbekanntername2023 x3d could be competitive for gaming but for productivity its gonna be way worse
@@propersod2390 and how many of us watching LTT use CPUs for serious productivity?
Don't think they'll be out before CES 2023 or something along those lines
Just realized how stupidly powerful the 13900K is when I looked at how my old, barely bottlenecking 6700K MT scores in cinebench R23.
6700... 4300 points
13900... 40000 points.
I still don't feel overly pressured to upgrade but really starting to consider it.
You'll deff want to upgrade in the near future. 8700k is pretty much as low as you want to go but its poorly designed. No rush, but in the next year or so its time for an upgrade.
character in your profile pictures name?
@@lilpizzy3892🐟
"interesting" choice to leave out the 7950X from the power draw and efficiency comparison ;)
With all these benchmark videos I usually default to LTT, Jayz, and GN to see the different opinions. I am very glad you addressed the 13600k as that is the one I ( and most likely a lot of ppl) are looking to actually buy. Also thanks for testing with the ddr4 ram because I was curious if going ddr5 was necessary. For my needs seems like 13600k and ddr4 are going to be best bang for the buck.
Yea with the 13600k you get 12900k performance for $330. On top of that you don’t need ddr5 and new mobos which are really expensive rn.
i believe they didn't showcase the i5's in previous gen videos simply because amd's r5s were the better value option.
@Jeremiah Bakker that would be interesting to see to say the least lol
Why not 5800X3D, a very cheap AM4 Mobo and DD4. I feel it still will perform better and cost even less. Its not like 13600k is any more future proof - you are buying a "end of socket" mobo anyway. Thats one thing that AMD has going for it with 7XXX series as well - you can expect to be able to upgrade it.
@@shha22 you can’t say get a 5800 x3d then say 13th gen intel is a dead end socket when so is the 5800x3D bc AM5 exists now
You guys should add the I3 12100 as well in the charts, it's really nice
intel still didnt announce the 13100 and 13400 yet
@@kubotite9168 yeah I know, but the previous gen i3 beats the r5 5600 in quite some benchmark so it's nice to compare to
agree with you, Intel still producing low end consumer CPUs even to this day. i'm currently using Pentium G6405 it's a 'lil monster not gonna lie. Sadly AMD not producing Athlon again or introducing low end CPUs :(
@@adamstrix8319 yeah the reviewers always kinda live in what's normal gear to them, when in reality most people just play 1080 and have absolutely no use with the high end gpus and cpu's reviewed here. It's sad that AMD stopped the low end indeed, I remember my phenom II unlocked from 2 to 3 cores more then any other chip.
@@malloott Wow its beats the R5? I'll have to look into the i3 then
13600k seems like a hell of a deal. Beast of a chip
Yeah, that i5 seems like the actual story here
Yea amazing value, is the DDR5 5600 great value too? Out of touch rich people 😂 I bet you want Rishi Sunak for PM too don't you? 😂🤣😂
@Omni. Because DDR5 5600 is at least £200 for 32GB. That is the price of TWO used Ryzen 5 3600s! That's TWO CPUs each of which is more than sufficient for the task. Talking about wasting money *smh*
@@engineeringvision9507 you could use a ddr4 mobo and call it a day
@@rarinth Sure, you could use a Ryzen 3600 and a DDR4 motherboard. That's value.
lookin good Anthony. audio guy, the outro ad vocals were considerably louder than the rest of the video, please fix.
Honestly last year's i5-12600K was a beast, and seeing this new i5-13600K is even better... man that's insane.
Nah bruh if im building a pc i was gonna buy the i5 12600k but know hehe i can buy the i5 13600k
@@giyu_pls Yeah, I built a SFF PC maybe a year ago with the 12600K. Honestly might check if the 13700K is worth looking at. But regardless, it's gotta be nice knowing you just have to swap out the CPU and keep the rest of your parts list identical.
@@derkaiser9881 thats why people like AMD so much. AM4 socket for years and years
Better at heating your room, yes.
@@SirPoppy that doesnt exist
cant wait to see how performance changes for AMD with the X3D models!!
@@infernaldaedra Intel can't do that, because their next gen will be using "glued" chiplets too.
@@infernaldaedra You should be rejoicing of this competition, yet you sound like you already picked your team...
It will beat out a 13600K by 3% but cost 50% more.
In the right games (usually massive shooters) X3D is way faster than anything. But those type of games are not very popular while reviewing.
@@JavoCover i agree, would love to see a lategame savefile of Anno1800 and Civ6 aswell as, i feel like, those are spectacular candidates aswell when it comes to "CPU gaming". But it seems like that its not to clear to reviewers yet that they might should adjust the kind of games. On the other hand its first of all hard to benchmark 200 games and second they might wanna keep the same games to make it more comparable with older gen CPUs, even though this might not always be very accurate because the old games might be to outdated to make use of for example the 3D-V Cache. Hard to say, but maybe the automated benchmarking from the LTT Lab will help a lot, lets hope!
An even stronger than expected showing from Intel. That's great news. Hopefully this will force price adjustments from AMD and X3D variants to release sooner rather than later. Competition is good. Although, since I play at 4K, my 3700X is still good enough for me. When it does become time for upgrade, it should make a huge difference.
Yeah definitely. I didn't expect intel to do that well now i have to rethink everything lol but this is good, the competition is getting fierce !
This is what competition gets us. If not for AMD stepping up, Intel would have never had a reason to create something like this at this price point.
Truth. No way in hell intel selling the i5 13600k for 300 without competition and it probably wouldn't be as good either.
7700k is the fastest we have days back, facepalm
Why would they? It's all about profit in the end.
The graphs are visually very confusing to me. I can't distinguish the many different processors fast enough to follow what Anthony is saying. It would help a lot to have the manufacturer logos next to the names (you already did that in previous videos) and additionally some visual indicator to separate the different CPU generations. Visuals are much quicker to grasp than text in this situation.
This
I really love Anthony’s reviews. His voice, personality, character are great. Please, more Anthony!
the fat dude? yeah his reviews are good but not a good aesthetic seeing a guy who can have a heart attack any time now
@@JackAdams0 His body fat is fine for a grown man. He just has a thick neck that's all. Adult men are not supposed to be low in body fat, that's actually unhealthy. Anything between 12%-20% is a healthy level. I'd say Anthony is around 22%-25% which is not to far beyond healthy levels. Its only his neck that is "meaty", which is something that is still better than being a pencil neck.
@@angrysocialjusticewarrior he is obese no matter how you try to put it
@@JackAdams0 how about you worry about your own body and stop being so judgmental? Thank you.
@@joshualewisjones how about you lose some weight fatso?
The closing statement was SOO freaking good. Hope it wasn’t underrated by viewers or something. Amazing video and amazing host we love Anthony :)
Ok serious question that I don’t see often addressed; how does the desktop performance charts translate to mobile/laptop performance. This video makes the 13600 look pretty good, but that doesn’t necessarily mean it would fare as well in a laptop.
A test that i think is very relevant these days is an "average" power consumption test. Right now, "after covid", alot of people are still working from home. I would love to see a test on power consumption on a system that's very low when working from home, let's say via a citrix connection and that can also be a top notch gaming system when my workday ends. Energy prices in Europe is insane right now, it's to a point where i dont use my second monitor just to save power. Thinking maybe i should buy a laptop for work stuff instead.
Something people also forget is that when it's toasty out (which is a lot of the year in say, the US), every watt of heat you spit out of your system (so cpu plus PSU inefficiency) ALSO has to be removed by an air conditioning unit which is not efficient (i.e. you will use more than 1 watt of energy to remove that one watt of heat output). So potentially every watt of power draw could be another two watts of air conditioning. A small space heater is 500 watts, for reference.
Same situation here...power efficiency please. I don't care which is the fastest CPU at 250W! Which one uses the least joules for a given workload!
@@ProfessionalProfessorPat Air conditioners use < 1 W energy for each 1 W of heat removed if they're installed and working correctly.
@@ProfessionalProfessorPat It's not a lot of the year in the US where I live, it's generally cool or cold for most of the year here in PA.
I'd be really interested in a deeper dive into the 5800x3d with the new and older graphics cards
I think there's a lot of interest in this chip with Black Friday and Cyber Monday coming up
I am wondering when Game Developers will optimize for the 5800x3D cache. being able to store instructions in that fast cache would benefit games I would think. of course it's up to the developers to optimize, I wonder what other processes can be optimized to use the large cache in this chip. definitely agree it's of interest.
@@808animescoob9
I'm also hoping so.
Tho, most likely. Major game devs will optimize their towards current gen consoles first. PC version optimization will come way later, when they're trying to sell DLC, extra content with surprise mechanism, or something alike.
Except for Bethesda, it's actually you who make the effort on game optimization, bug fixes, better QOL, etc.
13600K can be easily overclocked to be faster so thats the better deal
@@jordanplays-transitandgame1690 Not sure you really wanna still overclock these chips with the already high power budget and temperatures..
It's also worth noting that the 13900K pulls 50% more power at full throttle than the 7950x 😳
Yeah 323 watts for 4%
But if you run it at 90 watt its still better
And is on a dead end platform, making the slight price difference for AMD worth while, 3D vcache baby!
@@henrikbornemann7599 thats not true, if your refering to der8auers Video, the 7950x is faster in everything at 90W except 3Dmark and there its less than 1%
In given times with the high energy prices, the 13900k pulls far to much power for me.
@@moebius2k103 and what about price buddy?😅
These charts are very confusing to read to me, with an obtuse order (I guess it's based on price?) and sometimes you suddenly do order by performance. I'm constantly searching for the 13600K which I'm interested in, and it's hard to see the exact breakdown of performance since it all just looks jumbled up. Also would be helpful if you for instance used different colours for Intel and AMD, and possibly the new ones as well. Highlighting what you're talking about is good, but I'd like to also be able to pause and be able to read the graph myself and I find it harder to do than in other videos and articles.
You are able to pause.
This is always the problem with LTT. i never understand their charts .i feel they will never sort their charts properly.
Talking about having a good cooler for 13th gen honestly warrants a video showing which coolers (aio) are best.
If you can do a peltier video i think a "13th gen best aio/watercool option" video would be even better and more useful for the audience :)
I think even a 360 vs 240 could be part of the video too.
100% worth time invested.
Well I wasn't interested in the 7000 Series because of the requirement of DDR5 RAM, but if you look at these power consumption numbers, seems like intel isn't worth a buy aswell. Gonna stick with my R7 2700, that is just sipping 65W and still performs just fine. Gonna say though, this review was a little much centered around performance and while mentioning it, pretty much ignored the cost of running these chips in terms of power consumption. Yes getting 5% more performance at the very high end is really cool, especially if you pay less for it, but 350W peak? That is an unacceptable price to pay for so little gain. Running something like this in europe where energy prices just exploded because of the war is simply not an option if you have a functioning brain. And while the gains of the 13600K are more interesting especially in productivity, it also pulls up to 220W? An i5?!? Thats on par with a RTX3070 or an RX6800 graphics card, what the hell? Sorry but when DDR5 gets cheap enough to be worth the thought of buying it, with this trend, I'm gonna stick to AMD. If Intel cannot get their efficiency up and their power consumption down there is simply no way of me even considering them, even if they achieve 30% more performance, because I will pay for it in the long run
The other thing is the performance difference in games only occurs at 1080P, where it doesn't matter because your framerate will be super high anyway. Then at 4k there is no performance difference. I feel like the power consumption is far more important. As you'll have to spend more on cooling if you go with Intel, more on your PSU, and more on your energy costs. All of which could lead to you buying an older tier or cheaper CPU, and a better GPU, which is what really matters in gaming.
>>this review was a little much centered around performance and while mentioning it, pretty much ignored the cost of running these chips in terms of power consumption.
I was arguing with an Intel fanboy earlier about the problem with efficency and his reply back to me was you can always tune down the 13900K to save power.
And that idiot simply didn't realize that 7950X also have an eco mode and with this generation the two CPUs are trading blows on performance while Intel is losing on efficiency.
Exactly, they mentioned power draw in a CPU benchmark, but what about in games themselves? They have the possibility of using that much power but games use the CPU far less so power draw will be less.
Maybe its just because my kWh is super cheap but I never understood why some PC gamers care about power draw.
Yeah I really liked that german reviewer who puts power consumption in the spotlight with performance. Really more helpful.
I use a R5 3600, as many other gamers do. I'd love if you could do a comparison between it and the newest chips, taking into account the motherboard and RAM upgrades.
While I agree that such a comparison would be interesting, it is also true, that gaming in general tends to be more GPU-bound.
@@singular9 Can hardly be generalized, depends a lot on the game. I'm seeing some CPU bottlenecks with my R7 3700X and an RX 6700 XT @ 1440p, but that is with me actually benchmarking and paying attention. Besides the OG Crysis I'm always above 60 fps and that's practically all that matters to me, mostly averaging at 100+ fps.
R5 3600 is about 19 to 20 % slower than a R5 5600 so with a little math you can gat the results. But I think there will be a ryzen comparsion from someone in the near future.
Im sitting in the same boat, my R5 3600 is still good enough to drive my RTX3090. While there might be a slight CPU bottleneck in flight sims, I dont play in 1080p, most of the time Im in VR, so GPU bound. And while this video (clearly not sponsored by intel) is suggesting that the 13600 is the price/performance king, if you already own AM4 system, the clear and undisputed price/performance king is 3800X3D. And that is my next upgrade, Im probably skipping this gen of CPUs.
Unless you use a high refresh rate monitor, don't bother to upgrade. The R5 3600 is still capable to push enough fps.
To help mitigate potential stability/degradation widely reported on flagship Raptor lake SKUs, I have changed a lot of my BIOS settings based on what I have watched and read. For my 13900k and 14900k I have done the following: MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.5 GHz and E-core boost to 4.3GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC value. I have set voltage offset at a modest -0.015v and set the Core limit to 300 Amps. I have disabled the C6&C7 C states and EIST. Lastly I have locked AVX at 0 offset. I have tested on P95, CB R23 and CB R15. All great and in a mid 20 degree room, no workload exceeds 80c on package or cores. Very happy and benchmarks are very close to where they were before taming these beasts.
While performance is impressive, the power usage is not accepttable. Pretty surprised there isn't more about power consumption during games. i9-13900K doesn't seem to be great value in games, considering how well i5-13600K and properly how well i7-13700K is going to perform. Who are going to enjoy i9-13900K? People who don't care about money and don't mind paying lots more for 5% more? Hard to see i9-13900K being a great chip for productivy, when most people are going to have problems cooling it and allso lets not forget, it's not fun siitting in a room withn a machine spewing out that much heat.
sounds like you're ready to move to an iPad
58-66 watts whit my 13600k while gaming, whats the problem ?
AMD still isn't very far behind the 13900K with their current stack. Will be interesting to see these charts when they release the 3D V-Cache versions of the 7000 series CPU's.
they are far far behind in the price/performance ratio though.
Intel really had to ramp up the power consumption to compete with AMD here. They barely got a 4% average better performance than the 7950X while consuming 50% more power. Honestly the AMD platform is probably a better value for the longer term, both because of power consumption and because of the AM5 chipset still having another 4-5 years of support vs Intel's current chipset which is now end-of-life with 13th gen.
Overall, yeah. Can't really argue with that at the moment. Still though, none of AMD's CPUs are making people upgrade their PSU to account power draw spikes of 350W. They both have their advantages and disadvantages. If AMD can give better performance than Intel that's more efficient but costs a bit more money, I would probably still go AMD. If you're purely worries about getting the best bang for your buck though, then yeah you should probably go with Intel.
@@marty5300 Because of platform cost though mostly. But their platform is worth more imo if you can use it for 3-4 years again. Still has to drop a little though, starting price is just high
Before these new cpus I was really leaning team red for the budget but that 13600k sounds like the spot for me.
Why do you always upgrade your CPUs all the time? Are you so impatient to wait for Zen 4 Gen 2? Do you think that AMD isn't already researching on how to make their 9xxx better?
@@acmenipponair I'm still running a 10400 base...
@Steve Sherman AMD will be the abuser if Intel goes under. Healthy competition is the best for consumers.
You know what's nice about the 5800X3D? The fact that it's the third CPU that I'm using on the SAME MOTHERBOARD. In honour of the fact that I don't have to buy another 16GB of RAM (especially not DDR5 RAM) or an AM5 motherboard, I bought another 16GB of DDR4. With my RX 6800 XT, I'll be good for quite a few years to come.
Felt like the power chart is getting downplayed with the 3 variables chart. And the missing efficiency charts for the cpus.
I9 13th gen throttle so much it is unrealistic for normal users to utilize its performance. I5 13600k is great from what have shown here and tempted me to get one myself, but i have to stop and look at the graph of power and temps to see how it behaves and how much cooling is needed. A warping 60 watt increase from the previous gen while skipping the charts in like 5 seconds felt like not wanting people to know its main drawbacks. Same goes for i9, a nhd-15 and we can see from the chart that only shows on the screen for like 5 seconds the i9 throttle and had to run at much lower clock speed. What are the results if it runs completely smooth without throttling? Would the result be the same or higher? If i am getting a nhd-15 is that the performance im getting? Where my cpu stays at 100-95 °C? If I didnt watch other reviewers video, recommending an i9 system for a friend and he was asking why his pc is so loud and hot? Having no clues that a freaking 360mm AIO cant tame the i9 beast? And thats some bad news for my friend because he is going to spend more replacing cpu cooling and better case for airflow for having no clue it takes so much more watt and cooling for maximum power to crunch through the workloads. Or felt scammed when the system underdeliver the performance for having a not so recommended system cooling.
As much as I want to see amd stays on top, poor ass like me are still looking at budget intel chips for future builds. Shows what is and isnt important to the viewers linus.
U brought me into this tech world, i dont want to see you trying to create the competition between the two companies by praising the supposedly weak side atm. It is not fair for viewers that watch only ur videos to decide on the pc parts that they are choosing.
@Brendon Lee O'Connell they literally mention 350 watt.
GN cultist cannot watch the full video before commenting?
@@LaCroix05 he did mentioned it, sadly it is like a carry over speech, gave me the vibe that they are rushing and not wanting you to notice the problem here
@@mattmah7164 Who the f buy i9 and still use 700watt psu?
Who the f buy i9 and care about electrical bill?
This is why I don't like GN cultist. They didn't think by themselves after watching the video.
Watch the video again, at the end they are literally recommending everyone to still use ryzen 5xxx because we already got more fps than monitor refresh rate can ahndle anyway.
@La Croix is not just about the power consumption, it is for the amount of heat from the cpu.
Yes u are right on the electric bill and the psu choice. But the problem is on the cooling. I5 seems ok on a 200++ wattage to u? Even they say nhd15 is ok, but when i wanted to build one myself, what does it says online? 125W! So what do u think im gonna do? Buy a 125W capable cpu cooler right? And what's gonna happen? Cpu throttle and wonder why my cpu is weaker than shown above?
A lot of viewers that watch ltt are considered new or watch them bcs they are entertaining when compared to other similar tech channels. They should emphasize things everyone should know when they are getting it.
About that i9 chart, take a look at the clock speed. Maybe u didn't realise it throttled? They used the terms and words to make it not so much of a big deal. But it throttled so bad on the nhd15. If someone wanted i9 for a workshop workload crunching. Same thing would happen, it throttled and run slower and need to spend more on a cooler which can be saved, idk maybe watching this if they DONT RECOMMEND a nhd15? And recommend smtg like a quality 360mm AIO. (360 AIO might not even enuf to cool it btw)
I like both sides. My entire life own intel platforms but finally made the jump to Ryzen 5950x on release day. Interested to see what thermals and power draw the Intel chip does against the 7000 series. Because electricity is at a premium right now and I'd rather save 300 watts if I sacrifice 20 fps.
I think that CPU is going to stay good for a while, especially if you're gaming in 1440p or greater resolution.
i hope they're reading the market. however cool it is to have the best performing chip, most people will opt for the best-buy, and one of the most important characteristics of the best-buy model is low power consumption. i don't mean moderate, i mean low. as in 65w tdp low. i currently have an i5-6400 which i use for work and some light gaming (yes, intel hd graphics level gaming). so, if i were to upgrade, i'd be looking at a 65w tdp chip with decent integrated graphics.
I was thinking of going AMD for my last build but I've never went AMD because the selection for Intel is better, I was very surprised at how good the new i5s are.
I found it interesting that LTT gets a higher all-cores Cinebench score with a standard air cooler than some other reviewers are getting with AIOs.
Something is way off on this review, felt more like an Intel advert than a review.
Even the data doesn't seem right compared to other reviewers.
They have to pay for all those Intel Makeovers..hahahaha
Yeah Hardware Unboxed, Hardware Canucks, Gamers Nexus, and Optimum Tech got lower Cinebench scores. LTT’s benchmarks since the 4090 release have been rather off…
they are also Noctua partners 😅
They probably take the first (cold) run. Hardware Unboxed takes a run after 10mins I think. Still LTTs review feels inaccurate...
You test at 1080p high in an older game ALSO because even with the fastest graphics card there is, the RTX 4090 as of now, at 4K you're going to be GPU bound (at 1440p in newer games). So the CPU doesn't really matter much for gaming, any of the ones in this test will do for your next gaming system should always be the advice, if you're interested in high fps. We still always like to see those CPU fps benchmark numbers though, it's just fun! :)
Thanks for the great review, really enjoyed watching it.
CSGO in benchmarks is kind of meh, as people that bench it max everything out. Most people play not even Full HD on lowest settings, and even blackbars, stretched and so on.
whats wild is the 5600x is currently sub 200$ if you dont mind not having the absolute best, its an awesome cpu, paird with a 1660ti, most games i play have over 100fps
5600X is still nice, and with the 5800X3D available it still has a bit of an upgrade path for performance as well if needed. Personally though if I were building today, a 12400 would be my choice since I could get that for also under $200 but get 13th Gen Intel down the road too.
5600 for 150$ is even better
Today I saw 5600 at 130 euro with 24% tax included...
you can get a 2060, it's the best cost efficient
I love competition! Also, love the end comments. These graphs do really make you feel super behind in the PC world, but reminding that these numbers are extremes and your PC is still great was a nice to hear.
A PC is a tool, we tend to forget that. Who cares how fast it is? As long as it does what you want it to do, that's fine. Hell, just 3 years ago I 'upgraded' to an Intel 4th gen CPU and got my first taste of an SSD. It's still capable of more than I'll ever throw at it and I'm sitting somewhere between casual and power user.
On the specs screen it shows the 5800X3D as having only 25mb of L3, it should be 96. Also the 13600k is an i5 not i7
ltt at least this month is just getting worse and worse, what a shame.
@@Pantfula yeah seriously.. who proof reads these charts, these are crucial mistakes that misinform the viewer
there are many move typos in this video, kinda... surprised so many of them slipped through. Quite sad ngl
Would be nice to normalize performance against price
Still happily playing retro and indie games on my i5-6500 / GTX 1060 from 2016... but it's fun seeing how much things have changed since then! (Although Windows 10 end of support is eventually going to force me to upgrade...)
You are in good company. My brother STILL uses his Powercolor Red Devil RX 480 8GB GPU and a Ryzen 7 2700x. He refuses to upgrade and says that his PC runs his games just as he likes and has no plans to upgrade anytime soon.