Grab a GN Tear-Down Toolkit on back-order now to guarantee you get one in the next run! store.gamersnexus.net/products/gamersnexus-tear-down-toolkit Intel pushed Windows 11 for Alder Lake, so we tested Win10 vs Win11 here: ua-cam.com/video/XBFTSej-yIs/v-deo.html We also reviewed the 12600K here: ua-cam.com/video/OkHMh8sUSuM/v-deo.html We kicked off our Alder Lake reviews with the i9-12900K here: ua-cam.com/video/fhI9tLOg-6I/v-deo.html Learn about the CPU architecture here: ua-cam.com/video/htCvo9XJZDc/v-deo.html
E-cores causing problems in DRM protected games and programs, also forcing avx 512 to be disabled, making different cores Is bad idea, reminds Ps3 cell cpu where different cores gave many problems to developers. Intel repeated same mistake. We need to send Intel a message that they need to stick to unified identical cores only in next CPU generation and just make unified cores more power efficient. I suggest everyone to skip 12 gen CPU and stick to 11 gen and let Intel know that e-cores sucks and we don't need them again at 13 gen CPU.
@Windows Alternative for an extra 5-10 frames once you are already at over 170 FPS doesn't seem worthwhile when you are using 20% more power. That's more heat, more fan noise and a shorter life span of your gaming rig. For me performance captures all of those key metrics and not just FPS which is why I would still at this stage stick with AMD.
@Windows Alternative You just outed yourself as an Intel fanboy! The sales figures speak a different language. At least in my country. The majority is usually right. And no, pure computational performance or fps is not the only thing to consider. Even as a gamer! What about the upgrade path? Has Intel mentioned anything yet? I think that Intel is the current king is YOUR opinion. You should quickly realize that in life there are people with different opinions than yours. Otherwise you won't get far in life. 🤦♂ And no I'm not an AMD fan boy. In fact I'm considering an upgrade to 12600K from 4790k right now. But the upgradability at AMDs side is just so seductive... Especially with the beginning of a new super cycle like AM5 might be! 🤔
@@randocrypto1678 I can see these e cores clocking higher in the next few generations and when that happens we are going to see massive gains....hopefully
12700KF is definitely on my radar for gaming but given i'm likely not going to build a PC until maybe Mid--2022 i'm still likely to hold of to see what AMD does. But i love times like this with new tech coming out, it's all exciting and has me wanting to build a PC again, especially when my current one is a 6 year old 6600k rig.
I'm pretty sure ryzens next big-little architecture would be more efficient, assuming they will have something like that. But there's a lot to consider when buying, like we still have RAM compatibility issues on ryzen, who knows what further problems ddr5 will bring. I've never had a common mobo issue with every intel system I've had.
When you do iGPU testing, I'd love to see DDR4 vs DDR5 testing. iGPUs are one of the few places where the massive bandwidth of DDR5 makes a lot of sense right now, but I'm not sure that the anemic 32 EU iGPU included in the K series Alder Lake parts really has enough horsepower for the advantage to show up.
@@gunterreich2535 A majority of desktop computers rely on integrated graphics. The more powerful they can be made, the better for consumers. Also, there’s that pesky GPU shortage, so if someone has to make a due without a GPU, finding the fastest iGPU is in their best interest.
@@gunterreich2535 both , intel spend tons of transistor on the video engine and IGP for doing encode decode / machine learning / VNNI / DLboost etc , all of them will affect hardware accelerated workload (e.g. adobe , gigapixel AI stuff) and intel like to put about 15 pages powerpoint saying these benchmarks intel have the high ground compare to APPLE M1 and AMD . adding accretor is the future instead of using x86/ARM core to do the work
Yes, if AMD manage to continue on the same track as previous Ryzen generations, we can expect a significant uplift in performance. It will also be very interesting to se what AMD have with these new Ryzen 5000 with the added cache memory, will they be able to push past Intel's new 12th generation, and can they sell them at a competitive price.
@@obsprisma That is true but with this release I feel it's a bit justified to wait a few months, only the top motherboard chipset is released yet, you can use DDR4 but now when it is a thing I would like to go with DDR5 but those are very scarce right now, you can use Windows 10 and it works well in most cases but this platform is aimed to use Win 11 and that is buggy as hell for now. Seen a number of reports from people that can't get any image from their GPU's on this new platform, I suspect it is a UEFI (BIOS) bug. All this considered I feel I will wait something like 4-6 months before I make a decision to upgrade or not. But that's just me, I don't say there's any wrong way whichever route someone go, just what feels right for me right now.
@@obsprisma Once you start examining the specific performance and features of a given, new CPU generation from any chip maker, the adage of "doesn't matter, or ...there is very little reason to wait, because something better is always around the corner"... kind of goes out the window for many people, in my estimation. Zen 4 could be a perfect example. There could be a sizeable amount of people who want the features and performance Zen 4 MAY bring, that is not able to be found elsewhere, and thus, worth the wait for some people. Power efficiency that is better than Alder Lake, better than raptor Lake, and better than Zed 3 v-cache, especially in the non flagship level CPU of that lineup. Zen 4 mid range CPUs will be sought after by gamers, and by gamers who like having more power efficient CPUs than the other comparable options of the day. Performance roughly on par with Raptor Lake, in terms of gaming performance. Or at least, better than Zen 3 by over 15%. This performance level can be seen as a good place to pause for a decade, because of the high frame rates we are already getting with these CPUs and even more so by Zen 4 era, and because of the next feature.... Performance, in terms of the PCIe 4.0 or 5.0 bus. If Zen 4 has PCIe 5.0, then that will be seen as something that protects your investment for along long time. a Zen 4 CPU, with a GPu upgrade every 4 years is a great way to stay on top of gaming performance. In 2028, a fully saturated PCIe 5.0 slot with a power efficient and fast GPU, will quite the complement to a Zen 4 CPU, which at that point will have been serving the user well for 6 years or so. ( and was potentially as of this writing anyways, the most power efficient mid range gaming CPU available in 2024, with performance that was indistinguishable from all other gaming CPUs available in 2022, possibly even the flag ship CPUs of that year )
These are exciting times. Already on Ryzen 5900X but stoked about Intel actually putting up a fight. Dont want AMD to become another Intel from the past. More power to the consumer! I feel the 12700K is a great gaming CPU for the pricing.
i havent been looking at AMD for a very long time, ever since they won the fight their pricing has been ridiculously overpriced far worse than when Intel was monopolizing
Putting up a fight ,,, you twit ,,, Vastly superior platform ..... AMD was ahead for ONE generation in the last THREE DECADES and the AMD shills just keep up the bull-twang
iGP reveiws would be very helpful. Especially comparing Intel's iGPs, AMD's APUs, and where they sit relative to entry-level GPUs. GPUs are so hard to buy and expensive right now; would like to know if it's reasonable to just buy an APU / CPU w/ iGP for now, and get a dGPU later when prices come down.
if you still have any dGPU from the past 8 years, odds are it'll be better than the iGPU That said, iGPUs are great for redundancy & troubleshooting. If anything happens to your GPU, you can still drive the display
@@InnuendoXP first time builders likely don’t have a gpu just lying around to use. Some gpus are also eol with no more driver support, and with a new version of windows out, older drivers may have compatibility issues. In the current market, igpus should be be considered more.
@@Biwa_Hayahide Nobody concerned with compatability issues should be upgrading to a new version of Windows within the first year of its life. Sure 1st time builders should absolutely stick with the iGPU & play some CS:GO at 720p or something until the market returns to sanity. But for anyone else, if you intend to game on an iGPU, just set your expectations accordingly. The Steamdeck will have a state of the art APU not yet available to desktop users & it's lucky to trade blows with the base Xbox One.
Considering the current gpu pricing debacle scenario. The 5600g and 5700g are the stop gap apu solution. Intel igpu don't hold a candle. 1030 is the sad gpu alternative but viable when compared to the amd apu options.
if the price difference is just like 20-30 bucks, one big advantage of having an iGPU, performance-aside, is for redundancy & helping with troubleshooting. I'm still pushing an ancient 4690K & an old GTX 970, having the iGPU helped me with the process of troubleshooting my hardware issues that came down to a faulty DP cable, and in the meantime I still had a functional PC for work even with the GPU apparently out of action. Also good enough to play some indies in the meantime.
I had almost the exact same problem AFTER upgrading away from my old 4690k, and no iGPU makes things a lot more tedious. Especially when the CMOS is underneath the graphics card!
@@AlisterCountel and a lot more nerve-wracking in the current market. If GPU prices were actually sane right now & I had no iGPU honestly that might've pushed me into upgrading sooner than I'm ready for. It's also a shame AMD's Zen3 lineup compromises their 'G' products CPU performance so harshly.
Yeah this is why I bought myself a cheap $50 GPU along with my 3600X, just for troubleshooting. Back then I thought $50 was a lot for that, boy how times have changed.
I always buy the IGPU chips and for the small difference I reckon it's worth it. Nice to build your system and get the BIOS updated and test it out properly before you start adding a GPU into the mix.
The fact this CPU is now 189.99 on Newegg is just insane. I picked up a F-you variant just now and gonna sell off my 12400F. I have so many old gpu's layin around I dont need an IGPU, unless its from AMD :)
You guys are probably already working on this but I'll mention it anyway. Adding light and idle load power consumption into the test suite for Alder Lake (windows 10 & 11), would really show how efficient those E cores really are. And it would useful for the real world, since not everyone is running their system at full tilt all the time, efficiency can lead to lower operating costs etc... although full tilt they might consume more power, if they reduce that power draw while doing light tasks, overall power consumption might be better for Intel.
@@Kaygoooo I've seen them do gaming power tests once or twice. The difference between power draw during gaming between different CPUs is really marginal compared to how much juice your GPU would be drawing anyway. Most CPUs, i5, i9, Ryzen 5s, Ryzen 9s were all around the same ballpark, although I am unable to find the video in which they showed it at this moment.
It doesn't make a meaningful difference. Anandtech has done per-core power draw of CPUs, links at the bottom. Loading 3 cores of the 5800X should give you a power draw of around 71W. Assuming 10 hours per day, every day, and 10 cents per kW/h, you're looking at a yearly cost of around $26. Loading all 8 efficiency cores of the 12900K draws around 48W. Using the same assumptions you're looking at a yearly cost of around $18. Idling the 5800X uses around 12W. That works out to around $4 yearly. Idling the 12900K uses around 5W. That works out to around $2 yearly. Would this matter if you were deploying umpteen thousands of computers? Absolutely. Does it matter for the average home user? Absolutely not. Even worst case scenario, with the CPU hammered for those 10 hours per day, that's 131W for the 5800X and 259W for the 12900K, that's still only $48 and $95 respectively. The biggest difference is only $4/mo. www.anandtech.com/show/16214/amd-zen-3-ryzen-deep-dive-review-5950x-5900x-5800x-and-5700x-tested/8 www.anandtech.com/show/17047/the-intel-12th-gen-core-i912900k-review-hybrid-performance-brings-hybrid-complexity/4
@@BobBobson This is a good result for these hybrid CPUs Looking at the data they might be 30-50% more efficient doing the same light task or idling. Maybe I shouldn't have led with operating costs but lower power consumption has other benefits as well. Being better for the environment and if they can scale this down well, giving better battery life for mobile devices. A lot of the world's power consumption today is from computing devices and of we can reduce it with Hybrid architectures that's a win.
Effects the response times, really isnt worth it in my opinion keep this tech for industry embedded solutions.. Gamers care about response speeds and overclocking
I was supposed to be in the process of building a 5900x video editing rig, but now I am leaning towards the 12 700k because my local micro center has them for $399. Thanks for the helpful review as always!
I just bought 12700k for 190$ which is a great value and i think even today this chip will perform amazing for my productivity and some gaming use. Also almost 10 years old i7 3770, which served me very well is now very outdated and slows my work in 3D, so it will be a great upgrade. Not mentioning 16 gb ddr3 to 32 gb ddr4... Next month i will put all these parts together and hope it will last 5 to 10 years as well. Thanks GN for your amazing work
The 12700k might be the best cpu launch in a long time, not only for intel, in general I think it is a milestone in technology. I think it is being underestimated just because recent history has taught us that the 'middle' cpus are not worth it. This one is the exception.
Not sure why. It's fighting for parity with the Ryzen 5k series while costing significantly more as a package (higher mobo/RAM prices). It's novel, but not nearly a great CPU launch. This isn't a Ryzen upset by a long shot.
Intel fan boys are so disconmected from reality . No shelfs are sold out of the 12900k yet when ryzen 5000 series came out it was sold out everywhere hell its still kinda sold out in most places even a year later . Walk in to best buy 100s of 12th gen intel for sale walk in to microcenter 100s of intel for sale how is this revolutionary they probably only sold a few 12th gen cpus and amd has sold over 8.3billion 5000 series cpu
Interested on the non-K SKUs too. Sure the 12900k is a beast of a cpu but..... How good/efficient will be that same cpu when it's locked and chained to a 65w tdp. And how good will it be when you remove said power limits?
@natma relnam it's more of seeing how far you can pull performance/watt with non-k parts. While it's a bit raw yet, and very incomplete, I'm actually getting interesting results in a 11900f without power limits vs stock 11700k, at around 180w, and I still think that I can somewhat improve results further. It is very synthetic atm tho and I might become doing something wrong but, at the time I even got it for less than the 11700k.
Here in the German speaking EU region (Austria, Germany) you can get the 12900K for a cheap 700 Euros (812 USD). Motherboards that can actually properly drive it go around 300-500 Euros (350- 580 USD) and DDR5 is out of stock everywhere but would go for around 340-400 Euros (400-465 USD) for 32 GB CL38 5200. So worst case you pay around 1600 Euros (1855 USD) for CPU+MB+RAM which is fucking ridiculous. If you add possible extra cost for a new PSU and a better CPU Cooler you land at around 2000 Euros (2320 USD). I don't call that competing on price.
12700K goes around 500 Euros and 12600K for around 330 Euros. Also not super competitive but the overall price goes way down at least because you can use cheaper motherboards without issue, save money on DDR5 and do not need a new PSU or cooling solution.
Yeah its actually kinda crazy. Well at least in my case i did astonishing upgrade from old PC (plus i really need new one) and manget to get the minimum for i5-12600k & DDR5 just with 1000 EUros. Its still a lot but i think it might be worth it as who knows what happans with prices and DDR5 stock.
In comparison I recently upgraded my system from a 8700k to a 5950x (I use my desktop for work, for gaming I would've gone with the 5900x). I just upgraded CPU and motherboard (ASUS Crosshair VIII Hero) as my PSU, RAM and trusty NH-D15 were absolutely up for the job. Total cost 1220 Euros.
buying an 11700K was like buying a 7700K, just in time for the 1080 Ti and 3080 Ti, but you miss out on so much extra CPU power, 8700K 1080 ti and 12700K 3080 10gb builds are king
Could you test 7 zip with different dictionary sizes than 32MB? That's just the right size for it to fit in AMD L3 cache and not fit in intels. There's a wide range from 2MB up to 1GB dict size that 7zip supports in it's test and it may be interesting to see what will happen with different sizes over that range (ie "normal compression uses 16MB while ultra uses 64MB)
I think it might be better to publish why they settled on 32MB for dictionary size. How does that affect final compression performance relative to other options?
Man, definetly the Ryzen 5 5600X sucks. Now it's overshadowed by two 8 core CPUs, Core i7-10700K and Ryzen 7 5800X. Why AMD priced the 5600X higher than $300 MSRP? Why? It's the 11900K of Zen 3, dumb product.
The 12700 is only $20 more right now Micro Center. I'm in the store looking at it thinking about how I have a 10850 k that I can sell for almost the same amount. I don't know why I'm debating that much
F-SKU-se me, sir, la noi in Romania cea de toate zilele, daca la americani costul mediu va fi de ~600$, la noi EMAG va tranti o mega oferta in jur de 900-1000 Euro. Taxele vamale ne omoara tot avantul.
So, Intel is finally taking AMD seriously to be at parity with them. Lets hope both companies duke it out hard next gen so we win more. I don't even care who 'wins', I just want an upgrade for less than my kidney.
Wrong prices will only rise. Intel kept the same prices all throughout 14nm+++++ so when they had something decent they could launch at those and then raise prices next gen
@@denverbasshead intel keeping 14nm because they lead the market from 2012 to 2016. so they can charge whenever price they want now amd can compete with intel so price increase is unlikely unless is a demand and supply scenario.
@@denverbasshead price will raise all the time... but what we can get for that rised price ? maby we will get 2x faster cpu ? so we will win.. because 5years intel selling same cpu for rised price...
Reddit breeds hiveminds anyways. Soon people in that subreddit will be praising AMD for being what Intel used to be criticized for (pricing crazy high because they think they're the premium brand when they're not).
It would be very helpful to publish CPU temperatures in addition to power draw. Heat distribution across the die can affect how well the CPU can be cooled so power draw is not an accurate index of how easy or hard it would be to cool a CPU.
Wondering where a 12700KF with E cores turned off would end up sitting on the overall performance ladder, while avoiding the hybrid scheduling problems hopefully
iGPUs are fairly important for people who want to build a professional rig at a budget, without discrete GPUs. Not all professional workloads are like Machine Learning or block-chain mining... requiring GPUs/GPGPUs. I am one of them, so is a few of my teammates at work, my personal rig does not have a graphics card. Yet it had 32 gigs of RAM and 3 SSDs around 4 years ago. If I was to build a rig now, I'd be looking at these new Intel CPUs or AMD 5900x (will add a sub-$50 gpu). Even the 8-core 5700G will probably cover my current requirements. If I can couple that with gen-4 SSDs and 64 GB ram, I have a machine that will run a whole load of virtual machines with high utilisation software running. I hope AMD will be releasing APUs with higher core count than 8.
iGPUs are a godsend for pro audio builds. No DAW needs a good GPU, and it drives the cost down significantly. The 10700(K) was a phenomenal pro audio CPU. It was truly a shame the Ryzen APUs felt a bit gimped.
@@MadClowdz well considering it only came out a few months ago and the 10700k came out in 2020, I think it not existing when he needed to build his rig is what made the AMD APUs feel "gimped"
I found the bit @16:20 about AMD 5800X having less stutter very interesting. I also recall when Ryzen 1800X first came out, reviewers actually noticed this, despite it didn't show in the tests. As I recall, that was the reason 0.1% was included in tests by several reviewers. Seems like that bit is still relevant.
The fact that your MT workload frequency isn't constant suggests power limit throttling (could also be thermal but I'm assuming your cooling is adequate); so maybe the motherboard settings are different between the 12900K and 12700K. Intel PL1/PL2/PBP/MTP strikes again.
I sorta can't wait for AMD's response to these CPUs. Prices for Zen 3 are already significantly reduced (I saw 5800X going for $299 this week at Microcenter). As an AMD fan, I love that Intel finally made a CPU generation that beats AMD. It's healthy for the market. Better products, faster development, and cheaper prices. It's only good for the consumer.
What do you mean finally? Literally the only series that was faster than Intel was the Ryzen 5000 series which is the latest series... and the difference between 10/11th gen and ryzen 5000 is smaller than the jump from ryzen 5000 to intel 12th gen.
Just ordered a 5800x for 250$US and of course the 5800x3d comes out for 100$ more but up to 50 more fps in 1% lows in a ton of games compared to the regular x lol
It's been a very interesting release, but I'm waiting to see what Raptor Lake and Zen4 will have to offer on more-mature DDR5 platforms! Though I must say, a 5800x for $300 is actually quite a great deal.
I will say, even after seeing what Alder Lake can do, that I have zero buyers remorse for my 5800X, even at the $395 I paid for it back in May. One of the main reasons I bought it was to see if it could get RPCS3 playing Gran Turismo 5/6 at a locked 60 FPS. Amazingly, it does it, although PBO/Curve Optimizer and 4x8GB of 3733MHz CL14 RAM was a key factor in that. Everything else I've thrown at it on top of that, it just chews through it. In my 20+ years of building, every time I've bought a CPU there was always some application that made me think "I wish it was a little bit faster." The 5800X has been the first CPU where I'm like "damn, this thing just keeps delivering on all fronts and continues failing to disappoint." $300 for a 5800X is a killer deal, use the money you save on not buying overpriced DDR5 for a better GPU and the 5800X will be PLENTY for years. Yes, Alder Lake is a monster, but believe me, so is Zen 3.
@@dafaqu694 One set of 3200MHz binned Samsung B-Die, and one set of 3600MHz binned Samsung B-die, both G.Skill Ripjaws V kits. Bought the 3200MHz kit 2.5 years ago for $105, and the 3600MHz kit 1.5 years ago on sale for $118. You don't really need B-die though, you can get nearly the same results (maybe -2%) with Hynix DJR modules. The G.Skill Ripjaws V 3600MHz GVKC kit is Hynix DJR, and those will run 3800MHz CL16, and you can get 4x8GB of that for around $185 which is really good bang for the buck.
It's an amazing deal if you need a system today I agree, though it does also mean you will be limiting yourself to EOL DDR4 and PCIe Gen 4 (which honestly, that one is not going to be much of a handicap for a long time), not much path for future upgradability if you go that route.
@@catlikehana Yes but I was specifically talking about pros and cons of going for the 5800X pricedrop. Also, we can't say for sure what the difference in performance will be until it's been released and tested.
Great video! Two things I'm interest in: 1) Power consumption while gaming (you only showed fps) 2) Impact of e-cores on/off for gaming in Win10 and Win11 (mostly Win10 if I'm honest)
I picked up a i5-12600k with an Z690 auros elite ax mobo and 32gb ddr4 3600mhz. Upgrading from an i7-5960x with asrock fatality mobo and 32gb ddr4 2133mhz. I’m pretty sure I’m gonna be happy. The old system will become a workstation/gaming station for my wife. Had to buy an rtx3070ti since my old Pc needs its rtx2070 super to still work, got the ti because actually picked it up off Amazon cheaper then the regular 3070.
For strictly 4k video editing on premiere pro should I go for 12700k or cut cost and go for 12600k. Is spending extra worth it? Please let me know, thanks🙏🏼
you might want to look for 12700 non K variant. It cost very close to 12600K and it's just better performer with 2 more cores and a lot more cache. OC doesnt really matter for alderlake.
I mean, I myself (and I'm sure a lot of other people as well) would say Ryzen was the turning point for the CPU market when Intel was still charging $1000 for an 8-core CPU, which isn't even half a decade ago. Now it has finally gone back to a competitive market after years of incompetence by Intel following AMD's resurgence.
@@djlytic4603 I can't buy an Apple chip and build a custom PC and run my OS of choice on it, so... Although they may be interesting from a technical perspective, they are not *exciting*.
I'm upgrading from 4770k to 12700k. I'm planning on using my semi-new Cooler Master ML240R AIO with the 12700k with an LGA 1700 bracket and was concerned about the temps. So I'd very much appreciate if you guys do some temperature tests of the Alder Lake with a cooler made for previous gen sockets..
@@plop31 ... First of all, can't afford a new AIO (360mm on top of that) since I have to buy a MoBo as well as RAM kits. Also my current AIO isn't shitty at all. And I have no plan on OCing, also I'm avoiding F series cause then it'll be very hard to troubleshoot if I were to check any faulty GPU's and/or if I need to use the integrated graphics while I buy or wait for a new GPU. Thanks for your suggestions anyways :)
@@twiggsherman3641 Well, homie!! as I mentioned, I can't afford another Cooler right now and also Arctic Liquid Freezer II 360 isn't available in our country, no one (Distributers) sells it here.
Alder Lake pricing looked promising but then I looked at the prices of motherboards at my retailer and I noped the hell out of there. Meanwhile there are years old dirt cheap B450 motherboards that can run a 5950X at full power without even overheating the VRM.
Yeah, if you can afford a 5950x, you pair it with a trash pcie 3.0 mobo with pcie 2.0 chipset so not only your gpu is bottlenecked but also you pcie ssds. Oh no, you must be an amd fanboi, sorry, didn't know the condition was pathological.
I agree new z690 boards are still too expensive, but comparing the new z690 boards are far, far higher quality then those lower tier b450 boards, dumb comparison.
Most B450 in the market cant really handle a 3800x without melting the vrm... dont mention 5950x lol. This is kinda it is possible but doesnt mean you should. Playing basketball on heels is what i would use as an analogy.
It’s a great CPU, at the time of release it was a bit overpriced because of the performance difference from the 5600x and 5900x but it was available at the time. I just got the 5600x last month and love it, I’ll probably upgrade to DDR5 in 1-3 years, everything right now is too expensive.
I've just finished building a brand new platform update (coming from a i7 870). I've chosen the Ryzen 7 5800x since the whole build (CPU + B550M Mag Mortar + 32GB 3200 Mhz RAM) costs almost 1/3 of the price of building a brand new Intel CPU kit (3000 BRL for AMD vs 8600 BRL for Intel). With the dolar conversion being so high right now, buying brand new hardware here in Brazil is even more expensive than around the world. Even though part of me see's these results and wonder if it was the correct path for a mix of productivity and gaming, but I believe it was the more reasonable choice.
How are you wondering if it was the right call? brother, it was one third the price + intel's lead here is really, really small, nothing like 20~~25% we have before in 8th gen intel vs ryzen 5 2th gen. Here is more like 5~~10% at best and sometimes a tie and even losing... and you saved a lot of money. I really dont know what you mean with "wonder if it was the correct path seeing these results", maybe you didnt watch the whole video?. Also you can still upgrade later with more cores and 3d v-cache is coming as well.
@@igorrafael7429 not to mention the efficiency on the Intel side is just nowhere close. You're literally talking 10's of USD a month in electricity alone if you run your computer hard say 4-6 hours a day and electricity where you live is a bit pricey.
@@igorrafael7429 Yeah, I totally agree. In every line that you've wrote. It's just that I'm buying a PC with not the last gen PCI not DDR technically. Imagine, I've bought a PC around in 2010 before my current build. I'm intending to used as much as possible for a gaming and productive perspective for a long time. Buying "yesterday's" tech kind of makes me think "What if I...". But I totally agree on you and it was waaay more cheap on that build.
@@igorrafael7429 a lot of people aren't using pbo when testing and I've heard when you set pbo limit to 250 watts Intel is non existent besides single core
*What they don't talk about* Is just how AMAZING this chip is at undervolting and Ocing when combined. My 12700k literally, has a -70mv offset and +1 on each multiplier. Along with a 120second boost instead of 55 seconds. I went from stock 87c to 79c with an overclock and lower temps!!!! Stock r20 score was 8709 multi. Now it is 8951. Free performance, with an overclock and undervolt. Insane!! these chips are more efficient than they seem, they have a VERY generous voltage out of the box. You can almost 100% undervolt any 12700k or i9 by 70-90mv. Just for temps alone it is worth it. use the intel Oc tool.
Ironically I just bought a 5800X today. For me, it was a drop-in replacement for my 2700X, no new board. Also go it for $330 at MicroCenter, can’t beat that!
So you dropped a 5800X in to a board with no Wifi-6, no USB type-c, no PCIe 4.0... and less performance then you'd get out of a B550 or X570. Definitely can't beet that!
@@SweatyFeetGirl maybe, but to be honest, I know myself, I would have ended up spending more on the 3D chip 😂. I thought about waiting but decided I didn’t want to wait months to find out. I bought the 2700X at a similar MicroCenter sales weeks before Ryzen 3000 came out. I wasn’t wrong then, that CPU took over 6 months to drop down to that price again.
I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This card is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did.
13:11 I don't think it's as simple as saying that the 5600x/5800x "enjoy leads" here. The 12700k does better than the 5600x, due to equal average fps but measurably better 1% lows (arguably more important for competitive esports titles). When compared to the 5800x, it does slightly trail behind in average fps but again gets the win in 1% lows, which makes them about equal in terms of a competitive experience. The 5900x does simply do better, though.
when testing power efficiency (performance per watt) also take the duration of productive runs, like cpu x needs 100 watt-hours for task y. and please also test money efficiency (performance per dollar) at msrp/best price at a certain time/average price and please not just the cpu but instead the total platform (mainboard, ddr4/5, psu)!
Someone ought to create a benchmark that simply cycles through all of the cpu instructions one at a time, looped, in order. and return the results for each instruction.
Just bought the i7-12700F for 330€ to finally retire my current i7-4770, funnily kinda skipped the whole Ryzen hype time and the weaker intel CPUs to this one
just picked up a i7 12700k for 299 but microcenter had a deal going on so i got a z690 plus wifi mtb for 50$, not a bad buy for 350. :DDDDDD Probably gonna pop a 6800 in it
As someone that considers having an IGP a necessity for any cpu the fact that I sometimes see Intel F cpus listed at the same or higher prices than the non-F versions is mind boggling.
@@pixels_per_inch Even if you never actually end up using it having the option to fallback to integrated graphics will always be better than your computer basically becoming unusable because something happened to your graphics card.
The 12700K is price-equivalent to the 5900X, not the 5800X, when you consider the platform costs. That's _not_ including RAM. And that's not basing it on the Microcenter pricing of the 5800X.
True, I think the 11600 and 11700 are the Price to performance winners. Especially for gaming. Ryzen is twice the price where I am for 5-10% and Alder Lake with DDR5 is even more
@@roknroller6052 I'd say 10400 wins on value right now, for as long as it lasts, anyway. When the 12400 comes out, it might just be unambiguously the best choice, but we'll have to wait and see.
I didn't think Alder Lake would be that impressive. I'm so glad to have been proven wrong. The Golden Cove cores are monsters and if Intel had gone for a homogeneous architecture like AMD, I think they'd have blown the 5950x out of the water across the product stack. You'd need a 1200W power supply, but still damn impressive (and I'm saying that as a 5950x owner). I'm super psyched to see how AMD responds.
Now that we are seeing some of the final leaks of Zen 4, And RDNA three, I’d say intel scared the hell out of them! And now they will reward us with some amazing products!
@@mrsandroks Agreed, it’s not needed, but it’s useful if you want to have a quiet system. I have a 1200W Corsair PSU and even when at full load with a 5950X and the 3080Ti, the PSU fan doesn’t spin up. That’s great since I tried to build a quiet computer such that no loud fan noise is audible from my normal seating position when the PC is under the desk
Me having i7 12700k, DDR5 5200MHz, ASUS PRIME Z690-A and a NH-U12A cooler installed the day after release here trying to get some confirmation bias. Upgraded from i7-4790k DDR3 1900MHz, ASUS Prime Z97-A. Now my RTX 2080 is bottlenecking me since i play on 3840x1600
Its good to see Intel is fighting back and looks like they are gaming king again. AMD may not be in trouble right now but after a few generations they will be run short of fabrication size, intel is still managing the same performance with older fabrication sizes. there is a limit how far we can shrink the die size and soon it will be tougher to develop new technology.
For iGPU, I would like to see it used in low graphical intensive applications like web browsing, MS office like Excel and word, video, and so on. Then use the GPU for more graphical intensive task. I would think the whole PC would use less power that way and give a decent reason to put an iGPU on a CPU other than off time of trouble shooting to see if the GPU is dead.
@cat -.- if you have a CPU that doesn't have an igpu built in it, you don't even get a picture out of it. If you want some sort of picture out of a computer that doesn't have an igpu and gpu, you would have to connect to that computer using remote access or web interface if you make it a server. Even then, setting up the PC requires some sort of gpu. After the setup, you can remove the gpu. That is what I did with my NAS.
Most ASUS motherboards let you use the iGPU along with your discrete GPU for extra monitor support. I get a lot of value out of this function since I run 6 monitors. Definitely worth the extra 40 bucks for me.
@@Dennzer1 Lol, I'm thinking about adding 2 more actually =P You can run 4 monitors off of 1 Displayport with MST (multi stream transport). I culd just add another video card, but the iGPU only uses an average of 4 (yes four) watts!
You seemed to be looking for the term "Diminishing returns" when describing the price/performance falloff for the 12900K. I think that term explains your point pretty well.
igpu also useful for when you dont have a GPU at all. Also useful for when you have multiple monitors, plug in 2nd into your igpu. Helps if you have mismatching refreshrates. If primary display is 144 and 2nd is 60hz, both on the same GPU, and you view a video say Twitch on the 2nd 60hz display, the 144hz start behaving more like a 60hz display (while still being at 144 mind you) which has been an issue for a long time.
Meanwhile the 12900k used more power only in benchmark tests. On the other hand in every day usage (gaming / productivity) in cases you missed it was 10-25% W less. Measured with watt meter at power socket ….. But yeah in benchmarks uses almost double the power
@@WarshipSub Gaming I could believe as it only uses one or a few cores. Productivity is where it uses a boatload of power. If it didn't, it simply meant all the cores weren't being utilized or it wasn't being pushed.
@@darreno1450 yea, it's literally only going to matter if you're rendering 24/7 AND care about power draw. I suppose there are some people or use cases where that matters. But it's also still 10nm which is going to require a bit more power than smaller nodes.
I keep missing the the review part where the E-cores are tested for running mundane tasks and how much energy it saves in doing these simple tasks. I thought that was the party piece of these new CPU's.
Why include the 5950X with an all-core 4.7GHz overclock topping the power consumption chart, and then not include it in the actual productivity benchmarks? Weird decision.
It seems to be a 3+ variable equation. For homebodies, it can be a simple choice but if you are commercial, 2000 Alderlakes may farq your energy budget. Smart CFO/CIO management teams must choose AMD's efficiency over INTELS' speediness.
The new Alder Lake CPU's are making things exciting again, great job with these review video's Steve! I'm also hoping that this next year when Intel releases their new Xe HPG video cards, that it will help bring some normalcy back to the GPU market! And I LOVE the new studio/office setup, it looks great; Can't wait to see your new studio tour video's when you release them Steve! :)
Why do they never benchmark Games that are extremely CPU bound like Planet Coaster, or age of empires, city skyline, that would be a better illustration of their power.
@@numberM4 idk about benchmarks if I’m looking at buying a cpu like the 12600k for gaming it makes no sense as they are pretty run the same on gpu intensive games, so games that are cpu intensive like planet coaster are the only real reason I’d even think of upgrading from my last generation cpu.
Grab a GN Tear-Down Toolkit on back-order now to guarantee you get one in the next run! store.gamersnexus.net/products/gamersnexus-tear-down-toolkit
Intel pushed Windows 11 for Alder Lake, so we tested Win10 vs Win11 here: ua-cam.com/video/XBFTSej-yIs/v-deo.html
We also reviewed the 12600K here: ua-cam.com/video/OkHMh8sUSuM/v-deo.html
We kicked off our Alder Lake reviews with the i9-12900K here: ua-cam.com/video/fhI9tLOg-6I/v-deo.html
Learn about the CPU architecture here: ua-cam.com/video/htCvo9XJZDc/v-deo.html
What about the coasters dude?? 😜
E-cores causing problems in DRM protected games and programs, also forcing avx 512 to be disabled, making different cores Is bad idea, reminds Ps3 cell cpu where different cores gave many problems to developers. Intel repeated same mistake. We need to send Intel a message that they need to stick to unified identical cores only in next CPU generation and just make unified cores more power efficient. I suggest everyone to skip 12 gen CPU and stick to 11 gen and let Intel know that e-cores sucks and we don't need them again at 13 gen CPU.
@Windows Alternative for an extra 5-10 frames once you are already at over 170 FPS doesn't seem worthwhile when you are using 20% more power. That's more heat, more fan noise and a shorter life span of your gaming rig. For me performance captures all of those key metrics and not just FPS which is why I would still at this stage stick with AMD.
your lighting is not good as usual
@Windows Alternative You just outed yourself as an Intel fanboy! The sales figures speak a different language. At least in my country. The majority is usually right. And no, pure computational performance or fps is not the only thing to consider. Even as a gamer! What about the upgrade path? Has Intel mentioned anything yet? I think that Intel is the current king is YOUR opinion. You should quickly realize that in life there are people with different opinions than yours. Otherwise you won't get far in life. 🤦♂
And no I'm not an AMD fan boy. In fact I'm considering an upgrade to 12600K from 4790k right now. But the upgradability at AMDs side is just so seductive... Especially with the beginning of a new super cycle like AM5 might be! 🤔
13:03 “Out ranking the 10900K from last generation.” 11th gen really is that unremarkable.
the only thing remarkable from 11th gen is the 11400 and f sku
The 11900k is a waste of sand after all.
11th gen is remarkable, just not in the good way...
@@srquack27 exactly, the 11400 was the best value CPU this year. Next year the 12400 is going to be the same.
It was a waste of sand
"F-SKU" from Intel. Well F-SKU too!
fk!
😂
F-SKU right back 😂😂
😂😂😂
Can't believe IRL I lol'd at this. Hats off to you sir brilliant.
13600k owner's contingency plan
Those Ecores' qty and speed alone are like 3 times faster than my 2014 laptop.
thats my case to, and they are less power hungry to
Seems... according to someone at Intel, they're equivalent to something like an i5 6600.
@@TheGameBench The fact that I used a 4690k not too long ago and it’s slower than peoples ‘background task’ CPU now. Wild.
@@randocrypto1678 I can see these e cores clocking higher in the next few generations and when that happens we are going to see massive gains....hopefully
@@TheGameBench they said Skylake core but didn't specify more. It's probably closer to 8th or 9th gen skylake i5 core. They're more efficient.
Kinda nice seeing the head to head competition again hopefully, there will never be another Bulldozer situation for the rest of history.
Competition is good for everyone. Except for forum moderators that have to deal with the fanboy wars.
or 14++++++++++++++++++++++++++++++++++
@@robihr ^ forum moderators worst nightmare.
@@twiggsherman3641 lmfao true
@@robihr Skylake4ever
12700KF is definitely on my radar for gaming but given i'm likely not going to build a PC until maybe Mid--2022 i'm still likely to hold of to see what AMD does. But i love times like this with new tech coming out, it's all exciting and has me wanting to build a PC again, especially when my current one is a 6 year old 6600k rig.
Same, im waiting to see what amd got and a little more until the next gpus come out. May god save our wallets.
6700k/980 ti here. I'll be excited when $700 gets you the TI card that's bested only by the Titan again.
By the time you wait for AMDs next gen, raptor lake will be out 2 to 3 months later
@@stormchasingk9 You're gonna take that $700 dollars, buy a 5050ti and like it, or else.
- Nvidia Exec swimming in money
I'm pretty sure ryzens next big-little architecture would be more efficient, assuming they will have something like that. But there's a lot to consider when buying, like we still have RAM compatibility issues on ryzen, who knows what further problems ddr5 will bring. I've never had a common mobo issue with every intel system I've had.
When you do iGPU testing, I'd love to see DDR4 vs DDR5 testing. iGPUs are one of the few places where the massive bandwidth of DDR5 makes a lot of sense right now, but I'm not sure that the anemic 32 EU iGPU included in the K series Alder Lake parts really has enough horsepower for the advantage to show up.
What I find amazing is that Intel opted to stick the 96 EU iGPUs in _laptops._
@@sonicboy678 who is more likely to need an igpu? a desktop with unlimited power and cooling or a laptop that needs to be x mm thick?
@@gunterreich2535 A majority of desktop computers rely on integrated graphics. The more powerful they can be made, the better for consumers. Also, there’s that pesky GPU shortage, so if someone has to make a due without a GPU, finding the fastest iGPU is in their best interest.
@@sonicboy678 because laptops are famous for having dedicated GPUs? Makes sense to give more attention to the weakest part
@@gunterreich2535 both , intel spend tons of transistor on the video engine and IGP for doing encode decode / machine learning / VNNI / DLboost etc , all of them will affect hardware accelerated workload (e.g. adobe , gigapixel AI stuff) and intel like to put about 15 pages powerpoint saying these benchmarks intel have the high ground compare to APPLE M1 and AMD . adding accretor is the future instead of using x86/ARM core to do the work
SUMMARY
18:22
"Far more competitive than anything Intel's launched recently."
Thanks Steve
Back to you Steve
They really did well with the 12700, it seems to be the best all around CPU from intel this round. Can't wait to see what Ryzen 6000 with be like.
Yes, if AMD manage to continue on the same track as previous Ryzen generations, we can expect a significant uplift in performance.
It will also be very interesting to se what AMD have with these new Ryzen 5000 with the added cache memory, will they be able to push past Intel's new 12th generation, and can they sell them at a competitive price.
If you always wait what future releases will bring, you never appreciate what todays tech gives.
@@obsprisma Yes exactly, bring on the high power and heat and better A/C.
@@obsprisma That is true but with this release I feel it's a bit justified to wait a few months, only the top motherboard chipset is released yet, you can use DDR4 but now when it is a thing I would like to go with DDR5 but those are very scarce right now, you can use Windows 10 and it works well in most cases but this platform is aimed to use Win 11 and that is buggy as hell for now.
Seen a number of reports from people that can't get any image from their GPU's on this new platform, I suspect it is a UEFI (BIOS) bug.
All this considered I feel I will wait something like 4-6 months before I make a decision to upgrade or not.
But that's just me, I don't say there's any wrong way whichever route someone go, just what feels right for me right now.
@@obsprisma Once you start examining the specific performance and features of a given, new CPU generation from any chip maker, the adage of "doesn't matter, or ...there is very little reason to wait, because something better is always around the corner"... kind of goes out the window for many people, in my estimation.
Zen 4 could be a perfect example. There could be a sizeable amount of people who want the features and performance Zen 4 MAY bring, that is not able to be found elsewhere, and thus, worth the wait for some people.
Power efficiency that is better than Alder Lake, better than raptor Lake, and better than Zed 3 v-cache, especially in the non flagship level CPU of that lineup. Zen 4 mid range CPUs will be sought after by gamers, and by gamers who like having more power efficient CPUs than the other comparable options of the day.
Performance roughly on par with Raptor Lake, in terms of gaming performance. Or at least, better than Zen 3 by over 15%. This performance level can be seen as a good place to pause for a decade, because of the high frame rates we are already getting with these CPUs and even more so by Zen 4 era, and because of the next feature....
Performance, in terms of the PCIe 4.0 or 5.0 bus. If Zen 4 has PCIe 5.0, then that will be seen as something that protects your investment for along long time. a Zen 4 CPU, with a GPu upgrade every 4 years is a great way to stay on top of gaming performance. In 2028, a fully saturated PCIe 5.0 slot with a power efficient and fast GPU, will quite the complement to a Zen 4 CPU, which at that point will have been serving the user well for 6 years or so. ( and was potentially as of this writing anyways, the most power efficient mid range gaming CPU available in 2024, with performance that was indistinguishable from all other gaming CPUs available in 2022, possibly even the flag ship CPUs of that year )
These are exciting times. Already on Ryzen 5900X but stoked about Intel actually putting up a fight. Dont want AMD to become another Intel from the past. More power to the consumer! I feel the 12700K is a great gaming CPU for the pricing.
i havent been looking at AMD for a very long time, ever since they won the fight their pricing has been ridiculously overpriced far worse than when Intel was monopolizing
It's a good price but the mobo prices 😶
@@nismo927 ddr4 boards price looks normal, just forget about ddr5, nobody needs that
Putting up a fight ,,, you twit ,,, Vastly superior platform ..... AMD was ahead for ONE generation in the last THREE DECADES and the AMD shills just keep up the bull-twang
iGP reveiws would be very helpful. Especially comparing Intel's iGPs, AMD's APUs, and where they sit relative to entry-level GPUs. GPUs are so hard to buy and expensive right now; would like to know if it's reasonable to just buy an APU / CPU w/ iGP for now, and get a dGPU later when prices come down.
if you still have any dGPU from the past 8 years, odds are it'll be better than the iGPU
That said, iGPUs are great for redundancy & troubleshooting. If anything happens to your GPU, you can still drive the display
@@InnuendoXP first time builders likely don’t have a gpu just lying around to use. Some gpus are also eol with no more driver support, and with a new version of windows out, older drivers may have compatibility issues. In the current market, igpus should be be considered more.
@@Biwa_Hayahide Nobody concerned with compatability issues should be upgrading to a new version of Windows within the first year of its life.
Sure 1st time builders should absolutely stick with the iGPU & play some CS:GO at 720p or something until the market returns to sanity. But for anyone else, if you intend to game on an iGPU, just set your expectations accordingly. The Steamdeck will have a state of the art APU not yet available to desktop users & it's lucky to trade blows with the base Xbox One.
Considering the current gpu pricing debacle scenario. The 5600g and 5700g are the stop gap apu solution. Intel igpu don't hold a candle. 1030 is the sad gpu alternative but viable when compared to the amd apu options.
I'm really surprised no one has done this yet. Would be great if GN or HUB looked into it.
if the price difference is just like 20-30 bucks, one big advantage of having an iGPU, performance-aside, is for redundancy & helping with troubleshooting.
I'm still pushing an ancient 4690K & an old GTX 970, having the iGPU helped me with the process of troubleshooting my hardware issues that came down to a faulty DP cable, and in the meantime I still had a functional PC for work even with the GPU apparently out of action. Also good enough to play some indies in the meantime.
I had almost the exact same problem AFTER upgrading away from my old 4690k, and no iGPU makes things a lot more tedious. Especially when the CMOS is underneath the graphics card!
@@AlisterCountel and a lot more nerve-wracking in the current market. If GPU prices were actually sane right now & I had no iGPU honestly that might've pushed me into upgrading sooner than I'm ready for.
It's also a shame AMD's Zen3 lineup compromises their 'G' products CPU performance so harshly.
same here brother...
I have a 4679K and a gtx 1080
Yeah this is why I bought myself a cheap $50 GPU along with my 3600X, just for troubleshooting. Back then I thought $50 was a lot for that, boy how times have changed.
I always buy the IGPU chips and for the small difference I reckon it's worth it. Nice to build your system and get the BIOS updated and test it out properly before you start adding a GPU into the mix.
The fact this CPU is now 189.99 on Newegg is just insane. I picked up a F-you variant just now and gonna sell off my 12400F. I have so many old gpu's layin around I dont need an IGPU, unless its from AMD :)
This i7 is ridiculously fast and power efficient, it will last as long as a 2600K.
Where you find it for $190?
You guys are probably already working on this but I'll mention it anyway.
Adding light and idle load power consumption into the test suite for Alder Lake (windows 10 & 11), would really show how efficient those E cores really are. And it would useful for the real world, since not everyone is running their system at full tilt all the time, efficiency can lead to lower operating costs etc... although full tilt they might consume more power, if they reduce that power draw while doing light tasks, overall power consumption might be better for Intel.
Also it will be great to see power draw from different scenarios like gaming or Adobe.
@@Kaygoooo I've seen them do gaming power tests once or twice. The difference between power draw during gaming between different CPUs is really marginal compared to how much juice your GPU would be drawing anyway. Most CPUs, i5, i9, Ryzen 5s, Ryzen 9s were all around the same ballpark, although I am unable to find the video in which they showed it at this moment.
It doesn't make a meaningful difference. Anandtech has done per-core power draw of CPUs, links at the bottom.
Loading 3 cores of the 5800X should give you a power draw of around 71W. Assuming 10 hours per day, every day, and 10 cents per kW/h, you're looking at a yearly cost of around $26.
Loading all 8 efficiency cores of the 12900K draws around 48W. Using the same assumptions you're looking at a yearly cost of around $18.
Idling the 5800X uses around 12W. That works out to around $4 yearly.
Idling the 12900K uses around 5W. That works out to around $2 yearly.
Would this matter if you were deploying umpteen thousands of computers? Absolutely. Does it matter for the average home user? Absolutely not. Even worst case scenario, with the CPU hammered for those 10 hours per day, that's 131W for the 5800X and 259W for the 12900K, that's still only $48 and $95 respectively. The biggest difference is only $4/mo.
www.anandtech.com/show/16214/amd-zen-3-ryzen-deep-dive-review-5950x-5900x-5800x-and-5700x-tested/8
www.anandtech.com/show/17047/the-intel-12th-gen-core-i912900k-review-hybrid-performance-brings-hybrid-complexity/4
@@BobBobson This is a good result for these hybrid CPUs
Looking at the data they might be 30-50% more efficient doing the same light task or idling.
Maybe I shouldn't have led with operating costs but lower power consumption has other benefits as well. Being better for the environment and if they can scale this down well, giving better battery life for mobile devices. A lot of the world's power consumption today is from computing devices and of we can reduce it with Hybrid architectures that's a win.
Effects the response times, really isnt worth it in my opinion keep this tech for industry embedded solutions..
Gamers care about response speeds and overclocking
I was supposed to be in the process of building a 5900x video editing rig, but now I am leaning towards the 12 700k because my local micro center has them for $399. Thanks for the helpful review as always!
Wait for ryzen 6000, it may surpass intel's efforts.
I just bought 12700k for 190$ which is a great value and i think even today this chip will perform amazing for my productivity and some gaming use.
Also almost 10 years old i7 3770, which served me very well is now very outdated and slows my work in 3D, so it will be a great upgrade.
Not mentioning 16 gb ddr3 to 32 gb ddr4...
Next month i will put all these parts together and hope it will last 5 to 10 years as well.
Thanks GN for your amazing work
The 12700k might be the best cpu launch in a long time, not only for intel, in general I think it is a milestone in technology.
I think it is being underestimated just because recent history has taught us that the 'middle' cpus are not worth it.
This one is the exception.
Not sure why. It's fighting for parity with the Ryzen 5k series while costing significantly more as a package (higher mobo/RAM prices). It's novel, but not nearly a great CPU launch. This isn't a Ryzen upset by a long shot.
Intel fan boys are so disconmected from reality . No shelfs are sold out of the 12900k yet when ryzen 5000 series came out it was sold out everywhere hell its still kinda sold out in most places even a year later .
Walk in to best buy 100s of 12th gen intel for sale walk in to microcenter 100s of intel for sale how is this revolutionary they probably only sold a few 12th gen cpus and amd has sold over 8.3billion 5000 series cpu
@@CHICKENmcNUGGIESMydude ironic
@@CHICKENmcNUGGIESMydude lmao 8.3 billion CPUs you really are delusional
@@CHICKENmcNUGGIESMydude so they sold more than the population of the planet? GTFOH
The wait for the B660 boards from Intel still continues...
Interested on the non-K SKUs too. Sure the 12900k is a beast of a cpu but..... How good/efficient will be that same cpu when it's locked and chained to a 65w tdp.
And how good will it be when you remove said power limits?
@natma relnam it's more of seeing how far you can pull performance/watt with non-k parts. While it's a bit raw yet, and very incomplete, I'm actually getting interesting results in a 11900f without power limits vs stock 11700k, at around 180w, and I still think that I can somewhat improve results further.
It is very synthetic atm tho and I might become doing something wrong but, at the time I even got it for less than the 11700k.
Here in the German speaking EU region (Austria, Germany) you can get the 12900K for a cheap 700 Euros (812 USD). Motherboards that can actually properly drive it go around 300-500 Euros (350- 580 USD) and DDR5 is out of stock everywhere but would go for around 340-400 Euros (400-465 USD) for 32 GB CL38 5200. So worst case you pay around 1600 Euros (1855 USD) for CPU+MB+RAM which is fucking ridiculous. If you add possible extra cost for a new PSU and a better CPU Cooler you land at around 2000 Euros (2320 USD). I don't call that competing on price.
Ouch $2300 US would build me an entire Ryzn based pc including a decent graphics card
12700K goes around 500 Euros and 12600K for around 330 Euros. Also not super competitive but the overall price goes way down at least because you can use cheaper motherboards without issue, save money on DDR5 and do not need a new PSU or cooling solution.
Yeah its actually kinda crazy. Well at least in my case i did astonishing upgrade from old PC (plus i really need new one) and manget to get the minimum for i5-12600k & DDR5 just with 1000 EUros.
Its still a lot but i think it might be worth it as who knows what happans with prices and DDR5 stock.
In comparison I recently upgraded my system from a 8700k to a 5950x (I use my desktop for work, for gaming I would've gone with the 5900x). I just upgraded CPU and motherboard (ASUS Crosshair VIII Hero) as my PSU, RAM and trusty NH-D15 were absolutely up for the job. Total cost 1220 Euros.
today I got a deal on amazon for the i7 12700 kf for 199 USD ill pair it with a rtx 4070
So how is it?
Love how Steve never gets bored of the "F skew from Intel" joke. 😂
buying an 11700K was like buying a 7700K, just in time for the 1080 Ti and 3080 Ti, but you miss out on so much extra CPU power, 8700K 1080 ti and 12700K 3080 10gb builds are king
Could you test 7 zip with different dictionary sizes than 32MB? That's just the right size for it to fit in AMD L3 cache and not fit in intels. There's a wide range from 2MB up to 1GB dict size that 7zip supports in it's test and it may be interesting to see what will happen with different sizes over that range (ie "normal compression uses 16MB while ultra uses 64MB)
I think it might be better to publish why they settled on 32MB for dictionary size. How does that affect final compression performance relative to other options?
As someone who payed $450 at launch for his 5800X (and doesn’t regret it), the thought of it being sold at $330 is an incredible deal.
i just bought it at 300 Euro in Netherlands lol waited for the deal for a long time
Man, definetly the Ryzen 5 5600X sucks.
Now it's overshadowed by two 8 core CPUs, Core i7-10700K and Ryzen 7 5800X.
Why AMD priced the 5600X higher than $300 MSRP?
Why?
It's the 11900K of Zen 3, dumb product.
The 12700 is only $20 more right now Micro Center. I'm in the store looking at it thinking about how I have a 10850 k that I can sell for almost the same amount. I don't know why I'm debating that much
I paid 350 and got a free motherboard 😂
Just over a year later just bought my 5800x for 249$US new lol
F SKU almost made me wake up everyone else with laughter
It hits you when you least expect it lol
If I bought T-Shirts, I'd consider buying a F-SKU t-shirt
F-SKU-se me, sir, la noi in Romania cea de toate zilele, daca la americani costul mediu va fi de ~600$, la noi EMAG va tranti o mega oferta in jur de 900-1000 Euro. Taxele vamale ne omoara tot avantul.
That was great bwhahahaha
frumoasa poveste bro
So, Intel is finally taking AMD seriously to be at parity with them. Lets hope both companies duke it out hard next gen so we win more. I don't even care who 'wins', I just want an upgrade for less than my kidney.
Wrong prices will only rise. Intel kept the same prices all throughout 14nm+++++ so when they had something decent they could launch at those and then raise prices next gen
more like caught out with them. last 5years intels just stumbling.
@@denverbasshead intel keeping 14nm because they lead the market from 2012 to 2016. so they can charge whenever price they want now amd can compete with intel so price increase is unlikely unless is a demand and supply scenario.
Comments like this one really display how clueless people are.
@@denverbasshead price will raise all the time... but what we can get for that rised price ? maby we will get 2x faster cpu ? so we will win.. because 5years intel selling same cpu for rised price...
Gotta love that you won’t really get a bad product whichever platform you go with today.
Well according to r/AMD there's only one choice, and if you don't pick it, you're a fanboy.
It's not just raw performance now, there's a lot to look at when buying, both of them provide exclusive advantages.
Reddit breeds hiveminds anyways. Soon people in that subreddit will be praising AMD for being what Intel used to be criticized for (pricing crazy high because they think they're the premium brand when they're not).
@@twiggsherman3641 Don't try and say only one side does this, all 3 have faneboys who do this
Its more than plattform war, need too go in deep, amd.or.intel, you need to buy the ringt ones
The 12700K looks better than the 12900K in terms of price to performance and also power consumption
12900k isnt worth it
@@isucktrustme8672 intel’s best is really in the i5
just bought one from Newegg sale for $215.00
Best i7 of all time at this point.
I just love watching these charts at 4k60fps. Oh, and Steve's facial expressions.
Diminishing returns is the phrase you’re looking for I think. Great work, nobody does it better. Thanks Steve and all at GN!
It would be very helpful to publish CPU temperatures in addition to power draw. Heat distribution across the die can affect how well the CPU can be cooled so power draw is not an accurate index of how easy or hard it would be to cool a CPU.
Wondering where a 12700KF with E cores turned off would end up sitting on the overall performance ladder, while avoiding the hybrid scheduling problems hopefully
Micro Center has the 12700K for $399, just picked one up for my latest build. Great value.
320$ would be a great value
@@BBWahoo not exactly 320, but I got it for 350 at microcenter a month ago-ish
@@Gary_Sherman
God DAMN!!!
@@Gary_Sherman I drove 4 hours round trip to my closest MC to get a 12700k for $300 with a new customer discount. Couldn't pass it up!
on amazon rightnow for 320 bout to grab one
I’m really enjoying my 12700KF with 6700XT. Upgraded from 4770s and R9 290. Night and day difference gaming.
I got from i7-4790k and 1060 6gb to i7-12700kf and 3070. Massive jump.
iGPUs are fairly important for people who want to build a professional rig at a budget, without discrete GPUs. Not all professional workloads are like Machine Learning or block-chain mining... requiring GPUs/GPGPUs.
I am one of them, so is a few of my teammates at work, my personal rig does not have a graphics card. Yet it had 32 gigs of RAM and 3 SSDs around 4 years ago.
If I was to build a rig now, I'd be looking at these new Intel CPUs or AMD 5900x (will add a sub-$50 gpu). Even the 8-core 5700G will probably cover my current requirements. If I can couple that with gen-4 SSDs and 64 GB ram, I have a machine that will run a whole load of virtual machines with high utilisation software running. I hope AMD will be releasing APUs with higher core count than 8.
iGPUs are a godsend for pro audio builds. No DAW needs a good GPU, and it drives the cost down significantly. The 10700(K) was a phenomenal pro audio CPU. It was truly a shame the Ryzen APUs felt a bit gimped.
@@jakacresnar5855 What about the 5700G seems gimped to you?
@@MadClowdz well considering it only came out a few months ago and the 10700k came out in 2020, I think it not existing when he needed to build his rig is what made the AMD APUs feel "gimped"
Don't bet on it. Hell, at this point, AMD is _still_ using monolithic dies for APUs, which is part of the reason why they have half the L3 cache.
Doubt we'll see AMD with more than 8 cores any time soon. Why don't you just buy this....
I found the bit @16:20 about AMD 5800X having less stutter very interesting. I also recall when Ryzen 1800X first came out, reviewers actually noticed this, despite it didn't show in the tests. As I recall, that was the reason 0.1% was included in tests by several reviewers. Seems like that bit is still relevant.
Brain dysfunction caused by AMD soyboy thinking
The fact that your MT workload frequency isn't constant suggests power limit throttling (could also be thermal but I'm assuming your cooling is adequate); so maybe the motherboard settings are different between the 12900K and 12700K. Intel PL1/PL2/PBP/MTP strikes again.
The only difference between the K and KF is that one have integrated graphics and the other not? They're basically the same?
The iGPU difference is NOT basically the same. If you buy the K or non F version, that means you can use the computer without a graphics card.
I sorta can't wait for AMD's response to these CPUs. Prices for Zen 3 are already significantly reduced (I saw 5800X going for $299 this week at Microcenter). As an AMD fan, I love that Intel finally made a CPU generation that beats AMD. It's healthy for the market. Better products, faster development, and cheaper prices. It's only good for the consumer.
What do you mean finally? Literally the only series that was faster than Intel was the Ryzen 5000 series which is the latest series... and the difference between 10/11th gen and ryzen 5000 is smaller than the jump from ryzen 5000 to intel 12th gen.
Capitalism sure is amazing!
Just ordered a 5800x for 250$US and of course the 5800x3d comes out for 100$ more but up to 50 more fps in 1% lows in a ton of games compared to the regular x lol
Super looking forward to the efficiency results!
Yes, me too. It’s in the title, but alas nothing more than some power consumption figures.
great video as usual!
so glad i bought the 5800x, that power usage vs performance is amazing
Have you seen the benchmarks for the new 5800x3d? Up to a 50 fps increase in 1% lows compared to the 5800x for 100$ more
been waiting for this video to finally make my decision - thanks gn team!
It's been a very interesting release, but I'm waiting to see what Raptor Lake and Zen4 will have to offer on more-mature DDR5 platforms! Though I must say, a 5800x for $300 is actually quite a great deal.
I will say, even after seeing what Alder Lake can do, that I have zero buyers remorse for my 5800X, even at the $395 I paid for it back in May. One of the main reasons I bought it was to see if it could get RPCS3 playing Gran Turismo 5/6 at a locked 60 FPS. Amazingly, it does it, although PBO/Curve Optimizer and 4x8GB of 3733MHz CL14 RAM was a key factor in that. Everything else I've thrown at it on top of that, it just chews through it. In my 20+ years of building, every time I've bought a CPU there was always some application that made me think "I wish it was a little bit faster." The 5800X has been the first CPU where I'm like "damn, this thing just keeps delivering on all fronts and continues failing to disappoint." $300 for a 5800X is a killer deal, use the money you save on not buying overpriced DDR5 for a better GPU and the 5800X will be PLENTY for years. Yes, Alder Lake is a monster, but believe me, so is Zen 3.
@@K31TH3R what Ram do you use?
@@dafaqu694 One set of 3200MHz binned Samsung B-Die, and one set of 3600MHz binned Samsung B-die, both G.Skill Ripjaws V kits. Bought the 3200MHz kit 2.5 years ago for $105, and the 3600MHz kit 1.5 years ago on sale for $118. You don't really need B-die though, you can get nearly the same results (maybe -2%) with Hynix DJR modules. The G.Skill Ripjaws V 3600MHz GVKC kit is Hynix DJR, and those will run 3800MHz CL16, and you can get 4x8GB of that for around $185 which is really good bang for the buck.
It's an amazing deal if you need a system today I agree, though it does also mean you will be limiting yourself to EOL DDR4 and PCIe Gen 4 (which honestly, that one is not going to be much of a handicap for a long time), not much path for future upgradability if you go that route.
@@catlikehana Yes but I was specifically talking about pros and cons of going for the 5800X pricedrop. Also, we can't say for sure what the difference in performance will be until it's been released and tested.
The best speed to watch Steve's videos is 2.5x via developer console. 2.75x starts to get into gibberish territory.
Great video! Two things I'm interest in:
1) Power consumption while gaming (you only showed fps)
2) Impact of e-cores on/off for gaming in Win10 and Win11 (mostly Win10 if I'm honest)
I picked up a i5-12600k with an Z690 auros elite ax mobo and 32gb ddr4 3600mhz. Upgrading from an i7-5960x with asrock fatality mobo and 32gb ddr4 2133mhz. I’m pretty sure I’m gonna be happy. The old system will become a workstation/gaming station for my wife. Had to buy an rtx3070ti since my old Pc needs its rtx2070 super to still work, got the ti because actually picked it up off Amazon cheaper then the regular 3070.
Thanks. Great review as always you’re now the gold standard for reviews of this type and I recommend your vids to everyone.
For strictly 4k video editing on premiere pro should I go for 12700k or cut cost and go for 12600k. Is spending extra worth it? Please let me know, thanks🙏🏼
you might want to look for 12700 non K variant. It cost very close to 12600K and it's just better performer with 2 more cores and a lot more cache. OC doesnt really matter for alderlake.
I hadn't been this excited about the CPU market in over a decade, and I never thought it would be Intel bringing so much of the excitement back!
You spelled apple* wrong 🤣
@@djlytic4603 If I wanted to cut off two fingers all while swearing loyalty to my Cupertino overlords... then I'd care about Apple chips
I mean, I myself (and I'm sure a lot of other people as well) would say Ryzen was the turning point for the CPU market when Intel was still charging $1000 for an 8-core CPU, which isn't even half a decade ago. Now it has finally gone back to a competitive market after years of incompetence by Intel following AMD's resurgence.
@@djlytic4603 I can't buy an Apple chip and build a custom PC and run my OS of choice on it, so... Although they may be interesting from a technical perspective, they are not *exciting*.
decade? were you asleep during the ryzen launch?
I'm upgrading from 4770k to 12700k. I'm planning on using my semi-new Cooler Master ML240R AIO with the 12700k with an LGA 1700 bracket and was concerned about the temps. So I'd very much appreciate if you guys do some temperature tests of the Alder Lake with a cooler made for previous gen sockets..
@@plop31 ... First of all, can't afford a new AIO (360mm on top of that) since I have to buy a MoBo as well as RAM kits. Also my current AIO isn't shitty at all. And I have no plan on OCing, also I'm avoiding F series cause then it'll be very hard to troubleshoot if I were to check any faulty GPU's and/or if I need to use the integrated graphics while I buy or wait for a new GPU.
Thanks for your suggestions anyways :)
@@MagikMehedi ... the Arctic Liquid Freezer II 360 is the same price as most 240's with all the fancy RGB, homie.
@@twiggsherman3641 Well, homie!! as I mentioned, I can't afford another Cooler right now and also Arctic Liquid Freezer II 360 isn't available in our country, no one (Distributers) sells it here.
@@MagikMehedi well homie, move to a better country. Problem solved. :D
Alder Lake pricing looked promising but then I looked at the prices of motherboards at my retailer and I noped the hell out of there. Meanwhile there are years old dirt cheap B450 motherboards that can run a 5950X at full power without even overheating the VRM.
Yeah, if you can afford a 5950x, you pair it with a trash pcie 3.0 mobo with pcie 2.0 chipset so not only your gpu is bottlenecked but also you pcie ssds. Oh no, you must be an amd fanboi, sorry, didn't know the condition was pathological.
Yeah nobody seems to consider that after looking at the benchmarks even though everyone does mention how expensive everything is.
You can't run a 5950X at its peak performance on a B450 board. Stop sniffing glue.
I agree new z690 boards are still too expensive, but comparing the new z690 boards are far, far higher quality then those lower tier b450 boards, dumb comparison.
Most B450 in the market cant really handle a 3800x without melting the vrm... dont mention 5950x lol. This is kinda it is possible but doesnt mean you should. Playing basketball on heels is what i would use as an analogy.
got a ryzen 5800x earlier this year and could not be happier. kicks the crap out of the amd 8350 i had before it.
It’s a great CPU, at the time of release it was a bit overpriced because of the performance difference from the 5600x and 5900x but it was available at the time. I just got the 5600x last month and love it, I’ll probably upgrade to DDR5 in 1-3 years, everything right now is too expensive.
@@bsx132 for sure, but as long as i can run the games i play im good lol
I've just finished building a brand new platform update (coming from a i7 870). I've chosen the Ryzen 7 5800x since the whole build (CPU + B550M Mag Mortar + 32GB 3200 Mhz RAM) costs almost 1/3 of the price of building a brand new Intel CPU kit (3000 BRL for AMD vs 8600 BRL for Intel). With the dolar conversion being so high right now, buying brand new hardware here in Brazil is even more expensive than around the world. Even though part of me see's these results and wonder if it was the correct path for a mix of productivity and gaming, but I believe it was the more reasonable choice.
How are you wondering if it was the right call? brother, it was one third the price + intel's lead here is really, really small, nothing like 20~~25% we have before in 8th gen intel vs ryzen 5 2th gen. Here is more like 5~~10% at best and sometimes a tie and even losing... and you saved a lot of money. I really dont know what you mean with "wonder if it was the correct path seeing these results", maybe you didnt watch the whole video?. Also you can still upgrade later with more cores and 3d v-cache is coming as well.
@@igorrafael7429 not to mention the efficiency on the Intel side is just nowhere close. You're literally talking 10's of USD a month in electricity alone if you run your computer hard say 4-6 hours a day and electricity where you live is a bit pricey.
@@igorrafael7429 Yeah, I totally agree. In every line that you've wrote. It's just that I'm buying a PC with not the last gen PCI not DDR technically. Imagine, I've bought a PC around in 2010 before my current build. I'm intending to used as much as possible for a gaming and productive perspective for a long time. Buying "yesterday's" tech kind of makes me think "What if I...". But I totally agree on you and it was waaay more cheap on that build.
@@igorrafael7429 a lot of people aren't using pbo when testing and I've heard when you set pbo limit to 250 watts Intel is non existent besides single core
2600k upgrade for me last year, picked up a 10850k for $500 AUD, vs $580 for the 5600X. Still using my old Noctua D14, it's stll going a decade later!
I feel good about my choice going with this one! Good to see power consumption isn’t as ludicrous as the flagship.
*What they don't talk about*
Is just how AMAZING this chip is at undervolting and Ocing when combined.
My 12700k literally, has a -70mv offset and +1 on each multiplier.
Along with a 120second boost instead of 55 seconds.
I went from stock 87c to 79c with an overclock and lower temps!!!!
Stock r20 score was 8709 multi. Now it is 8951.
Free performance, with an overclock and undervolt. Insane!!
these chips are more efficient than they seem, they have a VERY generous voltage out of the box. You can almost 100% undervolt any 12700k or i9 by 70-90mv. Just for temps alone it is worth it. use the intel Oc tool.
Ironically I just bought a 5800X today. For me, it was a drop-in replacement for my 2700X, no new board. Also go it for $330 at MicroCenter, can’t beat that!
shouldve waited for Zen 3 3D the chips would probably drop even further
So you dropped a 5800X in to a board with no Wifi-6, no USB type-c, no PCIe 4.0... and less performance then you'd get out of a B550 or X570. Definitely can't beet that!
@@SweatyFeetGirl maybe, but to be honest, I know myself, I would have ended up spending more on the 3D chip 😂. I thought about waiting but decided I didn’t want to wait months to find out. I bought the 2700X at a similar MicroCenter sales weeks before Ryzen 3000 came out. I wasn’t wrong then, that CPU took over 6 months to drop down to that price again.
@Невада большевик amd literally announced zen 3d coming early next year...
@@sphbecker i think it would be even better to spend a bit more on the cpu since it would be the final upgrade and youd have a long life of it
I feel like 12700k with power limits extended would be a lot closer in these benchmarks.
"F SKU" isn't as bad as Macs running "Max" CPUs, but it is a funny poor naming strategy.
It's the poor-man's "K"
I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This card is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did.
I like how Intel skipped 11th gen and went from 10th to 12th 😍
Glad to see this, just customized and bought a pre-built with this CPU. Great video!
Right behind you bud
13:11 I don't think it's as simple as saying that the 5600x/5800x "enjoy leads" here. The 12700k does better than the 5600x, due to equal average fps but measurably better 1% lows (arguably more important for competitive esports titles). When compared to the 5800x, it does slightly trail behind in average fps but again gets the win in 1% lows, which makes them about equal in terms of a competitive experience. The 5900x does simply do better, though.
12700k has more cores tho
just ordered a custom prebuilt pc with the i7 12700k and rtx 3080 im stoked to see how it runs
when testing power efficiency (performance per watt) also take the duration of productive runs, like cpu x needs 100 watt-hours for task y. and please also test money efficiency (performance per dollar) at msrp/best price at a certain time/average price and please not just the cpu but instead the total platform (mainboard, ddr4/5, psu)!
Someone ought to create a benchmark that simply cycles through all of the cpu instructions one at a time, looped, in order. and return the results for each instruction.
The performance uplift on Intel CPUs with this generation is quite nice!
Hopefully that means that AMD will drop their prices.
Probably not, considering an AMD cpu+motherboard combo is still cheaper than an Intel cpu+motherboard combo
Good review steve👍👍👍
Just bought the i7-12700F for 330€ to finally retire my current i7-4770, funnily kinda skipped the whole Ryzen hype time and the weaker intel CPUs to this one
Same here. I just ordered 12th while my current is xeon-1650V2.
Same but for the i5-4670k. 😆
Every single Intel generation after Haswell is underwhelming until Tiger Lake (11th gen mobile) came out.
just picked up a i7 12700k for 299 but microcenter had a deal going on so i got a z690 plus wifi mtb for 50$, not a bad buy for 350. :DDDDDD Probably gonna pop a 6800 in it
As someone that considers having an IGP a necessity for any cpu the fact that I sometimes see Intel F cpus listed at the same or higher prices than the non-F versions is mind boggling.
How is it necessary?
@@pixels_per_inch Even if you never actually end up using it having the option to fallback to integrated graphics will always be better than your computer basically becoming unusable because something happened to your graphics card.
@@Nintony58 Well, the chances of a graphics card failing is quite low. Just wished it was more useful, whatever happened to explicit multi-adapter.
The 12700K is price-equivalent to the 5900X, not the 5800X, when you consider the platform costs. That's _not_ including RAM.
And that's not basing it on the Microcenter pricing of the 5800X.
True, I think the 11600 and 11700 are the Price to performance winners. Especially for gaming. Ryzen is twice the price where I am for 5-10% and Alder Lake with DDR5 is even more
@@roknroller6052 I'd say 10400 wins on value right now, for as long as it lasts, anyway.
When the 12400 comes out, it might just be unambiguously the best choice, but we'll have to wait and see.
Good video my bro keep up good work
Love the channel. Been subbed a long time, The content is great as always, but as a fellow video professional, please get some softer lighting.
Subbed since 140k here and 49k in Hardware Unboxed. Lol look how many sub they have now :)
Some nice RGB,maybe,pulse and breathe!?😆
Please do a DDR4 vs DDR5 comparison the new CPUs. Keep up the great work.
I didn't think Alder Lake would be that impressive. I'm so glad to have been proven wrong. The Golden Cove cores are monsters and if Intel had gone for a homogeneous architecture like AMD, I think they'd have blown the 5950x out of the water across the product stack. You'd need a 1200W power supply, but still damn impressive (and I'm saying that as a 5950x owner). I'm super psyched to see how AMD responds.
ryzen 7000 with rdna 2 yeah
Nobody needs 1200w power supply, except those with 2 gpus installed
Now that we are seeing some of the final leaks of Zen 4, And RDNA three, I’d say intel scared the hell out of them! And now they will reward us with some amazing products!
@@mrsandroks Agreed, it’s not needed, but it’s useful if you want to have a quiet system. I have a 1200W Corsair PSU and even when at full load with a 5950X and the 3080Ti, the PSU fan doesn’t spin up. That’s great since I tried to build a quiet computer such that no loud fan noise is audible from my normal seating position when the PC is under the desk
@@little_fluffy_clouds Niiiice!
Me having i7 12700k, DDR5 5200MHz, ASUS PRIME Z690-A and a NH-U12A cooler installed the day after release here trying to get some confirmation bias.
Upgraded from i7-4790k DDR3 1900MHz, ASUS Prime Z97-A. Now my RTX 2080 is bottlenecking me since i play on 3840x1600
Its good to see Intel is fighting back and looks like they are gaming king again.
AMD may not be in trouble right now but after a few generations they will be run short of fabrication size, intel is still managing the same performance with older fabrication sizes.
there is a limit how far we can shrink the die size and soon it will be tougher to develop new technology.
Would 12700k or 13600k be the ideal upgrade from 7700k?
Use case: Gaming, Office work, transscription work etc
Both are fine. The 12700K is 8P+4E vs the 13600K that is 6P+8E.
I was thinking about the 13700k
I really want them to release a hedt sku with a few P-cores (4 to 8) and a whack ton of E-cores. Would be nice for virtualization
you mean 8+64?
Thanks. Your Videos on the 11700k and the 12700k were very instructive for me.
For iGPU, I would like to see it used in low graphical intensive applications like web browsing, MS office like Excel and word, video, and so on. Then use the GPU for more graphical intensive task. I would think the whole PC would use less power that way and give a decent reason to put an iGPU on a CPU other than off time of trouble shooting to see if the GPU is dead.
So a CPU without igp is incapable to even output desktop? What about command line? They can’t output anything?
@cat -.- if you have a CPU that doesn't have an igpu built in it, you don't even get a picture out of it. If you want some sort of picture out of a computer that doesn't have an igpu and gpu, you would have to connect to that computer using remote access or web interface if you make it a server. Even then, setting up the PC requires some sort of gpu. After the setup, you can remove the gpu. That is what I did with my NAS.
Another great review, Steve! Keep up the good work
Most ASUS motherboards let you use the iGPU along with your discrete GPU for extra monitor support. I get a lot of value out of this function since I run 6 monitors. Definitely worth the extra 40 bucks for me.
How do you run a second monitor from the iGPU?
@@nozyspy4967It's a feature of Asus motherboards, not sure about other brands. Been using exclusively Asus for 5+ years.
Yeah that 6 monitors crowd has really been underserved.
@@Dennzer1 Lol, I'm thinking about adding 2 more actually =P
You can run 4 monitors off of 1 Displayport with MST (multi stream transport). I culd just add another video card, but the iGPU only uses an average of 4 (yes four) watts!
How much RAM would you recommend for a multi-monitor setup?
You seemed to be looking for the term "Diminishing returns" when describing the price/performance falloff for the 12900K. I think that term explains your point pretty well.
I'm really liking the cpu war these latest generations. It's reminding me of the old fight between amd and intel. Hopefully it lasts a while.
igpu also useful for when you dont have a GPU at all. Also useful for when you have multiple monitors, plug in 2nd into your igpu. Helps if you have mismatching refreshrates. If primary display is 144 and 2nd is 60hz, both on the same GPU, and you view a video say Twitch on the 2nd 60hz display, the 144hz start behaving more like a 60hz display (while still being at 144 mind you) which has been an issue for a long time.
this is still true a year later. That Igpu should be talked about more in reviews.
Intel : "I'm back! "
12900k using 2x the power of 5950x: "my back!"
AMD Fanboys: "ToO mUcH PoWEr"
Me dumping liquid metal on shunt resistors to increase power draw: "First time?"
Meanwhile the 12900k used more power only in benchmark tests. On the other hand in every day usage (gaming / productivity) in cases you missed it was 10-25% W less. Measured with watt meter at power socket ….. But yeah in benchmarks uses almost double the power
@@WarshipSub Gaming I could believe as it only uses one or a few cores. Productivity is where it uses a boatload of power. If it didn't, it simply meant all the cores weren't being utilized or it wasn't being pushed.
@@darreno1450 yea, it's literally only going to matter if you're rendering 24/7 AND care about power draw. I suppose there are some people or use cases where that matters.
But it's also still 10nm which is going to require a bit more power than smaller nodes.
@@WarshipSub ik, i was just doing it for the memes
I keep missing the the review part where the E-cores are tested for running mundane tasks and how much energy it saves in doing these simple tasks. I thought that was the party piece of these new CPU's.
in Win-11
Just curious why there is a can of WD-40 on the backdrop? I can think of no reason whatsoever to have that within 40 Km of a circuit board.
We just don't talk about what Steve do when the camera is turned off. 😄
Good ole can of WD-40,great for bringing your computer case surface back to new again!😀🎆
JUST ordered my PC with the i7-10700KF/RTX 3070! YAYUH!
Why include the 5950X with an all-core 4.7GHz overclock topping the power consumption chart, and then not include it in the actual productivity benchmarks? Weird decision.
Why indeed....
i7-12700K is only $299 at microcenter now.
It seems to be a 3+ variable equation. For homebodies, it can be a simple choice but if you are commercial, 2000 Alderlakes may farq your energy budget. Smart CFO/CIO management teams must choose AMD's efficiency over INTELS' speediness.
@GN: great review, as always!!! So the real question for me now ... "is it time to retire my 8700k@4,8GHz ?"
The new Alder Lake CPU's are making things exciting again, great job with these review video's Steve! I'm also hoping that this next year when Intel releases their new Xe HPG video cards, that it will help bring some normalcy back to the GPU market! And I LOVE the new studio/office setup, it looks great; Can't wait to see your new studio tour video's when you release them Steve! :)
Yayyyy. Moar gamers Nexus!
Why do they never benchmark Games that are extremely CPU bound like Planet Coaster, or age of empires, city skyline, that would be a better illustration of their power.
Do those games have a built-in benchmark tool though? I don't think Cities Skylines does.
@@numberM4 idk about benchmarks if I’m looking at buying a cpu like the 12600k for gaming it makes no sense as they are pretty run the same on gpu intensive games, so games that are cpu intensive like planet coaster are the only real reason I’d even think of upgrading from my last generation cpu.
@@numberM4 I don’t think planet coaster has a benchmark but most people that test planet coaster just simulate the same ride
Bought thr 12700kf and a Z690 mobo for my new build. Always happy to see Intel competitive