If you really want to piss Intel off- combine the old and new naming schemes together- i9-15285k. Much easier to pronounce & the "285k" end makes it seem like a low-end, wacky i3 variant!
@@benwhite6786 The most annoying part is they could have just called them Core 9 290K, Core 7 270K and Core 5 250K. But instead they decided to take 5 off for no apparent reason.
Wild guess: some Sony full frame camera with a wireless mic inside his shirt, reasons: those colors look really Sony like, and the audio nowadays can get crazy good with lavalieres some really nice ones can deliver 32bits audio even wireless, that could explain the good noise reduction and even having a phone recording general noise to do the noise reduction, for the picture I don't really see some kind of strong light pointing at him, maybe a phone with the flashlight on
I love the sudden push for efficiency. The focus has been on power for ages, we can do with one intermittent gen where the efficiency boost carries to future gens.
My instinct is that this CPU generation respects that gamers simply don't demand a volume that matters for a modern PC market where decent CPUs tend to do good enough for a minimum of 5 years. Both the volume and margins probably exist with laptops and bulk commercial desktops where performance has long been over the required threshold, but efficiency would make a significant marketable difference
@@LoFiAxolotl Even if efficiency is what wins this generation, AMD takes the crown. I'm not sure there is a "huge win for intel" in the foreseeable future.
That's not what's happening though. This is a stop-gap before they get their new ASML machines and chip fab are up and running. It should be another 12-18 months before that even spins up though. They have a whole new architecture planned and coming, but they can only optimize on their current designs right now. AMD is going to be the clear winner for at least another 1-2 years as TSMC already has their ASML machines in their fab, spinning up new AMD silicon.
It does feel like they saw power efficiency as one of the biggest factors for Apple Silicon's success and they're trying to bridge that gap. I'm not too down on this since focusing on power efficiency now means less heat which can lead to a higher ceiling on future chip designs.
@@HerbaMachina It depends, if you don't care about gaming and only care about productivity, then it's better to look at both Intel and AMD higher offerings.
This generation feels like it’s for OEMs and not enthusiasts. I’m sure Dell, HP, etc. will love that it’s easy to cool either AMD and Intel CPUs while still getting reasonable performance. With that said, I don’t think I’d ever choose any of the Intel 2xxx series over the AMD options.
We could stop processor development for the next 5 years, and it wouldnt be enough time for game developers to catch up on optimising and learning how to make games efficiently again and take advantage of where we are at now.
It seems like most AAA devs these days just hope that upscalers can pick up their optimization slack. I think that side of the industry is long due for a big shakeup.
honestly.... i'm not CPU bound in any game or really any application i use anymore (DaVinci, Photoshop and LrC) so i don't really care about more performance at this point... but saving a ton over the year on energy cost i am interested in so these CPU actually do look interesting to me even if i lose a few FPS
If that's your priority, then AMD is the way to go, and it's going to be cheaper to upgrade at the long run thanks to the longer lifespan of its AM5 socket
You don't save on energy though. At NA energy prices you'd need 10 years of continuous heavy usage to recuperate the price difference to 14th gen CPUs alone (nevermind motherboard cost), even at European price levels, it would still take 5 years. And that is vs Intel 14th gen. Realistically you won't save anything in energy bills and will moreso lose money because it's more expensive. AMD just blows it out of the water regarding power draw and efficiency as well, so not even a relative win.
We spent years demanding better efficiency and making fun of these companies for making chips so inefficient and hot. They finally prioritize efficiency and we go to making fun of them for not boosting performance...
tbh all those upper mid-range CPUs from last 3 years are nearly within margin of error when gaming at 1440p and beyond. It's all about GPU anyway. So yes, I'm happy there is no higher focus in deficiency. These 5 or 10% perf bumps at extra 100watts of power usage was getting ridiculous.
@@morgan40654 just glossing over the fact that AMD have set the benchmark now for what a 60W chip can do whilst Intel is sucking back 120W minimum for LESS performance. That's why people are pointing it out. Apple have solved this. AMD have solved this. Both of them top the charts whilst pulling tiny power draw numbers. Intel are embarrassing themselves. Stop being a fan boy.
In addition to being a launch pad for future generations with brand new architecture with a strong pivot away from the high power low efficiency of i9s and i7s past. This is good for the future, even if it doesn’t look that great today.
@@rmiah yeah just got ryzen 7900x and yeah it may get 10 fps less compared with intel but also getting less electricity consumption... but I'm not changing that cpu for a while no overheating clocking 5.4 ghz at 1.25 vcore and maxing at 70 degress celcius. 😜
@@peterpanini96 AMD has been way ahead in the efficiency game.. Intel is finally cluing in that people don't want to pay an extra $100 a year on their power bill for a few more FPS than the competitor.
@rmiah but why choose Intel? The problem isn't what they are trying but that AMD is on every step better. More performance and cheaper CPUs the Mainboard are for sure wayy cheaper then the new Intel boards, the Brand Intel is absolutely not what it was just after the last incident showing terrible support at times, especially for knowing the problem for atleast a year.
@@ehrlichgesagt863 why did people choose AMD when Ryzen 1 dropped and it was marginally worse if not equal to Intel’s offerings (ignoring pricing)? Competition is good in the space no matter what your preconceived notions about each brand is. And as later tests in the video showed, Intel’s new architecture still holds its own for productivity and AI tasks.
Nowadays, AMD is the best choice for anyone because the latest AMD CPUs use the AM5 socket which will still get some years of support in the upcoming years. Back in the 2010s professionals would recommend intel because of reliability, but this is absolutely not a issue anymore. AMD has become the better one out of the two giants, with of course significantly lower prices.
I kind of wonder what the best way forward is for Intel. When 12th gen came out, it was a huge performance improvement but most of the discussion was around how hot they get and how the i9 was almost impossible to max out before throttling. Now they focus on improving efficiency by crazy margins, and all we want to talk about is how there was a slight performance hit.
You're forgetting about 2 generations of CPUs that were only mildly better but ran even hotter. Like, I think the way to think of this is that they're overcorrecting on an issue they created themselves.
@@bagofsunshine3916 Ryzen 9000 pretty notoriously barely improved either, though. This generation kind of just seems like neither manufacturer is going after the enthusiast segment.
I like this minimum production out in the public video, reminds me of the first video I watched of Linus over a decade ago (overview of the MSI p55-GD65)
yeah because their reviews popped up before LTT's video ao they added that,but gamers Nexus's review just popped around the same time as their upload. Not that i expected them to mention them cuz of their beef but there's that.
Even at super expensive energy prices, the price difference to even the inefficient Intel 14th gen CPUs will take years to recuperate and that is before you take into account motherboard cost. And the AMD CPUs draw way less power and don't need new motherboards.
@@MajinOthinus I'm aware I'm using a 5800x3D myself and bought a RTX 4070 specifically for its low power draw undervolted. I just think that it's generally good if technology uses less power.
Don't forget everyone, you need to price the chips up and not just use FPS comparison charts. The charts are nice and all, but if (for example) it shows a chip that is fourth down from the top that is half the price of the one that is third down (i.e. faster), that makes a big difference.
I’m actually glad to see a clear return to Tick-Tock type roadmaps. Not every release needs to be exciting. Refinement releases are valuable, too. This product is a big pivot for Intel; it was never going to be an absolute win. This is basically their Zen 1 moment
Hot take: I don't necessarily think it's a bad thing if we're reaching a performance plateau for CPUs in general. It might be nice to shift the goals from just raw compute performance to efficiency and optimizations. "I couldn't make it faster, so I made it cheaper" should be a valid business strategy.
13:04 Have you noticed any disparity between the software reported CPU package power, and actually measuring power consumption at the hardware level? I know it's not possible to isolate just the CPU's power, but you can isolate power consumption to the motherboard by using hardware interposers like how GN is doing it now (which includes CPU, RAM etc) and measure that. It means you should have far more trustworthy numbers compared to how Intel and AMD self-report power consumption at the software level. They also use different methods internally for reporting power, which can lead to apparent power consumption differences between vendors.
Can I be an optimist and ask the following question, maybe they are both (AMD and Intel) doing major power budget cuts in the current release to sky rocket clock speeds on the next gen? I am probably using the wrong terminology but I think most will get what I mean.
I think I agree. This is probably for Intel’s 1,8nm chip next gen processor to make a big splash and also for their GPUs that are supposed to come out soon
I’m pretty sure both have said that somewhere. This is the foundation for the next generation and with all the headroom they can really push performance.
This is probably the direction the world doesn't want to see Intel go in, but from Intel's point of view, probably catering to the smaller audience of efficiency in both performance and power draw. Don't get me wrong, I'm all for getting a little greener, but did we all forget how stupid it was to run the 13 and 14900K? Now we don't have to worry about racking our brains on the most optimal or the bare minimum cooler to slap on the newer chips. But hey, no one is forcing you to buy the new core 200 series. I wouldn't even hold it against ex-team blue bros, because this is definitely a day to lose
I would’ve love it if you guys could do a Overclocking for dummies series. I’ve tried so many times to do research and figure out how to do it but I get scared when I open all these forums and have no idea what they’re talking about. Regardless keep up the great work
Honestly I’m pretty happy with this generation. I probably won’t be buying it, but it’s the first big architectural change in quite a while, and that’s a good thing. And it’s not exactly a complete dud, seeing the 285k score very close to the 9950x in productivity is pretty good all things considered. Gaming performance is pretty disappointing but time might come with future optimizations like with Ryzen 9000 The present is sort of gloomy all around in the CPU market but I think there’s a bright future for Intel
I wish you would add a compile time benchmark to your productivity suite. I understand Chromium or Linux takes an extremely long time to compile for each configuration, but I can compile ReactOS in about ten minutes on an Intel i5-9400 under Windows. It might be a big enough project to test without eating a ton of review time.
@@sharzo7728 They did compare the power consumption to the 7800X3D and the 9950X, atleast for productivity and really didn't hide the fact that these chips pull less across the board. I don't know what you are looking for.
@@1Percentt1 In gaming it would worth buying a 7800x3d but in productivity the u9 285k beats it all across the boards, and neck on neck against 9-9950x. But 9-9950x can draw up to194-200 watts while u9 285k uses fraction of that wattage. But I can't say anything about the new 9000x3d CPUs which has not released yet. conclusion: intel u9 285k is a fast but lower power CPU with AI capabilities. Hopefully Intel can use this and create an CPU that's more performant for games rather than production.
@jeppe1774 did you watch the full video and read my comment? My comment was talking about gaming. But even then, the u9 was pulling 220w against the 9950x's 200w. It's not using a fraction of the power, it's using more power. And the 9950x is also faster at gaming.
The naming scheme still works the same way. It just has the word "Ultra" probably to emphasize the new NPU. This is the 2nd generation of Ultra processors, hence the 2xx. The K and F suffixes still mean the same thing, the tiers are the same, and the xx part works the same way still. I find it hilarious how people see a different word and their brains turn off.
tbh for gaming these can handle pretty much any gpu till now and i dont think i want my pc to burn down for a "better" performance which i dont need ( for cpu intensive tasks i really dont know)
when my 13700K started acting up with the now well known intel-gate of 2024, i switched to AMD. the 7800X3D to be precise. my gaming performance significantly increased. only my synthetic benchmarks decreased. but real world performance you could say ''doubled'' (as i had to downclock the 13700K so much for stability it should no longer be considered a 13700k) another added bonus? the POWER BILL. the wattage my PC draws is significantly less. the 13700k would happily draw 200+ in ANY game/activity other than being a desktop youtube sim (and even then it would easily go to 100-150 watt) the 7800x3d? 78-82 watts. yes. thats full power, with beter gaming performance across the board. being a grown adult, who has bills to pay, while energy prices are doubling yearly, yes this is significant.
@@theroyalcam not OP but I can tell you that crashes for me are way more likely to be caused by my gpu driver or the game itself rather than the system having a complete meltdown due to the CPU. However trying to be as unbiased as possible, the YMMV part comes more from the motherboard manufacturer and how mature their BIOS's are, I'm using a gigabyte aorus master and I've had to rollback before due to bios bugs.
@@theroyalcam You probably won't experience any crashing or stuttering issues after switching because of the cpu, if it occurs it is most likely caused by the ram, motherboard or other components.
@theroyalcam nope, none at all. Same GPU. Had to switch motherboard course. And ram due to compatability. Same ssds, sata and nvme. No stability issues, no crashed, no blue screens. On demand performance for so little power.
@@avenage when diagnosing the horrendous stability I checked EVERYTHING. Ram, GPU, drivers, SSDs, software. To no avail. Clean installs multiple times. The only thing that helped was to downclock the 13700k to what you could say was 13500 performance levels...
Intel Ultra 9 285K or the upcoming AMD flagship CPU FOR MUSIC PRODUCTION (DAW) I'm talking big projects, professional level. DAW support says that the program mainly needs Single Thread Performance and on the paper Intel Ultra 9 285K is in the 1st place right now with score 5132
@@KYLE-zo4bm A year its hundreds of dollars of Average use. A 4090 alone can suck away 600+ dollars a year easily with just an hour or two of gaming a day... So no. Not just some pathetic 5$. Its thousands of dollars in a few years. 5years and you could have bought a pretty nice older luxury car from that electricity you paid for...
I don't mind lower power draw or 'better power efficiency' at all... Being from Europe where electricity is a significant cost I'm basically waiting for a more efficient 7800X3D/RTX4080Super equivalent from the new generations so I can finally upgrade from my 3700X/5700XT. Performance will be a huge boost anyway, I just wanna be able to afford to turn on my computer 😅
Might be a strange take but I love the direction Intel is going now. Finally a chip I want to upgrade to from the 9900K, I would get much less consumption and to be honest never had much of a CPU bottleneck anyways in the last few years.
honestly.... i'm not CPU bound in any game or really any application i use anymore (DaVinci, Photoshop and LrC) so i don't really care about more performance at this point... but saving a ton over the year on energy cost i am interested in so these CPU actually do look interesting to me even if i lose a few FPS
But if you want power efficiency why not go for AMD? You're going to have to buy a new Motherboard either way so why not make the jump to AMD and ensure the next 2 generations can be ran on that same Motherboard.
What about the cost of potentially replacing your CPU every 2 years? Does everyone have the memory of a goldfish??? Intel is not getting away with the shenanigans from the last 3 years. Intel really made it easy for me to stop being a fanboy with their one simple trick.
How much is that going to saved you though? If their power consumption only differs like 30W, for the price the 13th and 14th gen it's still better value if you only use it for 3 or so years
I am really happy they are focusing on the poer efficiency, it's exactly what I was waiting for. I don't care that it's not that much faster than last gen, it's drawing way less power which is very important to improve innovations in the future.
Loved the review and the breathing bubbles. I would suggest that beside the bubble, the intel processor should have blue bars instead of all of them being red. (my eyes keep missing the rows when comparing 😅)
When someone you considered a friend puts out a hit piece on you right after your sister dies, its completely understandable to not even acknowledge their existence anymore.
I feel this way too, and I was looking for someone else in the comments who said so. I don't think I can tell the difference between 226 fps and 197 fps, for example, so if that comes with a 40% reduction in power, why on earth is that not a good thing? I imagine I am not tech-savvy enough to know about the fringe cases where that kind of thing really matters.
Filming the intro to this in an airport is wilddd, also the naming is ridiculous, at least the previous iX-ABCDY kinda made sense if you put the effort it
I started saying a few years ago that we've probably reached peak technology (or close to it). There are a number of reasons, but *real* innovation is coming to an end. We've been seeing tweaking of OLD technology for decades. The last really new thing was 3d printing, which I first recall reading about around 1990 or so. What new thing is coming? Again, there are many reasons for this. Be happy for any improvements, and energy efficiency improvements really does need to be the focus.
this generation is not super interesting for intel, but with the lower power draw like they used to do every odd numbered cpu gen i'm super interested in how they'll push performance next generation
I am honestly fine with the focus on using less energy this generation. While I love some brute performance. I also like my air cooling. If it's priced like how it performs, it seems like a good chip for people like me.
I think people are too focused on FPS and performance, and its not sustainable in the long run to keep wanting more FPS rather than other things in a CPU. A more efficient CPU will also result in better cooling I imagine, so we wont need such expensive coolers...
People need to realize that we are up against a wall with power consumption and ARM is making big strides. If X86 doesn't get its power consumption down we will start to head backwards.
The biggest reason I can think to buy this gen though is the PCIE lanes. The platform supports up to 44 PCIE lanes which is a lot more than you can get from prior generations or the competition. This thing actually has some expandability in it that you used to have to go to threadripper or xeon/epyc to get.
I have an i7 6800k, going to Arrow Lake would be like going from a Vespa scooter to an Audi RS5. I do more than gaming, actually a lot less gaming, and more AI related tasks. This is also the start of a new platform and has an upgrade path for years to come. Will I buy this right now, no, do not have the money right now, but maybe next year or so.
sigh, so frustrating all reviewers bash on the efficiency, as a gamer I don't care about FPS increase because the reality is, you aren't getting any. If you're getting a high end CPU like 285K you're going to have a 7900XTX or RTX 4080/4090, and really despite reviewers constant focus on 1080p low performance, no one with those GPU's are going to do that, you're going to be running 1440P high/ultra min, if not 4k and maybe with RT on, and at that point the FPS is effectively going to be the same, so efficiency is really the only metric that really matters.
This is really cool for gaming laptops tho! Because when you buy gaming laptops you can never push your hardware to the limit due to thermal throttling. So hopefully future gen laptops will be efficient enough for it to not thermal throttle... I feel it's kinda scummy how they advertise performance on gaming laptops with high end gpu cpu when their cooling ability is way to low for it to be ever hit those peaks for a long duration of a time.
I've been an AMD fanboy for years but IMO this latest move is really good for long-term viability. Dumping out of multithreading is going to be a performance hit but will really improve security and prevent the need for patch against spectre/meltdown-type behavior. The AMD processors might even wind up slower than these Arrow Lake CPUs after future patches.
FWIW, Power efficiency actually does matter to me, a lot. In Northern California they’ve jacked out electrical prices wayyyyy up. I pay $0.43/kWh during the day and $0.57/kWh from 5-8PM. So even a 40W reduction on a machine I leave on 24/7 (I do because I run containers and other home lab workloads on it) would save me $150.7/year. So if it costs me $200 to upgrade, selling what I have, I save money. This is my plan for servers particularly. It costs me several hundred dollars per year to run my current E5-2680 v4 TrueNAS server.
[More context in my other comments] I agree with your final recommendation. Still, I hoped you would discuss the marketing claims more because it covers all the awesome technical stuff. There is a lot of new stuff here. New node, architecture, core types, EMIB, and more. From a purely technical perspective, these chips are really cool, and addressing that in depth is a good context for your results. If you've already made a video on that, my bad, plesae reference it. If you plan on making one, please let us know so we can keep an eye out for that. If you don't, I hope you will consider doing one.
Other people don't care about power efficiency? That's like one of the first things I look at because then I know how well designed it is and how close to the edge it is running.
Efficiency doesn’t mean any of that. If you design something to run at 90% and it runs at 90% then it’s well designed and where it’s supposed to be operating.
@@benstanfill363 well to use the tired car analogy, would you buy the new Honda Civic that somehow gets 8 miles per gallon and produces 160 horsepower? Or would you buy that new Toyota Corolla which gets 60 miles per gallon and produces 160 horsepower?
Hate to be a contrarian (actually, I don't), but I actually spent a not insignificant amount of time on my most recent CPU trying to pick the best balance between performance and efficiency.
Not really powerful CPUs but a large reduction in power consumption? These are CPUs for laptops then! Gamers generally don't really care about power consumption for desktops.
POV: Linus talks to you about how disappointing Intel’s new processors are while your flight is delayed
(me: sips on a coffee to-go, quietly nodding)
@@aJazzyFeel HAHHAHAHAHAHAH I'm doing exactly that lmao. Broo that ambience is 10/10.
My flight got cancelled this morning, he's going to be talking about the Pentium 4 by the time I'm free
Lmaoo hey MysteryMii
@@isaacdantzler YOU FOUND ME!
I do love this clear naming - "Core Ultra 200S" - WAYYY better than the annoying understandable clear naming they used to have
Yes. But also everyone who got used to the old one is pissed now XD
Edit: I was beeing sarcastic
why can't we just have gen 1,2,3,4, ... again?
@@Mafa0001 yeah, best looking name was I7 8700K!
Wont improve performance unfortunately
@@Mafa0001 he is being sarcastic, the older naming scheme was clear and easy to understand.
"Hey Linus, embargos coming up"
"Ah crap, I'll just film it in the airport"
I was just thinking “is he at a micro center? Why does it look like an airport” but no it actually is an airport lmao
does that take away from his professional delivery? I don't see a problem here.
@@SatongiFilms No, and nobody said it did? It's just a little funny.
Looks like a green screen to me.
Beats flying to Taiwan and waiting for rain.
Riley's commitment to the bit is top notch. Never let him go.
Yes!
I love riler murdered.
@@AffectionateLocomotive That's quite the double-typo.
This top chip feels to complex for intel to pull off rn.
@@nottimothy5994 yes.
Yeah nah, I’m still calling it the i9-15900K
If you really want to piss Intel off- combine the old and new naming schemes together- i9-15285k. Much easier to pronounce & the "285k" end makes it seem like a low-end, wacky i3 variant!
@@benwhite6786 The most annoying part is they could have just called them Core 9 290K, Core 7 270K and Core 5 250K. But instead they decided to take 5 off for no apparent reason.
@@benwhite6786fr. Ur cooking.🔥🔥
@@theemperorofmankind3739this would be the way.
@@theemperorofmankind3739 Maybe they'll be releasing updated versions later on with those numbers planned? Similar to Super GPUs?
Actually very curious what Linus was using for sound and video record here in an Airport, quality on both ends looks and sounds great.
Wild guess: some Sony full frame camera with a wireless mic inside his shirt, reasons: those colors look really Sony like, and the audio nowadays can get crazy good with lavalieres some really nice ones can deliver 32bits audio even wireless, that could explain the good noise reduction and even having a phone recording general noise to do the noise reduction, for the picture I don't really see some kind of strong light pointing at him, maybe a phone with the flashlight on
the usual lavalier mic + the camera they use at the studio since he's traveling to cover the event.
Green screen
I love the sudden push for efficiency. The focus has been on power for ages, we can do with one intermittent gen where the efficiency boost carries to future gens.
No choice with what happened to the last two generations
If they didn´t pushed efficiency, those CPUs now would take more power than mid-range GPU. I think they dont have a choice to push performance.
intel cant push more power or they will get the 13th and 14th gen issues
Intel ran out of headroom for clock speed, they forced themselves into needing to change and chose efficiency as their next focus.
i prefer to see those new efficient cpu than seeing a cpu you need to run on a 35amps breaker 😂
Great job on the audio whoever mixed the episode. I can’t imagine how bad the raw feed was with all that background airport noise.
My instinct is that this CPU generation respects that gamers simply don't demand a volume that matters for a modern PC market where decent CPUs tend to do good enough for a minimum of 5 years. Both the volume and margins probably exist with laptops and bulk commercial desktops where performance has long been over the required threshold, but efficiency would make a significant marketable difference
especially with how high the energy cost is in most of the target market this might be a huge win for intel
I agree if only intel could actually do that but it dosnt seem like they know what the word efficient means
@@LoFiAxolotl Even if efficiency is what wins this generation, AMD takes the crown. I'm not sure there is a "huge win for intel" in the foreseeable future.
That's not what's happening though. This is a stop-gap before they get their new ASML machines and chip fab are up and running. It should be another 12-18 months before that even spins up though. They have a whole new architecture planned and coming, but they can only optimize on their current designs right now.
AMD is going to be the clear winner for at least another 1-2 years as TSMC already has their ASML machines in their fab, spinning up new AMD silicon.
It does feel like they saw power efficiency as one of the biggest factors for Apple Silicon's success and they're trying to bridge that gap. I'm not too down on this since focusing on power efficiency now means less heat which can lead to a higher ceiling on future chip designs.
TLDR 5800x3D is still the GOAT
Right. Imagine making a CPU that nor your own successors beat it nor does the competition.
still the GOAT in gaming*
@@Sithhy nah its still the GOAT accross the board, performance in productivity still isn't better enough to justify upgrading.
@@HerbaMachina It depends, if you don't care about gaming and only care about productivity, then it's better to look at both Intel and AMD higher offerings.
And unlikely to change any time soon, unless the new X3D chip is bonkers value
This generation feels like it’s for OEMs and not enthusiasts. I’m sure Dell, HP, etc. will love that it’s easy to cool either AMD and Intel CPUs while still getting reasonable performance.
With that said, I don’t think I’d ever choose any of the Intel 2xxx series over the AMD options.
Whomever thought of the news caster bit to correct the information, needs a raise. Like, a fat one. Genius. Do it again.
I read a title as 2005 first. And thought to myself - neat, some oldschool stuff review. Boy oh boy, was I surprised by the actual video
We could stop processor development for the next 5 years, and it wouldnt be enough time for game developers to catch up on optimising and learning how to make games efficiently again and take advantage of where we are at now.
Nah, they’ll just pump out more games every year with barely any optimization.
@@wyterabitt2149 indeed, seems the physical shrinking is at its limit. Most ipc is coming from better and smarter choices on architecture
It is more cost effective to just dump more games rather than to stop and fix the previous one.
@@fujinshu the easy, mundane task of optimizations. game devs are so lazy
It seems like most AAA devs these days just hope that upscalers can pick up their optimization slack. I think that side of the industry is long due for a big shakeup.
honestly.... i'm not CPU bound in any game or really any application i use anymore (DaVinci, Photoshop and LrC) so i don't really care about more performance at this point... but saving a ton over the year on energy cost i am interested in so these CPU actually do look interesting to me even if i lose a few FPS
The problem is that amd cpus are still more efficent
AMD is still more efficient at better performance and stability.
If that's your priority, then AMD is the way to go, and it's going to be cheaper to upgrade at the long run thanks to the longer lifespan of its AM5 socket
You don't save on energy though. At NA energy prices you'd need 10 years of continuous heavy usage to recuperate the price difference to 14th gen CPUs alone (nevermind motherboard cost), even at European price levels, it would still take 5 years. And that is vs Intel 14th gen.
Realistically you won't save anything in energy bills and will moreso lose money because it's more expensive.
AMD just blows it out of the water regarding power draw and efficiency as well, so not even a relative win.
0:38 I thought he said Error Lake
Would be accurate
@@WayStedYou”apo will be default” wasn’t enabled on most motherboards…
core ultra 200s sounds like if a german car made a phone
more like a samsung phone name 😂lol
Sounds more like if crystler made a cpu
@@sib1212 apple fanboy spotted, opinion rejected
@@sib1212 you mean like in the apple watch ULTRA? even be stealing names lol
Definitely has a modern Mercedes naming scheme to it
Riley's newscaster segment was hilarious.
I would be kicked out of any studio where they're recording Riley. He is such a great standup comedian in the shape of a techtuber.
We spent years demanding better efficiency and making fun of these companies for making chips so inefficient and hot. They finally prioritize efficiency and we go to making fun of them for not boosting performance...
Because you need to do both and AMD also managed it, even with a larger node.
@@MajinOthinus AMD only had a performance gain of about 5% only a little more than Intel.
tbh all those upper mid-range CPUs from last 3 years are nearly within margin of error when gaming at 1440p and beyond. It's all about GPU anyway. So yes, I'm happy there is no higher focus in deficiency. These 5 or 10% perf bumps at extra 100watts of power usage was getting ridiculous.
@@morgan40654 just glossing over the fact that AMD have set the benchmark now for what a 60W chip can do whilst Intel is sucking back 120W minimum for LESS performance. That's why people are pointing it out. Apple have solved this. AMD have solved this. Both of them top the charts whilst pulling tiny power draw numbers. Intel are embarrassing themselves. Stop being a fan boy.
Honestly I think it's good they're doing this, this isn't the first time we had next gen cpus that were just a refresh
In addition to being a launch pad for future generations with brand new architecture with a strong pivot away from the high power low efficiency of i9s and i7s past. This is good for the future, even if it doesn’t look that great today.
@@rmiah yeah just got ryzen 7900x and yeah it may get 10 fps less compared with intel but also getting less electricity consumption... but I'm not changing that cpu for a while no overheating clocking 5.4 ghz at 1.25 vcore and maxing at 70 degress celcius. 😜
@@peterpanini96 AMD has been way ahead in the efficiency game.. Intel is finally cluing in that people don't want to pay an extra $100 a year on their power bill for a few more FPS than the competitor.
@rmiah but why choose Intel? The problem isn't what they are trying but that AMD is on every step better. More performance and cheaper CPUs the Mainboard are for sure wayy cheaper then the new Intel boards, the Brand Intel is absolutely not what it was just after the last incident showing terrible support at times, especially for knowing the problem for atleast a year.
@@ehrlichgesagt863 why did people choose AMD when Ryzen 1 dropped and it was marginally worse if not equal to Intel’s offerings (ignoring pricing)? Competition is good in the space no matter what your preconceived notions about each brand is. And as later tests in the video showed, Intel’s new architecture still holds its own for productivity and AI tasks.
Nowadays, AMD is the best choice for anyone because the latest AMD CPUs use the AM5 socket which will still get some years of support in the upcoming years. Back in the 2010s professionals would recommend intel because of reliability, but this is absolutely not a issue anymore. AMD has become the better one out of the two giants, with of course significantly lower prices.
I kind of wonder what the best way forward is for Intel. When 12th gen came out, it was a huge performance improvement but most of the discussion was around how hot they get and how the i9 was almost impossible to max out before throttling. Now they focus on improving efficiency by crazy margins, and all we want to talk about is how there was a slight performance hit.
You're forgetting about 2 generations of CPUs that were only mildly better but ran even hotter.
Like, I think the way to think of this is that they're overcorrecting on an issue they created themselves.
it needs to be talked about, considering amd keeps on improving performance without the issues intel had with their power limits
they should've improved BOTH power draw and performance..... kinda like their competition.
@@bagofsunshine3916 Yeah, exactly like Zen 5%.
@@bagofsunshine3916 Ryzen 9000 pretty notoriously barely improved either, though. This generation kind of just seems like neither manufacturer is going after the enthusiast segment.
I like this minimum production out in the public video, reminds me of the first video I watched of Linus over a decade ago (overview of the MSI p55-GD65)
Feels more nice and grounded to me. Like a mate telling me a news.
Mysterious absence of Gamers Nexus in other recommended channels.
lol
yeah because their reviews popped up before LTT's video ao they added that,but gamers Nexus's review just popped around the same time as their upload.
Not that i expected them to mention them cuz of their beef but there's that.
Say what you want but I appreciate hardware taking power consumption into account. Power isn't cheap everywhere in the world.
Even at super expensive energy prices, the price difference to even the inefficient Intel 14th gen CPUs will take years to recuperate and that is before you take into account motherboard cost.
And the AMD CPUs draw way less power and don't need new motherboards.
@@MajinOthinus I'm aware I'm using a 5800x3D myself and bought a RTX 4070 specifically for its low power draw undervolted. I just think that it's generally good if technology uses less power.
You guys missed the chance for Linus to show off the commuter backpack since Linus is at the airport
Lol not Linus recording at what looks like Pearson Airport... Anyone else feel like the ending was more like "No help... I'm being kidnapped"
Don't forget everyone, you need to price the chips up and not just use FPS comparison charts. The charts are nice and all, but if (for example) it shows a chip that is fourth down from the top that is half the price of the one that is third down (i.e. faster), that makes a big difference.
Facts
Yeah in many charts the 285k was only slightly lower than the top chip but if it’s significantly lower cost, it’s still probably a better value.
I’m actually glad to see a clear return to Tick-Tock type roadmaps. Not every release needs to be exciting. Refinement releases are valuable, too. This product is a big pivot for Intel; it was never going to be an absolute win. This is basically their Zen 1 moment
Too bad this tick-tock always comes with a new socket.
Love the Airport vibe/video! Thanks for the information, too. It's great that I'm set on last generation, this is just... messy.
I also love the vibe too.
Hot take: I don't necessarily think it's a bad thing if we're reaching a performance plateau for CPUs in general. It might be nice to shift the goals from just raw compute performance to efficiency and optimizations. "I couldn't make it faster, so I made it cheaper" should be a valid business strategy.
Nah, I don't care about efficiency, I want performance.
13:04 Have you noticed any disparity between the software reported CPU package power, and actually measuring power consumption at the hardware level? I know it's not possible to isolate just the CPU's power, but you can isolate power consumption to the motherboard by using hardware interposers like how GN is doing it now (which includes CPU, RAM etc) and measure that. It means you should have far more trustworthy numbers compared to how Intel and AMD self-report power consumption at the software level. They also use different methods internally for reporting power, which can lead to apparent power consumption differences between vendors.
I don't think gn has been mentioned even once since last year and i think they're simply avoiding it so sadly i assume this won't be answered.
LTT, JTC and GN releasing videos at literally the same millisecond. Oh the embargos...
Yeah, I am subscribed to a lot of tech channels in multiple languages... My timeline just got flooded with CPUs :D
Can I be an optimist and ask the following question, maybe they are both (AMD and Intel) doing major power budget cuts in the current release to sky rocket clock speeds on the next gen? I am probably using the wrong terminology but I think most will get what I mean.
I think I agree. This is probably for Intel’s 1,8nm chip next gen processor to make a big splash and also for their GPUs that are supposed to come out soon
I’m pretty sure both have said that somewhere. This is the foundation for the next generation and with all the headroom they can really push performance.
This is probably the direction the world doesn't want to see Intel go in, but from Intel's point of view, probably catering to the smaller audience of efficiency in both performance and power draw. Don't get me wrong, I'm all for getting a little greener, but did we all forget how stupid it was to run the 13 and 14900K? Now we don't have to worry about racking our brains on the most optimal or the bare minimum cooler to slap on the newer chips.
But hey, no one is forcing you to buy the new core 200 series. I wouldn't even hold it against ex-team blue bros, because this is definitely a day to lose
Someone missed the memo that kirkland brand stuff is typically the best value:cost ratio.
I would’ve love it if you guys could do a Overclocking for dummies series. I’ve tried so many times to do research and figure out how to do it but I get scared when I open all these forums and have no idea what they’re talking about. Regardless keep up the great work
6:38 *Vsauce theme plays*
Honestly I’m pretty happy with this generation. I probably won’t be buying it, but it’s the first big architectural change in quite a while, and that’s a good thing.
And it’s not exactly a complete dud, seeing the 285k score very close to the 9950x in productivity is pretty good all things considered. Gaming performance is pretty disappointing but time might come with future optimizations like with Ryzen 9000
The present is sort of gloomy all around in the CPU market but I think there’s a bright future for Intel
I wish you would add a compile time benchmark to your productivity suite. I understand Chromium or Linux takes an extremely long time to compile for each configuration, but I can compile ReactOS in about ten minutes on an Intel i5-9400 under Windows. It might be a big enough project to test without eating a ton of review time.
Not them skipping Gamers Nexus when mentioning to watch other viewers for perspective 😂😂😂😂
So it's slower but more power efficient in gaming? Isn't the 7800x3d faster and also efficient?
@@KevinEF yep, my 7800x3d consumes 60-70 watts
Notice how they never compared power intake with AMD? Yeah, They tried their level best to save this disaster of a launch.
@@sharzo7728 They did compare the power consumption to the 7800X3D and the 9950X, atleast for productivity and really didn't hide the fact that these chips pull less across the board. I don't know what you are looking for.
@@1Percentt1 In gaming it would worth buying a 7800x3d but in productivity the u9 285k beats it all across the boards, and neck on neck against 9-9950x. But 9-9950x can draw up to194-200 watts while u9 285k uses fraction of that wattage. But I can't say anything about the new 9000x3d CPUs which has not released yet.
conclusion: intel u9 285k is a fast but lower power CPU with AI capabilities.
Hopefully Intel can use this and create an CPU that's more performant for games rather than production.
@jeppe1774 did you watch the full video and read my comment?
My comment was talking about gaming.
But even then, the u9 was pulling 220w against the 9950x's 200w. It's not using a fraction of the power, it's using more power.
And the 9950x is also faster at gaming.
The naming scheme still works the same way. It just has the word "Ultra" probably to emphasize the new NPU. This is the 2nd generation of Ultra processors, hence the 2xx. The K and F suffixes still mean the same thing, the tiers are the same, and the xx part works the same way still. I find it hilarious how people see a different word and their brains turn off.
I also presume the non-Ultra versions will be rebadged Meteor Lake or even Raptor Lake CPUs.
Yeah they said the Ultra was to push the NPU and the idea of the “AI PC”
tbh for gaming these can handle pretty much any gpu till now and i dont think i want my pc to burn down for a "better" performance which i dont need ( for cpu intensive tasks i really dont know)
How the fuck did he get audio this good at an airport
Didn't expected I'd see x86 CPUs taking design philosophies ARM mobile chips embraced years ago.
You mean multiple tiles? That has been done for years.
I was sure Linus was going to say "arrow lake" is more like "arrow puddle"
But "arrow through the heart" was a pretty good line too.
when my 13700K started acting up with the now well known intel-gate of 2024, i switched to AMD. the 7800X3D to be precise.
my gaming performance significantly increased. only my synthetic benchmarks decreased. but real world performance you could say ''doubled'' (as i had to downclock the 13700K so much for stability it should no longer be considered a 13700k)
another added bonus? the POWER BILL.
the wattage my PC draws is significantly less. the 13700k would happily draw 200+ in ANY game/activity other than being a desktop youtube sim (and even then it would easily go to 100-150 watt)
the 7800x3d? 78-82 watts. yes. thats full power, with beter gaming performance across the board.
being a grown adult, who has bills to pay, while energy prices are doubling yearly, yes this is significant.
you experienced any crashing or stuttering after swapping to amd?
@@theroyalcam not OP but I can tell you that crashes for me are way more likely to be caused by my gpu driver or the game itself rather than the system having a complete meltdown due to the CPU. However trying to be as unbiased as possible, the YMMV part comes more from the motherboard manufacturer and how mature their BIOS's are, I'm using a gigabyte aorus master and I've had to rollback before due to bios bugs.
@@theroyalcam You probably won't experience any crashing or stuttering issues after switching because of the cpu, if it occurs it is most likely caused by the ram, motherboard or other components.
@theroyalcam nope, none at all. Same GPU. Had to switch motherboard course. And ram due to compatability. Same ssds, sata and nvme. No stability issues, no crashed, no blue screens. On demand performance for so little power.
@@avenage when diagnosing the horrendous stability I checked EVERYTHING. Ram, GPU, drivers, SSDs, software. To no avail. Clean installs multiple times. The only thing that helped was to downclock the 13700k to what you could say was 13500 performance levels...
OMG I loved Riley doing the news segment thing. If you need to do any addendums like this in future please bring him back!
What were they thinking with this naming sceme?
Intel Ultra 9 285K or the upcoming AMD flagship CPU FOR MUSIC PRODUCTION (DAW) I'm talking big projects, professional level. DAW support says that the program mainly needs Single Thread Performance and on the paper Intel Ultra 9 285K is in the 1st place right now with score 5132
I'mma be honest right here, I live in Germany and lower energy consumption is a life saver.
So buy AMD
do you really save that much? i mean you aren't using your computer 24/7 what are you gonna save like $5
Bruder was zahlst du auf die kWh?
Alles über 30 ct +15€ ist Abzocke
@@KYLE-zo4bm A year its hundreds of dollars of Average use. A 4090 alone can suck away 600+ dollars a year easily with just an hour or two of gaming a day... So no. Not just some pathetic 5$. Its thousands of dollars in a few years. 5years and you could have bought a pretty nice older luxury car from that electricity you paid for...
@@KYLE-zo4bm Yes, per month
I don't mind lower power draw or 'better power efficiency' at all... Being from Europe where electricity is a significant cost I'm basically waiting for a more efficient 7800X3D/RTX4080Super equivalent from the new generations so I can finally upgrade from my 3700X/5700XT. Performance will be a huge boost anyway, I just wanna be able to afford to turn on my computer 😅
The decrease in power consumption is very important for servers
i wonder if the newscaster jokes in this video have anything to do the the Anchorman reference on the WAN show.
That kid's grandma not happy to see this from the other side
Might be a strange take but I love the direction Intel is going now. Finally a chip I want to upgrade to from the 9900K, I would get much less consumption and to be honest never had much of a CPU bottleneck anyways in the last few years.
honestly.... i'm not CPU bound in any game or really any application i use anymore (DaVinci, Photoshop and LrC) so i don't really care about more performance at this point... but saving a ton over the year on energy cost i am interested in so these CPU actually do look interesting to me even if i lose a few FPS
But if you want power efficiency why not go for AMD?
You're going to have to buy a new Motherboard either way so why not make the jump to AMD and ensure the next 2 generations can be ran on that same Motherboard.
What about the cost of potentially replacing your CPU every 2 years? Does everyone have the memory of a goldfish???
Intel is not getting away with the shenanigans from the last 3 years. Intel really made it easy for me to stop being a fanboy with their one simple trick.
Sighh.... buddy these mofo software devs gonna procrastinate and say u need new processors.😢😢😂😂
Basically their gonna unoptimize that to kingdom come.
How much is that going to saved you though? If their power consumption only differs like 30W, for the price the 13th and 14th gen it's still better value if you only use it for 3 or so years
I was also interested in high refresh rate 4k performance. You should think about implementing those to your CPU testing too.
I should be pleased that my 5800X3d is still relevant after all this time, but I quite enjoy an upgrade now and then but it's just not worth it.
I am really happy they are focusing on the poer efficiency, it's exactly what I was waiting for. I don't care that it's not that much faster than last gen, it's drawing way less power which is very important to improve innovations in the future.
Arrow Lake is like an arrow in the knee
Seems like they're all in on trying to create their own M1 style chip, with similar drawbacks. Maybe HEDT will become relevant for gaming again?
Hopefully. I want more PCIe lanes, too. I ended up going with a Threadripper 7960X for that reason but it would be nice not to have cheaper options.
Man, the 5800X3D is such an incredible CPU for gaming.
Doubt intel will be able to beat that price/performance in the coming years, if ever.
Loved the review and the breathing bubbles. I would suggest that beside the bubble, the intel processor should have blue bars instead of all of them being red. (my eyes keep missing the rows when comparing 😅)
I laughed when they didn't show Gamer Nexus on the recommended channels
Is the beef from a year and half ago still going on?
Yeah it's completely understandable.
When someone you considered a friend puts out a hit piece on you right after your sister dies, its completely understandable to not even acknowledge their existence anymore.
I thought the title said 2005 and was getting pumped for a retro Core 2 Duo review.
I'm actually so glad they went this direction. CPUs are wayyy overpowered these days, and it's about time Intel & AMD actually focus on efficiency.
Efficiency is important, but we're going to be in a situation where cpus can't keep up with gpus soon for gaming, we're pretty much already there.
Software developers: oh nice, I will make more Electron based software and unoptimized games in UE5 with full RT enabled.
@@V1tol pov: the self called AAA or AAAA games
I feel this way too, and I was looking for someone else in the comments who said so. I don't think I can tell the difference between 226 fps and 197 fps, for example, so if that comes with a 40% reduction in power, why on earth is that not a good thing? I imagine I am not tech-savvy enough to know about the fringe cases where that kind of thing really matters.
I know right? I like not having a hot room, so power efficiency is actually.a huge plus
I wasn't expecting Canadian Ron Burgundy at the halfway point! That was great!
whatever mic they're using is working overtime to keep airport noise down.
I was thinking the same thing. That background noise is shockingly quiet.
Dropping Hupethreading was the biggest factor. Would beintesting if you could somehow disable it on 14th gen and compare them.
End of a era 🫡🫡🫡
Dang. Did he use the teleprompter at the airport, too?? 😮
In places where energy is expensive (not Canada), arrow lake has a selling point.
(not canada except alberta)
Turning on a heater is like 2-3kW, how expensive does energy have to be to fckin care about 50 watts?
It doesn't. If you want energy efficient chips Zen5 is better
@@TKIvanov when you are a business or institution running 20, 000 laptops
@@Testing-t4c who tf is discussing institutions?! We are talking the average consumer with 1-2 machines...
Filming the intro to this in an airport is wilddd, also the naming is ridiculous, at least the previous iX-ABCDY kinda made sense if you put the effort it
Ikr ? Like 14 YEARS. F*KING 14 YEARS. EVEN THE IX 1234ABCD IS GOOD THAN THIS :(
I'm so glad switching from team blue (i3 4150) to team red (Ryzen 5 4500, weird CPU but 3600 went outta stock back then)
TL:DR: Get a 7800X3D and 5800X3D.
I started saying a few years ago that we've probably reached peak technology (or close to it). There are a number of reasons, but *real* innovation is coming to an end. We've been seeing tweaking of OLD technology for decades. The last really new thing was 3d printing, which I first recall reading about around 1990 or so. What new thing is coming?
Again, there are many reasons for this. Be happy for any improvements, and energy efficiency improvements really does need to be the focus.
this generation is not super interesting for intel, but with the lower power draw like they used to do every odd numbered cpu gen
i'm super interested in how they'll push performance next generation
I am honestly fine with the focus on using less energy this generation.
While I love some brute performance. I also like my air cooling.
If it's priced like how it performs, it seems like a good chip for people like me.
I think people are too focused on FPS and performance, and its not sustainable in the long run to keep wanting more FPS rather than other things in a CPU.
A more efficient CPU will also result in better cooling I imagine, so we wont need such expensive coolers...
People need to realize that we are up against a wall with power consumption and ARM is making big strides. If X86 doesn't get its power consumption down we will start to head backwards.
What the fck is that naming Scheme????
The biggest reason I can think to buy this gen though is the PCIE lanes. The platform supports up to 44 PCIE lanes which is a lot more than you can get from prior generations or the competition. This thing actually has some expandability in it that you used to have to go to threadripper or xeon/epyc to get.
AM5 is literally 44 PCIe lanes, and of those 24 as opposed to 20 are PCIe 5.0.
The competition is objectively better.
Is video and audio from iPhone 16?
I have an i7 6800k, going to Arrow Lake would be like going from a Vespa scooter to an Audi RS5. I do more than gaming, actually a lot less gaming, and more AI related tasks. This is also the start of a new platform and has an upgrade path for years to come. Will I buy this right now, no, do not have the money right now, but maybe next year or so.
0:51 I'll take those chances
sigh, so frustrating all reviewers bash on the efficiency, as a gamer I don't care about FPS increase because the reality is, you aren't getting any. If you're getting a high end CPU like 285K you're going to have a 7900XTX or RTX 4080/4090, and really despite reviewers constant focus on 1080p low performance, no one with those GPU's are going to do that, you're going to be running 1440P high/ultra min, if not 4k and maybe with RT on, and at that point the FPS is effectively going to be the same, so efficiency is really the only metric that really matters.
AMD for the win.
This is really cool for gaming laptops tho! Because when you buy gaming laptops you can never push your hardware to the limit due to thermal throttling. So hopefully future gen laptops will be efficient enough for it to not thermal throttle... I feel it's kinda scummy how they advertise performance on gaming laptops with high end gpu cpu when their cooling ability is way to low for it to be ever hit those peaks for a long duration of a time.
I'm surprise, i watched the vid 9sec after posting
he is surprise guys
Hi surprise, I'm Dad
I've been an AMD fanboy for years but IMO this latest move is really good for long-term viability. Dumping out of multithreading is going to be a performance hit but will really improve security and prevent the need for patch against spectre/meltdown-type behavior. The AMD processors might even wind up slower than these Arrow Lake CPUs after future patches.
23 seconds. Damn
FWIW, Power efficiency actually does matter to me, a lot. In Northern California they’ve jacked out electrical prices wayyyyy up. I pay $0.43/kWh during the day and $0.57/kWh from 5-8PM. So even a 40W reduction on a machine I leave on 24/7 (I do because I run containers and other home lab workloads on it) would save me $150.7/year. So if it costs me $200 to upgrade, selling what I have, I save money.
This is my plan for servers particularly. It costs me several hundred dollars per year to run my current E5-2680 v4 TrueNAS server.
Too early here, feels illegal
we do not care
[More context in my other comments] I agree with your final recommendation. Still, I hoped you would discuss the marketing claims more because it covers all the awesome technical stuff. There is a lot of new stuff here. New node, architecture, core types, EMIB, and more. From a purely technical perspective, these chips are really cool, and addressing that in depth is a good context for your results. If you've already made a video on that, my bad, plesae reference it. If you plan on making one, please let us know so we can keep an eye out for that. If you don't, I hope you will consider doing one.
The chips being "cool from a technical perspective", means very little when they're objectively worse at what they're supposed to be doing.
Other people don't care about power efficiency? That's like one of the first things I look at because then I know how well designed it is and how close to the edge it is running.
Efficiency doesn’t mean any of that. If you design something to run at 90% and it runs at 90% then it’s well designed and where it’s supposed to be operating.
@@benstanfill363 well to use the tired car analogy, would you buy the new Honda Civic that somehow gets 8 miles per gallon and produces 160 horsepower? Or would you buy that new Toyota Corolla which gets 60 miles per gallon and produces 160 horsepower?
Hate to be a contrarian (actually, I don't), but I actually spent a not insignificant amount of time on my most recent CPU trying to pick the best balance between performance and efficiency.
Not really powerful CPUs but a large reduction in power consumption? These are CPUs for laptops then! Gamers generally don't really care about power consumption for desktops.
Tell that to the people in Europe then. They seem to care more about power consumption.
@@Gfirex Europe is 1 country
@@Gfirex i dont, give me all the performance. i can turn off all house heaters and heat it with my cpu instead
Actually we do
Oh, they should care. Because their direct competitor manages to be faster with much less power consumption.
And to think 3 years ago, we had Intel 12th gen and Ryzen 5000, some of the best, most competitive CPUs at the time
Oil up Linus