Thanks for the "Thanks, Steve," Steve. Back to you, Steve. Signed, Steve. Edit: Wow, even the video lengths are the same. Steves really do have cross-continental telepathy.
The only way is to radically reduce prices. Or Intel will be sold for Qualcomm and the entire management board and accountants will be fired. I don't see any other options. The situation is catastrophic.
@@ithelp-cm5bi They use newer and more expensive manufacturing process from TSMC than AMD does. Intel cannot win a price war without selling at a loss.
@@user-pi4qo3zc2e What does that matter to me as a potential customer? Intel has to deliver a good product so I'll be willing to buy it. If they're paying TSMC a lot, then they should cut their margin. I'm not a charity that's going to whine about corporate costs. The current price-to-performance ratio is absolutely unacceptable. Period.
Well, they did get much better efficiency... And giving that they're trying to sell their foundry space as world class; that's probably not a good thing for them.
Takeaways: 1) Steve was standing at the beginning of the video, so I knew this was going to be bad. 2) The 5800X3D is still showing up to be the BEST PC purchase I've ever made. Period. Side note: The fact that the 7800X3D is the best gaming CPU that uses only 58 watts compared to Intels 14900K's 214 is just obscene. That's damn near a quarter of the power while getting better performance......OOF!!
@@DVDHDBLURAY I meant the one in development, PS6 and Xbox whatever. I don't care about consoles themselves, but they define the kinds of default optimizations that PC games also get stuck with. If we were to have 5 years of games being specifically optimized for this crap, it would've been pretty sad. Like, imagine it works on this specific number of P cores and a ridiculous number of E cores, but on hyperthreading it plummets, etc.
Yeah. To be fair: in gaming the 7800X3D also annihilates every other AMD CPU. That’s quite disturbing. Also important to point out that the 7800X3D is weak outside of gaming. So it depends what you use your PC for.
@@PaintrainX Well amd is about to Change Weak outside of gaming tag on the 9800x3d chip. As its reported to perform better than 9700x in Productivity workloads and in Geekbench Multicore as well.
Remember that 10nm Intel had was highly mature, which partly helped to push the clocks on it (until it was too much). This is a new node, this is a branch new micro-architecture, new packaging, tiles, etc. Basically it's more like Intel is paying technical debt here. If your main interest is gaming in general (not a specific set of games) there is nothing for you here much. Keep looking at any AMD X3D variant and you are good for multiple years. Simply put you choose your CPU (and everything else) based on workloads. Not to mention money spent on GPU side will provide you more too.
zen 5 absolutely crushes zen 4 in non gaming tasks though, in linux it's 20% faster on average. While windows latest update made things better, it's still quite a ways behind linux for performance and there are still issues.
@@elu5ive How sweet! That is the meaning of bromance! 👉When one rises, the red team helps the blue team rise. On the other hand when some of them fails dramatically the other fails too so they don't stay behind! How sweeeeet!!! 😍😍😍 Ps. No, it is NOT sweet, that is just the phenomenon of oligopoly and the lack of competition!! 😡
@@PineyJustice Zen5 beats Zen4 in some specific server tasks, it doesn't beat it by more than 3-8% in rendering, compiling, zipping, etc. Only datacenter tasks. By up to 50% btw, not 20%.
@@rattlehead999 But it DOES beat it, not by "enough". And what counts as "enough" is "Whatever makes AMD not enough". Meanwhile, Intel fails (worse) but not a peep....
yet again I wonder where tf is the 8 core X3D in laptops. The only X3D chip that made it into a couple models is the dual CCD 16 core with its idle consumption issue causing horrific battery life.
The 16 core 7950x3d is even more amazing, caps out around 150w max load for productivity at stock. Can be got down to around 130w with a modest undervolt while staying within 10-15% of intels top chips that use 250-300w+ for the same workloads while also destroying them in games. The 9000 series x3d's will only extend that lead. Intel is in huge trouble.
I have a 7800x3d, and yes the power consumption is very low.. But at least mine, with this low comsumption, obtains high temperature peaks. For example lets say 49w max usage, and 76° max temp reached. I mean, what temp will reach if it consumes lets says 70w with more % usage?
@@greebj Exactly! I hope the Zen5X3D 8core version makes it into the laptop segment. With gaming laptops running over 300W and very hot temperatures. We need low power X3D CPUs more than ever.
Being able to actually use a powerful desktop without heating up the room it is in was a very foreign concept coming from monolithic intel chips for decades
You don't buy a 600 dollar CPU for its gaming perf. 285K beats 9950X in many applications and destroys a 9600X. If you solely play games, just get a 7800X3D or soon, 9800X3D which will be much better allround.
@@LegendaryPhenom them phasing out the 7800X3D supposedly means that this will just take it's place. Could be a good spot to get one 7800X3D for quite cheap next black friday though.
With the new architecture seeing improvements especially on AVX512 and pipelines to accommodate it, AMD really does focus on chasing the datacenter market. It's better to wait for 2026 or 2027 where the next zen architecture would be used for the next console generation, seeing actual improvements for gaming.
Windows 11 is the majority of the problem here. AMD and Intel are both hamstrung by Windows 11. We have to look at Phoronix's Linux testing to see how they should perform...
LOL!! That is a good one! I can't believe these launches either! What the hell are they doing? And here I thought the Arrow Lakes were going to be a real threat to AMD..... wow
@@charizard6969 You think? With Arm pushing into the market and Intel still being close in performance AMD cant really slack off atm. Like their x3d chips have a decent lead but thats just extra cache and nothing to do with the actual cpu cores. Anyone could implement that within a generation.
@@TheDude50447 Anyone could implement that within a generation? yeah no, apple in their first M series processors had issues with, like everything, they didn't put 8gb of ram in almost all laptops just because its enough, their architecture couldn't handle more, there were also issues with PCIE connectivity and so on, it's not an easy feat to engineer these systems, that's why ARM won't really get popular outside of systems on a chip, not soon at least
Then again, since it is entirely likely that you will get only this chipset for one CPU series, ANY gamer should skip it anyway, making it moot, and only data customers that require a new architecture should apply, and that would be mostly servers, no gamers. This is released to market "for gamers" ONLY to sell new CPUs (and hence boards) for Dell et al, who could just sell more 12thGen instead but want "new hotness" to make people pay them, not actual work selling a 12th or 13th gen "old Intel" to people.
@@NadeemAhmed-nv2br Linux benchmarks for 285 are already in, and it still flops there as well, just to a slightly lesser degree. Still nowhere near Zen 5 gains. So no evidence of Zen 5 type of improvement on Windows in the future.
so far I've heard that multi-core on the 9800x3d is on a par with the other 9000-series multi-core performance, which sounds like it could be a great multipurpose cpu.
@@mikem9536 At least the Core 2 series had them able to _talk_ to each other. The Pentium D had to have both chips talk to each other, through the front side bus. So the northbridge/memory controller had to perform double/triple duty to make multitasking viable.
it's a crime against intel fanboys , loosing in heavy rendering application , losing to even ur old gen of cpus in gaming and still loosing in power consumption by more than 50% vs the 3d chips , oh the misery .....
Can't wait for Intel fanboys to tell me that aktschually, if you run these with DDR5 12000MT it magically gives you another 20% gaming perf, which you clearly can run stable with no issue, trust me bro my sample is totally average and does indeed exist.
This is only because the new gens are incredibly bad in performance uplift, but good for the consumer, might as well keep looking for bargains on older gen stuff.
Windows is the biggest culprit here again if we go with the same what was offered the reason for initial results of Zen 5. Then everyone said MS favors Intel. Well how is it now? The truth is that things will improve a lot after a while, exactly like with Zen 5. Will it make a difference? Well did it do it with Zen 5? No it di dnot, it just made the Zen 5 suck a bit less. The same will happen with Core Ultra. Windows will get an optimization update or a few in practice and after that Core Ultra will be meh instead of sucking.
@@Rasenmaehermann89 well they failed, it’s slower not on par overall and the efficiency gains are only meaningful in full all core load other than that it’s pretty negligible
@@Rasenmaehermann89 No one makes a "goal" to make a slower CPU than your own, and much slower than competition. It's the spin your marketing guys decide to settle on when they realize they can't possibly massage this turd into any sort of desirable form. "We totally intended a downgrade! Also instead of looking at this real CPU, please imagine some non existent CPUs in a year or 2 or 5 that are actually great!"
@@NJ-wb1cz I think they couldn't keep getting bigger with the monolithic dies, switching to the stacked design that AMD moved to years ago. I'm guessing they made it work because they know this is the way forward, and couldn't go another year without releasing stuff even though it feels like 14th gen just came out. I think they'd have been better off selling these as 14th gen but only in certain skus that slot them where they'd go performance wise but calling them U chips since you can get similar performance, but at lower power. In short, they have to know it's not great, but it was 'close enough' for a first effort in a completely new fab architecture. They HAD to do somethign to counter 9000 series Ryzens, too.
"The memory prices of these high-end memory parts are pretty much the same" I wouldn't say that DDR5-6000 on Ryzen and the DDR5-8200 on your Arrow Lake test costs about the same, though...
This would be number two bulldozer moment for Intel. Pentium 4 was Intel's bulldozer moment. Clock for clock the original Pentium 4 was slower than the Pentium 3. One could say bulldozer was AMDs Pentium 4 moment since the Phenom II X6 beat it in many tasks.
No AMD just had an expensive Zen 4 launch and Zen 5 is meh. If they can do something with X3D then it will be so so for AMD this gen as long as they come back with a better uplift in Zen 6. If they make Zen 5 really cheap in 6 months it will only be bad for those who wanted more performance. Those who are upgrading 5+ year old systems will still get a great new CPU.
Imagine building an entire empire on hyperthreading, and then pulling the rug from underneath everything. At least for Alder and Raptor Lake, all developers had to do was prioritize the P-Cores, but even that took YEARS.
The 5800x3D is only 2.5 years old. Not sure why people are surprised. The 2600k from 2010 is STILL viable as a CPU today if you are happy with 60 fps. Now that's impressive. (5800x3D owner btw)
what a mess, from Intel as a premium company to a something that seems completely improvised, everything seems a patch of a patch, first the "efficiency cores" and removing AVX512, then removing SMT ... while still using lots of power and not able to compite on gaming performance ...
@@lharsay But by "Broken" I suspect it was the Ring Bus couldn't get the faster speed needed to keep those cores fed, and trying to force it was why 14th Gen and the voltage fried (permanently) the raptor cove chips, and to avoid that, they had to dial back the bus speed, to keep the voltage in check. And therefore it was either starve those threads OR not have SMT.
@@lharsay From what MLID says Intel sources told him, the plan was to phase out HT in preparation for rentable units that would've come with the now cancelled Beast Lake next gen. Getting HT working properly apparently seemed like wasted effort in that light. (They might've originally even planned rentable units for Arrow Lake, not sure I remember that correctly.) With the BL cancellation, not sure if we'll ever see rentable units at all, or if that didn't actually pan out like expected. What a mess.
@@TotalXPvideos Zen5's 3D CPUs are different beasts. This is because now both of the CCDs gonna have vertically stacked caches. In comparison, 7000 series 3D CPUs with dual CCD configuration only had 1 stack of such cache. :)
7200Mhz and 8200Mhz memory. Damn, I'd rather buy 6000Mhz cl30 for 85$ and pair it with Ryzen 7600 or 7700 and get similar gaming as this core ultra wowza chip that costs how much, 600 or more dollars?
I run the same G-Skill Trident Z5 Neo kit in DDR 5 6000Mhz Cl30 that HUB uses, it was a 32GB kit bought in 2023 prices, still about the same at $112.99 on Newegg.
Issue is Intel kneecapped the ring bus clock because it was probably the reason why 13th and 14th gen chips were frying themselves to death. This just means that memory speed just doesn't do much for gaming and daily tasks anymore. I'm just happy that Ryzen is optimized for cheap stuff like 6000-6400 low latency ram.
The 6000MHz cl36 DDR5 will probably be the memory sweetspot until DDR6 ram arrives, it's basically the 3600MHz DDR4 all over again in the sense that you know it will be good til it's phased out
It's like Intel only cares about topping cinebench performance charts. Kind of an odd hill to die on, considering it doesn't translate to performance anywhere else. What a stupid thing to do.
Esp. hilarious if you remember how Intel did their whole ridiculous "real world benchmarks!" marketing schtick when zen2's vastly higher core count crushed them in MT benchmarks with cinebench in particular.
Yeah, I just wonder, why that amazing single core performance doesn't reflect in games? I suspect they are simply optimising for the benchmarks, and not real world performance...
games do not fit in the cache of the cpu core they are most likely memory starved but... beeing memory starved with 7200MT and 8200MT .. both basically overclocking the MC is a tough one to swallow...
Great video. What stands out is that in the CPU intensive gaming benchmarks where Arrow Lake performs way under expectations, both power draw and cpu temperature follow the downward trend, yet cpu usage does not. It all points towards Windows/Intel/Game Engines not scheduling cores in the most effective way, similarly to what we saw on Ryzen/Windows not too long ago. I expect patches to be incoming, or that's what I hope at least, for the sake of healthy competition.
For gaming it is superb but in every other way it is a very outdated CPU. The core count will be crucial in the future. Even for gaming after a few years. It sounds quite unbelievable but 8 cores is just not enough any more or very soon at least.
@@jarnovilen5259 pretty massive speculation based on nothing, but go on, keep sayin currently best CPU (for gaming) is gonna be outdated in a few years LOL
Imagine if Microsoft actually spent some resources making windows optimzed and run well instead of countless AI/Copilot/Recall features that no one asked for
@@Shane-PhillipsAnd it took them how long? And now compare that to the resources they ve dumped into the Snapdragon Arm windows project which is pretty much a flop
Intel when 5800X3D releases: "Those chips are a trouble. We should make note of it moving forward." [a few years and release of 7800X3D later] "We've closed the gap between us and X3D. Wait, that's the wrong X3D!"
@@lucazani2730 They were. It was called Adamantine and was supposed to work for both CPUs and GPUs. But insiders have leaked that it didn't pan out, or didn't pan out quickly enough, so it got axed. For reference AMD had the connecting pads for 3D V-Cache already on Zen 2, but the bonding yields were so atrocious that it never became a product. TSMC developed the basic bonding method, but AMD did a lot of the heavy lifting as a willing guinea pig. Intel probably realized that it was nowhere near as easy to do as the theory implied, and now AMD is reaping the rewards of perseverance.
Question - how do you get around the activation limits when benchmarking? IE, cant change the hardware more than 3 times before you have to wait 24 hours to resume?
@@lharsay They are indeed, and those Epycs are really good, which is why it is a shame TSMC wafers are being used for these Arrow to the knee Lake chips and not more Zen 5 ones.
If I understand correctly...the Intel benchmarks were done on Windows 23H2...and the AMD ones on 24H2...because the Intel processors on Windows 24H2 were even slower!!
@@defnotatrollActually, ARL CPU tile used the Intel 20A node until Intel cancelled it & went with TSMC 3nm. Maybe Intel thought same design in a different process would be fine?
@@defnotatroll Intel has produced a shit ton on TSMC for the last two decades. Their chipsets and network (both wired and wireless) have been on TSMC for ages. So it's not like Intel doesn't know what to do with the TSMC PDK.
It may be literally true simply due to the production process. What they should've done, is price the models according to their performance, not some made up positioning
@@dam8976 That makes me wonder if Zen5 will see more optimization and improvers over time with future windows updates. Seems like Windows is just bloated and unoptimized for Zen5.
What is this cope. It's a terrible generational improvement for an almost 2 years wait. There WERE even regressions, and they were not infrequent. The "power reduction" wasn't even really there compared to non-x Zen 4 CPUs, and efficiency only improved moderately in productivity (and barely improved in gaming if at all) compared to non-x Zen 4. The price doesn't make sense compared to Zen 4, X or not. With the right price reductions, sure, it's not inherently bad. They'd be good CPUs given the right prices. But at these prices, they are worse than AMD's own offerings.
@@w7bUxhwRYUo8Lv AMD AMD AMD... it's not about AMD, it's how bad Intel is. Different perspective. If two companies try to reduce the TDP but one regresses while the other manages to still improve, then obviously we don't pick on the more successful one.
@@aos32 There isn't. That's the saddest part, it's not a bug. They revamped the entire CPU around chiplets, and this is simply how they perform. We should've expected wild swings, problem is there are no positive positive swings. It's just worse
I love my 7950X3D. What an amazing blend of good productivity performance (Handbrake, etc) and gaming. The 9950X3D would only be superior if it has cache on all the CCDs, eliminating the one concern with messing up the core parking on 7950X3D (no issues so far on that BTW).
The likes of handbrake are likely to benefit massively from AVX in the near future, and on Linux encoding can be up to 20% faster even without it already. I'm pretty sure the people stuck on Windows will realize in a year or so that Zen 5 was actually an excellent cpu once the optimizations in apps start rolling in
@@arthurion You don't even have to do that on windows 11. You can just control alt delete and tell windows that the program is a game if it fails to auto detect and it will handle the core allocation correctly.
Not that I really care but Intel seems to be getting a pass from being roasted compared to AMD. Not really a roasting kinda of guy but they both seem deserving and shady in their pre-released info.
Hardware unboxed made like 9 videos blasting zen 5 and now arrow lake is even more buggy, lower performance and shady about their efficency gains and Hub acts like it's just a meh, but still might be OK after bug fixes Kinda transparently in Intel's favor, their coverage
So it seems to perform really well with Starfield's ancient engine that was mostly designed in the low core count eras but it completely falls apart in multi core heavy games like Cyberpunk or Homeworld.
The last time I was this early Meteor & Arrow Lake both sounded good on MLiD's channel. Sapphire Rapids fix seemed to consist of reducing tile count to 2 or 3, basically glueing together monolithic sized dies. Presumably the problem with Sapphire Rapids, Meteor and Arrow Lake is down to the special brew low latency glue Intel cooked up.
Nice work, seems these Cpus can deliver more, but it's a whole mess since we're in Microsoft and Intel flappy hands, hope they'll fix everything before the end of November. Seems that you'll have another month of massive reviews considering Ryzen near new debut.
So disappointed with intel had a 14900k had to rma it and now selling because I switched to amd and love it so much more and seeing ultra cpu fail as well just doesn’t even surprise me after the dealing with intel’s instability for almost a year…. And he is standing which is another sign 😬
Welcome to Team Red. The rules are simple: Never ever buy anything AMD releases until 3 months has passed. After 3 months the product is stable and priced sanely. AMD wants to be "Apple hip" so badly, but their core audience are the pragmatics, the penny pinchers, and the politically inclined. For this reason AMD does one stupid marketing thing after another, and then has to revert course with a price correction. It's important that we keep teaching them this lesson over and over again.... It's in our own interest.
If you cant buy 13th/14th gen because they blow up, and you can't buy amd 7000 because its out of stock, then this makes the recent prices drops to ryzen 9000 look AMAZING
Yes, with the price drop & updated BIOS, AMD 9000 CPUs are a steal compared to ARL. If you’re staying with Intel, find a 12900k & revel with the last decent CPU that Intel ever had!
@@tringuyen7519 Intel 13&14th i5 and i7 are just fine with UEFI/microcode updates. No problems and a 5 years warranty. And decent performance and very good prices. Only the efficiency is poor. But a 14600kf for 200 USD is the best CPU for the price and with a HUGE margin.
That's what happens when you abandon Ring Bus and monolithic chips and they went the garbage chiplet route. They destroyed the CPU market evolution with Meteor Lake which is a crappy architecture.
Except for Starfield's stupid ahh....they have to be crippling AMD on purpose in that game, LOL. I have a 7800X3D and a 4090 so I have skin in the game as well.
So. 50W reduction in power in applications. 70W reduction in power in gaming. 5-10% gaming regression. Still nowhere near close to AMD's efficiency. This has been a fun launch from the perspective of an AMD fan. Intel has shown they are the same company that intentionally stagnated on quad cores 10 years longer than they needed to. They just aren't that good at innovating.
I'm still blown away by how well the Ryzen 7 5800X3D is near the middle of the pack in gaming performance with especially high 1% lows. In every game, even when it was dead-last, it still had a 1% low higher than 70FPS. At this rate, it will still be relevant for gaming in 2030!
Why would any reviewer comment benchmarks like "the CPU wins this benchmark, but it is bad overall"? I would accept if such conclusions are made in form "it wins N benchmarks, it loses M benchmarks".
Thanks for the "Thanks, Steve," Steve. Back to you, Steve. Signed, Steve.
Edit: Wow, even the video lengths are the same. Steves really do have cross-continental telepathy.
Steveception
Someone get Steve a chair, he's still standing
Steeeeeeve
Someone time stamp it
Into the Steveverse
Arrow to the Knee Lake, and he's standing for this one
The only way is to radically reduce prices. Or Intel will be sold for Qualcomm and the entire management board and accountants will be fired. I don't see any other options. The situation is catastrophic.
@@ithelp-cm5bi They use newer and more expensive manufacturing process from TSMC than AMD does. Intel cannot win a price war without selling at a loss.
Whateverlake
@@user-pi4qo3zc2e What does that matter to me as a potential customer? Intel has to deliver a good product so I'll be willing to buy it. If they're paying TSMC a lot, then they should cut their margin. I'm not a charity that's going to whine about corporate costs. The current price-to-performance ratio is absolutely unacceptable. Period.
oh he IS STANDING
Intel - Watch what happens when WE use TSMC!
oh, nm.
😂
💀
Well, they did get much better efficiency... And giving that they're trying to sell their foundry space as world class; that's probably not a good thing for them.
@@rcavicchijr They got slightly better efficiency. Still miles off of AMD's efficiency.
NM nevermind? or nanometer? lol
Takeaways:
1) Steve was standing at the beginning of the video, so I knew this was going to be bad.
2) The 5800X3D is still showing up to be the BEST PC purchase I've ever made. Period.
Side note: The fact that the 7800X3D is the best gaming CPU that uses only 58 watts compared to Intels 14900K's 214 is just obscene. That's damn near a quarter of the power while getting better performance......OOF!!
It's a good thing Intel didn't outbid AMD for consoles this cycle. We could've been stuck with a dud generation
Not to mention that it'll probably degrade and have all kinds of issues
@@NJ-wb1cz but is a dud non the less, because almost no games worth mentioning came.
@@DVDHDBLURAY I meant the one in development, PS6 and Xbox whatever. I don't care about consoles themselves, but they define the kinds of default optimizations that PC games also get stuck with.
If we were to have 5 years of games being specifically optimized for this crap, it would've been pretty sad. Like, imagine it works on this specific number of P cores and a ridiculous number of E cores, but on hyperthreading it plummets, etc.
@@NJ-wb1cz that would be terrible.
There 7800X3D using the same amount of power as an old timey lightbulb to deliver that much performance is insane!
Yeah. To be fair: in gaming the 7800X3D also annihilates every other AMD CPU. That’s quite disturbing. Also important to point out that the 7800X3D is weak outside of gaming. So it depends what you use your PC for.
@@PaintrainX Well amd is about to Change Weak outside of gaming tag on the 9800x3d chip. As its reported to perform better than 9700x in Productivity workloads and in Geekbench Multicore as well.
@@ebenezer8058 Intel is taking an arrow to the knee and AMD is holding the bow
from 10 nm to 3nm = -5% performance 😂😂😂😂
wait until they release the 1nm cpu. we'll be seeing single digit fps
@@Amfibios double digit negative gains
Intel can't even take advantage of a newer node...are they doomed
To be fair this is TSMC's node, so it's not entirely their fault. Their weird decision to drop hyperthreading also plays a part of this.
Remember that 10nm Intel had was highly mature, which partly helped to push the clocks on it (until it was too much). This is a new node, this is a branch new micro-architecture, new packaging, tiles, etc. Basically it's more like Intel is paying technical debt here. If your main interest is gaming in general (not a specific set of games) there is nothing for you here much. Keep looking at any AMD X3D variant and you are good for multiple years. Simply put you choose your CPU (and everything else) based on workloads. Not to mention money spent on GPU side will provide you more too.
AMD: trying to reinvent the wheel and ending up with Zen 5%
Intel: "hold my beer and watch my negative gains"
zen 5 absolutely crushes zen 4 in non gaming tasks though, in linux it's 20% faster on average. While windows latest update made things better, it's still quite a ways behind linux for performance and there are still issues.
@@elu5ive How sweet! That is the meaning of bromance! 👉When one rises, the red team helps the blue team rise. On the other hand when some of them fails dramatically the other fails too so they don't stay behind! How sweeeeet!!! 😍😍😍
Ps. No, it is NOT sweet, that is just the phenomenon of oligopoly and the lack of competition!! 😡
@@PineyJustice Zen5 beats Zen4 in some specific server tasks, it doesn't beat it by more than 3-8% in rendering, compiling, zipping, etc. Only datacenter tasks. By up to 50% btw, not 20%.
@@rattlehead999 But it DOES beat it, not by "enough". And what counts as "enough" is "Whatever makes AMD not enough".
Meanwhile, Intel fails (worse) but not a peep....
@@PineyJusticeidc about that I need gaming perf
Damn, it`s slower then Ryzen 5800x3d. The legend still lives!
*insert gigachad meme*
@@tyr8338 yeah, the world revolves around gaming performance. Surely there are no more important tasks a computer does
And it cost triple!
@@Aeis77tasks yes, important no. 😂
God forbid someone uses a computer for something productive! @@Aeis77
The gaming power consumption of X3D CPUs still amazes me. The 7800X3D topping most of the charts while using just 55-80 watts is stunning.
yet again I wonder where tf is the 8 core X3D in laptops. The only X3D chip that made it into a couple models is the dual CCD 16 core with its idle consumption issue causing horrific battery life.
The 16 core 7950x3d is even more amazing, caps out around 150w max load for productivity at stock. Can be got down to around 130w with a modest undervolt while staying within 10-15% of intels top chips that use 250-300w+ for the same workloads while also destroying them in games. The 9000 series x3d's will only extend that lead. Intel is in huge trouble.
I have a 7800x3d, and yes the power consumption is very low.. But at least mine, with this low comsumption, obtains high temperature peaks.
For example lets say 49w max usage, and 76° max temp reached.
I mean, what temp will reach if it consumes lets says 70w with more % usage?
@@greebj Exactly! I hope the Zen5X3D 8core version makes it into the laptop segment. With gaming laptops running over 300W and very hot temperatures. We need low power X3D CPUs more than ever.
Being able to actually use a powerful desktop without heating up the room it is in was a very foreign concept coming from monolithic intel chips for decades
Intel trying to sell me a 9600X's worth of CPU in gaming performance for $600 was not on my bingo card, but here we are.
maybe amd will release 9600x3d. worth waiting imo
I got ryzen 7900x it does everything just perfect... not even thinking about intel outdated cpus ... 😂😂😂
You don't buy a 600 dollar CPU for its gaming perf. 285K beats 9950X in many applications and destroys a 9600X. If you solely play games, just get a 7800X3D or soon, 9800X3D which will be much better allround.
@@Dr.WhetFarts Many? 5% of all apps that we have? C'mon, just by 1% or 2% in single core.
With what, double the power used, too? Lololol
Zen 5% seems very competent now. Intel closed the node gap & still couldn't reach AMD.
9800X3D going whip the floor with the new intel gen, even by being better by 5% haha.
Zen 5 was designed for servers if you watch any epyc benchmark
@@LegendaryPhenom them phasing out the 7800X3D supposedly means that this will just take it's place. Could be a good spot to get one 7800X3D for quite cheap next black friday though.
With the new architecture seeing improvements especially on AVX512 and pipelines to accommodate it, AMD really does focus on chasing the datacenter market. It's better to wait for 2026 or 2027 where the next zen architecture would be used for the next console generation, seeing actual improvements for gaming.
zen 5 isnt even a bad product it was just underwhelming in gaming for the price, but its productivity is great. hopefully prices keep coming down.
AMD: Nobody can make a messier CPU launch than us.
Intel: Hold my beer...
Windows 11 is the majority of the problem here. AMD and Intel are both hamstrung by Windows 11. We have to look at Phoronix's Linux testing to see how they should perform...
LOL!! That is a good one! I can't believe these launches either! What the hell are they doing? And here I thought the Arrow Lakes were going to be a real threat to AMD..... wow
@@AshtonCoolman Just gave Phoronix a quick glance; the Ultra 9 285K is a bit slower than the Ryzen 9900x on the overall average score.
@@dreadcthulhu6842 Good lord.
Hold my chips 😂
I was joking when I said the 5800X3D would tie Arrow Lake in gaming.
Well, here we go...
And the 7800X3D is 20% ahead at half the power draw...
Yeah.. No competition unfortunately, amd is going ro slack off now bc intel suucks
@@charizard6969 You think? With Arm pushing into the market and Intel still being close in performance AMD cant really slack off atm. Like their x3d chips have a decent lead but thats just extra cache and nothing to do with the actual cpu cores. Anyone could implement that within a generation.
That CPU is never going to die.
AMD may as well start making a 5900x3d in one chiplet at this point. 🤣
@@TheDude50447 Anyone could implement that within a generation? yeah no, apple in their first M series processors had issues with, like everything, they didn't put 8gb of ram in almost all laptops just because its enough, their architecture couldn't handle more, there were also issues with PCIE connectivity and so on,
it's not an easy feat to engineer these systems, that's why ARM won't really get popular outside of systems on a chip, not soon at least
@@obic123 I'd love to have a single 16-core CCD 5950X for AM4 and 3D-vcache on it.
I remember back in the old days when people would say you buy Intel for stability 😂
still true though, my 7800x3d system feels laggier and slower than my previous 12400 system
@@randy89555looks like ur alone in this one dude, u cant build a pc
@recherth.robert built 5-6 pcs before, all worked fine. You just know nothing about amd I guess
My god, the 7800X3D is just insane...
Holy Shit, what an absolute flop for gaming. And the 7800x3d is absolutely crushing fucking everything...😮
I mean all zen cpus got like a 9% FREE UPLIFT AFTER THE WINDOWS UPDATE
Then again, since it is entirely likely that you will get only this chipset for one CPU series, ANY gamer should skip it anyway, making it moot, and only data customers that require a new architecture should apply, and that would be mostly servers, no gamers.
This is released to market "for gamers" ONLY to sell new CPUs (and hence boards) for Dell et al, who could just sell more 12thGen instead but want "new hotness" to make people pay them, not actual work selling a 12th or 13th gen "old Intel" to people.
It's a flop for basically everything. Except maybe you want that AI chip for some reason?... Which is inccomparably worse than Nvidia videocards.
@@NadeemAhmed-nv2br Linux benchmarks for 285 are already in, and it still flops there as well, just to a slightly lesser degree. Still nowhere near Zen 5 gains.
So no evidence of Zen 5 type of improvement on Windows in the future.
@@NJ-wb1cz Which distro, Arch or Debian based?
and it's also depends on the Desktop Environment
9000X3D might be a bloodbath for Intel If the leaks of better gaming without sacrificing productivity are true
at this point AMD could even charge $600-800 for the 9800X3D due to the lack of competition
so far I've heard that multi-core on the 9800x3d is on a par with the other 9000-series multi-core performance, which sounds like it could be a great multipurpose cpu.
@@lharsayAMD will stay with $450 msrp for the 9800X3D. But given Intel’s poor value, don’t be surprised if sellers charge you $550.
@@lharsay idk about USD but I suspect around £600-650 for the 9950X3D and about £400-450 for the 9800X3D
If the ~10% better gaming and ~20% better application /MT is true for the 9800X3D then Arrow lake is truly a failure.
Intel has now decided to glue their CPU's together now, how ironic...
Used to glue them with Core Quad series but hypocrisy is a b1tch.
@@PT-rg2vo They glued the Pentium D together.
Seems when it comes to silicon, glue is more effective than a more monolithic wafer.
@@mikem9536 At least the Core 2 series had them able to _talk_ to each other. The Pentium D had to have both chips talk to each other, through the front side bus. So the northbridge/memory controller had to perform double/triple duty to make multitasking viable.
@@ZeroHourProductions407 I thought all the Core 2 quads used the FSB for die to die coms.
Intel definitely won this generation. They top all the disappointment charts in averages and 1% lows.
This is the best comment on this video I've seen so far!
it's a crime against intel fanboys , loosing in heavy rendering application , losing to even ur old gen of cpus in gaming and still loosing in power consumption by more than 50% vs the 3d chips , oh the misery .....
Dont worry they will find an excuse! drivers maybe?
But but but, it winning at the most important task.
The "CINEBENCH" 😌
Can't wait for Intel fanboys to tell me that aktschually, if you run these with DDR5 12000MT it magically gives you another 20% gaming perf, which you clearly can run stable with no issue, trust me bro my sample is totally average and does indeed exist.
They will pay microsoft to sabotage amd cpus
@@evilpotatoman9436 In reality, drivers probably will help eventually.
That being said, this isn't a good look.
The 5800X3D still being competitive after 3 generations, nice.
As it should be. Best CPU's should stay competetive for at least 5 years in my opinion, i will probably hold on to it until AM5 end of life.
This is only because the new gens are incredibly bad in performance uplift, but good for the consumer, might as well keep looking for bargains on older gen stuff.
7800x3d is far better
*2 generations later and it's only a 2 year old CPU.
Hail to the King baby.
Intel warned us it was going to be bad, but damn….some of these results are downright shameful in gaming.
thats one thing, and true, but this is different demand by others, i dont have anything to say abt it, to clarify
Windows is the biggest culprit here again if we go with the same what was offered the reason for initial results of Zen 5. Then everyone said MS favors Intel. Well how is it now? The truth is that things will improve a lot after a while, exactly like with Zen 5. Will it make a difference? Well did it do it with Zen 5? No it di dnot, it just made the Zen 5 suck a bit less. The same will happen with Core Ultra. Windows will get an optimization update or a few in practice and after that Core Ultra will be meh instead of sucking.
Seems like intel needs ita own x3d cache their the new amd now lol circa zen 1-2 era
Slower than 14th gen and still not even close to Ryzen's efficiency. Clown hats on.
from what I had red, this was their desgn goal. performance parity with 14th gen + large efficiency gains.
*read @@Rasenmaehermann89
@@Rasenmaehermann89 well they failed, it’s slower not on par overall and the efficiency gains are only meaningful in full all core load other than that it’s pretty negligible
@@Rasenmaehermann89 No one makes a "goal" to make a slower CPU than your own, and much slower than competition. It's the spin your marketing guys decide to settle on when they realize they can't possibly massage this turd into any sort of desirable form.
"We totally intended a downgrade! Also instead of looking at this real CPU, please imagine some non existent CPUs in a year or 2 or 5 that are actually great!"
@@NJ-wb1cz I think they couldn't keep getting bigger with the monolithic dies, switching to the stacked design that AMD moved to years ago. I'm guessing they made it work because they know this is the way forward, and couldn't go another year without releasing stuff even though it feels like 14th gen just came out. I think they'd have been better off selling these as 14th gen but only in certain skus that slot them where they'd go performance wise but calling them U chips since you can get similar performance, but at lower power.
In short, they have to know it's not great, but it was 'close enough' for a first effort in a completely new fab architecture. They HAD to do somethign to counter 9000 series Ryzens, too.
Cant wait for the benchmarks for the 9800x3d.
It's a good thing it's done already and AMD can't retroactively cheap out on it because they are only competing with themselves
@@NJ-wb1czthis for sure, but I fear they will cheap out on the 11800x3d (or 10800x3d or whatever it will be called)
@@lucazani2730 oh, they a b s o l u t e l y will if Intel doesn't stop pooping the bed.
"The memory prices of these high-end memory parts are pretty much the same"
I wouldn't say that DDR5-6000 on Ryzen and the DDR5-8200 on your Arrow Lake test costs about the same, though...
Alsow anything above 6400MT on Arrow lake is an overclock and running the cpu wildly out of spec...
oh no Steve is standing
😂
Another Windows update to help intel this time?
Is this cannon? 🤣
What would it mean if he started floating
Your review has sparked my interest in upgrading from the 13700k to the 7950x3d. Maybe I should wait for the release of the 9800x3d.
After such a failure by Intel, I think stores are setting prices for 9800x3d inadequate, if you want the best, pay!
Steve is a trooper, still standing even though he took an arrow to his Knee Lake
I used to stand like him before I took a knee to my Ball Lake
Wow...
Intel finally got their ''bulldozer'' moment.
This would be number two bulldozer moment for Intel. Pentium 4 was Intel's bulldozer moment. Clock for clock the original Pentium 4 was slower than the Pentium 3. One could say bulldozer was AMDs Pentium 4 moment since the Phenom II X6 beat it in many tasks.
Intel has been sitting in Bulldozer territory for years now. And AMD is skirting the sidelines. Gaming hardware is really depressing these days.
No AMD just had an expensive Zen 4 launch and Zen 5 is meh. If they can do something with X3D then it will be so so for AMD this gen as long as they come back with a better uplift in Zen 6. If they make Zen 5 really cheap in 6 months it will only be bad for those who wanted more performance. Those who are upgrading 5+ year old systems will still get a great new CPU.
ahahaha
@@OmegaPhattyAcid Zen 5 meh? Bro even 11th gen Intel was better XD
Imagine building an entire empire on hyperthreading, and then pulling the rug from underneath everything.
At least for Alder and Raptor Lake, all developers had to do was prioritize the P-Cores, but even that took YEARS.
Intel trying hard to compete with AMD for the most disappointing cpu line-up of 2024
Again zen 5 is decent
@@chriswright8074 zen 5%*
They won
@@hungnguyenBánhMìBùKhu2612 only for gaming. 9900X is faster than 7950X for me. That's Zen 30%+.
Nah, intel won the disappointment crown by a huge margin.
Geez, the 5800x3D IS the 1080-Ti of CPUs.
EXACTLY
@@mraltoid19 7500F also slaps hard
1080 Ti sucked after just 4 years.
The 5800x3D is only 2.5 years old. Not sure why people are surprised. The 2600k from 2010 is STILL viable as a CPU today if you are happy with 60 fps. Now that's impressive. (5800x3D owner btw)
@@DriveCancelDC 2600K is from 2011
lol they freakin titled it arrow in the knee lake.😂😂😂 You guys are awesome.
Buying a 7800X3D for $350 in January is making me feel like a genius 😂
9000 iQ play 😂
Very much depends what you do with your PC.
Did intel recruit Boeing engineers ?
what a mess, from Intel as a premium company to a something that seems completely improvised, everything seems a patch of a patch, first the "efficiency cores" and removing AVX512, then removing SMT ... while still using lots of power and not able to compite on gaming performance ...
Well, they left AVX-VNNI at least (one of the gazillion AVX512 extensions), although that NPU kind of nullifies its use case.
I heard they removed HT because the development team couldn't get it working it was so broken.
@@lharsay But by "Broken" I suspect it was the Ring Bus couldn't get the faster speed needed to keep those cores fed, and trying to force it was why 14th Gen and the voltage fried (permanently) the raptor cove chips, and to avoid that, they had to dial back the bus speed, to keep the voltage in check. And therefore it was either starve those threads OR not have SMT.
@@markhackett2302 apparently they wanted a refresh with 8+32 cores which was canceled twice
@@lharsay From what MLID says Intel sources told him, the plan was to phase out HT in preparation for rentable units that would've come with the now cancelled Beast Lake next gen. Getting HT working properly apparently seemed like wasted effort in that light. (They might've originally even planned rentable units for Arrow Lake, not sure I remember that correctly.) With the BL cancellation, not sure if we'll ever see rentable units at all, or if that didn't actually pan out like expected. What a mess.
7800x3d Supremacy showcase
@@rust2156by its successor, so it's more of a pass of the torch than dethroning.
@@rust2156 I mean, will it? Or will it "dethrone" by 5%?
@@TotalXPvideosDepends on the game. Leaks show that 9800X3D offers 5 to 13% increase in performance over 7800X3D.
@@ebenezer8058 Wrong. It will pass its inheritance, but not to 9800X3D, but to 9950X3D.
@@TotalXPvideos Zen5's 3D CPUs are different beasts. This is because now both of the CCDs gonna have vertically stacked caches. In comparison, 7000 series 3D CPUs with dual CCD configuration only had 1 stack of such cache. :)
Kudos to Steve for spending the effort to find and test cpu demanding parts of games. This is the data we come to HUB for.
As an consumer: This CPU sucks
As an engineer: This thing is nuts
THe 7800x3d will be the new 1080ti GOAT in terms of CPU this decade
Waste of sand basically
☝👏👏👏👏
👌👌👌🤣🤣🤣
Not if you're big about setting world records in Cinebench, I guess.
Kids in africa could've eaten that sand
There are more grains of sand on planet Earth than stars in the sky! And you're feeling sorry, really?
This gen is shit, wait for 9800x3d no reason to buy Intel.
amd X3D for gamers is just wow
This is payback for giving us 4 core for 20 plus years.
Don't forget 14nm+++++++
Or 1080p@60Hz/60 fps for the same amount of time.
Still love my 13600K in which I got for $169.99. Great for gaming and work.
Yeah, but Intel sucks. It has to suck as everyone is telling so. Just believe them, you are miserable and your CPU sucks, you just do not know it yet.
@@jarnovilen5259 Ignorant comment. go to bed little boy.
7200Mhz and 8200Mhz memory. Damn, I'd rather buy 6000Mhz cl30 for 85$ and pair it with Ryzen 7600 or 7700 and get similar gaming as this core ultra wowza chip that costs how much, 600 or more dollars?
I run the same G-Skill Trident Z5 Neo kit in DDR 5 6000Mhz Cl30 that HUB uses, it was a 32GB kit bought in 2023 prices, still about the same at $112.99 on Newegg.
Issue is Intel kneecapped the ring bus clock because it was probably the reason why 13th and 14th gen chips were frying themselves to death. This just means that memory speed just doesn't do much for gaming and daily tasks anymore. I'm just happy that Ryzen is optimized for cheap stuff like 6000-6400 low latency ram.
The 6000MHz cl36 DDR5 will probably be the memory sweetspot until DDR6 ram arrives, it's basically the 3600MHz DDR4 all over again in the sense that you know it will be good til it's phased out
Those 6000 kits scale to 6400 EASILY using buildzoid's timings as well. I plugged in his numbers for timings and haven't had any problems whatsoever.
That's why he didn't include memory cost into his price to performance chart, otherwise it would be even worse for the Intel system.
It's like Intel only cares about topping cinebench performance charts. Kind of an odd hill to die on, considering it doesn't translate to performance anywhere else. What a stupid thing to do.
Esp. hilarious if you remember how Intel did their whole ridiculous "real world benchmarks!" marketing schtick when zen2's vastly higher core count crushed them in MT benchmarks with cinebench in particular.
Yeah, I just wonder, why that amazing single core performance doesn't reflect in games? I suspect they are simply optimising for the benchmarks, and not real world performance...
games do not fit in the cache of the cpu core they are most likely memory starved but... beeing memory starved with 7200MT and 8200MT .. both basically overclocking the MC is a tough one to swallow...
My Ryzen 7 7700x cost $210 new on Amazon with Warhammer free, already had a working AM5 system.
Great video. What stands out is that in the CPU intensive gaming benchmarks where Arrow Lake performs way under expectations, both power draw and cpu temperature follow the downward trend, yet cpu usage does not. It all points towards Windows/Intel/Game Engines not scheduling cores in the most effective way, similarly to what we saw on Ryzen/Windows not too long ago. I expect patches to be incoming, or that's what I hope at least, for the sake of healthy competition.
So happy I got a 7800X3D last February. What a monster.
yay zen5% and intel core ultra 9 -2.85%!!!
with a large gain in efficiency are you stupid? probably not an engineer lmao
AMD: we got at least a little bit of moar performance!
Intel: You guys got more performance???
@@solidsnake5051zen 5 was built for servers
More like Intel Core Ultra -5%
it's a lot more then -2.85%... closer to -10%-15%
obligatory thanks for including the 5800x3D in the review comment!
I upgraded a month ago from i7 9700f to 7800x3d and feel proud of the decision form my "mainly" gaming pc 😊
Lol, and why do you need such a powerful processor for the old woman 1660 SUPER?
@@arthurion i got a 4070 🤔
@@samiyanes1598 GPU thats GPU limited even at 1440p lmao
For gaming it is superb but in every other way it is a very outdated CPU. The core count will be crucial in the future. Even for gaming after a few years. It sounds quite unbelievable but 8 cores is just not enough any more or very soon at least.
@@jarnovilen5259 pretty massive speculation based on nothing, but go on, keep sayin currently best CPU (for gaming) is gonna be outdated in a few years LOL
13 years after release we still use the meme. Skyrim was truly ahead of its time
I'd name this generation "Error Lake".
i knew it was gonna be "arrow in the knee lake"
Imagine if Microsoft actually spent some resources making windows optimzed and run well instead of countless AI/Copilot/Recall features that no one asked for
I've imagined, nothing changed, now what?
They did, a most recent Zen processors got quite a big performance uplift on 24H2. It also helped Intel in a couple of places.
Say it louder for the people in the back!
@@Shane-PhillipsAnd it took them how long? And now compare that to the resources they ve dumped into the Snapdragon Arm windows project which is pretty much a flop
Do enjoy a good goalpost shift. Computing on ARM isn't going away, why wouldn't they try and get in on that market?
Intel when 5800X3D releases:
"Those chips are a trouble. We should make note of it moving forward."
[a few years and release of 7800X3D later]
"We've closed the gap between us and X3D. Wait, that's the wrong X3D!"
The i9 12900K was on the market for 6 months when the 5800X3D was released and the i9 was winning at the time on average.
@@lharsaytrue but the windows update boosts performance by like 10%
@@lharsayit was a new technology and the architecture has since then aged really well
I don't understand how on hell intel is not working on v-cache focused chips. Just, why?
@@lucazani2730 They were. It was called Adamantine and was supposed to work for both CPUs and GPUs. But insiders have leaked that it didn't pan out, or didn't pan out quickly enough, so it got axed. For reference AMD had the connecting pads for 3D V-Cache already on Zen 2, but the bonding yields were so atrocious that it never became a product. TSMC developed the basic bonding method, but AMD did a lot of the heavy lifting as a willing guinea pig. Intel probably realized that it was nowhere near as easy to do as the theory implied, and now AMD is reaping the rewards of perseverance.
Question - how do you get around the activation limits when benchmarking? IE, cant change the hardware more than 3 times before you have to wait 24 hours to resume?
There are easy ways to get around that for those who know. But they will not tell everyone, neither will I.
Intel: Let's remove hyperthreading and see what happens.
Waste of good TSMC N3 wafers; imagine if those same wafers were being used to make die-shrunk Zen 5 chips.
AMD is using 3nm for the 16 core Zen 5C chiplets on their new 192 core Epyc chips.
And there are also Apple and MediaTek on 3nm both beating Intel and AMD in GeekBench single core performance.
@@lharsay They are indeed, and those Epycs are really good, which is why it is a shame TSMC wafers are being used for these Arrow to the knee Lake chips and not more Zen 5 ones.
@@lharsay You can't just compare completely different architectures like that. You have to use the same real apps that you will be using on both.
‘Too much creativity may kill the practicality’ - quote by my senior engineer. Seems like intel just did it. 😢
Now give Ryzen tuned low latency DDR5 6400 and intel gets demolished even worse.
If I understand correctly...the Intel benchmarks were done on Windows 23H2...and the AMD ones on 24H2...because the Intel processors on Windows 24H2 were even slower!!
yes, by 10% per Steve. Yikes.
Ultra Processors, ULTRA PERFORMANC.... Wait
16x the detail.
@@Herr_Affe "it just works" 😂
Ultra unfinished as was Zen 5.
@@jarnovilen5259finally some epic competition inspired by gaming industry.
Ultra 9 -2.85% (3nm) is slower than 7700X (5nm) in gaming, wtf?
Intel is lost at this point. 😂
intel are so used to their crap node they didnt know what to do when they actually got ahold of a competent fabricator
There’s a size matters joke in there somewhere lmao
@@defnotatrollActually, ARL CPU tile used the Intel 20A node until Intel cancelled it & went with TSMC 3nm. Maybe Intel thought same design in a different process would be fine?
@@defnotatroll Intel has produced a shit ton on TSMC for the last two decades. Their chipsets and network (both wired and wireless) have been on TSMC for ages. So it's not like Intel doesn't know what to do with the TSMC PDK.
Oh man, this CPU generation is something else
Wait for nvidia...
Intel lives in their own world where they need to release new CPUs each year no matter what
It may be literally true simply due to the production process. What they should've done, is price the models according to their performance, not some made up positioning
Well if we look at the market share, still it seems like Intel marketing strategy is really working. Can you some how prove me wrong?
@@jarnovilen5259it stopped working when first 13gen started dying...
@@jarnovilen5259Not much to do with marketing. People are just tech illiterate and resilient to change even when it objectively would be beneficial.
Zen 5 was never bad. AMD managed to reduce power consumption while still delivering slight improvements. They didn't regress like Intel.
In Linux zen5 is aprox 20% faster than zen4
@@dam8976 That makes me wonder if Zen5 will see more optimization and improvers over time with future windows updates. Seems like Windows is just bloated and unoptimized for Zen5.
What is this cope. It's a terrible generational improvement for an almost 2 years wait. There WERE even regressions, and they were not infrequent. The "power reduction" wasn't even really there compared to non-x Zen 4 CPUs, and efficiency only improved moderately in productivity (and barely improved in gaming if at all) compared to non-x Zen 4. The price doesn't make sense compared to Zen 4, X or not. With the right price reductions, sure, it's not inherently bad. They'd be good CPUs given the right prices. But at these prices, they are worse than AMD's own offerings.
@@w7bUxhwRYUo8Lv Zen 5 is disappointing no doubt, but ArrowLake made it look so good.
@@w7bUxhwRYUo8Lv AMD AMD AMD... it's not about AMD, it's how bad Intel is. Different perspective. If two companies try to reduce the TDP but one regresses while the other manages to still improve, then obviously we don't pick on the more successful one.
Oh god your titles just get me every time I couldn’t help but watch your review first. Arrow in the knee lake that’s some seriously funny shit!
So arrow lake -2.85% was too optimistic?
Minus 20 percent cyberpunk 2077.
Amazin'
Intel wasn't so honest afterall.
-20% in some cases is catastrophically bad. It really has to be some issue causing this.
well but more power efficient, still worse than Amd 3DVcache Cpus
@@aos32 There isn't. That's the saddest part, it's not a bug. They revamped the entire CPU around chiplets, and this is simply how they perform.
We should've expected wild swings, problem is there are no positive positive swings. It's just worse
OMG... HE IS STANDING!
is*
@@ivanbrasla THANKS S2
I love my 7950X3D. What an amazing blend of good productivity performance (Handbrake, etc) and gaming. The 9950X3D would only be superior if it has cache on all the CCDs, eliminating the one concern with messing up the core parking on 7950X3D (no issues so far on that BTW).
Yes, and you can also disable half the cores to make it the same as the 7800X3D.
9900X3D and 9950X3D will have 3DV cache on both CCDs, im waiting what AMD brings up with these two
The likes of handbrake are likely to benefit massively from AVX in the near future, and on Linux encoding can be up to 20% faster even without it already.
I'm pretty sure the people stuck on Windows will realize in a year or so that Zen 5 was actually an excellent cpu once the optimizations in apps start rolling in
@@windfire5380 I'm considering buying a lightly used 7800x3d or 7950x3d for $300-$350 as people look to upgrade. Undeniable value.
@@arthurion You don't even have to do that on windows 11. You can just control alt delete and tell windows that the program is a game if it fails to auto detect and it will handle the core allocation correctly.
Windows 11 should dissapear. Trash OS.
Damn... Steve with that "thousand yard stare"... he's seen things...
Not that I really care but Intel seems to be getting a pass from being roasted compared to AMD. Not really a roasting kinda of guy but they both seem deserving and shady in their pre-released info.
Hardware unboxed made like 9 videos blasting zen 5 and now arrow lake is even more buggy, lower performance and shady about their efficency gains and Hub acts like it's just a meh, but still might be OK after bug fixes
Kinda transparently in Intel's favor, their coverage
Congrats to Balin on his performance in the Manscaper ad spot - he looks as though he's a pro at it.
So it seems to perform really well with Starfield's ancient engine that was mostly designed in the low core count eras but it completely falls apart in multi core heavy games like Cyberpunk or Homeworld.
12900k/s was amazing
13900k/14900k/s were really good, but with voltage issues
15900k (not using their ridiculous rebranding) is DOA
The last time I was this early Meteor & Arrow Lake both sounded good on MLiD's channel.
Sapphire Rapids fix seemed to consist of reducing tile count to 2 or 3, basically glueing together monolithic sized dies.
Presumably the problem with Sapphire Rapids, Meteor and Arrow Lake is down to the special brew low latency glue Intel cooked up.
Yeah except, they are competing with the monitors naming now !
CODE RED!! Steve is standing, I repeat Steve is standing!!!
That Cyberpunk/Homeworld data looks like Intel is trying to help Longsoon beat them in the midrange faster.
Yeah having one of the P cores randomly drop to 800MHz from 5.7GHz certainly doesn't help when all of them are in use.
@@lharsay If there wasn't anything running on it there's no reason it shouldn't clock down for caster cooling.
The move from monolithic was always gonna be a mess the 1st gen
Anything that isn't monothic is objectively trash for gaming and IPC.
My 12700K still has higher IPC than Zen 4 chips.
Nice work, seems these Cpus can deliver more, but it's a whole mess since we're in Microsoft and Intel flappy hands, hope they'll fix everything before the end of November.
Seems that you'll have another month of massive reviews considering Ryzen near new debut.
So disappointed with intel had a 14900k had to rma it and now selling because I switched to amd and love it so much more and seeing ultra cpu fail as well just doesn’t even surprise me after the dealing with intel’s instability for almost a year…. And he is standing which is another sign 😬
Welcome to Team Red. The rules are simple: Never ever buy anything AMD releases until 3 months has passed. After 3 months the product is stable and priced sanely. AMD wants to be "Apple hip" so badly, but their core audience are the pragmatics, the penny pinchers, and the politically inclined. For this reason AMD does one stupid marketing thing after another, and then has to revert course with a price correction. It's important that we keep teaching them this lesson over and over again.... It's in our own interest.
Glue manufacturers will make a lot of profit now that Intel finally switched its manufacturing procedures
If you cant buy 13th/14th gen because they blow up, and you can't buy amd 7000 because its out of stock, then this makes the recent prices drops to ryzen 9000 look AMAZING
Yes, with the price drop & updated BIOS, AMD 9000 CPUs are a steal compared to ARL. If you’re staying with Intel, find a 12900k & revel with the last decent CPU that Intel ever had!
@@tringuyen7519 My 12700kf is doing fine, and you can pick them up cheap. (mine even has avx-512)
@@tringuyen7519sad, but facts
Here in Australia the 7800x3d is available and at a discounted price , about 10% cheaper.
I just bought one .
@@tringuyen7519 Intel 13&14th i5 and i7 are just fine with UEFI/microcode updates. No problems and a 5 years warranty. And decent performance and very good prices. Only the efficiency is poor. But a 14600kf for 200 USD is the best CPU for the price and with a HUGE margin.
I'm shocked HUB was allowed to record a dead body on youtube.
This just doesn't seem like an Intel launch, especially with so many reviewers having Windows/driver issues. Usually Intel has that ironed out.
That's what happens when you abandon Ring Bus and monolithic chips and they went the garbage chiplet route. They destroyed the CPU market evolution with Meteor Lake which is a crappy architecture.
somehow instead of evolving we keep going backwards
Me, a 7800X3D owner after watching this video: 🌚☕
Same 😂
absolutely. i feel like i've made a 1080 TI type purchase. "omg you're overpaying for a 350€ processor in a 1400€ build" nah bro that thing is STAYING
Except for Starfield's stupid ahh....they have to be crippling AMD on purpose in that game, LOL. I have a 7800X3D and a 4090 so I have skin in the game as well.
The 7800X3D is 20% ahead. Even if the 9800X3D is only 5% faster than the 7800X3D it'll be a bloodbath.
You an me both man. I'm glad I snagged mine back when they were going for $360.
Steve is standing! Also, we owe Zen 4 and Zen 5 an apology.
So. 50W reduction in power in applications. 70W reduction in power in gaming. 5-10% gaming regression. Still nowhere near close to AMD's efficiency.
This has been a fun launch from the perspective of an AMD fan.
Intel has shown they are the same company that intentionally stagnated on quad cores 10 years longer than they needed to.
They just aren't that good at innovating.
I'm still blown away by how well the Ryzen 7 5800X3D is near the middle of the pack in gaming performance with especially high 1% lows. In every game, even when it was dead-last, it still had a 1% low higher than 70FPS. At this rate, it will still be relevant for gaming in 2030!
"Arrow in the knee lake" Lmao, just when I thought CPU market couldn't get any worse.
GN and HU are the most trusted tech channels on youtube, thanks steve
Steve is standing, RIP INTEL.
Wow this is not even a dud like the Zen 5.
But total grabage on an incredible level.
Why would any reviewer comment benchmarks like "the CPU wins this benchmark, but it is bad overall"?
I would accept if such conclusions are made in form "it wins N benchmarks, it loses M benchmarks".