Core i9-14900KS overclocked to 9.1 GHz but 285K can only get to 7.5Ghz under LN2. Overclocking is dead on Core Ultra and only impressive thing is the E-cores!!
@@JDD_Tech_MODS Im running the 14900KS on water cooling and I love this chip. I also never downloaded any of the new microcodes because they will gimp the performance. But you gotta water cool this thing otherwise it will down clock as it hits high temps.
@@papasmurf5598 14900KS is my daily dr8iver now with a full custom loop. This chip is a lottery winner. I'm running 8000MT cl38 on a 4 DIMM Z790 Aorus MASTER board. The IMC on it is incredible. Did you set an all-core freq on yours? That will negate the need to ever need those microcode updates, and I don't blame you one bit for not updating it. Running it at stock settings allows Intel's faulty vid requests to hammer the chip in to a death spiral.
my 12900k is still going strong. no need to upgrade it. nothing really makes full use of it. and im on DDR4 still i could move to DDR5 and get another 10%
I always wonder what kind of games on what kind of monitors people are playing. I'm here with my 4 year old 14nm OC 10600kf and all modern games are super smooth 60+ fps experiences, VR everything. With a fast gpu ofc.
@@philanthropist5005 depends heavily on the game for me. Fast paced games, shooters, racing games etc. do profit a lot from 90-120hz, and VR of course. Which my cpu is totally capable of delivering so far. But in something like Silent Hill 2 remake, I rather have raytracing and ultra settings at 60fps instead of turning stuff off for 90fps (all gpu dependent anyway...). I just don't see a reason to upgrade my cpu, but I guess there is the e-sports shooter 144hz crowd?
Idk if you’re asking by what games I play, but I have a 4k, 144hz, mini led monitor. It’s the INNOCN MV27. Rn I’m mostly playing Dragon ball z Sparking zero, but I also play Halo infinite, the new warhammer game, all the resident evil games, Minecraft, and a couple of old gems
If the 9800X3D is a double-digit percentage increase I'll be very surprised, even if the performance gains are as little as going from 100 to 110FPS, I'd be surprised.
I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This cpu is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did,Mortel shell still drops but not as far down or as often
Chiplet design has both upsides and downsides. Upside currently is that majority of silicon in AMD CPU:s is in process node that's half the price, while getting benefits of faster silicon for the cores. But biggest benefit is not spending hundreds of millions or billion to make many different designs the in latest process node.
Also, instead of making large, monolithic cores, you make loads of tiny chiplets. These chiplets can be arranged in multiple configurations to get more effective use of wafers/dies with errors.
@@zam1007 The downside of communication between silicons, in both latency and power. And packaging cost would increase. And it would start increasing number of designs they have to push through the fab after certain point. And there's a small extra silicon needed for communication interface between chiplets. AMD is splitting its designs between expensive and cheap silicon, and giving the high-end two pieces of expensive silicon. This is cost effective way of using chiplets, avoiding the need for another design for high-end, and splitting the design at cost of silicon point. And this is what allows AMD to offer the prices we see. But the communication latency of of NOT sharing L3 cache is high, but on the other hand issues L3 latency increasing by adding too many interfaces to it is also real.
Oh I know. This is why I am on Intel myself. Productivity per dollar is more at every tier on Intel (and gaming perf is actually more consistent except for the x3d chips). The ARL-S chips just ain’t moving the needle though. Appreciate the comment
The worst part is the compute tile of Arrow lake has moved away from the center of the silicon die , your cooler cant disspiate heat efficiently like raptor /alder lake CPU .
@@rattlehead999 9700x/zen4/zen5 has issues with offset die and thick ihs due to am4 cooler compatibility. If you go down to a cooler with only ~70w of cooling capacity, the 9700x will run cooler.
@@SiliconSteak Wait till the 9800X3D drops. It’s supposed to be this Sh*t. I’m running 14900KS on water cooling and I can’t complain. Looks like I’m stuck with it for a good while after what I just saw with this Intel launch. Future for the near term looks grim for team blue. Only other thing I might pull the trigger on is the new 5090 GPUs coming out. Thx for the response
Reason reviews showing 7800X3D faster now is reviewers are using the Intel baseline spec bios setting on 14900K testing., and updates to agesa and windows is helping a lot. 13900K/14900K still are excellent for gaming/productivity. No reason to upgrade to the Core Ultra for any reason right now.
@@tracesmith3572 They will never admit it. Out of 5 13900K's I had to RMA 1 because it ran fully stock and allowed to run unchecked. The other 4 are running fine because they have cores locked with none of that TVB BS. Intel still has not refunded me for that RMA's 13900K and has not responded to me for 3 weeks now. RMA was started way back in July. Intel RMA is shit as well as this Core Ultra launch. Karma is coming around to bite Intel where it hurts the most with this launch.
The 5800x3d/5700x3d are the 1080Ti of CPUs.
Right, I agree completely! 2.5ys old.
I agree, Zen3 x3D is more the 1080-Ti.
The 7800x3D is more the 4090, without the baggage.
@@mraltoid19 at least the 7800x3d is a reasonable price tho
This is not Arrow Lake but Narrow Lake. Maybe Sorrow Lake.
when era of stupid names will be over?
Arrow in the Knee Lake 😅
@@Kyanzes They are changing the way they are doing things, and this is the first iteration.
Arrow To The Knee Lake
never thought id actually prefer a 3rd reiteration of raptor lake...
Imma pair a 7800x3d with a 1080ti.
I will name the machine "Heard of GOATs?"
It's joever I'm going team red Lisa plz don't milk me for $550 9800X3D
AMD thought they had worst CPU launch with Bulldozer, but Intel said, "Hold my BEER". 285K is a regression in gaming lol.
Core i9-14900KS overclocked to 9.1 GHz but 285K can only get to 7.5Ghz under LN2. Overclocking is dead on Core Ultra and only impressive thing is the E-cores!!
Yeah the e cores are really good now. I think the new N line of mini pcs could be good for low power home servers
@@SiliconSteak Would be really good from what I have seen.
@@JDD_Tech_MODS Im running the 14900KS on water cooling and I love this chip. I also never downloaded any of the new microcodes because they will gimp the performance. But you gotta water cool this thing otherwise it will down clock as it hits high temps.
@@papasmurf5598 14900KS is my daily dr8iver now with a full custom loop. This chip is a lottery winner. I'm running 8000MT cl38 on a 4 DIMM Z790 Aorus MASTER board. The IMC on it is incredible.
Did you set an all-core freq on yours? That will negate the need to ever need those microcode updates, and I don't blame you one bit for not updating it. Running it at stock settings allows Intel's faulty vid requests to hammer the chip in to a death spiral.
my 12900k is still going strong. no need to upgrade it. nothing really makes full use of it. and im on DDR4 still i could move to DDR5 and get another 10%
I always wonder what kind of games on what kind of monitors people are playing. I'm here with my 4 year old 14nm OC 10600kf and all modern games are super smooth 60+ fps experiences, VR everything. With a fast gpu ofc.
60FPS isnt bragging let alone a desired target.
@@philanthropist5005 depends heavily on the game for me. Fast paced games, shooters, racing games etc. do profit a lot from 90-120hz, and VR of course. Which my cpu is totally capable of delivering so far. But in something like Silent Hill 2 remake, I rather have raytracing and ultra settings at 60fps instead of turning stuff off for 90fps (all gpu dependent anyway...).
I just don't see a reason to upgrade my cpu, but I guess there is the e-sports shooter 144hz crowd?
@@clouds5 e-sports shooter is more like 240fps+. Even without the refresh rate to back it, higher framerates do a lot for the feel of the game.
Idk if you’re asking by what games I play, but I have a 4k, 144hz, mini led monitor. It’s the INNOCN MV27.
Rn I’m mostly playing Dragon ball z Sparking zero, but I also play Halo infinite, the new warhammer game, all the resident evil games, Minecraft, and a couple of old gems
The new Ultra core cpus are pure fail. Some of the Intel higher up needs to be fired for making this garbage. Give us more p cores and no ecores.
Stupid to cancel royal core.
Yeah true just give us 14th gen on TSMC 3NM 😩.
If the 9800X3D is a double-digit percentage increase I'll be very surprised, even if the performance gains are as little as going from 100 to 110FPS, I'd be surprised.
I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This cpu is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did,Mortel shell still drops but not as far down or as often
Lower power consumption is much welcome, I live in the tropics so let's not overheat things
it will get better.its the future.
When is battle image is coming damnn bro
Hopefully this year man
Chiplet design has both upsides and downsides. Upside currently is that majority of silicon in AMD CPU:s is in process node that's half the price, while getting benefits of faster silicon for the cores. But biggest benefit is not spending hundreds of millions or billion to make many different designs the in latest process node.
Also, instead of making large, monolithic cores, you make loads of tiny chiplets. These chiplets can be arranged in multiple configurations to get more effective use of wafers/dies with errors.
@@zam1007 The downside of communication between silicons, in both latency and power. And packaging cost would increase. And it would start increasing number of designs they have to push through the fab after certain point. And there's a small extra silicon needed for communication interface between chiplets.
AMD is splitting its designs between expensive and cheap silicon, and giving the high-end two pieces of expensive silicon.
This is cost effective way of using chiplets, avoiding the need for another design for high-end, and splitting the design at cost of silicon point. And this is what allows AMD to offer the prices we see.
But the communication latency of of NOT sharing L3 cache is high, but on the other hand issues L3 latency increasing by adding too many interfaces to it is also real.
bro you could just sit there saying norhing and i would still watch 😍
Bro 🗿
Missing the point of Intel CPU's. You get 5 less FPS at the cost of being 20+% faster in everything else but games compared to AMD.
Oh I know. This is why I am on Intel myself. Productivity per dollar is more at every tier on Intel (and gaming perf is actually more consistent except for the x3d chips). The ARL-S chips just ain’t moving the needle though.
Appreciate the comment
The worst part is the compute tile of Arrow lake has moved away from the center of the silicon die , your cooler cant disspiate heat efficiently like raptor /alder lake CPU .
Except Arrow Lake runs cooler than Raptor Lake.
@@rattlehead999 yea this guy is lost in the sauce. If it was drawing 270W+ like raptor lake it would be a problem.
@@faranocks The 245k consumes up to 140W but runs cooler than a 9700x with the same cooler.
@@rattlehead999 9700x/zen4/zen5 has issues with offset die and thick ihs due to am4 cooler compatibility. If you go down to a cooler with only ~70w of cooling capacity, the 9700x will run cooler.
@@faranocks they don't run cooler with wraith coolers compared to an Arctic36 or dual tower or water cooler.
just yesterday i ordered a 7800x3d i ll hope im fine.
Looks like you’ll be eating good for a while 👍
Isn’t the 7800X3D a Chaplet design, and it’s the Gaming King right ?
Yeah it has enough cache that it negates the chiplet design (this is literally why it’s so good)
@@SiliconSteak Wait till the 9800X3D drops. It’s supposed to be this Sh*t. I’m running 14900KS on water cooling and I can’t complain. Looks like I’m stuck with it for a good while after what I just saw with this Intel launch. Future for the near term looks grim for team blue. Only other thing I might pull the trigger on is the new 5090 GPUs coming out. Thx for the response
ULTRA has the same energy as FX...just sayiin.
The Zen 5 launch wasn't has bad as this Intel launch, Arrow Lake is much worse then Zen 5, lol
You’re right, and it being a whole new platform is even worse. It’s real bad
Got an i9-13900K with a 7900 XTX, for my use case its like the best combo lol
What is your use case?
@@SiliconSteak I use the 7900 XTX due to having better Linux drivers compared to Nvidia and my i9-13900K with its quicksync for hardware encoding
Oh nice
I feel like 1080 Ti would have been that GPU but because the follow up was RTX it ruined it potential.
yea this is something you don't hear ppl talk about as much. Imagine how much better it would be if the 1080ti had DLSS too
I'm still using my ryzen 1700x.
I use to be a reviewer like you, than I took an arrow lake to the knee.
Reason reviews showing 7800X3D faster now is reviewers are using the Intel baseline spec bios setting on 14900K testing., and updates to agesa and windows is helping a lot. 13900K/14900K still are excellent for gaming/productivity. No reason to upgrade to the Core Ultra for any reason right now.
Only the chips that didn't burn, and who knows how many did. I'll be it's a lot more than Intel wants and will admit.
@@tracesmith3572 They will never admit it. Out of 5 13900K's I had to RMA 1 because it ran fully stock and allowed to run unchecked. The other 4 are running fine because they have cores locked with none of that TVB BS. Intel still has not refunded me for that RMA's 13900K and has not responded to me for 3 weeks now. RMA was started way back in July. Intel RMA is shit as well as this Core Ultra launch. Karma is coming around to bite Intel where it hurts the most with this launch.
LIE!!! It's because they have updated Windows to remove the AMD crippleware code lmao.
@@JoeL-xk6bo What lie?? I literally said windows updates. lol
got a 7800X3D for 200$
That’s a W deal