Tests with faster memory will come a little later, for now the only one I have is this Games : Silent Hill 2 - 0:09 CYBERPUNK 2077 - 1:01 Starfield - 2:08 Forza Horizon 5 - 3:05 Star Wars Outlaws - 4:05 CS2 - 4:58 Microsoft Flight Simulator - 5:57 Hogwarts Legacy - 6:58 Ghost of Tsushima - 7:55 The Witcher 3 - 8:52 Red Dead Redemption 2 - 10:00 System: Windows 11 Core Ultra 9 285K - bit.ly/4fisOO6 MSI MAG Z890 TOMAHAWK - bit.ly/3NFmC7a Core i9 14900K - bit.ly/3rTFhVy ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy GeForce RTX 4090 24GB - bit.ly/3CSaMCj SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
@@aashritmalik6931 Those power numbers are not telling the real story because the CPU pulls now power in another way. To see what the real difference is one should measure how much power is being pulled from the power socket.
Degradation? It's actually beats the 14900k in workload. The real comparison for this CPU is the 9950x Nobody said that Intel makes gaming CPU. If you buy an Intel just for gaming it's your fault 😂
Reducing the lithography size and getting better performance in games are unrelated. However, reducing the lithography size to consume less power-that’s where there’s a connection.
@@zarkha_ When giving less power to the 14900K to align the performance with the 285 Core Ultra in gaming, there is barely any improvement in power efficiency.
@@nossy232323 Well, actually there is, since you have to reduce the power of the i9-14900K to match the consumption of the Core Ultra 285K, so there is a difference in power efficiency, right? lol
@@zarkha_ No. It's like comparing the fuel consumption of two cars driving at different speeds. A Suzuki and BMW. The Suzuki gets better fuel consumption but only because it's driving slower. If the BMW matches the speed of Suzuki and they both get the same fuel consumption, we do not say the Suzuki is more efficient because the BMW had to slow down to the same speed. We say they're the same efficiency at that speed. It's illogical to claim the BMW is less efficient if they're achieving the same efficiency at the same speed.
I think my 12900k is breaking. Damn PC keeps crashing even on a new Windows install. Was gonna see if the new i9 was worth but apparently not. Maybe 14900k will work.
@@i7-1260P I did memtest and the ram modules were fine. Either the chipset or ram controller on the CPU was starting to degrade. Spent $1000 on a Z790 and 14900k because I got fed up with it. Everything is working now and I haven't had anymore freezes or blue screens. It's a shame, I just built that PC in November 2021...
@@Thatwholesomeguy 3D Mark tests say other wise, i9-13980HX is the same sillicon and is by far the best (or was?) perfomance per watt chip there, like 3 times the 7800X3D
man i was reading all the hate, but that is noticeable, is a lot of power less, means processor efficiency is better, that all i want when i buy technology, this could mean a really good oc, that is 3nm process, only iphones did that, 45w less with same performance is crazy, could mean a lot on mobile devices, thanks for the comment i didnt notice it at first
Correct me if I'm wrong but apparently this latest gen of Intel CPU can use UP TO 50 watts via the 24 pin connector. I'd be surprised if HwInfo isn't just pulling the CPU power from the EPS cable. Sneaky if you ask me.
@@rafatejera329 Adding on to Franki's comment, this isn't factoring in the whole power draw. Once factoring in the power usage with the cable as well, power usage is comparable to AMD's 9xxx chips and a significant drop since 14th gen intel. Yes, it draws less than last gen but it's looking like nothing more than a power and temperature catch up. As others are noting/hoping, good 'reset' and hopefully this is a Zen 1 moment.
@@pcrepairshop6799 Your comparison is stupid. X86 can match ARM easily and also performs better. Apple silicon chips don't have the magic of ARM but the magic of Nuvia engineers.
X3D is 9700x just some cache on top of 9700x cache and call it 3d , just like in nvme storage for more storage you need more nands so... put more on the top of already nand chip and call it 3d nand
X3D only has more cache , you will see the performance only with 4090 1080 low settings , and that is how you play game..... it is a scam believe me i know tech very well , don't trust tech tubers
7800X3D was 7700X and 9800X3D is 9700X. the have what called (Level 1 Cache , Level 2 Cache and Level 3 cache) Level 3 is the last cache and after level 3 cache the cpu asks for ram that is why ram speed matters. like i say 9800X3D is 9700X why.... becuase 9700X has 32mb level 3 cache and amd puts the same amont of more cache on top of that then it is 64mb and now we got 9800X3D. so when does this cache matter = RTX 4090 + 1080p + lowest settings to quickly push more frames .. just a fucking scam
What's the point? They could simply reduce the core frequency of the i9 14900K by 300MHz, which would significantly reduce the voltage, which led to a decrease in consumption
for the consumer perspective you are right, but from Intel, is a new architecture, with a different design on a 3rd party node, take it as "Ryzen 1st" gen for them... although Ryzen was more exciting
@@Qelyn Your right they don't need to, but AMD didn't know how bad the 285K was going to be when they made the 9800x3D. AMD always assumes that intel is going to do what they claim so they worked hard on improving the frequency of the x3d chips. They have already announced it and set a release date so they can't back out now!
@@sirgwaine2695 Given the Intel bad performance they may not release it. Milking money on 7800x3d for gaming and 9000x for productivity. I am sure they gonna skip 7 Nov date😅
actually if these core ultra with hyperthreading it will be much better but it will be the same way as 14th gen since intel mention about intel core ultra will be able to power efficiency than previous gen, no performance boost is ok but lol you have to replace socket too so literally disappointed
energy efficiency is still bad. 7800x3d uses half the energy of the ultra9 cpu in games and has 20% more fps on average in hardware unboxed, gamers nexus tests...
@@alejoyugar If you clock the 14900K a bit lower and give it less power until you get the gaming performance of the 285 Core Ultra, I doubt there would be much difference in power consumption.
@@syedawishah ooow so thats what this video was all about....multi working heavy load...now it all makes sense i thought it was to show gaming performance....silly me
@@syedawishah not true, since the windows patch non x3d chips can match the 14700k/14900k, and are the same or faster in productivity. 0 reason to buy intel
The results are not that disgusting vs other videos. It will be more interesting to see the performance of 285 in brand new games when developers can take advantage of the new architecture maybe.
excellent generation for data centers and enterprise customers, and very solid foundation for future gaming, I see about 35-40% power efficiency and about 7-9 degree cooler than previous gen CPUs...
I’m hoping I should be okay as I only built my PC in June and soon after they released micro code updates. I’ve installed the latest update and thankfully haven’t had any issues with stability.
I really don't know what Intel was thinking with this one, they should have done a 14xx series refresh and just add an 'X' or something to the name so customers know it is the new and improved 14 series without the stability issues. Would make a lot more sense than releasing a far slower new range with a pointless new naming scheme
@@rafa2657 A new socket does nothing for consumers though, it just means a new CPU requires an entire platform upgrade, unlike AM4 and AM5. So it is just another negative for customers
@@MaTtRoSiTy yeah but that's what I'm saying. Who is going to updrade your 12th 13th 14th for this 15th gen, considering they need to change their CPU. They should have done what you said and still keeped the old socket.
@@BlackWGame just adjust the voltaje in the BIOS with the CPU lite load option reducing the number, if the voltaje is less than 1.299 your cpu Will be fine.
I mean I get it that there is a low power consumption at the cost of the performance but if I'm considering the Top End CPU then I want the Ultra Performance! I will not care about the Power Consumption cause those other High End components I will be having will already consume hell lot of power so I'm already aware of the Electricity Bill!
Arrow lake is a definite regression , but its not 100% bad , the power comsumption is much lower , and in real world gaming performance, the difference will be minimal. What intel should do is lower the price immediately, so atleast it will sell some units. You are still better buying this than an unpredictable cpu like the 14900k which degrades after a year. I am more excited to see the 9950x 3d , it is alleged to have 100mb vcache for each ccd so it should have a total of 200mb vcache. That will make it a gaming and productivity monster
Every single test (except Starfield), the 14900K was like 3-5% faster, and used 25% more power. I'm wondering if the 285K, when up-clocked, may beat the 14900K, but Intel just wanted to sell the power efficiency.
Please add Warhammer Space Marines 2 to your tests! Thank you for a proper comparison, but in several games (Starfield for example) you need to set 720p cause GPU load is 96+% for both CPUs. Waiting for 1080p with a new 5090 card.
Excellent move by Microsoft and Intel. Release patches for PCs so that all 13-14 generation processors start to malfunction. Raise hype by scaring people. Now everyone will be afraid to take 13-14 generation and buy only new CPUs, although in fact these are the same 13-14 only with HT disabled and cut frequencies. What can I say, beautiful. The guys from Intel clearly understood that they would not be able to jump above their heads. Bravo!
What I don't like about the 285k is the P-Cores only have eight threads. I'm sure some of these thread-gobbling games will take a small performance hit with this processor. I don't see this processor aging very well. Also, the power figure is tricky because these processors like to suck down more power from the 24-pin connector, which throws off the reported power figures in the overlay.
I think the 285k will end up being faster than the 14900k once new platform teething issues are ironed out (bios and os). For one thing there's a lot more thermal room for overclocking. Once we figure out how to oc these things...
How the fuck can anyone love a Tech Company that's anti consumer and does not want to progress their technology for the greater good but only for Money. Don't forget the time Intel stagnated CPU performance because they haven't had competition but hey guys, new architecture, new chip set you need to buy every time even tho it's only 5% higher clock every single time. And I'm currently using intel 13th gen and 14th gen was a nostalgic move from them as well as all the f-ing problems that came from that.
@@ESKATEUKhow is it a joke? It’s a refresh of 13th gen. Same as amd with ryzen 9000. 12th gen and 13th gen were a huge improvement. “nothing released in years” average amd fanboy
@@XFXGX an average AMD fanboy who has never owned and AMD cpu or gpu in his life 😂 I’ve always owned intel and nvidia. I’m just not a biased fool like yourself.
@@ESKATEUK I’m not biased I just look at the performance, product over brand, it’s objective their 12 and 13th gen were big performance improvements. it’s not true that they haven’t released anything good in a while, years ago they were stagnating. 15th gen isn’t much though, so far at least. Probably cause they’re nerfed in clock speed. 500mhz slower
If people want new cpus there going to want absolute power houses. Ultra core 9 is like $600 and the r7 9800x3d is like $480 + way lower power usage and way higher performance
@@westernonion3338 yeah, those who are rich and noisy on the YT (they are noisy and i bet they don't have the top end cpu, just making fun of intel for the sake of meme), but many people in real world prefer lower wattage with powerful CPU since many countries have expensive electricity and heat environment
Out of curiosity, If it's possible I'd like to see the performance of these new Core Ultra CPUs with the E-cores disabled, I don't think these CPUs will fair very well without having to offload onto the E-cores, (assuming games will even do that) not having Hyper-Threading was a huge mistake in my opinion, that's the main reason why many old CPUs are still viable today.
I wonder how they'd compare once 285K has 5.7GHz all core and matched ring bus clocks. Seems they're slowing the chip down. Need to see more OC comparisons
Sooo... when we are limited by the GPU, they are equal, but once we actually compare them in a less GPU intensive gaming scenario, the 14900k is considerably faster.
Intel really went "Fine, I'll do it myself" when it comes to burying itself in this whole Intel vs AMD thing. And all they had to do was, not fuck up for like 3 times in a row. Can't make this shit up man...
Thank you fot this video! Could you do a test another time with the 285k OC 5.7ghz without the E-cores or with E-cores to see if there is any difference?
People might not agree but this is a step in the right direction. The CPU pulls much less power, performance is worse but performance per watt is increased. The technology is improved for sure. It is all about what Intel would do with the next-gen.
Normally, power efficient is nice. That probably mean it has better stability with cheaper part. Tho I can imagine company to just make newer motherboard as expensive as last model and make it support even lower wattage
This is their first version of chiplets or in their term, tiles. Yes, it is bad, I think they have problem with the latency between the tiles. In all honesty, I expect a cost reduction due to chiplet design but the pricing isn't great. Unless you are using it for productions or a hardcore fanboy, no reason to get this generation intel.
I was comparing this video with Core Ultra 5 245K vs Core i5 14600k. How a Core Ultra 9 and i9-14900K have less FPS than an I5? Like it's 75fps vs 119fps, quite a big difference at Microsoft Flight Simulator. Am I missing something?
I have a quick question can you please respond to me For the intel i9 14900KF for a 4080 super with a 1080p 540hz monitor how much fps should I be getting.
@@iikatinggangsengii2471 I remember when 12900k was barely any better then 11900k, wich was a massive letdown on the first place (on some scenarios, the 11900k was worse then 10900k, mainly due to being 8/16 instead of 10/20 on 10900k). Then the 13900k came out and it blew 12900k out of the water.
must say that 285K is interesting one... im so curios what went wrong 1. software sending work to E cores instead of P cores ? 2. the new tile design have bad latency ? 3. new architecture just bad for gaming ? I guess its combination of 2 or even all 3.
Hey! Don't be so harsh on it, guys! At least the Ultra 9 has lower power consumption, isn't that something? RIGHT?! Still ridiculously higher than 7800X3D tho--
I feel like in a year or 2 it would be good since this new architecture is made from scratch. Windows drivers need to gets patched for the cpu and motherboard drivers need to mature. Ryzen in 2009 was the same, they had to mature the bios and get better driver support on windows. How the tables have turned table.
if the power draw is a selling point for you just get ryzen, they're still nearly 40-100% more efficient in gaming than this new 285k and faster. intel is a joke
@@Definedd I even use a negative PBO of 20 and now my Ryzen 5 5500 uses 41w max under load and in gaming only use 12-15w of power. That is with still the same default clock speed of 4.25GHz.
Does the new core ultra cpu have same issues with dx12 games crashing,it might be a bit slower but if this issue gets resolved with this new cpu it might be worth it.
Better 1% and 0.1% less power usage sometimes up to 50% almost same but beats it out in most times with fps runs cooler and has better value compared to older gen now is a LOT better in work tasks and rendering stays cooler and costs 140 more
Tests with faster memory will come a little later, for now the only one I have is this
Games :
Silent Hill 2 - 0:09
CYBERPUNK 2077 - 1:01
Starfield - 2:08
Forza Horizon 5 - 3:05
Star Wars Outlaws - 4:05
CS2 - 4:58
Microsoft Flight Simulator - 5:57
Hogwarts Legacy - 6:58
Ghost of Tsushima - 7:55
The Witcher 3 - 8:52
Red Dead Redemption 2 - 10:00
System:
Windows 11
Core Ultra 9 285K - bit.ly/4fisOO6
MSI MAG Z890 TOMAHAWK - bit.ly/3NFmC7a
Core i9 14900K - bit.ly/3rTFhVy
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
I can't wait for the 1440p and 4K testing between these CPUs. 👍
7 9800x3d ok you thank 😂
Why 1080p go 4k
How does this trash channel have 512k subs?
Intel started making crap
I finally understand the meaning of the word “Ultra” in the name of this CPU - Ultra disappointment XD.
ultra hyita
they truly burn the word there, it should be for a future refresh that actually play games better than Ryzen (or at least their own last gen)
@@КАВО2ne posporish
from Error Lake Architecture 😂
Again, these useless tests with DDR5 6000... I'm so tired of this...
AMD: we have no performance uplift
Intel: how about performance dragdown??
AM five percent vs Intel less five percent
AM5% vs Ultra disappointment. 🤣🤧
@@Jakiyyyyy AMD 5% vs Intel -2.85%
Intel dont understand we want hot processors to spend more with entusiast watercoolers.
@@JynxedKomamore like 28.5
Wow, that's basically degradation! -fps, -hyperthreading, + price, + new socket. Amazing work Intel
I wouldn't even care if it lost hyperthreading if the performance was up. Seems to only be up in some productivity apps
The 14900k has literal degradation
Bro fps are like almost the same but look at the cpu wattage.... it is noticeably low...
@@aashritmalik6931 Those power numbers are not telling the real story because the CPU pulls now power in another way. To see what the real difference is one should measure how much power is being pulled from the power socket.
Degradation? It's actually beats the 14900k in workload.
The real comparison for this CPU is the 9950x
Nobody said that Intel makes gaming CPU. If you buy an Intel just for gaming it's your fault 😂
Going from 10 nm to TSMC's 3nm and having a regression in gaming performance!! Wow Intel you have done the impossible!!!
Reducing the lithography size and getting better performance in games are unrelated. However, reducing the lithography size to consume less power-that’s where there’s a connection.
@@zarkha_ When giving less power to the 14900K to align the performance with the 285 Core Ultra in gaming, there is barely any improvement in power efficiency.
@@nossy232323 Well, actually there is, since you have to reduce the power of the i9-14900K to match the consumption of the Core Ultra 285K, so there is a difference in power efficiency, right? lol
they have their own chip factory
@@zarkha_ No.
It's like comparing the fuel consumption of two cars driving at different speeds. A Suzuki and BMW.
The Suzuki gets better fuel consumption but only because it's driving slower.
If the BMW matches the speed of Suzuki and they both get the same fuel consumption, we do not say the Suzuki is more efficient because the BMW had to slow down to the same speed. We say they're the same efficiency at that speed.
It's illogical to claim the BMW is less efficient if they're achieving the same efficiency at the same speed.
error lake , im chilling with my 12900k
12900k is laughing while watching this benchmarks in your pc XD
I think my 12900k is breaking. Damn PC keeps crashing even on a new Windows install. Was gonna see if the new i9 was worth but apparently not. Maybe 14900k will work.
@@Sonavyon chack your ram with
Mem64 to see if have any problem
it’s happened with my 12100F because the timing was sick :/
@@i7-1260P I did memtest and the ram modules were fine. Either the chipset or ram controller on the CPU was starting to degrade. Spent $1000 on a Z790 and 14900k because I got fed up with it.
Everything is working now and I haven't had anymore freezes or blue screens.
It's a shame, I just built that PC in November 2021...
12900k , 12700 ,12600k , 12400 , 12100 only these cpu is saving Intel reputation in p.c Market as per price to performance
So, it's more like Core Ultra -9% 285K than Core Ultra 9 -2.85%K?
at this point just undervolt the i9 14900k
It does have crazy undervolting headroom with the ridiculous amount of e-cores.
🤣
You not gonna use 24 cores . Trust me
@@Thatwholesomeguy 3D Mark tests say other wise, i9-13980HX is the same sillicon and is by far the best (or was?) perfomance per watt chip there, like 3 times the 7800X3D
@@saricubra2867 sir 13900hx is a laptop CPU also 3d mark have its name 3D MARK its not for games on a game you never gonna use 8+ core
1 step forward, 2 steps back. This gen feels like what the 13th gen should've been. It's basically just a Raptor Lake bugfix.
i must say im very impressed with 14gen, that 14400 seems like good pick if cheaper board available
Better performance ❌️
Better power efficiency ✅️
Look at the power consumption difference.
99% comments are blind and idiots, until you and me.
man i was reading all the hate, but that is noticeable, is a lot of power less, means processor efficiency is better, that all i want when i buy technology, this could mean a really good oc, that is 3nm process, only iphones did that, 45w less with same performance is crazy, could mean a lot on mobile devices, thanks for the comment i didnt notice it at first
Correct me if I'm wrong but apparently this latest gen of Intel CPU can use UP TO 50 watts via the 24 pin connector.
I'd be surprised if HwInfo isn't just pulling the CPU power from the EPS cable.
Sneaky if you ask me.
@@rafatejera329 Adding on to Franki's comment, this isn't factoring in the whole power draw.
Once factoring in the power usage with the cable as well, power usage is comparable to AMD's 9xxx chips and a significant drop since 14th gen intel.
Yes, it draws less than last gen but it's looking like nothing more than a power and temperature catch up. As others are noting/hoping, good 'reset' and hopefully this is a Zen 1 moment.
It almost forgive the performance difference, which still are practically unnoticable.
They went from 10nm to 3nm and somehow still managed to fuck up 💀
its 7nm, intel chip, intel5 tech
@@iikatinggangsengii2471 CPU-z missreports it, the compute tile is TSMC 3nm, the I/O is TSMC 6nm and the GPU is TSMC 6nm.
@@pcrepairshop6799 Your comparison is stupid. X86 can match ARM easily and also performs better. Apple silicon chips don't have the magic of ARM but the magic of Nuvia engineers.
@@iikatinggangsengii2471 No, they used various nodes from TSMC.
Gotto install all of those metrics and backdoors into their cpus...
So hyperthreading is officially removed from Intel from this generation?
Yes🎉
At least for now, yes.
and still no AVX-512
too much cores. most games can use max 16 threads. so hyperthreading is useless but have some problem with stability (only intel).
@@BUDA20 Sorry to tell you but AVX512 on Intel is called AVX VNNI if you read it from CPUZ
I was waiting for this video. Thank you. 👍
we got Zen 5%
now we got Core ultra -9%
worst cpu generation in years
9800X3D our only hope now
11900k was a disappointment
@@asrafulemon2004i hated the 11th gen when it was released shitty version 10th gen and oversimplified logo
У нас всё ещё есть AMD 7000 и Intel 12. Пока в других процессорах смысла нет.
Now we wait for the new X3D and see the difference :D
amd's 3d cache stack isnt gimmick and does improve fps significantly
@@iikatinggangsengii2471 I know, I'm just wondering how much Intel will screw up again after AMD introduces new processors
X3D is 9700x just some cache on top of 9700x cache and call it 3d , just like in nvme storage for more storage you need more nands so... put more on the top of already nand chip and call it 3d nand
X3D only has more cache , you will see the performance only with 4090 1080 low settings , and that is how you play game..... it is a scam believe me i know tech very well , don't trust tech tubers
7800X3D was 7700X and 9800X3D is 9700X. the have what called (Level 1 Cache , Level 2 Cache and Level 3 cache) Level 3 is the last cache and after level 3 cache the cpu asks for ram that is why ram speed matters.
like i say 9800X3D is 9700X why.... becuase 9700X has 32mb level 3 cache and amd puts the same amont of more cache on top of that then it is 64mb and now we got 9800X3D. so when does this cache matter = RTX 4090 + 1080p + lowest settings to quickly push more frames .. just a fucking scam
What's the point?
They could simply reduce the core frequency of the i9 14900K by 300MHz,
which would significantly reduce the voltage, which led to a decrease in consumption
true
@@fx8052 Intel Scam Point
the funniest thing is they went to tsmc 3nm for this too..
for the consumer perspective you are right, but from Intel, is a new architecture, with a different design on a 3rd party node, take it as "Ryzen 1st" gen for them... although Ryzen was more exciting
tdp, man, tdp
It will be years until intel can even reach the 7800x3d in gaming
And the 9800x3d is just around the corner (Nov 7th)
@@sirgwaine2695won’t be too hyped about the 9800x3D intel is showing they can’t compete amd doesn’t need to make a better cpu
@@Qelyn Your right they don't need to, but AMD didn't know how bad the 285K was going to be when they made the 9800x3D. AMD always assumes that intel is going to do what they claim so they worked hard on improving the frequency of the x3d chips. They have already announced it and set a release date so they can't back out now!
@@ravere1314 actually, I'll bet they did know.... which is why they pushed out Zen 5 so early.
@@sirgwaine2695 Given the Intel bad performance they may not release it. Milking money on 7800x3d for gaming and 9000x for productivity. I am sure they gonna skip 7 Nov date😅
Energy efficiency is not bad, but gaming performance will be disappointing
actually if these core ultra with hyperthreading it will be much better but it will be the same way as 14th gen since intel mention about intel core ultra will be able to power efficiency than previous gen, no performance boost is ok but lol you have to replace socket too so literally disappointed
@@igm1571 yeah and price of the mobo alone is already ridiculous. Not gonna bother.
Nobody buys these top end CPUs to save a few pennies on energy efficiency. Be real.
energy efficiency is still bad. 7800x3d uses half the energy of the ultra9 cpu in games and has 20% more fps on average in hardware unboxed, gamers nexus tests...
and thats on 4nanometers... intel uses tsmc 3nm !
As a intel fan boy😭 i was waiting for this🤡😭🙏
Get a grip
but hey at least they are now really making their cpus energy efficient
@@alejoyugar If you clock the 14900K a bit lower and give it less power until you get the gaming performance of the 285 Core Ultra, I doubt there would be much difference in power consumption.
@@alejoyugar No where close to AMD though.
@@MrBoombast64 ryzen 9000 is the same thing, barely any more performance
It's satisfying to think my Ryzen 7 5700X will last at least 2 years longer than I thought
I haven't found a single decent CPU as an upgrade after my i7-12700K, everything out there is mediocre.
@@saricubra286713600KF user, same
I'm fine with my i9-12900KS same performance in games to reduce consumption by 50-60w, 3 generations ahead, it seems like a joke.
and still AMD RYZEN 7-7800X3D is the Undisputed Gaming Heavyweight Champion
Only in gaming not multi working heavy load bro
@@syedawishah ooow so thats what this video was all about....multi working heavy load...now it all makes sense
i thought it was to show gaming performance....silly me
@@syedawishah in this case R9 9950x is champion
@@syedawishah not true, since the windows patch non x3d chips can match the 14700k/14900k, and are the same or faster in productivity. 0 reason to buy intel
@@thedawn-rt9rx intel is better in gaming too
I thought it was impossible for performance to get worse between generations; sadly, I was wrong.
I stood 10 years with an amd CPU and i was so pissed about it and i changed to intel 1 year ago and they release this sht.I am cursed
I'm giong from a 13900KS to the 285k, which means I'm moving to basically identical performance.
At least the new one will survive
Bruh why you can save this cpu for 7 year and give it to youre kid to and to youre grandchild 😂❤
Everyone's talking about power efficiency let's see this abomination vs 7800x3D
And yeah I am talking about gaming performance solely
Rememver kids. It's not a bad CPU. It's a badly priced one. $300 and it will sell like hot cakes.
Intel is never gonna sell their flagship for 300, keep being delusional
@Definedd I don't know about that. I'm not delusional, but I'm sure nobody will buy this crap for $600 unless they use userbenchmark as their source.
The results are not that disgusting vs other videos. It will be more interesting to see the performance of 285 in brand new games when developers can take advantage of the new architecture maybe.
Well, are you sorry to pay 600 bucks for a processor and another 300 for a motherboard to help the poor guys from Intel?
Those z890 is cost way more than 300 bucks my friend 😂
very true, but positively youll be able to get max perf out of your ultra proc
@@phatminhphan4121 ye i got right now Z790 MAXIMUS HERO which i got for 600EUR and Z890 of this cost 1k EUR lol
@@iikatinggangsengii2471 JUST UNDERVOLT 14900 BRUH
excellent generation for data centers and enterprise customers, and very solid foundation for future gaming, I see about 35-40% power efficiency and about 7-9 degree cooler than previous gen CPUs...
Ok😂 these data centers prefer amd doe.
9800x3d ftw
Im glad i did not wait for this launch and went with the 14900K
Good luck with RNG...
I’m hoping I should be okay as I only built my PC in June and soon after they released micro code updates. I’ve installed the latest update and thankfully haven’t had any issues with stability.
I really don't know what Intel was thinking with this one, they should have done a 14xx series refresh and just add an 'X' or something to the name so customers know it is the new and improved 14 series without the stability issues. Would make a lot more sense than releasing a far slower new range with a pointless new naming scheme
Don't forget the new socket.
@@rafa2657 A new socket does nothing for consumers though, it just means a new CPU requires an entire platform upgrade, unlike AM4 and AM5. So it is just another negative for customers
@@MaTtRoSiTy yeah but that's what I'm saying. Who is going to updrade your 12th 13th 14th for this 15th gen, considering they need to change their CPU. They should have done what you said and still keeped the old socket.
@@MaTtRoSiTy I'm going to build a PC, and probably going with AMD, lets see the prices here in Brazil of this new 15th gen and CPUs.
Having a 14600k i dont see any reason for a further upgrade in the next 3 years
Me too but we live with russian roulette chips
@@BlackWGame just adjust the voltaje in the BIOS with the CPU lite load option reducing the number, if the voltaje is less than 1.299 your cpu Will be fine.
I mean I get it that there is a low power consumption at the cost of the performance but if I'm considering the Top End CPU then I want the Ultra Performance! I will not care about the Power Consumption cause those other High End components I will be having will already consume hell lot of power so I'm already aware of the Electricity Bill!
Arrow lake is a definite regression , but its not 100% bad , the power comsumption is much lower , and in real world gaming performance, the difference will be minimal.
What intel should do is lower the price immediately, so atleast it will sell some units. You are still better buying this than an unpredictable cpu like the 14900k which degrades after a year.
I am more excited to see the 9950x 3d , it is alleged to have 100mb vcache for each ccd so it should have a total of 200mb vcache. That will make it a gaming and productivity monster
Error Lake architecture 😂
Every single test (except Starfield), the 14900K was like 3-5% faster, and used 25% more power.
I'm wondering if the 285K, when up-clocked, may beat the 14900K, but Intel just wanted to sell the power efficiency.
There's no point on getting an Intel Cpu nowdays for Gaming. 🤦 AMD really owned gaming Cpu's now.
theyre doing well and 9700x is heck of a cpu
if you only like play at 144hz, or even less, then ryzen is cheaper, low power, and yet high fps
if the price is right, thats all that matters
-5 to 8% of performance with 50% of less consumption in the best case scenario
Please add Warhammer Space Marines 2 to your tests!
Thank you for a proper comparison, but in several games (Starfield for example) you need to set 720p cause GPU load is 96+% for both CPUs. Waiting for 1080p with a new 5090 card.
Seems to be more fireproof than the 14900K, so that's nice.
Ultra processors, ULTRA PERFORMANC.... Wait
I think everyone is waiting for 9800x3d
Excellent move by Microsoft and Intel. Release patches for PCs so that all 13-14 generation processors start to malfunction. Raise hype by scaring people. Now everyone will be afraid to take 13-14 generation and buy only new CPUs, although in fact these are the same 13-14 only with HT disabled and cut frequencies. What can I say, beautiful. The guys from Intel clearly understood that they would not be able to jump above their heads. Bravo!
Include AMD too they also released trash CPUs this year
I like the improvement in energy efficiency but at least I need performance to be maintained, not worst.
What I don't like about the 285k is the P-Cores only have eight threads. I'm sure some of these thread-gobbling games will take a small performance hit with this processor. I don't see this processor aging very well. Also, the power figure is tricky because these processors like to suck down more power from the 24-pin connector, which throws off the reported power figures in the overlay.
I think the 285k will end up being faster than the 14900k once new platform teething issues are ironed out (bios and os). For one thing there's a lot more thermal room for overclocking. Once we figure out how to oc these things...
5:31 shows clearly an Intel 14900k PC running Blender
I love intel but this is disspointing
How could you “love” intel. Their 14th gen and now this gen have been an utter joke. There’s nothing to love that they’ve released in years.
How the fuck can anyone love a Tech Company that's anti consumer and does not want to progress their technology for the greater good but only for Money. Don't forget the time Intel stagnated CPU performance because they haven't had competition but hey guys, new architecture, new chip set you need to buy every time even tho it's only 5% higher clock every single time. And I'm currently using intel 13th gen and 14th gen was a nostalgic move from them as well as all the f-ing problems that came from that.
@@ESKATEUKhow is it a joke? It’s a refresh of 13th gen. Same as amd with ryzen 9000. 12th gen and 13th gen were a huge improvement. “nothing released in years” average amd fanboy
@@XFXGX an average AMD fanboy who has never owned and AMD cpu or gpu in his life 😂 I’ve always owned intel and nvidia. I’m just not a biased fool like yourself.
@@ESKATEUK I’m not biased I just look at the performance, product over brand, it’s objective their 12 and 13th gen were big performance improvements. it’s not true that they haven’t released anything good in a while, years ago they were stagnating. 15th gen isn’t much though, so far at least. Probably cause they’re nerfed in clock speed. 500mhz slower
People talk about U9 285K loses vs 14900K but…
There is a 14900KS xD
lul, but really oc top tier like 9900k is always hard, and factory stable like ks is noticeably different
14900K not 14900KS
@@gabrielebarreca2111 I know but 14900KS exists
@@daemonx867yeah bro nobody talks about the KS one
everyone is so stupid that this is 20-30w power draw reduced! with only 2-5% performance reduced which is on right track right now
If people want new cpus there going to want absolute power houses. Ultra core 9 is like $600 and the r7 9800x3d is like $480 + way lower power usage and way higher performance
@@westernonion3338 yeah, those who are rich and noisy on the YT (they are noisy and i bet they don't have the top end cpu, just making fun of intel for the sake of meme), but many people in real world prefer lower wattage with powerful CPU since many countries have expensive electricity and heat environment
Out of curiosity, If it's possible I'd like to see the performance of these new Core Ultra CPUs with the E-cores disabled, I don't think these CPUs will fair very well without having to offload onto the E-cores, (assuming games will even do that) not having Hyper-Threading was a huge mistake in my opinion, that's the main reason why many old CPUs are still viable today.
having Flight Simulator on full HD with 4090 just on 70 fps is fckin wild
That's because the game is CPU limited at 1080p.
Look at the 4090 utilisation, it's at 45%
look at gpu usage
Will you add a new call of duty to the tests later?
I wonder how they'd compare once 285K has 5.7GHz all core and matched ring bus clocks. Seems they're slowing the chip down. Need to see more OC comparisons
Sooo... when we are limited by the GPU, they are equal, but once we actually compare them in a less GPU intensive gaming scenario, the 14900k is considerably faster.
Not necessarily. They're essentially tied in Flight Simulator which is an incredibly CPU bound game. And the 14900k draws 40% more power.
Intel really went "Fine, I'll do it myself" when it comes to burying itself in this whole Intel vs AMD thing. And all they had to do was, not fuck up for like 3 times in a row.
Can't make this shit up man...
they should releases ultra instinct next years..
1 game out of 10😂😂
Intel💪
and that game is starfield... using an engine from the year 2000
Интересно было бы посмотреть этот ультра 9 и мой 9900к. Разница 6 лет как никак
9900к мощнее хд
9900к конечно же уступать будет
14100 и то мощнее в играх чем 9900
Ага, ага, с 4 фризящими бутылками@@Leon26039
@@angrynimbus270 мощность на 1 ядро сильнее чем 9900К
Power used by the cpu is impressive. This gen is not a leap but sets. Great foundation for the next one
If you have the 14900k wait four more years to upgrade now it's not worth it if you only use it for gaming✌
Can you test integrated GPUs of the 285K and 14900K?
new genereration of cpu intels is not worth !!! stil 13900ks the king !!!!
Thank you fot this video! Could you do a test another time with the 285k OC 5.7ghz without the E-cores or with E-cores to see if there is any difference?
id keep the ecores really, theyre very useful for low load tasks, which will run at very low watt
The difference in most games is 5 fps (-10%), but with a much higher price, don't always trust performance percentage leaks.
Did u try the next with OC like the Ring Ratio and the other stuff that Roman der 8auer trys?
People might not agree but this is a step in the right direction. The CPU pulls much less power, performance is worse but performance per watt is increased. The technology is improved for sure. It is all about what Intel would do with the next-gen.
i must say their 7nm is very impressive, like their 10nm when 12100f introduced, which beats most ryzens at the moment
Everybody see bad performance, i see little egal performance with better efficiency
Normally, power efficient is nice. That probably mean it has better stability with cheaper part. Tho I can imagine company to just make newer motherboard as expensive as last model and make it support even lower wattage
This is their first version of chiplets or in their term, tiles. Yes, it is bad, I think they have problem with the latency between the tiles. In all honesty, I expect a cost reduction due to chiplet design but the pricing isn't great. Unless you are using it for productions or a hardcore fanboy, no reason to get this generation intel.
I imagine using an external foundry - TSMC at the cutting edge node - 3nm is going to be pretty expensive.
I was comparing this video with Core Ultra 5 245K vs Core i5 14600k.
How a Core Ultra 9 and i9-14900K have less FPS than an I5? Like it's 75fps vs 119fps, quite a big difference at Microsoft Flight Simulator. Am I missing something?
I have a quick question can you please respond to me For the intel i9 14900KF for a 4080 super with a 1080p 540hz monitor how much fps should I be getting.
Is APO working?
If its anything like the 12XXX series, the Ultra 9 385 will be a beast.
true that oc capability is what ultra best at, prob why intel remove ht for ultra, but yeah not everyone can oc, but oc k series is really easy
@@iikatinggangsengii2471 I remember when 12900k was barely any better then 11900k, wich was a massive letdown on the first place (on some scenarios, the 11900k was worse then 10900k, mainly due to being 8/16 instead of 10/20 on 10900k). Then the 13900k came out and it blew 12900k out of the water.
Well done Intel! You have a great future behind you!
Now do again all the same tests, but without Raytracing!!!
it always happens! with a couple updates will get better! it’s not fully optimized yet! Remember it’s a new architecture
it is the case, even optimizing one pc, and game, such as mine took years, imagine all system available
intel and nvidia has significantly larger human resources to do this, so they always ahead at day1 optimization/performance
So... same performance with less energy?
must say that 285K is interesting one...
im so curios what went wrong
1. software sending work to E cores instead of P cores ?
2. the new tile design have bad latency ?
3. new architecture just bad for gaming ?
I guess its combination of 2 or even all 3.
core ultra is your best cpu for gaming atm, now that theyve been released
@@iikatinggangsengii2471 sure.
they way better than 7800X3D...
they use way less power then 7800X3D and give way more FPS.
ohh wait...
Yeah, these new CPU´s at their current price are gonna go great.......
Hey! Don't be so harsh on it, guys! At least the Ultra 9 has lower power consumption, isn't that something? RIGHT?!
Still ridiculously higher than 7800X3D tho--
core ultra running lower watt than most (all) ryzen highends
великий год для интел 2024, в историю войдет😁 так лохануться уметь надо!
I appreciate the much lower power consumption, but this CPU should be $150 cheaper.
Will you do 245 vs 14600 ?
It’s more expensive yet slower 😂
A great big NO THANK YOU to Intel, you really dropped the ball.
Looks great, almost same performance but 50-60W of power less. Thats really good for a CPU
Looking at the power draw it makes me wanna buy the new 285K, but looking at it's cost - -nah...
I feel like in a year or 2 it would be good since this new architecture is made from scratch. Windows drivers need to gets patched for the cpu and motherboard drivers need to mature. Ryzen in 2009 was the same, they had to mature the bios and get better driver support on windows. How the tables have turned table.
if the power draw is a selling point for you just get ryzen, they're still nearly 40-100% more efficient in gaming than this new 285k and faster. intel is a joke
@@Definedd I even use a negative PBO of 20 and now my Ryzen 5 5500 uses 41w max under load and in gaming only use 12-15w of power. That is with still the same default clock speed of 4.25GHz.
Intel continues to troll the living hell out of us lmao
but Ultra!
The number of cores and the type of architecture probably have an impact, but it's too early to say
🤔💭
looks like a good processor, power efficiency has increased a lot, but performance has dropped slightly
Atleast it's 2fps faster in Starfield, the most important game on the list
Does the new core ultra cpu have same issues with dx12 games crashing,it might be a bit slower but if this issue gets resolved with this new cpu it might be worth it.
Keeping my 13900k another 2 years I see
my i9 13900k watching this video between laughs
Better 1% and 0.1% less power usage sometimes up to 50% almost same but beats it out in most times with fps runs cooler and has better value compared to older gen now is a LOT better in work tasks and rendering stays cooler and costs 140 more
- What is this Barrymore?
- sanctions Sir.
Same performance for 40w less, not bad. THIS is what the 14 series should have been though.