Is funny because right now the only one that use the CPU by 100% is the productivity, the only sad news was the review is with the 9900X not the 9950X...
30 seconds in… ‘no gaming news’. Well done on not focusing on gaming, there’s a ton of other sites who do that but very few who focus on productivity/creativity/editing. Well done sir and keep up the (non gaming) good work 👍
He can not say right now intel its not god or amd its no good because they will not sponsor him anymore .....so we have to wait a while to see the truth!!!
Which means that if you have a Ryzen system and want XVAC 4:2:2 decode support, you can just toss an Arc A310 in and get the same performance for decode.
@@Healthy_Tokiit's posible, but hard to do it for the PCIe lines, both will perform worse for limiting the 16 to 8 lines each, unless you have a threadreaper or so
@@Healthy_Toki Should work just like if you had an Intel CPU with integrated graphics and you'll have two GPU's. This is what I'll be doing as it's looking like the 9950X 3D might work best for my productivity workloads as well as gaming. Right now I'm working with XVAC 4:2:0 which both NVIDIA and AMD GPU's can decode. But if I upgrade to the A6700 then I'll toss the A310 in, which I already have, or maybe Intel will have a Battlemage replacement I can go for.
I'm thinking of upgrading from a 11900k to the 285K, just hope it's worth it for what I'm using my PC. I initially wanted the 14900K but that CPU is a ticking time bomb. So glad I skipped the 13th/14th gen.
@@robicelus is it? Plenty of ppl who have had them since launch on manual voltages with no issues wasn’t it high voltages degrading them anyway? My 13900kf is still fine with an oc had it since launch gets stressed a lot as well loads of r23 runs occt runs and benchmarks
@@TheBURBAN111 just because your 13900kf hasn't failed yet, doesn't mean it won't fail in the future. Those chips have had problems from the very beginning and they still have. Good it still preforms for you. A buddy of mine has the 14900K and it failed after 1.5 years of normal use. Good luck
The thing I hate about Blender CPU benchmarks is that nobody uses Blender to render on CPU -- the GPU will always be preferred because its so much faster. Blender benefits from higher IPC than increased multi-core performance. Blender CPU benchmarks are irrelevant for understanding 3d/render performance because in practice, nobody uses it. Also checking Blender's opendata benchmarks -- seems NOBODY has uploaded a single test using ONEAPI/iGPU in Blender so we can see how well the integrated ARC based iGPU perfroms in the mix.
This. CPU Rendering benchmarks in Blender are kinda useless. Something like baking fluid/cloth/rigid body sims would be a better measure of CPU performance. And it's no more harder/time consuming than comparing Cycles render times.
@@nenoman3855 Problem is: these youtubers don't use Blender, don't know how to test the machine in those kind of tasks. That's why they use the rendering demos and the opendata benchmark. It's easier.
Not true, 3D softwares like Blender or C4D use GPU for rendering and CPU for all steps above like modelling and animating. Geometry needs a good CPU to be calculated
Tyvm, for the review. Guess I'm going 14900K for Davinci Resolve. Gonna wait till black Fri see if prices drop more. I do imagine 285 will get a performance boost w/ updated bios, windows updates, and ram but I don't imagine its going to be worth the extra $300-400 more over the reduced price of a 14900K build.
@@seendromeMy thoughts exactly…those security issues are a major reason why I’d not want to buy into 14900 and waited for the new ones…but I’m holding off to see if there’s any fixes for the 285K
Please do a comparison in timeline performance with different codecs between these cpus. Editors spend a lot of time on the timeline, so any performance increase and better smoothness is useful. Also 9950x comparisons would be really useful - could you get a used one and then return it?
X2 "Editors spend a lot of time on the timeline, so any performance increase and better smoothness is useful" same for AE or other compositing software
That's exactly why I love this channel so much. There are so many other things you can do besides gaming. When I look at the reviews of the other tech-tubers, I get the impression that there are only gamers out there. Massive thanks for not testing the gaming performance like all the other lemmings out there.
Most of the creators either use a mac or laptop and mostly gamers are the only people who looks to buy a pc. so no wonder why everyone just tests those gaming benchmarks.
I have an old Dell 8930 with a i9 9900 and RTX 2080 GPU with an Eizo 4K monitor which I use to process photographs. This computer processes images just fine in Photoshop. However, it is challenged when using Capture One as manifested by a noticeable delay when changing settings in the program. Programs such as DxO Pure RAW 4 that draw on the GPU require shutting down Photoshop and Bridge to run efficiently. My point is that Photoshop isn’t the weak link in the chain when it comes to limiting performance in applications that photographers frequently use. It would be much more useful to know how well Capture One, DxO Pure RAW 4, and Lightroom’s noise reduction algorithms run on a 285K. Thank you for your consideration of this topic.
Thanks for pushing oxidized cpus to people who are not familiar with that problem. Be sure to save receipt and rma it when it starts freezing and dropping the clocks. I bought 13600k before the issues are surfaced up on the web. So far no problems but it is the last time I am buying intel cpus
@@Alex-wg1mb I've never OC'd mine and use them for work so not a teenager trying to max out my FPS in some game. I actually use them as working system tools and earn a living with my hardware and have done so for several decades now. Thanks for the childish sarcasm though, it reinforces my low opinion of internet tools who think they know everything and like to prove Dunning Krueger Effect correct regularly.
now sitting on an 11 year old system and age is starting to take its toll so im going to build a whole new system, was just waiting for the zen 5 and intel to come out with their new cpu's I play extremely little, but use the computer for photo processing i'm leaning towards switching to amd and zen5, partly because they don't change the socel, every time and zen5 seems to be the best all-round cpu
@@6XCcustom Nice! My wife been wanting a pc for light gaming and watching youtube on the 2nd monitor so wont be needing anything hardcore. So AMD is the way!
Thanks for the video ! It will be interesting to see the timeline performance of the intel core ultra 9 vs the i9 14900k vs an apple chip. Keep going !
Agree and disagree. If you are looking for an upgrade, buy AMD Zen 5. The E Cores is a BIG problem in my work. I always have to open task manager and disable the E Cores to make the programs run "correct". When intel offers a CPU that is only P Cores i will be interested.
Price for upgrading mobo and paying twice as much for these cpus is way too much for upgrading for what you get. 12 gen is such a good bargain right now.
I was going to get 13600, an upgraded 2 cores more version of 12700, but I read of the problems 13/14 had. I actually just found out about the problems last week when I watched benchmark videos. I only watch the numbers, so I don't usually know of a GPU or CPU having problems. I just happened to read on a forum if I should buy a 12700 or 13600, and people were talking about the problems. Even people with new CPUs still get those problems even with the BIOS update, so I thought I might as well stay away from those and get a 12700. So I bought it last week but haven't used it yet because the cooler comes today. 12th gen is the last good jump chips while being stable
@@cmoneytheman 12700k early adopter here, glad I went with it, built my daughter a system with it just last christmas as well. Will ride or die(maybe try x3d for next build) with this cpu. Now it's time for me to buckle up for this wild nvidia ride that's about to start!
@@jlp8648 I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This card is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did.
@@cmoneythemanit’s mainly i9s that were affected due to high stock auto vcore and mce blasting an all core oc and microcode issues it degraded a bunch of 13900ks and so on.. my 13900kf is still fine with an oc but then again I use a manual vcore and llc so voltages stay safe still scoring well above stock in benchmarks and don’t use 300w under full load sits around 278w on stress tests.
This is nuts, it's almost like Adobe doesn't have a single PC in their building. For Lightroom, are the slower speeds a result of software optimization that could get better down the line? Or is it just the architecture of the chipset?
the ugly part of the situation is... the 14th gen was better but degrading... so for the new "upgrade", people would downgrade performance, while wasting their expensive cpu+motherboard combo of the previous 14th gen. it literally sucks.
Another issue is if arrow lake is only a one and done platform, or if it will support more CPU gens. As cheap as last gen platforms have become, arrow lake will need to come down in price. The power efficiency is nice to see.
@@oneanother1 maybe... but intel would have win the situation if they stickked to that same 1700 socket type for the newer gen cpus as well, people would not have hard time to quickly ditch the 13-14th gens for the newer ones... easy upgrade specially when some customers are getting cash in return for their RMA cpus and motherboardis not a waste. But sadly, the entire combo goes to trash.
@@neetkun5331 absolutely not... but my point is, what will you do if your existing 13-14th gen cpu degraded to such an extent that its not usable? obviously you would rma it. and still it might be faulty.
It is an interesting experiment, waiting for the overclocking video. There is also a new Xeon line up that seems it can take over Theadripper crown it would be so cool to see you reviewing those as well!
NO. MY 690 GIGABYTE MOTHERBOARD FOR MY 14900K WAS ONLY $300 AND CAME WITH 10 GB ETHERNET . THE NEW Z890 BOARDS ARE $700 TO $1000. AND IN SOME CASES YOU DONT EVEN GET THUNDERBOLT 5
the ugly part of the situation is... the 14th gen was better but degrading... so for the new "upgrade", people would downgrade performance, while wasting their expensive cpu+motherboard combo of the previous 14th gen. it literally sucks.
So here's a question. I am waiting for RTX 5090 next year, and keen to avoid the 1400k series. Do we think Intel will follow up on i9 Ultra with something better than 285k next year that's worth holding off? Seems 285 is entry way into something bigger.
sitting on an 11 year old system and age is starting to take its toll so im going to build a whole new system, was just waiting for the zen 5 and intel to come out with their new cpu's I play extremely little, but use the computer for photo processing i'm leaning towards switching to amd and zen5, partly because they don't change the socel, every time and zen5 seems to be the best all-round cpu
@@iLegionaire3755 Agreed - especially with the new platform - you will still be able to plop in a new CPU 1 or 2 gens down the road (if it makes sense to do so) ... I think its a no brainer.
My Intel Core i7-7740X on the X299 chipset still works great. :D I did hope for much better reviews / performance from the new Intel 200 series CPU though… So I would consider the 14700 / 14th. Gen CPU, for less money and sometimes better performance than the 200 series. If going for the 200 series I am looking at the 265K. A bit of gaming, but 3D design and printing is the current focus.
Cancel and build one. Same functionality, same power, much less expensive. Slapping creator, or gaming on things doesn't change one thing about the parts. If you would rather just buy a pre built tho, go ahead and order it. I would explore other options, but for a non rgb pre built eh sure I understand.
Do you recommend the 14th gen even after all the issues from the year? I know they recently released another update that is allegedly now resolving all issues but can we trust Intel? I was considering upgrading my rig this year at the same time the intel issues popped up then considered switching to the 7950X, but the 14900k is such a great bargain on the price. I guess I could buy an extended warranty with Newegg and hopefully never have to use it but that might give me peace of mind. The only reason I'm considering the 14900k is so as to not switch platforms, so it would be cheaper than hopping on AMD. Either way, I was planning on pairing it with a 4080 super. I mostly do 2D and 3D illustrations, with Clip Studio Paint and dabbling with Blender. I also recently started fiddling with animation but these are super short, like a gif length, to play around with it.
I have an i9 14900k with a Noctua NH-U12A air cooler. In Cinebench R24 I have a single core score of 136 and multicore 2117. Your multicore score for the i9 14900k was 1906, which seems a little bit low for a liquid cooler 🤷♂️
He explained in the video that he didn't get sent one by AMD. Also I believe the 9950x is like 700$-ish price, and 9900x is 500$-ish price. Either would be out of the price bracket, and Intel purposefully fulfilled those gaps.
Have you checked the davinci resolve longGOP score on the puget review for core ultra 285k? There it is pretty much on par with the 14900k. Also: AV1 Encode was added to the media engine. Might be worth mentioning.
The key for me is resolve studio version editing and RED raw performance and the Canon R5c codecs performance for video editing as that's what I have material in. Puget results for Red raw showed little improvement when 14900 came out for this codec. Also is there still Intel hardware support advantage for the h264 h265 codecs over AMD ? What's the reality for codecs on resolve as said above because the general results don't show the no improvement of red raw results?
TechNotice missed the value of the NPU for creators. Play you games on a 9800X3d. Do your editing on the Intel Core Ultra 9 285K with NPU enabled software.
Would it be worth upgrading to this from 12700k? I produce music and I play games every now and then.. the fact the new z890s come with Thunderbolt ports make using universal audio interfaces a dream with out expansion slots.
I'm from the 4th Gen Haswell (i7 4790k). Two options I have in mind currently. Core Ultra 5 256k or Ryzen 9700X? The power efficiency and iGPU (Alchemist iGPU 4 Xe-cores) is a plus point for getting Arrow Lake. The AMD Zen 5 CPU has better fps output and a much lower power consumption but the igpu (RDNA 2) with 2 CU cores is lacking. Tough choice, lol.
It depends on your usage. If you mostly game, get the x3d from AMD. If you mostly work, I think Intel. And there will be more performance from these chips as the BIOS, firmware and drivers mature. Ah, in the future, faster memory (up to 10 000 MT/S) should get some extra performance, too.
I think in the 3D benchmarks, the results are very good but the problem is the crazy wattage that comes with it when it comes around 250W that's not ideal at all
I have 3900x, RTX 3080. Recently, working a lot with Sony HEVC 10bit 4.2.2. which only intel can hardware decode. I was really waiting for this CPU to get over my troubles with editing but the reviews are not convicing enough. I think I will wait 2 more months, until all necessary updates will be released and then I will go for it. Anyone else strugling with 10bit 4.2.2?
Intel is releasing new socket at the pace of a german sinking beers at the oktoberfest. Getting fed up. And i had to replace one of my 14700K, completely failing on me. My next socket is AMD AM5 for sure.
So I’m considering getting either of these CPUs or 9800x3D, I will be doing very light editing but will only use the Free version of Davinci for the moment, does that mean that it doesn’t matter if I have the intel iGPU or not? Price is not a difference, I can get a discount on the intel CPUs so it’s similar to 9800x3D
@@Alex-wg1mb I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This cpu is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did.
Where's the 9950X? Anyone getting Ryzen 7000 or 9000 is getting DDR5 6000 CL30 at a minimum. People using 14900K have to worry about longevity of the silicon, and people using the Ultra 9 285k are dealing with BSOD right now, but also using DDR5 8000. This review was "bleh" at best, and that's being generous.
Maybe Intel, AMD (Global Foundries) and Nvidia can pool their resources to build more advanced fabs that could eliminate the TSMC's practical monopoly. Maybe they can get funding from China and Russia who would be open to having a chip manufacturing base of their own. Business opportunities are there, only if someone notices them.
1st up looking to replace my Gigabyte elite Z390 MB bent pins stopped working after 4 months of daily blue screens from Aug 2023 updates wtf all i did was clean the gpu watercooler radiator and the bastid wouldn't boot took days of swet and tears finally only 1 8gb stick ram slot 1 worked now its finally fooked wont boot at all the end 6years worth now back on the old DDR3 pc Till" Now Either spend $200 for another ddr4 MB if ? spent hrs looking at 13600k au$309 & B760 $160 Mb & ddr5 ram $150 =$630 approx. thats an extra $430 but least faster than my old i7-9700k & 2080S mobo 32gb ram " Crazy upgrades" more trouble than betting on a 3 legged horse race now days
Interesting how you didn't put the 9950X in the comparison but made sure to include the 9900X, which have literally have HALF the total amount of cores the 285K have, you shill.
@@Felale It doesn't cost a lot more, they're comparable in price by MSRP and the 9950X already received a price cut and you know it, those two processors are direct competitors and it makes no sense to not include a 9950X when you're comparing the 285K. So, just because AMD didn't send one he won't include it but will include a 7950X and a 9900X? Interesting.
@@Felale Current prices its a $10 dollar difference. $599 for 9950x and 589 for 285k ... not sure you financial status, but $10 isn't a lot to me, and I'm poor.
Wanting to upgrade from 12700k to 285 or 14900k... After Effects 90% and 10% gaming. Does it really worth the extra money ? Would be grateful for an opinion
Very poor choice for memory ... doesn't make sense to select the speeds that you chose ... the "officially supported memory speed" argument doesn't have any technical merit when, for example, DDR5-6000 works on every AM5 platform.
@@JynxedKomawrong. The 285k is on par with the more expensive 9950x and it beats the 9900x . And that is without any software optimization vs the amd which has been out for some time now and devs had more time to optimize their programs better. In a couple of months the 285k will beat the 9950x as well.
One thing ibe never quite gor my head around: if i have an NVIDIA gpu, do i need to tell premiere to use the igpu for decoding h.265 video, or will premiere use the igpu for playback and the nvidia gpu for encoding?
It seems like most reviewers I've seen have only gotten the Intel Core Ultra 9 (U9) and Ultra 5 (U5) processors. What's with the Ultra 7 (U7) processors? I've only seen one reviewer, Techtesters, reviewing the U7 processor. Can we just quit with the Ultra and call it "U" and save the energy of one extra syllable? :)
Here in Canada, the Core 2 245K is $440 + 12% tax on top. Is 1851 socket a dead end platform that is only for this generation? If it was not I would think Intel would use it as a marketing plus point. Power usage is not always a key point everywhere in the world. I am looking for a productivity rather than a gaming system for a friend whose system sorely needs replacing. The issues that this CPU had with some motherboards and benchmarks I hope were more due to engineering samples that were fixed in time for retail release. I have been building AMD & Intel systems since Pentium 133 days and quite frankly these days the system build flame is wavering. It seems that both Intel and AMD have fubared the product releases.
👇❗PC Builds for Creators [Latest]❗👇
➡ $750 BB4B (Best-Bang-For-Buck) PC geni.us/750-1300BB4B
➡ $1500 BB4B Creator PC geni.us/1500-2300BB4B
➡ $2500 BB4B Creator PC geni.us/2500-3500BB4B
➡ $4000 BB4B Creator PC geni.us/4000-5kBB4B
One of the rare channels that cares about productiviy.
Because it is rare to care
Is funny because right now the only one that use the CPU by 100% is the productivity, the only sad news was the review is with the 9900X not the 9950X...
There's a lot more creator channels but Tech Notice is by far the best.
@@christophermullins7163if you had a job you’d care
@@Felale yeah I dont have a job
30 seconds in… ‘no gaming news’. Well done on not focusing on gaming, there’s a ton of other sites who do that but very few who focus on productivity/creativity/editing. Well done sir and keep up the (non gaming) good work 👍
He can not say right now intel its not god or amd its no good because they will not sponsor him anymore .....so we have to wait a while to see the truth!!!
Exactly ❤
I agree!
Fax
agreed, i too am a creator and this channel is great
Which means that if you have a Ryzen system and want XVAC 4:2:2 decode support, you can just toss an Arc A310 in and get the same performance for decode.
This is very interesting, but what if you're running an Nvidia GPU? Can you dual GPU or something?
@@Healthy_Tokiit's posible, but hard to do it for the PCIe lines, both will perform worse for limiting the 16 to 8 lines each, unless you have a threadreaper or so
@@Healthy_Toki Should work just like if you had an Intel CPU with integrated graphics and you'll have two GPU's. This is what I'll be doing as it's looking like the 9950X 3D might work best for my productivity workloads as well as gaming. Right now I'm working with XVAC 4:2:0 which both NVIDIA and AMD GPU's can decode. But if I upgrade to the A6700 then I'll toss the A310 in, which I already have, or maybe Intel will have a Battlemage replacement I can go for.
I'm thinking of upgrading from a 11900k to the 285K, just hope it's worth it for what I'm using my PC. I initially wanted the 14900K but that CPU is a ticking time bomb. So glad I skipped the 13th/14th gen.
what do you mean ticking bomb?
@gungunsaxena8340 meaning that the 14900K will fail sooner or later and it doesn't matter if you undervolt it or not.
@@robicelus is it? Plenty of ppl who have had them since launch on manual voltages with no issues wasn’t it high voltages degrading them anyway? My 13900kf is still fine with an oc had it since launch gets stressed a lot as well loads of r23 runs occt runs and benchmarks
@@TheBURBAN111 just because your 13900kf hasn't failed yet, doesn't mean it won't fail in the future. Those chips have had problems from the very beginning and they still have. Good it still preforms for you. A buddy of mine has the 14900K and it failed after 1.5 years of normal use. Good luck
A huge thank you for using official memory supported speeds. Subscribed.
The thing I hate about Blender CPU benchmarks is that nobody uses Blender to render on CPU -- the GPU will always be preferred because its so much faster. Blender benefits from higher IPC than increased multi-core performance. Blender CPU benchmarks are irrelevant for understanding 3d/render performance because in practice, nobody uses it. Also checking Blender's opendata benchmarks -- seems NOBODY has uploaded a single test using ONEAPI/iGPU in Blender so we can see how well the integrated ARC based iGPU perfroms in the mix.
This. CPU Rendering benchmarks in Blender are kinda useless. Something like baking fluid/cloth/rigid body sims would be a better measure of CPU performance. And it's no more harder/time consuming than comparing Cycles render times.
x2
Exactly.
@@nenoman3855 Problem is: these youtubers don't use Blender, don't know how to test the machine in those kind of tasks. That's why they use the rendering demos and the opendata benchmark. It's easier.
Not true, 3D softwares like Blender or C4D use GPU for rendering and CPU for all steps above like modelling and animating. Geometry needs a good CPU to be calculated
Tyvm, for the review. Guess I'm going 14900K for Davinci Resolve. Gonna wait till black Fri see if prices drop more. I do imagine 285 will get a performance boost w/ updated bios, windows updates, and ram but I don't imagine its going to be worth the extra $300-400 more over the reduced price of a 14900K build.
What about 13/14th gen intel problems?
@@seendromeMy thoughts exactly…those security issues are a major reason why I’d not want to buy into 14900 and waited for the new ones…but I’m holding off to see if there’s any fixes for the 285K
Please do a comparison in timeline performance with different codecs between these cpus. Editors spend a lot of time on the timeline, so any performance increase and better smoothness is useful. Also 9950x comparisons would be really useful - could you get a used one and then return it?
X2 "Editors spend a lot of time on the timeline, so any performance increase and better smoothness is useful" same for AE or other compositing software
I keep rewatching videos again and again. Your chapters are a lifesaver. Thanks a lot
Will you cover 245K and 265K? Thank you for the amazing work that you do.
Your channel is so useful for the creative end of hardware. Thanks mate
That's exactly why I love this channel so much. There are so many other things you can do besides gaming. When I look at the reviews of the other tech-tubers, I get the impression that there are only gamers out there. Massive thanks for not testing the gaming performance like all the other lemmings out there.
When you see gaming benchmarks on this channel, feel free to unsubscribe! :)
No 9950x benchmark, no igpu comparisons with AMD…why?
Most of the creators either use a mac or laptop and mostly gamers are the only people who looks to buy a pc. so no wonder why everyone just tests those gaming benchmarks.
I was looking for this exact kind of video!
😉
I have an old Dell 8930 with a i9 9900 and RTX 2080 GPU with an Eizo 4K monitor which I use to process photographs. This computer processes images just fine in Photoshop. However, it is challenged when using Capture One as manifested by a noticeable delay when changing settings in the program. Programs such as DxO Pure RAW 4 that draw on the GPU require shutting down Photoshop and Bridge to run efficiently. My point is that Photoshop isn’t the weak link in the chain when it comes to limiting performance in applications that photographers frequently use. It would be much more useful to know how well Capture One, DxO Pure RAW 4, and Lightroom’s noise reduction algorithms run on a 285K. Thank you for your consideration of this topic.
watched most of the review videos about core ultra but i was waiting for your video to drop. Thanks
Glad I bought the 14700K CPUs when I did. Thanks for this Lauri.
Thanks for pushing oxidized cpus to people who are not familiar with that problem.
Be sure to save receipt and rma it when it starts freezing and dropping the clocks.
I bought 13600k before the issues are surfaced up on the web. So far no problems but it is the last time I am buying intel cpus
@@Alex-wg1mb I've never OC'd mine and use them for work so not a teenager trying to max out my FPS in some game.
I actually use them as working system tools and earn a living with my hardware and have done so for several decades now.
Thanks for the childish sarcasm though, it reinforces my low opinion of internet tools who think they know everything and like to prove Dunning Krueger Effect correct regularly.
@@WireHedd My supervisor changed his two 13900k's because of the issues on base specs. What is your argument then?
Ive been INTEL all these years. Its time to go to AMD.
Which CPU are you upgrading from? Zen5 wasn't really worth upgrading too as well, unless you want x3d for gaming.
@@oneanother1 not upgrading. Building one in a few weeks. Going with the 7800x3d.
now sitting on an 11 year old system and age is starting to take its toll
so im going to build a whole new system, was just waiting for the zen 5 and intel to come out with their new cpu's
I play extremely little, but use the computer for photo processing
i'm leaning towards switching to amd and zen5, partly because they don't change the socel, every time and zen5 seems to be the best all-round cpu
@@6XCcustom Nice! My wife been wanting a pc for light gaming and watching youtube on the 2nd monitor so wont be needing anything hardcore. So AMD is the way!
Not everyone games @@oneanother1
Thanks for the video ! It will be interesting to see the timeline performance of the intel core ultra 9 vs the i9 14900k vs an apple chip. Keep going !
Agree and disagree. If you are looking for an upgrade, buy AMD Zen 5. The E Cores is a BIG problem in my work. I always have to open task manager and disable the E Cores to make the programs run "correct". When intel offers a CPU that is only P Cores i will be interested.
They offer P-Core only CPUs
@@kovacspis In Xeon lineup of course.
@@sophustranquillitastv4468😂
Intel's 14th Gen Desktop CPUs Now Available In P-Core Only Flavors, Core i9-14901KE Leads The Pack With 8 P-Cores & 5.8 GHz Clocks
Core i9-14901KE.
Core i9-14901E.
Core i9-14901TE.
Core i7-14701E.
Core i7-14701TE.
Core i5-14501E.
Core i5-14501TE.
Core i5-14401E.
Amd said it best e cores economy class cores
Price for upgrading mobo and paying twice as much for these cpus is way too much for upgrading for what you get. 12 gen is such a good bargain right now.
I was going to get 13600, an upgraded 2 cores more version of 12700, but I read of the problems 13/14 had. I actually just found out about the problems last week when I watched benchmark videos. I only watch the numbers, so I don't usually know of a GPU or CPU having problems. I just happened to read on a forum if I should buy a 12700 or 13600, and people were talking about the problems. Even people with new CPUs still get those problems even with the BIOS update, so I thought I might as well stay away from those and get a 12700. So I bought it last week but haven't used it yet because the cooler comes today.
12th gen is the last good jump chips while being stable
@@cmoneytheman 12700k early adopter here, glad I went with it, built my daughter a system with it just last christmas as well. Will ride or die(maybe try x3d for next build) with this cpu. Now it's time for me to buckle up for this wild nvidia ride that's about to start!
@@jlp8648 I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This card is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did.
Even if did not likes AMD just grab an 7600X still faster then 13600K also cheaper. Not a being a fanboy it just truth hurts! 🤦♂😅😅
@@cmoneythemanit’s mainly i9s that were affected due to high stock auto vcore and mce blasting an all core oc and microcode issues it degraded a bunch of 13900ks and so on.. my 13900kf is still fine with an oc but then again I use a manual vcore and llc so voltages stay safe still scoring well above stock in benchmarks and don’t use 300w under full load sits around 278w on stress tests.
This is nuts, it's almost like Adobe doesn't have a single PC in their building. For Lightroom, are the slower speeds a result of software optimization that could get better down the line? Or is it just the architecture of the chipset?
the ugly part of the situation is... the 14th gen was better but degrading... so for the new "upgrade", people would downgrade performance, while wasting their expensive cpu+motherboard combo of the previous 14th gen. it literally sucks.
Another issue is if arrow lake is only a one and done platform, or if it will support more CPU gens. As cheap as last gen platforms have become, arrow lake will need to come down in price. The power efficiency is nice to see.
@@oneanother1 maybe... but intel would have win the situation if they stickked to that same 1700 socket type for the newer gen cpus as well, people would not have hard time to quickly ditch the 13-14th gens for the newer ones... easy upgrade specially when some customers are getting cash in return for their RMA cpus and motherboardis not a waste. But sadly, the entire combo goes to trash.
@@realracing3specter295 But is it worth it for new build though?
@@neetkun5331 absolutely not... but my point is, what will you do if your existing 13-14th gen cpu degraded to such an extent that its not usable? obviously you would rma it. and still it might be faulty.
@@oneanother1I believe it is a one and done socket. Next gen is supposed to be the good one.
Will you be making a Premier Pro timeline performance video with this CPU? I'd love to see how smooth video playback is with the new iGPU.
This man is big time whenever you get a item the average person can’t get early
Good video, I like that you focused on something other than gaming…
Thank you so much. Love your videos, love your benchmarks. So good! 🍻
12:50 please make the positive numbers green next time, otherwise great stuff
I’d like to see you talk about Intel’s idle and low load power draw vs AMD.
It is an interesting experiment, waiting for the overclocking video. There is also a new Xeon line up that seems it can take over Theadripper crown it would be so cool to see you reviewing those as well!
Cant wait for your testing the new Macbook M4 Max. That has some savage numbers.
Oh yeah...
Thank you, I don't need to know how this does in Final Fantasy!
now we can see the difference of the new ultra core thanks Lauri🎉
NO. MY 690 GIGABYTE MOTHERBOARD FOR MY 14900K WAS ONLY $300 AND CAME WITH 10 GB ETHERNET . THE NEW Z890 BOARDS ARE $700 TO $1000. AND IN SOME CASES YOU DONT EVEN GET THUNDERBOLT 5
Aorus elite z890 and msi tomahawk z890 only cost 350 euro.
How to upgrade if it's downgrade? 🤔
the ugly part of the situation is... the 14th gen was better but degrading... so for the new "upgrade", people would downgrade performance, while wasting their expensive cpu+motherboard combo of the previous 14th gen. it literally sucks.
For sellers its upgrade 😂
These are the fixed 13/14 chips it’s easy to see this
So here's a question. I am waiting for RTX 5090 next year, and keen to avoid the 1400k series. Do we think Intel will follow up on i9 Ultra with something better than 285k next year that's worth holding off? Seems 285 is entry way into something bigger.
The fact that the title says "upgrade" is a lie. Literally nothing except the Intel specific AI is faster at all.
the only video I was waiting for.... non-gaming review
sitting on an 11 year old system and age is starting to take its toll
so im going to build a whole new system, was just waiting for the zen 5 and intel to come out with their new cpu's
I play extremely little, but use the computer for photo processing
i'm leaning towards switching to amd and zen5, partly because they don't change the socel, every time and zen5 seems to be the best all-round cpu
Zen 5 is the better option.
@@iLegionaire3755 Agreed - especially with the new platform - you will still be able to plop in a new CPU 1 or 2 gens down the road (if it makes sense to do so) ... I think its a no brainer.
My 14900KS is all I need 😊
Core ultra 7 vs ryzen 9 7900X
Next video on this please.
Since I just have a Ryzen 2600 six core CPU...this will be my first big leap of an upgrade
My Intel Core i7-7740X on the X299 chipset still works great. :D I did hope for much better reviews / performance from the new Intel 200 series CPU though… So I would consider the 14700 / 14th. Gen CPU, for less money and sometimes better performance than the 200 series. If going for the 200 series I am looking at the 265K. A bit of gaming, but 3D design and printing is the current focus.
ahhh - I've been waiting for this! I actually have a 14900/proart system in the shopping cart and need to know if I should cancel it or not!
On the z890 you can upgrade to next gen intel core ultra 300 or 400
cancel and go for 9 9950x
Cancel and build one. Same functionality, same power, much less expensive. Slapping creator, or gaming on things doesn't change one thing about the parts. If you would rather just buy a pre built tho, go ahead and order it. I would explore other options, but for a non rgb pre built eh sure I understand.
I don’t trust the 14th gen anymore. So go for the arrow lake. I think overall it is still better for creators than AMD.
Do you recommend the 14th gen even after all the issues from the year? I know they recently released another update that is allegedly now resolving all issues but can we trust Intel? I was considering upgrading my rig this year at the same time the intel issues popped up then considered switching to the 7950X, but the 14900k is such a great bargain on the price. I guess I could buy an extended warranty with Newegg and hopefully never have to use it but that might give me peace of mind. The only reason I'm considering the 14900k is so as to not switch platforms, so it would be cheaper than hopping on AMD. Either way, I was planning on pairing it with a 4080 super.
I mostly do 2D and 3D illustrations, with Clip Studio Paint and dabbling with Blender. I also recently started fiddling with animation but these are super short, like a gif length, to play around with it.
You should be fine with the new updates.
I have a dead 14900K on my carpet, trust me its not worth it. Get the 9950X for productivity and Blender especially, it smokes Blender.
Sure………upgrade to downgrade 🤪. That’s a no-brainer.
That 2fps difference is getting u crazy
Hot garbage, let's be honest I'm keeping my 14900k.
Well done. It's not a gaming processor and it should not be advertised as if it were.
The single core speed is very enticing to me because I spend a lot of time working with older financial software.
I am waiting that Intel and Microsoft will release soon a patch to make these results consistent in all the line of performance.
I have an i9 14900k with a Noctua NH-U12A air cooler. In Cinebench R24 I have a single core score of 136 and multicore 2117. Your multicore score for the i9 14900k was 1906, which seems a little bit low for a liquid cooler 🤷♂️
He didn't have it on "self destruct" power limits 🤣 But to be serious - it really depends on mobo defaults with 13th and 14th gen.
Where is 9950x ? Why 9900x?
He explained in the video that he didn't get sent one by AMD. Also I believe the 9950x is like 700$-ish price, and 9900x is 500$-ish price. Either would be out of the price bracket, and Intel purposefully fulfilled those gaps.
and I highly doubt his amd scores
@@TheTechFolk 9950x still lose behind
@@alderlake12th No it isn't check other Benchmarks as well. Intel is getting flushed out of the water with worse power at that as well
@@KrYPTzx I'm not going for the gamming. Also still a monster for multicore. The 285k still great at power efficiently
Have you checked the davinci resolve longGOP score on the puget review for core ultra 285k? There it is pretty much on par with the 14900k. Also: AV1 Encode was added to the media engine. Might be worth mentioning.
Great review!
I don’t have any recent computer, should I future-proof with the 285k or save money with the 14900k??
The key for me is resolve studio version editing and RED raw performance and the Canon R5c codecs performance for video editing as that's what I have material in. Puget results for Red raw showed little improvement when 14900 came out for this codec. Also is there still Intel hardware support advantage for the h264 h265 codecs over AMD ? What's the reality for codecs on resolve as said above because the general results don't show the no improvement of red raw results?
TechNotice missed the value of the NPU for creators. Play you games on a 9800X3d. Do your editing on the Intel Core Ultra 9 285K with NPU enabled software.
I'm very interested in upgrading to the Ultra 9, but I have a 5950X in my system already. I'm wondering if I would see any performance upgrades?
You definitely would, but nonetheless the competition from the 7950x and 9950x can’t be ignored.
I just upgraded to 285k from 14900k. It’s worth it just the energy efficiency alone. My AIO is much quieter now
Why not just go AMD?
@@Valk-Kilmer it’s a good upgrade for me it’s much better performance and new socket. Always buy bew
Amd 9000 would be better if your after engery efficiency
@@Valk-KilmerAMD is for gaming.
@@r25012501AMD uses way more power at idle and low load. Like 3-4x as much as Intel. This is something they’ve tried fixing but can’t.
Single thread in work good choice
Would it be worth upgrading to this from 12700k? I produce music and I play games every now and then.. the fact the new z890s come with Thunderbolt ports make using universal audio interfaces a dream with out expansion slots.
I'm from the 4th Gen Haswell (i7 4790k). Two options I have in mind currently. Core Ultra 5 256k or Ryzen 9700X? The power efficiency and iGPU (Alchemist iGPU 4 Xe-cores) is a plus point for getting Arrow Lake. The AMD Zen 5 CPU has better fps output and a much lower power consumption but the igpu (RDNA 2) with 2 CU cores is lacking. Tough choice, lol.
It depends on your usage. If you mostly game, get the x3d from AMD. If you mostly work, I think Intel. And there will be more performance from these chips as the BIOS, firmware and drivers mature. Ah, in the future, faster memory (up to 10 000 MT/S) should get some extra performance, too.
I think in the 3D benchmarks, the results are very good but the problem is the crazy wattage that comes with it when it comes around 250W that's not ideal at all
i7-5820K here, waiting for a new CPU with at least 28 PCIe lanes without being heavily overpriced workstation tier platform
Hi, I cam across another review that had a different perspective.
The productio quality is soooo good !
Now you’re talking!
I have 3900x, RTX 3080. Recently, working a lot with Sony HEVC 10bit 4.2.2. which only intel can hardware decode. I was really waiting for this CPU to get over my troubles with editing but the reviews are not convicing enough. I think I will wait 2 more months, until all necessary updates will be released and then I will go for it. Anyone else strugling with 10bit 4.2.2?
My i7 4790k needs this.
Intel is releasing new socket at the pace of a german sinking beers at the oktoberfest. Getting fed up.
And i had to replace one of my 14700K, completely failing on me. My next socket is AMD AM5 for sure.
offtopic question, what monitor should you suggest for photo and video editing.. with a max budget of 500 euros.
doesnt matter they want sell well, unfortunatelly
So I’m considering getting either of these CPUs or 9800x3D, I will be doing very light editing but will only use the Free version of Davinci for the moment, does that mean that it doesn’t matter if I have the intel iGPU or not?
Price is not a difference, I can get a discount on the intel CPUs so it’s similar to 9800x3D
I could see myself using this CPU for Blender but then I would be using my RTX 4070 in my Linux machine for Cycles rendering.
Yes you should upgrade. The 14900k is cheaper with better performance
upgarde from what to
what?😂😂😂😂Intel fanboys ahhahaa
These are basically fixed versions of 13-14 chips
@@cmoneytheman I would not buy them either. Or wait at least 4-6 month so they iron out bugs and other stuff.
AMD 7000 series are ok with discount.
@@Alex-wg1mb I was having drops in kill zone rpcs3 10k resolution with my 12400, and my fps would drop to 40s and back to 60. With my 12700, I hardly get any drops using the same resolution. Timespliters dolphin 8k I got drops using 12400. I no longer get those using the 12700. LFD2 would give me drops to the 60s from 120fps using multiple graphic mods using a 3080 RTX/12400. I no longer get those big drops with the 12700. This cpu is a monster. I didn't expect it to stabilize my FPs so much, but I was hoping it did.
Where's the 9950X? Anyone getting Ryzen 7000 or 9000 is getting DDR5 6000 CL30 at a minimum. People using 14900K have to worry about longevity of the silicon, and people using the Ultra 9 285k are dealing with BSOD right now, but also using DDR5 8000. This review was "bleh" at best, and that's being generous.
Maybe Intel, AMD (Global Foundries) and Nvidia can pool their resources to build more advanced fabs that could eliminate the TSMC's practical monopoly. Maybe they can get funding from China and Russia who would be open to having a chip manufacturing base of their own. Business opportunities are there, only if someone notices them.
1st up looking to replace my Gigabyte elite Z390 MB bent pins stopped working after 4 months of daily blue screens from Aug 2023 updates wtf all i did was clean the gpu watercooler radiator and the bastid wouldn't boot took days of swet and tears finally only 1 8gb stick ram slot 1 worked now its finally fooked wont boot at all the end 6years worth now back on the old DDR3 pc Till"
Now Either spend $200 for another ddr4 MB if ? spent hrs looking at 13600k au$309 & B760 $160 Mb & ddr5 ram $150 =$630 approx. thats an extra $430 but least faster than my old i7-9700k & 2080S mobo 32gb ram " Crazy upgrades" more trouble than betting on a 3 legged horse race now days
Can't blame AMD for ghosting
Why not have the more comparable 9950x ?
Explained in the video? 🤔
You said Ultra 200 has the same IGPU as 14900K. Intels new notebook CPU was released with a much better IGPU. What is the future of Intel desktop?
Interesting how you didn't put the 9950X in the comparison but made sure to include the 9900X, which have literally have HALF the total amount of cores the 285K have, you shill.
9950X costs a lot more so it’s in a different class and AMD didn’t send one.
@@Felale It doesn't cost a lot more, they're comparable in price by MSRP and the 9950X already received a price cut and you know it, those two processors are direct competitors and it makes no sense to not include a 9950X when you're comparing the 285K.
So, just because AMD didn't send one he won't include it but will include a 7950X and a 9900X? Interesting.
@@Felale Current prices its a $10 dollar difference. $599 for 9950x and 589 for 285k ... not sure you financial status, but $10 isn't a lot to me, and I'm poor.
What about ryzen 9 9950x ??
Why don't you show the idle power consumption anymore?
6:47 my gpu or monitor is broken or just the video?
The video, upff I though I'm the only one :DDD
Thanks, thought my laptop was dying.
😂 just built my PC and that happens thought my PC was having problems too
@@MariusNinjai haha thats nightmare
Nope, it's a flaw with the video. Thanks for asking this question because i would have wondered myself.
I think they should run memory with up to 10 000 MT/s, not the old 6400. These will benefit from faster memory.
Wanting to upgrade from 12700k to 285 or 14900k... After Effects 90% and 10% gaming. Does it really worth the extra money ? Would be grateful for an opinion
14900k much better for your money, don't need to change RAM and mobo! :)
@@theTechNotice Thanks I intend to tho, gonna sell the 12th gen case and go ddr5
Very poor choice for memory ... doesn't make sense to select the speeds that you chose ... the "officially supported memory speed" argument doesn't have any technical merit when, for example, DDR5-6000 works on every AM5 platform.
For unreal engine 5 which one is choose between 265k vs 14700k
I'll save everyone 25 minutes and 15 seconds. Buy the 9950x.
If you only game sure.
@@thetheoryguy5544 Wrong. If you game AND do productivity, then 9950x is the best option, even over the 14900K believe it or not.
@@JynxedKoma Not what results say.
@@thetheoryguy5544 If you're looking at 1080p, but at 1440p the Core Ultra 9 2.85% gets destroyed by the 9950X.
@@JynxedKomawrong. The 285k is on par with the more expensive 9950x and it beats the 9900x . And that is without any software optimization vs the amd which has been out for some time now and devs had more time to optimize their programs better. In a couple of months the 285k will beat the 9950x as well.
'SHOULD YOU UPGRADE?' you mean downgrade 😂
So
We gotta wait for Davinci Resolve optimisation for new intel chips...😅
What happened to the 9950x, it should be there.
Bro did the Arrow Lake review ahead of zen 5 while those chips are out for almost 3 months 💀
One thing ibe never quite gor my head around: if i have an NVIDIA gpu, do i need to tell premiere to use the igpu for decoding h.265 video, or will premiere use the igpu for playback and the nvidia gpu for encoding?
It seems like most reviewers I've seen have only gotten the Intel Core Ultra 9 (U9) and Ultra 5 (U5) processors. What's with the Ultra 7 (U7) processors? I've only seen one reviewer, Techtesters, reviewing the U7 processor. Can we just quit with the Ultra and call it "U" and save the energy of one extra syllable? :)
Here in Canada, the Core 2 245K is $440 + 12% tax on top. Is 1851 socket a dead end platform that is only for this generation? If it was not I would think Intel would use it as a marketing plus point. Power usage is not always a key point everywhere in the world. I am looking for a productivity rather than a gaming system for a friend whose system sorely needs replacing. The issues that this CPU had with some motherboards and benchmarks I hope were more due to engineering samples that were fixed in time for retail release.
I have been building AMD & Intel systems since Pentium 133 days and quite frankly these days the system build flame is wavering. It seems that both Intel and AMD have fubared the product releases.
why intel choose such a complex naming? i9 14900k is much more easy to understand than 9 285k
Is the u5 245k viable for productivity in its price range or is there anything better ?
This is what you get these are fixed 13/14 chips