boys in New benchmarks 9700x shine very good , see tomshardware,hardwareunboxed new Intel bench ... 9700x. beat many expensive cup (newly and update bios microcode for 9000 seri)
@@bijoyhossain93 editing "What" specifically? 16K HDR 120 FPS footage on Premiere Pro or DaVinci Resolve? Throw-in some After Effects to run along Premiere Pro as well Actually 7800x3D feels good enough to do that As long as you're pairing it with RTX 4090 to take advantage of Hardware acceleration No idea what's your problem
Great round-up, it's really interesting to see creator and professional focussed benchmarks for the Core Ultra 245k - clearly it's better suited at that, than gaming (right now at least).
Wait up to 4-6 month before buying this platform. Let the vendors to fix the bugs and hitches. Prosumers require stable environment. 7000 AMD series are quite good now.
@@Alex-wg1mb i preordered the cpu, but its gonna take months to save up for the motherboard i want. I'm sure any bugs will get fixed long before i ever get to actually use the cpu.
@@Alex-wg1mb thanks man but my mind was made after the first review I saw showing blender performance. I’ve been spending the last few days tuning and learning about all the new parts of the architecture. Bios is like being a kid in a brand new candy store
Also they are very good on gaming as well (with DDR5). DDR4 holds them back even on games. The i7-12700K is my favorite CPU ever released that i ever own, the perfomance per watt is crazy for such a beast.
I was watching Coreteks review of 285k chip, down to how E-Core and P-Cores positioned, and what I thought that 285k could be a great starting ground for Intel, but there are so many issues on this 285k in terms of latencies and how L3 is accessed, that I believe this will be a 1 of design Unlikely this will be used to build upon in the future They will need to rebuild the architecture into something else, even on tile concept
Cost for cost, the 265k is the competitor for the 7800X3d, though. If I'm spending $400-430 on a CPU, I'm going to take a tiny gaming FPS hit to get 2x the performance in multicore rendering, live audio processing, video editing, etc. Every time.
This is, in my opinion, absolutely the best choice one could take. That 10 fps difference in games is basically nothing, when these CPUs offer so many better features and speedups in productivity.
This is why it confuses me how often the 7800x3d is recommended. It's the best selling CPU everywhere, being purchased by people who would probably be better off with something like a 13600k or a 7900x.
Noone goes with 7800X3D for content creation. You should tested with the similar priced 7900X instead which of cause crushes the 245k in everything including gaming.
I guess, the thing with AMD's upcoming 2nd Gen V-Cache is that it won't be such a kneecap on productivity and creator use. Hoping that the 9800X3D could actually be a good CPU for people who not only want good gaming performance, but also want good productivity performance.
x3d was heldback by temperatures, the stacked cache design has a lower TJmax and is "harder" to cool due to its design. if amd fixes the temp issues and "restores" the clock speeds for the X3D chips they will basically be the same as their non x3d parts in productivity
It's really nice to see!!! Big tanks again for your very interesting video. BUT, i can't WAIT for the new Ryzen 9 9950X review. It's been a while now it's release. When we can expect to got a review on this CPU. I looking to change for this CPU in novembre and i ask you some questions about it on minnect (big tanks again for your answers). BUT, it will be very interesting to see the result on benchmark. BIG BIG TANKS again for your channel. I appreciate it so much. Maybe spend to much time on it, but it's my problem 😅
Would love to see a set of benchmarks and comparisons for programmers. Compilation times and power consumptions. I notice that many 'productivity' benchmarks specifically target large data sets that are visited sequentially and once, almost as if to minimize loops in an effort to defeat the effect of large cache. Fact is, most programs run loops, and often the inner loops are where most of the processing time is spent. Caches are effective for a reason, and most use cases can take advantage of large caches. Seems fairly obvious to me that the photoshop benchmark is the only one that constrains the data processing set to be reasonably cache-able. Another desirable metric in the dizzying world of so many cores that all share the same memory bandwidth would be responsiveness under load for other programs and the operating system. Personally I would favour idle-like latency while a few cores were loaded over any individual core performance that shuts down responsiveness for other processes. The kind of test that would really show how phone processors specs are largely illusions that cant deliver once any one core gets a little more than lightly loaded.
Should've compared it to something of the same price as that's what important to everyone. How much performance you're getting for a defined budget. In that case, 245 loses to almost all other chips.
I would love to see a world where we shift technology to focus on longevity and lifespan of products instead of "featuring" Ai in everything now that we are nearing the peak of what is possible. Once that market exists instead of the one we have we might actually make it as a species.
The comparison is incorrect. 7800X3D is comparable to 265K at purchase price. 😉 The best buy in your comparison is 14600K. Why? Because it's cheap. With the same money I can buy 14600K+7900GRE instead of 7800X3D+7700X. The first configuration will definitely win.
Hey, it will be great to see some rendering and viewport benchmarks in applications like Vray 3ds max, maya and unreal engine. When you are running the tests. I know this is probably a big ask.
Such a conundrum for someone like me. I’ve never had AMD. Always relied on intel and coming from a 12700kf which I honestly haven’t enjoyed at all compared to my 8700k. But… I’m a 50% gamer/ 50% creator. Davinci, Revit, Lumion, UE5 etc. Gaming is mainly MSFS and I really need a good all rounder which I was really hoping Arrow Lake would be…. But, no dice. So if you match the Ram speed across all platforms which we know Zen5 can reach 8400mghz… I think it’s going to be 9950x3D. Delid, OC with direct die cooling. It will literally destroy everything out there for the next 2 years minimum in everything. Why does it have to be this way? Why is Intel, a historical cpu making monster now such trash?
1. all power usage should be compared to the work done. I mean you can be running 200W over 10min and 170W over 60min to complete same task. 2. what about idle usage or simple usage like browsing, youtube, generally any activities other than heavy lifting. Many people who do actually pay bills may not utilize CPU all the time while they still want it to perform whenever needed. You used to have nice measurements right from the wall.
6:05 i have had 6000Mhz ddr5 in my 7900x build rock solid for almost a year. Am i missing something here? or is this test perhaps rigged in intels favor for some reason
I am upgrading a friend's system from an old Intel system. I returned parts for an Intel build back in July due to Intel's handling of the 13th & 14th gen fiasco. This is from someone who has built systems from P133 days. What goodwill Intel had with me has been lost. Intel has not indicated that the 1851 socket is not a dead-end socket. My friend doesn't do gaming. I doubt that I will go with Intel Core Ulta and I still do not trust Intel 13/14th gen even with the 2 BIOS updates. I probably will be going with an AMD 9700x or 7700x which are all about the same price as the Core Ultra 5 245K.
No, we have not reached the peak of how fast processors can be or how fast they develop. Intel had a development team working on Beastlake. These were meant to be real high performance cores with a design that allowed something they called rent a core or something like that. The idea was that is a processor had eight of these high performance cores one or more could be "rented out" as four efficiency cores, making it a flexible design where the number and type of cores could be changed easily. These performance cores were meant to be really fast with a very good IPC. The project was on time and on budget, but was closed down as "we don't need high performance cores" and the resources was instead spent on developing AI cores. This might be good for Intel in the long run. Their new AI processors seem to have very good price/performance and works exceptionally well with their latest Xeon processors to optimize the performance. But for the rest of us not living on AI or using massive servers it means we won't get the cores that the team developing Beastlake was working on. As I understand it they were almost complete with the first version and was working on the development of the second and third version when the project was shut down.
Lets see if they optimize all these segregated tiles and chip-lets and if going even lower in node size helps (which I doubt we will get better performance from node size).
Yes but you barely touched on the extra gen5 PCIe lanes. Previous gen Intel ditched 4 lanes, meaning that you often lost a PCIe slot completely, which really sucked when MoBos have so few these days anyway...
I suppose I will keep my 5800X for the moment, I use my PC for gaming, 3D animation and video editing, but I don't see a real and worthy reason to upgrade to this new Ultra Core generation, also, paying a lot for a Z890 motherboard. I'm more excited for the new 9800X3D and 9900X3D CPUs.
I had . unfortunately the 5800X, to X570 MBO, I upgraded the old 3700X.... the 5800X is the worst processor I've ever had in a computer... except that it consumes about 50W when idle... it heats up like crazy... SSD lines are desperate , 2 ssd disks crashed, about 2 TB of data went to waste. The 3700X is great, but the 5800X sucks.
@@igorbalen3389 What cooler do you have? I have Be Quiet Dark Rock 4 Pro. Max temperatures are around 70-72°C while gaming at 1440p 160FPS. And around 78°C while video encoding.
With the recent Bios updates for intel I was able to get 4800 with four ram sticks on 13500 but, for your question 7800x3d vs Any core Ultra.... I would go with the x3d, no idea what intel is planning with the Ultra lineup but imma take a hard pass on it.
Blender benchmark shows the 9950x scores +50 median score versus the Ultra 9 285k, and the 7950x 7900x 7950x3d have also better performance than the 14900k, the only CPU that can perform better than the 9950x are the Threadrippers; then ditch windows, run your 9950x in a good x890 MB, if your are not very knowable just run 2x46GB DDR5 RAM, is up to you but I found a double 4080S Asus Pro Art performs much better than a single ROG strix 4090, and you will have a BEAST workstation for 3d and animation work. Can works if you are too dependable with windows, but be aware win11 is a horrible OS for work, Linux will increase your PC performance around 50% compared with win11, a good beginner distro of Linux is the last version of Linux Mint that will get you setup easily and fast and the UI works similar to win11, actually is more pleasant and customizable than win11.
Dude, no one renders on their CPU if they have a GPU (Which is basically everyone who purchased such a high-end CPU for Blender.) But for things that you are likely to use, like video editing, encoding and BVH calculations I Blender, these CPUs are far better than what AMD has to offer with Ryzen, and also far cheaper than Threadrippers.
@@qwertyuiop8334 not if you take in account price and maintenance cost, I lose thousands thanks to the 14900k, every single week I had a 14900k or 4090 melt, the 285k is a cheap production cost CPU with the same price tag of an i9 and do you expect something cheaper will perform better? for a freelancer with only one PC anything will work, but if you have a fleet 50+ things works different, I am not a gamer I am a producer then the Threatrippers are my battle horses, and just at lunch just look how many bugs and instability showed the 285k, I can not trust Intel anymore
@@theTechNotice LOL no I meant on my side here in Malaysia - the AMD team in Malaysia is more active with samples and I get the all the time. Can't say the same for Intel, they're very limited. My apologies for the misunderstanding caused.
Ummm. Why is it every time I watch one of your shows it is Intel biased? Was a nice speech at the beginning though. Now that I have done some score comparisons from the places who are good at this stuff and don't have a favorite. Your differences are all larger in favour of intel. I don't think you will get on AMDs mailing list any time soon.
im also thinking about trying the am5 build now since I have had the 5800x and using 5700x now, giving my nephew a 5700x pc build next month!the 5800x killed an 240 aio before I learned how to undervolt it!it made the cpu cooler actually whistle since the fans were spinning so fast!then it smelled like detergent fluids and I didnt dare risk a leak!I liked the undervolt performance and tried to put extra case fans in, and forgot to turn the power switch off!tried to lift the cooler off gently, but accidently yanked it off, hearing the power slap sound when frying the cpu!didnt bend any pins, just fried!
245K is really embarrassing chip. I don't understand why Intel even released it. Maybe they should have waited one more year and released a product that is more powerful in at least in 80% cases vs equivalent prior gen cpu.
Intel has much less support when it comes to software, and this new architecture is waiting for software to support it. The problem is always too much competition in the industry. Software is propelling AMD's chiplets while the more complicated chiplet with more potential is left in the dust.
167W for the 14600k? my 12600k stock pulls 160W in cinebench, weird stuff, I though 14th gen had higher TDPs. In any case, the power readings for the new intel gen are a bit complicated and turns out it is higher than what HWMonitor shows, would advise to watch gamer nexus for that, made a whole video and a section of another video about that.
The 7800x3d is $477 on Amazon right now. More than a 14900k. It makes no sense unless you're a gamer and must have the best and can't wait 2 weeks for the 9800x3d.
@@c.m.7037 Not really. The 7800x3d outperforms the 7950x3d in games half the time and is less complicated because of its single CCD with extra cache. Games don't typically utilize more than 8 cores anyway (that's what the consoles run), so on the 7950x3d you'd want to coerce them to run only on the lower-clocked vcache CCD and avoid the higher-clocked CCD with the smaller cache, which has to be done via software because the scheduler will otherwise pick the higher clocked core. Even if there was extra cache on BOTH CCDs (as rumored), the performance penalty crossing the infinity fabric (or otherwise dealing with cache locality) would probably negate any boost from extra cores, at least with most modern games. The 7800x3d currently the best gaming CPU regardless of cost (hence scalper prices) and the 98000x3d will almost certainly wear that crown next.
come on man. let out a link and buy a 9700x or 7900x. it's not hard. you can resell it in 30 days. you're out of pocket for $50-75 bucks and 30 minutes of extra hassle. BTW not getting free stuff would allow you to experience what everyone else has to and also would lead to less bias because it would mean you don't have personal relationships with these people.
Intel & AMD need more improvement. Intel Price is also too much for this spec 👎 Mac M4 is rocking the field with Les Cost. Can buy a Mac mini M4 at a Intel VGA cost.
no they should buy a damn arrow lake and the new mainboards coming out for it and the new cudimms and laugh their a off all the way to the bank. then when intel drops their new arcs buy that too.
The fact that you basically begged for AMD to send you 9xxx sample multiple times in this video was not funny at all, made me stop the vid and move on. I like your reviews/tests. but 230k yt channel begging for free stuf, it's kind of lame. No offense intended. I will continue to watch your videos, but man, you can afford a 9xxx AMD CPU, just go an buy one and do a proper review.
Actually it's abit more than that, for some reason AMD chooses to leave some creators out.... but the thing is they worked with me on the Threadripper launch which was 10x more expensive if not more, but this time cold silence... why this is annoying to me is because that makes me look Intel biased, because team red samples are harder to get. I would be really annoyed if it was the other way as well, if Intel was keeping me out of the loop, especially if you've got a relationship with them and have worked with them in the past, but one or the other is blanking you at the moment. Hope this makes more sense!? Appreciate your comment nevertheless!
I'm amazed that the 7800X3D could win in anything against Intel CPUs. I have one and it has low clock speeds and a lot fewer cores, regardless of SMT. Intel is superior in productivity because they're throwing 2x+ the physical cores and a lot more watts at these applications. I'll stick to gaming 😂💪
Intel is gaining more and more interest among people who work... ULTRA 200 has not yet appeared on sale for it, there is not even optimization, zero optimization, no drivers... But it is already fighting monsters in work tasks..
@@groenevinger3893 I think this guy thinks Tech Notice is bias towards Intel or this guy is just an AMD fanboy and can't stand AMD losing on any parts, since the benchmarks clearly shows Intel beat AMD on work loadouts, but lose on gaming parts. This channel is all about work, not gaming, and that's where this new gen Intel mostly beat AMD, even the 9000 series...
but why not use ryzen 7700x or 9700x instead 7800x3d creators don't care about 3d processor all we know 3d processor not good for editing
He didn’t get 9000 series chips from Amd , so I think he created it with whatever he had
@@Happy-g3e5d agree
@@Happy-g3e5d 150$ 8700f from AliExpress is going to have a similar performance as this 7800x3d in work load.
boys in New benchmarks 9700x shine very good , see tomshardware,hardwareunboxed new Intel bench ... 9700x. beat many expensive cup (newly and update bios microcode for 9000 seri)
@@bijoyhossain93 editing "What" specifically?
16K HDR 120 FPS footage on Premiere Pro or DaVinci Resolve?
Throw-in some After Effects to run along Premiere Pro as well
Actually 7800x3D feels good enough to do that
As long as you're pairing it with RTX 4090 to take advantage of Hardware acceleration
No idea what's your problem
Great round-up, it's really interesting to see creator and professional focussed benchmarks for the Core Ultra 245k - clearly it's better suited at that, than gaming (right now at least).
There's no point in getting a 'gamer cpu, like the 7800x3d.. Unless you still play in 1080p. Lol!
As someone who uses blender, fusion, and plasticity every day, I’m excited to upgrade to a 245k from my 6850k despite what everyone seems to think
Im upgrading from a 3900x to the 265k. Those of us who are on older hardware can upgrade with a clear mind. Those on modern ones not so much
Very Good
Wait up to 4-6 month before buying this platform. Let the vendors to fix the bugs and hitches.
Prosumers require stable environment. 7000 AMD series are quite good now.
@@Alex-wg1mb i preordered the cpu, but its gonna take months to save up for the motherboard i want. I'm sure any bugs will get fixed long before i ever get to actually use the cpu.
@@Alex-wg1mb thanks man but my mind was made after the first review I saw showing blender performance. I’ve been spending the last few days tuning and learning about all the new parts of the architecture. Bios is like being a kid in a brand new candy store
12th gen Intel has been selling at super low prices lately. 12th gen right now biggest creator bang for buck with proven reliability IMO.
Also they are very good on gaming as well (with DDR5). DDR4 holds them back even on games.
The i7-12700K is my favorite CPU ever released that i ever own, the perfomance per watt is crazy for such a beast.
14600kf for $200 is looking like really good value, but I'm waiting to see how bios updates fix the instability and degradation
@@xanira6367 My 12700K has AVX512 instructions because it's an early 2021 batch 🤪
@@xanira6367 bro it's already fixed
@@GameOnBoy0228 yeah I think it might be safe to go ahead, but better to be 100% sure
I was watching Coreteks review of 285k chip, down to how E-Core and P-Cores positioned, and what I thought that 285k could be a great starting ground for Intel, but there are so many issues on this 285k in terms of latencies and how L3 is accessed, that I believe this will be a 1 of design
Unlikely this will be used to build upon in the future
They will need to rebuild the architecture into something else, even on tile concept
Cost for cost, the 265k is the competitor for the 7800X3d, though.
If I'm spending $400-430 on a CPU, I'm going to take a tiny gaming FPS hit to get 2x the performance in multicore rendering, live audio processing, video editing, etc. Every time.
This is, in my opinion, absolutely the best choice one could take. That 10 fps difference in games is basically nothing, when these CPUs offer so many better features and speedups in productivity.
This is why it confuses me how often the 7800x3d is recommended. It's the best selling CPU everywhere, being purchased by people who would probably be better off with something like a 13600k or a 7900x.
Noone goes with 7800X3D for content creation. You should tested with the similar priced 7900X instead which of cause crushes the 245k in everything including gaming.
I will wait until after all of the " Next " { Generations } of hardware to be released before deciding If I want to upgrade anything in my system.
Never though AMD would skip over Intel, but they have. I'm a creator and gamer, so I will stick with my i9 13900k.
same here I'm sticking to my Intel i7 TM 850h 2.60ghz
@@lopserdff Worst time for Intel and they are laying off lots of staff.
I love seeing videos like this for creators! Thanks
I guess, the thing with AMD's upcoming 2nd Gen V-Cache is that it won't be such a kneecap on productivity and creator use. Hoping that the 9800X3D could actually be a good CPU for people who not only want good gaming performance, but also want good productivity performance.
x3d was heldback by temperatures, the stacked cache design has a lower TJmax and is "harder" to cool due to its design.
if amd fixes the temp issues and "restores" the clock speeds for the X3D chips they will basically be the same as their non x3d parts in productivity
It's really nice to see!!! Big tanks again for your very interesting video. BUT, i can't WAIT for the new Ryzen 9 9950X review. It's been a while now it's release. When we can expect to got a review on this CPU. I looking to change for this CPU in novembre and i ask you some questions about it on minnect (big tanks again for your answers). BUT, it will be very interesting to see the result on benchmark. BIG BIG TANKS again for your channel. I appreciate it so much. Maybe spend to much time on it, but it's my problem 😅
Would love to see a set of benchmarks and comparisons for programmers. Compilation times and power consumptions. I notice that many 'productivity' benchmarks specifically target large data sets that are visited sequentially and once, almost as if to minimize loops in an effort to defeat the effect of large cache. Fact is, most programs run loops, and often the inner loops are where most of the processing time is spent. Caches are effective for a reason, and most use cases can take advantage of large caches. Seems fairly obvious to me that the photoshop benchmark is the only one that constrains the data processing set to be reasonably cache-able.
Another desirable metric in the dizzying world of so many cores that all share the same memory bandwidth would be responsiveness under load for other programs and the operating system. Personally I would favour idle-like latency while a few cores were loaded over any individual core performance that shuts down responsiveness for other processes. The kind of test that would really show how phone processors specs are largely illusions that cant deliver once any one core gets a little more than lightly loaded.
You should do another real world power draw comparison when you get your hands on the new amd ryzen 9000 series cpus 👀
Should've compared it to something of the same price as that's what important to everyone. How much performance you're getting for a defined budget. In that case, 245 loses to almost all other chips.
His audience is smart, they know this is not a fair comparison and two different use cases
Please do a review on core ultra 7 265k.
I would love to see a world where we shift technology to focus on longevity and lifespan of products instead of "featuring" Ai in everything now that we are nearing the peak of what is possible. Once that market exists instead of the one we have we might actually make it as a species.
I have a couple of 30+year old computers that are still used. Never overclocked.
The comparison is incorrect. 7800X3D is comparable to 265K at purchase price. 😉
The best buy in your comparison is 14600K. Why? Because it's cheap. With the same money I can buy 14600K+7900GRE instead of 7800X3D+7700X. The first configuration will definitely win.
Please test/benchmark these processors on DAW(digital audio workstation). because there is no one out there.
Hey, it will be great to see some rendering and viewport benchmarks in applications like Vray 3ds max, maya and unreal engine. When you are running the tests. I know this is probably a big ask.
Such a conundrum for someone like me. I’ve never had AMD. Always relied on intel and coming from a 12700kf which I honestly haven’t enjoyed at all compared to my 8700k. But… I’m a 50% gamer/ 50% creator. Davinci, Revit, Lumion, UE5 etc. Gaming is mainly MSFS and I really need a good all rounder which I was really hoping Arrow Lake would be…. But, no dice. So if you match the Ram speed across all platforms which we know Zen5 can reach 8400mghz… I think it’s going to be 9950x3D. Delid, OC with direct die cooling. It will literally destroy everything out there for the next 2 years minimum in everything. Why does it have to be this way? Why is Intel, a historical cpu making monster now such trash?
You have to think twice, Intel does supported much more video codecs than AMD. Benchmarks are always just a result, they didn't really care about it.
@@BillyRazOr2011 Anyone who cares about productivity is going to be using a GPU. Rendering on CPUs is such a 90s thing.
1. all power usage should be compared to the work done. I mean you can be running 200W over 10min and 170W over 60min to complete same task.
2. what about idle usage or simple usage like browsing, youtube, generally any activities other than heavy lifting. Many people who do actually pay bills may not utilize CPU all the time while they still want it to perform whenever needed. You used to have nice measurements right from the wall.
Ultra 5 245k is a great processor,
Just get the best cpu for your budget
6:05 i have had 6000Mhz ddr5 in my 7900x build rock solid for almost a year.
Am i missing something here?
or is this test perhaps rigged in intels favor for some reason
huge performance uplift is coming soon in the next Generation
Hmmm...... Have to think a lot before purchase....
You meant: is it worth DOWNGRADING to the Intel Core Ultra 5 245k?
I am upgrading a friend's system from an old Intel system. I returned parts for an Intel build back in July due to Intel's handling of the 13th & 14th gen fiasco. This is from someone who has built systems from P133 days. What goodwill Intel had with me has been lost. Intel has not indicated that the 1851 socket is not a dead-end socket. My friend doesn't do gaming. I doubt that I will go with Intel Core Ulta and I still do not trust Intel 13/14th gen even with the 2 BIOS updates. I probably will be going with an AMD 9700x or 7700x which are all about the same price as the Core Ultra 5 245K.
No, we have not reached the peak of how fast processors can be or how fast they develop. Intel had a development team working on Beastlake. These were meant to be real high performance cores with a design that allowed something they called rent a core or something like that. The idea was that is a processor had eight of these high performance cores one or more could be "rented out" as four efficiency cores, making it a flexible design where the number and type of cores could be changed easily. These performance cores were meant to be really fast with a very good IPC. The project was on time and on budget, but was closed down as "we don't need high performance cores" and the resources was instead spent on developing AI cores. This might be good for Intel in the long run. Their new AI processors seem to have very good price/performance and works exceptionally well with their latest Xeon processors to optimize the performance. But for the rest of us not living on AI or using massive servers it means we won't get the cores that the team developing Beastlake was working on. As I understand it they were almost complete with the first version and was working on the development of the second and third version when the project was shut down.
It looks, for a creator a 265K looks a safe and optimal choice then. I'm upgrading from a 7700K
Lets see if they optimize all these segregated tiles and chip-lets and if going even lower in node size helps (which I doubt we will get better performance from node size).
Big fan of the vids
Yes but you barely touched on the extra gen5 PCIe lanes. Previous gen Intel ditched 4 lanes, meaning that you often lost a PCIe slot completely, which really sucked when MoBos have so few these days anyway...
I suppose I will keep my 5800X for the moment, I use my PC for gaming, 3D animation and video editing, but I don't see a real and worthy reason to upgrade to this new Ultra Core generation, also, paying a lot for a Z890 motherboard. I'm more excited for the new 9800X3D and 9900X3D CPUs.
I had . unfortunately the 5800X, to X570 MBO, I upgraded the old 3700X.... the 5800X is the worst processor I've ever had in a computer... except that it consumes about 50W when idle... it heats up like crazy... SSD lines are desperate , 2 ssd disks crashed, about 2 TB of data went to waste. The 3700X is great, but the 5800X sucks.
@@igorbalen3389 What cooler do you have? I have Be Quiet Dark Rock 4 Pro. Max temperatures are around 70-72°C while gaming at 1440p 160FPS. And around 78°C while video encoding.
With the recent Bios updates for intel I was able to get 4800 with four ram sticks on 13500 but, for your question 7800x3d vs Any core Ultra.... I would go with the x3d, no idea what intel is planning with the Ultra lineup but imma take a hard pass on it.
Blender benchmark shows the 9950x scores +50 median score versus the Ultra 9 285k, and the 7950x 7900x 7950x3d have also better performance than the 14900k, the only CPU that can perform better than the 9950x are the Threadrippers; then ditch windows, run your 9950x in a good x890 MB, if your are not very knowable just run 2x46GB DDR5 RAM, is up to you but I found a double 4080S Asus Pro Art performs much better than a single ROG strix 4090, and you will have a BEAST workstation for 3d and animation work. Can works if you are too dependable with windows, but be aware win11 is a horrible OS for work, Linux will increase your PC performance around 50% compared with win11, a good beginner distro of Linux is the last version of Linux Mint that will get you setup easily and fast and the UI works similar to win11, actually is more pleasant and customizable than win11.
Dude, no one renders on their CPU if they have a GPU (Which is basically everyone who purchased such a high-end CPU for Blender.) But for things that you are likely to use, like video editing, encoding and BVH calculations I Blender, these CPUs are far better than what AMD has to offer with Ryzen, and also far cheaper than Threadrippers.
@@qwertyuiop8334 not if you take in account price and maintenance cost, I lose thousands thanks to the 14900k, every single week I had a 14900k or 4090 melt, the 285k is a cheap production cost CPU with the same price tag of an i9 and do you expect something cheaper will perform better? for a freelancer with only one PC anything will work, but if you have a fleet 50+ things works different, I am not a gamer I am a producer then the Threatrippers are my battle horses, and just at lunch just look how many bugs and instability showed the 285k, I can not trust Intel anymore
All of these new generation motherboards look so good.. Smh...
Oh my side, I can get AMD samples far easier than Intel.
Please reach out to me on email, I'd love to learn more!
@@theTechNotice LOL no I meant on my side here in Malaysia - the AMD team in Malaysia is more active with samples and I get the all the time. Can't say the same for Intel, they're very limited.
My apologies for the misunderstanding caused.
Ummm. Why is it every time I watch one of your shows it is Intel biased? Was a nice speech at the beginning though. Now that I have done some score comparisons from the places who are good at this stuff and don't have a favorite. Your differences are all larger in favour of intel. I don't think you will get on AMDs mailing list any time soon.
what are the places that you found the comparisons ? I would also like to take a look.
Obvious Intel schill stuck in the early 2000s where people still think Intel Xeons are the peak of computing power when they are absolute garbage
@@7eave1 There is this thing called youtube. They have video reviews of products. You should familiarize yourself with it.
i love amd just because of low power. you should have compared with 9700x
no he should not compare it to a ryzen 7 he should compare it to a 9600x.. whats the use of comparing an i5 to an r7?
im also thinking about trying the am5 build now since I have had the 5800x and using 5700x now, giving my nephew a 5700x pc build next month!the 5800x killed an 240 aio before I learned how to undervolt it!it made the cpu cooler actually whistle since the fans were spinning so fast!then it smelled like detergent fluids and I didnt dare risk a leak!I liked the undervolt performance and tried to put extra case fans in, and forgot to turn the power switch off!tried to lift the cooler off gently, but accidently yanked it off, hearing the power slap sound when frying the cpu!didnt bend any pins, just fried!
It's great to have choices more. Am I right?
Next time do ultra 285k vs amd 2700x 😏
yes, but the 14000 series runs very hot and are literally melting...
245K is really embarrassing chip. I don't understand why Intel even released it. Maybe they should have waited one more year and released a product that is more powerful in at least in 80% cases vs equivalent prior gen cpu.
Intel has much less support when it comes to software, and this new architecture is waiting for software to support it. The problem is always too much competition in the industry. Software is propelling AMD's chiplets while the more complicated chiplet with more potential is left in the dust.
167W for the 14600k? my 12600k stock pulls 160W in cinebench, weird stuff, I though 14th gen had higher TDPs.
In any case, the power readings for the new intel gen are a bit complicated and turns out it is higher than what HWMonitor shows, would advise to watch gamer nexus for that, made a whole video and a section of another video about that.
GET AN M4 MAX MACBOOK PRO WITH THUNDERBOLT 5 SUPPORT.
more l2 than cache? oops eh?
the tsmc bit was pretty funny😅
The 7800x3d is $477 on Amazon right now. More than a 14900k. It makes no sense unless you're a gamer and must have the best and can't wait 2 weeks for the 9800x3d.
True, but I think to be specific gamers want the 9950x3d right? Because of the dual ccd's?
@@c.m.7037 Not really. The 7800x3d outperforms the 7950x3d in games half the time and is less complicated because of its single CCD with extra cache. Games don't typically utilize more than 8 cores anyway (that's what the consoles run), so on the 7950x3d you'd want to coerce them to run only on the lower-clocked vcache CCD and avoid the higher-clocked CCD with the smaller cache, which has to be done via software because the scheduler will otherwise pick the higher clocked core. Even if there was extra cache on BOTH CCDs (as rumored), the performance penalty crossing the infinity fabric (or otherwise dealing with cache locality) would probably negate any boost from extra cores, at least with most modern games.
The 7800x3d currently the best gaming CPU regardless of cost (hence scalper prices) and the 98000x3d will almost certainly wear that crown next.
Maybe buy one and test it?
come on man. let out a link and buy a 9700x or 7900x. it's not hard. you can resell it in 30 days. you're out of pocket for $50-75 bucks and 30 minutes of extra hassle. BTW not getting free stuff would allow you to experience what everyone else has to and also would lead to less bias because it would mean you don't have personal relationships with these people.
still waiting someone to scream 7800x is a gaming cpu not a productivity cpu etc etc.
Why would creators be using a Ryzen 7 class CPU or a 5 series Intel?
*Laughs in Ryzen 9000*
Not bad this AMD Core Ultra 5 245K! 😂
Intel & AMD need more improvement. Intel Price is also too much for this spec 👎
Mac M4 is rocking the field with Les Cost. Can buy a Mac mini M4 at a Intel VGA cost.
You lose, Intel! Finish him, Ryzen!😮😮😮
Pretty sure it's the same performance as 120$ 7500f from AliExpress... In games btw
Gaming ruin PCs
Are you running out of ideas for content? When has 3D been relevant for creators?
Did you know there are also 7900x3D and 7950x3D?
I don't think he's that smart...
They made creator cpus not for gaming :))
no they should buy a damn arrow lake and the new mainboards coming out for it and the new cudimms and laugh their a off all the way to the bank. then when intel drops their new arcs buy that too.
Those new Intel cpu's are a flat out joke.
Great
The fact that you basically begged for AMD to send you 9xxx sample multiple times in this video was not funny at all, made me stop the vid and move on. I like your reviews/tests. but 230k yt channel begging for free stuf, it's kind of lame. No offense intended. I will continue to watch your videos, but man, you can afford a 9xxx AMD CPU, just go an buy one and do a proper review.
Actually it's abit more than that, for some reason AMD chooses to leave some creators out.... but the thing is they worked with me on the Threadripper launch which was 10x more expensive if not more, but this time cold silence... why this is annoying to me is because that makes me look Intel biased, because team red samples are harder to get. I would be really annoyed if it was the other way as well, if Intel was keeping me out of the loop, especially if you've got a relationship with them and have worked with them in the past, but one or the other is blanking you at the moment. Hope this makes more sense!?
Appreciate your comment nevertheless!
Please use CUDIMM RAM
Why? It's expensive vapor wear 😂😂😂
I'm amazed that the 7800X3D could win in anything against Intel CPUs. I have one and it has low clock speeds and a lot fewer cores, regardless of SMT. Intel is superior in productivity because they're throwing 2x+ the physical cores and a lot more watts at these applications. I'll stick to gaming 😂💪
Question; are there any instabilities with the IGPU forcing you to turn off the IGPU to solve this
Intel is gaining more and more interest among people who work... ULTRA 200 has not yet appeared on sale for it, there is not even optimization, zero optimization, no drivers... But it is already fighting monsters in work tasks..
No one is buying this shit
Second first comment
Unsubscribed
not sure i like this video too.. but what made you to unsubscribe this channel? Its still one of the better.
@@groenevinger3893 I think this guy thinks Tech Notice is bias towards Intel or this guy is just an AMD fanboy and can't stand AMD losing on any parts, since the benchmarks clearly shows Intel beat AMD on work loadouts, but lose on gaming parts. This channel is all about work, not gaming, and that's where this new gen Intel mostly beat AMD, even the 9000 series...
@@arthur_pd this.. any brand fanboy are hard to deal with sometimes, ignore and move on i guess
Ryzen 7 7700 as a creator, what you think? 4k moderated video editing and a bit of everything
i7-12700K overshadows that Ryzen 7.
@@saricubra2867 why are you even talking about the 12700k?
@@coffee7180 because the ipc is higher than zen 4, it's very cheap and has 20 threads, quicksync from the uhd 770.
7700 is directed for gaming
@@saricubra2867 overshadows in power consumption and heat, also.
@@edengate1 i7-12700K is sub 70 degrees on Cinebench and way faster at the same power use, meanwhile AMD told us that the 7700X at 95C is "normal" .