@@kuroneko9270 Word. I remember playing and beating Crysis as a young teen with 20fps average. On the lowest quality settings, no less. I didn't even complain. Kids, man.
@@keres993 Yeah ofc kids nowaday grow up with this kind of hardware. Like most games need to run at 30fps on consoles on 4k and on PCs like minimum 60fps. Thats like the new standard for gaming and you need to accept this. I don't know if you mean this "kids, man" in a negative way.
I was working in scientific computing for a couple of years and this is where these CPUs shine. Brute force numerical calculations - in my case, modeling of chemical reactions using approximations to Schroedinger's equation.
Though scientific computing is a good use case of such CPU, from what I have seen there are not many program/framework/environment/whatever-you-call-it that efficiently utilizes multi-core processing, just like what the video shows. Yes, high performance CPU is helpful, but for a chip as powerful as Threadripper, the software is the real bottleneck, coming from a computer engineering perspective
this more like the optimal usage of this monster CPU, heavy intens calculating task or modelling with some complex algorithm. it was so fun back in my college day at our HPC lab, we try too stress test the new cluster computer at it with our program based on different algorithm;
@@Volker_A4 In ten years?? Extremely doubtful considering how much stuff is built around x86, I'd think it would take decades of gradual change before a complete switch like that happens, at least in the consumer space.
I'm a Mechanical Engineer and these things are made for FEA and CFD. You're utlizing the multi-core capabilities to the max, and RAM is king in models that easily run on hundereds of gigs of RAM. You're also writing terabytes at a time, so fast PCIe and storage make a huge difference. When people ask who would pay that much for a CPU, it's the engineers who would can use this hardware to iterate faster, save development costs, and even use generative design solutions.
I believe its better for companies to rent out cloud computing services because yk down the line chips like these quickly become redundant. years down the line, when these chips become extremely cheap (due to datacenters selling off older processors), it's quite a bargain for students.
Generative design doesn’t use AI it just uses FEA to calculate stress distribution in a part given certain boundary conditions, then it iteratively removes material in low-stress areas until it hits the weight or stress target define by the engineer. Very cool stuff, thankfully doesn’t depend on AI, just math
@@AA-bh3bz Bro relax some people want to make money from a real job instead of joining a pyramid scheme and have the risk of blowing away your life savings.
As a software eng. I would love to see some compile times (Firefox or Chromium, maybe Linux kernel with all modules) in the benchmarks section. Maybe next time. :)
Well unless he’s using gentoo probably not. However in terms of giving one an idea of how performant something for compiling/transpiling, building a kernel or a browser actually is a good benchmark and it’s not entirely synthetic either.
I’m sure those who need this capacity know its capability. However this is good knowledge for those who do not need the speed/capacity but want to understand what is possible with the computing power available on the market.
It is not for everybody, but it is excellent for some workflows: - Virtualization - VFX (exr formats...) - Storage and connectivity. You can easily add a lot of nvme working at their full speed and several 100Gb (or 200Gb) erthernet boards. - Multi GPU applications
The thing is with GPU rendering, I can't use even a 4090 for rendering my 3D scenes, not enough ram, not even on a A5000. As soon as you are using udim workflow extensively with a large number of texture GPU is over. For motion design or product viz gpu rendering is very good, for full cg scenes rip. For 3D artists out there, don't fall into the "fastest rendering hardware" trap, buy the hardware that can actually render what you need to do.
Exactly. People act like GPU rendering is the only way forward. But as long as RAM is as limited as it’s historically been on GPU’s, then there’s no way more demanding 3D workers will switch.
What about an RTX 6000 w/ 48gb of memory? If the highest end enterprise cards can't handle your scene it sounds like you need to start using rendering farms / clusters
i mean there are slick and cool regular motherboards but you have to pay for it lol. It’s like saying i wish 10k cars looked like Ferraris. The pro motherboars are way more expensive to build
@@tayk-47usa41 Look at some expensive asus mobos, they are non-slick and extremely uncool, one could even say cringe. It's not the price that's the issue. These companies just don't respect gamers the same way they respect their processional clients.
Honestly pretty tempted to create a massive workstation and run all my family's PCs virtualized from a server rack and fiber all over the house. It won't be the best for gaming but it'll be good enough and it'll do just anything. I can also host my servers and containers too all at the same time.
I still fondly remember my 7980xe 18c36t. I had a GoldenSample and was able to clock it at 1.25v on all core 5.1 GHz with direct cooling. What's actually so fascinating about it is that the CPU alone drew 700 watts on the 5Ghz all core in Cinebench. If you now look at how big the DIE is and then think about the fact that the 2011-3 CPU still breaks the voltage in the CPU..... what Intel built in 2017 is really fascinating!
Threadripper 7000 is a MONSTER. Makes regular Ryzen and Intel Core look like cheap toys (and that's even with the 7980X being heavily memory bandwidth bottlenecked on TRX50 with 1 channel per 16 cores).
Awesome review, however a bit strange testing this productivity product on rendering. It's not designed to do that and GPUs are far more efficient for it. I'm a power user myself and would kill for one of these. Rendering 3d scenes is a GPU activity, however generating geometry(in Houdini for example) is a CPU task, and also one that requires huge amounts of ram. I would love to see some files compression and decompression for testing raw CPU power.
Lots of production renderers use CPU though, like V-Ray (with more features than GPU), Houdini Karma and RenderMan (with help from the GPU for certain tasks). The stability, scalability and feature-proof nature of CPU rendering is still king for many pipelines
It depends, let's say you re a VFX studio and you re contracted to work on a pixar/disney level of animation quality. Then you ll need much more ram than any consumer GPU has, 24/48GB simply wont cut it for some scenes. In that case you re probably looking towards getting 256+GB threadripper workstations
@@ThunderingRoarsome of nvidias dgx series gpus (such as the a100, h100) can have up to 128gb of ram. though granted they are ai/compute focused cards so not sure how they’d fare in rendering performance
The thing on the GPU vs this CPU is on the project size. Blender for example if you make a massive project like for a movie or so, you will use a bunch of crazy stuff that will kill your GPU in terms of memory, but these CPU will compare and won't have these kind of constraints.
GPUs aren’t kings when it comes to rendering. In some cases yes, but far from all. This CPU would be perfect for something like corona in architectural visualization. What’s the point in running GPU rendering tests with a CPU for CPU Rendering. In general with CPU rendering, you would put the most money into the CPU and just get the cheapest GPU with the most RAM. That way you still get good performance in viewport, but great rendering power for finals.
It's worth it for context. A lot of rendering can be done now with top end mainstream platform and a 4090. Threadripper is for when youre serious enough to have multiple 4090s and nvme raid.
I think you guys should all add simulation projects when testing CPUs. I don't know about everybody else, of course, but I'm only considering upgrading to a TR at the end of the year or early next year (ideally it would be the 8970X i think) only because I work with simulations, water, smoke etc as a 3D animator. These CPUs are designed for stupidly big math equations, scientifics things and simulations. So I find it always kinda sad that it's never really talked about as most people interested in these CPUs are actually using them for these reasons! But Cinebench is a nice benchmark I think, idk how it translates to the simulations and all tho, thank you for the video!
4:35 my understanding was that they used the zen4c cores that are essentially physically more compact than zen4 at the cost of stability, aka no high clock speed, hence why you can have 96 of them but with lower single core performance, so you should not expect them to preform the same.
Very nice video which illustrates how this CPU is a niche item for specific CPU-heavy loads since most mainstream productivity apps such as video editing or 3D modelling & animation apps use GPU acceleration these days, where a beast CPU would be redundant.
Thanks a lot for your work! Mate, you have to introduce some deep learning tests for such videos. I mean GPU performance is critical for NN training but CPU is also critical for preparing and sending data to GPU. So the combination usually gives you a best result
as a 3D artist myself I think you should test simulation which heavily relay on CPU such as liquid simulation, cloth simulation and... etc in Houdini or Choas Phoenix liquid simulation. Simulation is heavily used in movies and these CPUs seems to be made for studios than Individuals
Doesn't make too mich sense for ITX since the CPU alone consumes like 550W at full load. Air-cooling that in a tiny case would be very loud. A custom water loop is almost mandatory.
4:12 actually looks like the 7970X could be a sweetspot. I'm looking at it this way: you could get 2 systems with a 7950X (without GPU) for about the price of JUST the 7970X but you'd need the space etc and you'd have more points of failure, more maintenance etc so I see the appeal of a single 7970X system. When it comes to the 7980X however, you could probably get 4 systems with a 7950X and then you have more rendering power plus a heck of a lot of flexibility to distribute your rendering and other tasks, kinda like a small compute farm which I personally would much rather have than 1 massive workstation. And I guess that also offsets the downsides I mentioned previously, at least imo
technicaly speaking - HEDT is overkill for us, hardware nerds. ye, this is amazing that AMD or Intel can make CPU this huge and this powerful, but honestly - we have no need in this. i mean, we - PC enthusiasts. huge thx for this tests!
Man i would love to buy a threadripper build for my mom shes an interiour designer and would hugely benefit from having an actual workstation instead of her midrange gaming laptop
Great video! You presented their performance really well against mainstream CPUs as that is what info people will be after. I was looking forward to seeing a sff threadripper build teaser
SFF build for a CPU which consumes 500W sustained and spikes up to 550W under load? Yes, a great fit for an SFF system, as long as you have an external 1600W PSU hooked up *and* if you don't mind setting your house on fire.
Or 7960X which has the same exact name because the Intel 7960X isn't Extreme Edition it doesn't have the E in the end. A bit like Threadripper 3970X and i7-3970X, Threadripper 3960X and i7-3960X because back then the X indicated the Extreme Edition
In honor of Threadripper HEDT return, I'm going to rebuild my 2990WX TR / X399 MEG Creation rig. I sort of moved to TRX40 with a 3960X a couple years ago, and instead of getting caught up in the new super expensive TR of today, I'll satisfy my geeky needs by building another HEDT system, my 2990WX.
Maybe late response but he posted that he has a Sony FX3 (Same sensor as the A7SIII he linked in the description) and a FX30 on Instagram. But its all about the lighting.
Seems like this Threadripper is probably good for game developers? Good mix of rendering and compiling code... it'd be nice to see compile times stats too.
In a future mouse video, could you test motion sync on and off? With it on, I feel like I tend to aim short slightly, but idk if it's just me or of motion sync is causing some kind of negative acceleration effect. Love your videos!
yo Optimum love the vids bro. Would love to see a video on that new mouse latency device you got. Really curious to see what the fastest mouse clicks are out there
Hey Ali, seeing that you shoot on Sony cameras, their compressed codecs wreak havoc on CPU's and GPU's, for some reason unlike the FX9/6 that have the proper, XAVC-I implementation that is very easily decoded, the FX3/30 and A7S3 don't use those codecs and chug a lot. If you record externally to ProRes, you'll be surprised by how well it runs. You've used Blackmagic RAW before so I'm sure you know :)
I have a question do you think you could make a video on the Razor Huntsman V3 Pro keyboard because from what I see its the faster than the wooting and apex pro also great video!
Ali - why do you still use the font Calibri in your videos? Seems at odds with your otherwise super premium feeling & amazingly made videos to use the default MS Office font, don't you think? Why not just stick to Inter for everything?
@@zoomzabba452 oh i dont need to upgrade is what im saying lol. The og ryzen 7 1700 still works fine ive never even hit 70% usage yet 😂 pc is fast as f still
Thread ripper has always been my dream CPU to own. 1 day I will when they come down in price. And there's second hand in mint condition. But do I need it? No, but I don't love to have it. It's something that has been my dream because it can do. Anything can game making run a server or can do anything. And with the speech today in the internet, you could run a nice gaming server and have all your buddies. Get on and that's what pleases me about technology. It's so amazing god bless the makers and people who continue this fantastic gift❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤
So the upcoming AMD EPYC 'Turin' Zen 5 CPU with 256C/512T, up to 600W on TSMC 3nm would be just as quick as an NVidia 4090 at rendering. Also checkout IBM Northpole which just blows everything out of the water. It's 8x faster than NVidia A100 and 6x faster than their newest H100. Just wow.
reasoning for the "cheaper" board supporting the best CPUs is the fact that generally CPUs get bricked way less than motherboards over long time. thats why all the old server CPUs are so cheap now. but not the motherboards. so in the long run people can upgrade if they want when the "best" becomes cheap.
My grand son is playing Apex a lot. What is the best PC and monitor for that game? He is a good boy and he deserves the best. Thank you for all the answers.
Great video as usual. One thing tho: For someone who uses threadrippers, those gaming numbers are great. They only compare to the current one they have. The cheaper CPUs aren’t even an option so the comparison is pointless for them
Nice and informative video, I'm curious about the maximum temperature of the 7980 X during rendering at 100% load. In 3Ds max/corona renderer it goes up to 93 degrees and is this normal? With 360 mm thermaltake cooling
The point of a CPU like this in Eevee is that you can run multiple instances of your render at once (as many as you can fit into VRAM). ...which realistically is 24GB, unless you're crazy.
Great video. I am wondering if I wanted 256GB DDR5 what other motherboard could I get? ASUS Pro WS TRX50-SAGE oly has 4 DDR5 slots :( and there aren't any 64GB single memories in the market.
Im happy with my 7800x3d tat thing is just "chefs kiss". Optimum i wonder can we see a video on the new razer huntsman v3 pro lineup keyboards as well as drunkdeer gaming keyboards which both are new to the rapid trigger lineup. I know ull be bc but i just hoping to get a certificate from u
Dude you can fit a small indie game on the cache of the 7995WX. The times.
I wonder how software rendering would actually work out for older games.
You should also be able to run Crysis OG on the CPU cores alone at 15-18fps via SwiftShader.
Linus got 7-9fps on a previous 64C/128 AMD CPU 😎
@@kuroneko9270 Word. I remember playing and beating Crysis as a young teen with 20fps average. On the lowest quality settings, no less. I didn't even complain.
Kids, man.
@@keres993 Yeah ofc kids nowaday grow up with this kind of hardware. Like most games need to run at 30fps on consoles on 4k and on PCs like minimum 60fps. Thats like the new standard for gaming and you need to accept this.
I don't know if you mean this "kids, man" in a negative way.
Now imagine if AMD brought 3D V-Cache to the Threadripper line. The Epyc 9684X has 1152 MB of L3 cache.
I was working in scientific computing for a couple of years and this is where these CPUs shine. Brute force numerical calculations - in my case, modeling of chemical reactions using approximations to Schroedinger's equation.
Though scientific computing is a good use case of such CPU, from what I have seen there are not many program/framework/environment/whatever-you-call-it that efficiently utilizes multi-core processing, just like what the video shows. Yes, high performance CPU is helpful, but for a chip as powerful as Threadripper, the software is the real bottleneck, coming from a computer engineering perspective
couldn't you use CUDA?
this more like the optimal usage of this monster CPU, heavy intens calculating task or modelling with some complex algorithm. it was so fun back in my college day at our HPC lab, we try too stress test the new cluster computer at it with our program based on different algorithm;
@@holderuasome types of calculations run better on CPU architecture. Not sure if that's the case for him, tho
@@deansmits006 i imagine so seeing as i doubt the company he works at would drop that much on machines that aren't the best fit for their workflow
Kinda crazy to think that at some point 10 years from now, this will be retro tech going for $18 per CPU on eBay like older Xeons are now.
Haha I had the exact same thought! That and there'll always be those people trying to milk it for way more because "original MSRP was $5k bro!"
sounds like people on facebook market place trying to sell GPUs for 2-3k because they got hoodwinked and finnessed during the pandemic@@MystMyth
x86 might be dead by then.
@@Volker_A4 In ten years?? Extremely doubtful considering how much stuff is built around x86, I'd think it would take decades of gradual change before a complete switch like that happens, at least in the consumer space.
@@Dell-ol6hb ARM CPUs walked out from behind the tree rubbing their palms and go "yeahh bwoyyy"
I'm a Mechanical Engineer and these things are made for FEA and CFD. You're utlizing the multi-core capabilities to the max, and RAM is king in models that easily run on hundereds of gigs of RAM. You're also writing terabytes at a time, so fast PCIe and storage make a huge difference. When people ask who would pay that much for a CPU, it's the engineers who would can use this hardware to iterate faster, save development costs, and even use generative design solutions.
I believe its better for companies to rent out cloud computing services because yk down the line chips like these quickly become redundant.
years down the line, when these chips become extremely cheap (due to datacenters selling off older processors), it's quite a bargain for students.
"generative design solutions" == AI bullshit
Generative design doesn’t use AI it just uses FEA to calculate stress distribution in a part given certain boundary conditions, then it iteratively removes material in low-stress areas until it hits the weight or stress target define by the engineer. Very cool stuff, thankfully doesn’t depend on AI, just math
I'm about to attend college to study mechanical engineering
@@AA-bh3bz Bro relax some people want to make money from a real job instead of joining a pyramid scheme and have the risk of blowing away your life savings.
bread ripper the way it so god damn expensive
enough power draw to burn toast as well
Not for gamers
to be fair, this is barely even meant for consumers.
No poors allowed
As a software eng. I would love to see some compile times (Firefox or Chromium, maybe Linux kernel with all modules) in the benchmarks section. Maybe next time. :)
Gamers Nexus did Chromium compile, it completely shreds it.
Well unless he’s using gentoo probably not. However in terms of giving one an idea of how performant something for compiling/transpiling, building a kernel or a browser actually is a good benchmark and it’s not entirely synthetic either.
I’m sure those who need this capacity know its capability. However this is good knowledge for those who do not need the speed/capacity but want to understand what is possible with the computing power available on the market.
This dude is a gamer, what did you expect? I wonder why he even got the threadripper, what a waste.
@@Gone1229 fr he doesn't even need a workstation
It is not for everybody, but it is excellent for some workflows:
- Virtualization
- VFX (exr formats...)
- Storage and connectivity. You can easily add a lot of nvme working at their full speed and several 100Gb (or 200Gb) erthernet boards.
- Multi GPU applications
The thing is with GPU rendering, I can't use even a 4090 for rendering my 3D scenes, not enough ram, not even on a A5000. As soon as you are using udim workflow extensively with a large number of texture GPU is over. For motion design or product viz gpu rendering is very good, for full cg scenes rip.
For 3D artists out there, don't fall into the "fastest rendering hardware" trap, buy the hardware that can actually render what you need to do.
Exactly. People act like GPU rendering is the only way forward. But as long as RAM is as limited as it’s historically been on GPU’s, then there’s no way more demanding 3D workers will switch.
And that hardware is?
@@hanzevisualsCPUs with a shit ton of RAM?
What about an RTX 6000 w/ 48gb of memory? If the highest end enterprise cards can't handle your scene it sounds like you need to start using rendering farms / clusters
@@D1zZit 48GB is nothing compared to 512GB-2TB of RAM that you can have on Threadripper
the camera angles in your videos are crazyyy just love how you get creative man!
I wish the mainstream regular motherboards looked as slick and cool as these pro motherboards
they could, even with the same price just need to use some (a lot) polycarbonate 🫣
i mean there are slick and cool regular motherboards but you have to pay for it lol. It’s like saying i wish 10k cars looked like Ferraris. The pro motherboars are way more expensive to build
fr
Yeah it’s sick as! So industrial and minimal. Love it.
@@tayk-47usa41 Look at some expensive asus mobos, they are non-slick and extremely uncool, one could even say cringe. It's not the price that's the issue. These companies just don't respect gamers the same way they respect their processional clients.
Honestly pretty tempted to create a massive workstation and run all my family's PCs virtualized from a server rack and fiber all over the house. It won't be the best for gaming but it'll be good enough and it'll do just anything. I can also host my servers and containers too all at the same time.
Bro is way too passionate about his tech! Ah, to be young and optimistic again.
Congrats again for a simple and objective review!
Thank you for the practical reviews/comparisons. Especially for us 3D artists it helps A LOT to see it. Keep up the top notch work!
I still fondly remember my 7980xe 18c36t. I had a GoldenSample and was able to clock it at 1.25v on all core 5.1 GHz with direct cooling. What's actually so fascinating about it is that the CPU alone drew 700 watts on the 5Ghz all core in Cinebench. If you now look at how big the DIE is and then think about the fact that the 2011-3 CPU still breaks the voltage in the CPU..... what Intel built in 2017 is really fascinating!
Clean video and nice editing as always. Keep it up !
Threadripper 7000 is a MONSTER. Makes regular Ryzen and Intel Core look like cheap toys (and that's even with the 7980X being heavily memory bandwidth bottlenecked on TRX50 with 1 channel per 16 cores).
One word for your videos: *Aesthetic*
I like his videos before I even watch it, this guy is by far the best in this area of youtube. Love you bro
the red light lines in your video was a nice touch
Love your new hair dude. Its inspiring
Awesome review, however a bit strange testing this productivity product on rendering. It's not designed to do that and GPUs are far more efficient for it. I'm a power user myself and would kill for one of these. Rendering 3d scenes is a GPU activity, however generating geometry(in Houdini for example) is a CPU task, and also one that requires huge amounts of ram. I would love to see some files compression and decompression for testing raw CPU power.
Lots of production renderers use CPU though, like V-Ray (with more features than GPU), Houdini Karma and RenderMan (with help from the GPU for certain tasks). The stability, scalability and feature-proof nature of CPU rendering is still king for many pipelines
well, that's what he said. For his workflow it doesn't do much.
Came here to say this. However I agree heavy cpu loads should be great to see.
It depends, let's say you re a VFX studio and you re contracted to work on a pixar/disney level of animation quality. Then you ll need much more ram than any consumer GPU has, 24/48GB simply wont cut it for some scenes. In that case you re probably looking towards getting 256+GB threadripper workstations
@@ThunderingRoarsome of nvidias dgx series gpus (such as the a100, h100) can have up to 128gb of ram. though granted they are ai/compute focused cards so not sure how they’d fare in rendering performance
Really enjoyed this video.
The thing on the GPU vs this CPU is on the project size. Blender for example if you make a massive project like for a movie or so, you will use a bunch of crazy stuff that will kill your GPU in terms of memory, but these CPU will compare and won't have these kind of constraints.
both the dude and the cpu is insane
No nonsense, direct to the point, very good quality videos. Keep it up sir!!!
Dude build the most compact monster build ever, would fit your audience so well!
GPUs aren’t kings when it comes to rendering. In some cases yes, but far from all. This CPU would be perfect for something like corona in architectural visualization.
What’s the point in running GPU rendering tests with a CPU for CPU Rendering. In general with CPU rendering, you would put the most money into the CPU and just get the cheapest GPU with the most RAM. That way you still get good performance in viewport, but great rendering power for finals.
It's worth it for context. A lot of rendering can be done now with top end mainstream platform and a 4090.
Threadripper is for when youre serious enough to have multiple 4090s and nvme raid.
I would absolutely love to see you build a workstation with that, considering your build quality it would be amazing
bro your studio looks like batman headquarters
I'd really love to have a dual slot , 8 channel setup. Just for the sheer insanity of it! 🙂 I really like the Treadripper lineup.
I think you guys should all add simulation projects when testing CPUs. I don't know about everybody else, of course, but I'm only considering upgrading to a TR at the end of the year or early next year (ideally it would be the 8970X i think) only because I work with simulations, water, smoke etc as a 3D animator. These CPUs are designed for stupidly big math equations, scientifics things and simulations. So I find it always kinda sad that it's never really talked about as most people interested in these CPUs are actually using them for these reasons!
But Cinebench is a nice benchmark I think, idk how it translates to the simulations and all tho, thank you for the video!
4:35 my understanding was that they used the zen4c cores that are essentially physically more compact than zen4 at the cost of stability, aka no high clock speed, hence why you can have 96 of them but with lower single core performance, so you should not expect them to preform the same.
Very nice video which illustrates how this CPU is a niche item for specific CPU-heavy loads since most mainstream productivity apps such as video editing or 3D modelling & animation apps use GPU acceleration these days, where a beast CPU would be redundant.
Love the Amen Break in the intro haha
Thanks a lot for your work! Mate, you have to introduce some deep learning tests for such videos. I mean GPU performance is critical for NN training but CPU is also critical for preparing and sending data to GPU. So the combination usually gives you a best result
reviewing tech that speeds up the process that you can make tech reviews is the craziest infinite money glitch
That red light on his black shirt looks pretty dope
as a 3D artist myself I think you should test simulation which heavily relay on CPU such as liquid simulation, cloth simulation and... etc in Houdini or Choas Phoenix liquid simulation. Simulation is heavily used in movies and these CPUs seems to be made for studios than Individuals
awesome content, straight to the point, love it.
Imagine how fast you can play minesweeper with this CPU.
amazing tech but arm workout drops when??? 👀
The 12 core combined with the HEDT MB would make sense for something like a heavy duty NAS / small office server.
Crazy power. Amazing video, imagine a itx build with this, it would be difficult to cool, but it may be possible.
The cpu socket alone is nearly big like a itx board.
lmao its impossible and impratical, part of the point on threadripper series is the expansion slots with all the pcie lanes
Doesn't make too mich sense for ITX since the CPU alone consumes like 550W at full load. Air-cooling that in a tiny case would be very loud. A custom water loop is almost mandatory.
Here is to hoping someone makes an ITX board that supports it.... I really enjoy the small PC life.
4:12 actually looks like the 7970X could be a sweetspot. I'm looking at it this way: you could get 2 systems with a 7950X (without GPU) for about the price of JUST the 7970X but you'd need the space etc and you'd have more points of failure, more maintenance etc so I see the appeal of a single 7970X system.
When it comes to the 7980X however, you could probably get 4 systems with a 7950X and then you have more rendering power plus a heck of a lot of flexibility to distribute your rendering and other tasks, kinda like a small compute farm which I personally would much rather have than 1 massive workstation. And I guess that also offsets the downsides I mentioned previously, at least imo
technicaly speaking - HEDT is overkill for us, hardware nerds. ye, this is amazing that AMD or Intel can make CPU this huge and this powerful, but honestly - we have no need in this. i mean, we - PC enthusiasts.
huge thx for this tests!
What about runing a local AI (LLM) model in that Threadripper vs the others?
ur video quality is amazing .... i really like ur videos
Man i would love to buy a threadripper build for my mom shes an interiour designer and would hugely benefit from having an actual workstation instead of her midrange gaming laptop
Great video! You presented their performance really well against mainstream CPUs as that is what info people will be after. I was looking forward to seeing a sff threadripper build teaser
SFF build for a CPU which consumes 500W sustained and spikes up to 550W under load? Yes, a great fit for an SFF system, as long as you have an external 1600W PSU hooked up *and* if you don't mind setting your house on fire.
Hope no one gets the 7980X confused with the 7980XE! Sheesh AMD.
Or 7960X which has the same exact name because the Intel 7960X isn't Extreme Edition it doesn't have the E in the end.
A bit like Threadripper 3970X and i7-3970X, Threadripper 3960X and i7-3960X because back then the X indicated the Extreme Edition
hey brother!! i love you videos! can we see a comparison between the wooting 60he and the new razer huntsman mini v3 pro analog ?
Glad I wouldn't need room heater for winters anymore.
In honor of Threadripper HEDT return, I'm going to rebuild my 2990WX TR / X399 MEG Creation rig. I sort of moved to TRX40 with a 3960X a couple years ago, and instead of getting caught up in the new super expensive TR of today, I'll satisfy my geeky needs by building another HEDT system, my 2990WX.
Finally a worthy cpu for Cities Skylines 2
Oh fuck yeah boys I can't wait to play Oldschool Runescape with this MONSTER.
Damn Man!!! A huge fan of your Videos. Could you share your shooting gear for such clean videos? Very eager for that video.
Maybe late response but he posted that he has a Sony FX3 (Same sensor as the A7SIII he linked in the description) and a FX30 on Instagram. But its all about the lighting.
Seems like this Threadripper is probably good for game developers?
Good mix of rendering and compiling code... it'd be nice to see compile times stats too.
Bro is ripped
In a future mouse video, could you test motion sync on and off? With it on, I feel like I tend to aim short slightly, but idk if it's just me or of motion sync is causing some kind of negative acceleration effect. Love your videos!
Manny the Mammoth reviewing Tech again!
yo Optimum love the vids bro. Would love to see a video on that new mouse latency device you got. Really curious to see what the fastest mouse clicks are out there
You should collab with a woodworking channel to make a desk with you, since I remembered that you wanted a change in an earlier video. Cheers.
I'm absolutely need this for excel data processing😂
Is there rly a reason to go threadripper over Epic?
Great Video, What PSU (Watt) should I have for a 7960X and a RTX4090?
Hey Ali, seeing that you shoot on Sony cameras, their compressed codecs wreak havoc on CPU's and GPU's, for some reason unlike the FX9/6 that have the proper, XAVC-I implementation that is very easily decoded, the FX3/30 and A7S3 don't use those codecs and chug a lot.
If you record externally to ProRes, you'll be surprised by how well it runs. You've used Blackmagic RAW before so I'm sure you know :)
just the casual "(monster)" in the title got me rolling.
Never thought about optimizing media at the end of a project in DR. Gonna have to try that. 🤔
istg I want this cpu so bad 💀
That can be your server, smart tv, game pc, work pc. All your home computer needs connected to this single monster, it's actually really cheap.
Gaming sucks on those kind of CPUs
I need his workout routine ASAP
another great video, but wait what is intel's version of threadriper?
Maya Arnold renderer and Bifrost simulations will benefit TR as those can cripple ANY cpy/gpu quite easily
I'd love to see something that's cpu only, like compiling firefox. It should scale really well
I have a question do you think you could make a video on the Razor Huntsman V3 Pro keyboard because from what I see its the faster than the wooting and apex pro also great video!
A monster, with a monster price tag.
Ali - why do you still use the font Calibri in your videos? Seems at odds with your otherwise super premium feeling & amazingly made videos to use the default MS Office font, don't you think? Why not just stick to Inter for everything?
9:30 : the good old looking bios
im still chilling with a ryzen 7 1700 since launch day and its still going strong
You can buy used 3rd gens if you're ok with that
@@rocky-zx6kq huh?
@@O6ias an upgrade. The latency benefits are worth it
@@zoomzabba452 oh i dont need to upgrade is what im saying lol. The og ryzen 7 1700 still works fine ive never even hit 70% usage yet 😂 pc is fast as f still
I would love to see your monster setup with this CPU
optimum what paint do you use in your recording studio i love that gray color
Thread ripper has always been my dream CPU to own. 1 day I will when they come down in price. And there's second hand in mint condition. But do I need it? No, but I don't love to have it. It's something that has been my dream because it can do. Anything can game making run a server or can do anything. And with the speech today in the internet, you could run a nice gaming server and have all your buddies. Get on and that's what pleases me about technology. It's so amazing god bless the makers and people who continue this fantastic gift❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤
So the upcoming AMD EPYC 'Turin' Zen 5 CPU with 256C/512T, up to 600W on TSMC 3nm would be just as quick as an NVidia 4090 at rendering. Also checkout IBM Northpole which just blows everything out of the water. It's 8x faster than NVidia A100 and 6x faster than their newest H100. Just wow.
What about AMD's SAM (Smart Access Memory) using EXPO DDR5 and a AMD GPU? If that's supported with the 7980X...
thats not how sam works
reasoning for the "cheaper" board supporting the best CPUs is the fact that generally CPUs get bricked way less than motherboards over long time.
thats why all the old server CPUs are so cheap now. but not the motherboards. so in the long run people can upgrade if they want when the "best" becomes cheap.
My grand son is playing Apex a lot. What is the best PC and monitor for that game? He is a good boy and he deserves the best. Thank you for all the answers.
I have a server with two 128core epyc cpus and 8gpus for rendering and simulations
Pretty sure that the board supports 2 PSUs for redundancy and not necessarily overclocking 😅
Great video as usual. One thing tho: For someone who uses threadrippers, those gaming numbers are great. They only compare to the current one they have. The cheaper CPUs aren’t even an option so the comparison is pointless for them
Would love to see how these compare to Apple's new M3 line up. Hope someone makes a comparison video soon.
Nice and informative video, I'm curious about the maximum temperature of the 7980 X during rendering at 100% load. In 3Ds max/corona renderer it goes up to 93 degrees and is this normal? With 360 mm thermaltake cooling
It's a good day when Optimum uploads
The point of a CPU like this in Eevee is that you can run multiple instances of your render at once (as many as you can fit into VRAM).
...which realistically is 24GB, unless you're crazy.
good video man!
Wish you did a code compile benchmark as well, many people would pay the price to reduce that time in half
Great video. I am wondering if I wanted 256GB DDR5 what other motherboard could I get? ASUS Pro WS TRX50-SAGE oly has 4 DDR5 slots :( and there aren't any 64GB single memories in the market.
Turns on 7980X, city blackout immediately follows.
Im happy with my 7800x3d tat thing is just "chefs kiss". Optimum i wonder can we see a video on the new razer huntsman v3 pro lineup keyboards as well as drunkdeer gaming keyboards which both are new to the rapid trigger lineup. I know ull be bc but i just hoping to get a certificate from u
This would be great for Minecraft