Id have gone a 4xxx series RTX as they support AV1 codec and learn to use DaVinci Resolve (i got a KEY with a small Keyboard) good setup for YT videos if you U/G it later
@@aChairLeg Checked out a few of these systems on ebay and can't find anything close to the price you paid. Nice work as it's quite a workstation. I was thinking of replacing my x299 socketed i9-9900X with one of these, however that PSU is a real downer. I recently purchased an ASUS ProArt 4080 Super and it's powered by 3x 8 pin PCIe (with the included adaptor). Strange that it has so many full length PCIe slots with nowhere near as many PCIe power cables as you'd need.
for your passive cooling test.. look into getting a tesla m40 24gb pg600 (the pg model will indicate internal generation which means better write speeds, so the pg600 is the last of the line and fastest. They are only about $80, but are server grade with no peripherals so you need a quadro 600 to pair with. You designate the program to the Tesla, and route the output to the Quadro. It basically has the same architecture and frame as the Titan X so use those drivers, and any 3rd party stuff like water coolers meant for the Titan X to get compatibility. Hope this helps, it took me a fair amount of research to piece this stuff together to build a VDI gaming server with a few linked together with NVlink for shits n giggles. It's probably the cheapest $/GB gpu solution currently, and wild to build out ~100GB GPU for the price of a 3060 * but you definitely will need more cooling for them, they are power hungry fuckers at 300W each, I highly recommend getting the 3d printed fan adapters for them on ebay; they operate around ~220wat nominal in my work loads, and youll want to get special splitters that turn TWO 8 pin power supplies into one 8pin for each GPU, as each 8 pin is only giving around 150W, and you can't use 6 pin converters because 6 pin is only around 75W. Power is probably the main reason they're so cheap and people haven't caught on to it as a solution, well that and not many people have motherboards that support multiple GPU/NVLink operations anymore but realistically it's so easy to build it out on something like you'd built here and give them 2 PSU, and set it up to only run what you actually need because frankly even 1 is a beast let alone 2 paired with NVLink or FOUR, so just set it up to run 2 per cheaper PSU, with one PSU as the master for everything else for a motherboard, and the other only for the GPU's themselves. In a configuration like yours, I'd have a PSU configured to power 1 on each motherboard, then the 2nd PSU turns on the 2nd for each motherboard. Shouldn't hurt them to not have that PSU controlling power to the GPU via motherboard communications, they're power hungry server grade fuckers with their own built in power management because they're designed to be plugged into stuff that might not necessarily support controlling power for it, this will also help keep them from all trying to pull the potentially 75 watts feeding from the PCIE all at once and will reduce the heat on the motherboard, and with a modular PSU, you can route the PSU outside the case to redirect that extra heat elsewhere. Food for thought for anyone going down this dark road. I suggest getting a v4 though, they've been really coming down lately unless you want to attempt the EPYC dark road, but frankly, there are benefits and arguments to be made about getting a dual EPYC SP3 board and only building out one 1 at first, with the expectation of using the other half for a future 2nd CPU when you need to expand for more power to make the machine more future proof. Bare in mind the EPYC cpu's noted with a P in their name don't like being configured for dual use. For instance the EPYC 7502p vs the 7502. Anyways, that's just what I've been playing with lately for an RSPS
the thing about old workstations is, that they were once worth thousands and thousands for a reason! a 10 year old workstation can go further than most people realize
The other side is that these beasts are designed to be able to work 24/7 for a few years. And because of that, often they have been run 24/7 for years. I had a Z800 (well I still have it, but it's dead), and eventually all the RAM went bad and had to be replaced. Then suddenly it just died on me.
Until they break down, past warranty and no vendor support. We had to tell a client we can't support his system any longer. Motherboard issues along with part failure.
Dell T5500 144GB ram 2 X 5670, 2060 12GB GPU, got the box for $100 AUD, ram for $25 each 9 sticks, nvme, USB 3.1 card, cost in 2021 to build the system about $800. Win 10 pro OEM DELL. Fantastic video editing machine.
@@aChairLeg it is, I got a chinese motherboard and built a dual Xeon e-5 2698 v3 CPU's with 32 cores total 3.6 Ghz and 256gb of DDR4 and 1tb m.2 with a 3.2 TB 12gb u.2 drive and a 1080TI to top it all off. and running linux. The CPU's have taken every single large task I throw at them no problem, I will be rendering in Blender and doing local AI at the same time while webbrowsing and have no slowdown whatsoever. I only paid 500$ for my system too, since I reused the Graphics card and power supply. It is simply a dream to work with and I'm saving up to upgrade to a 3090 TI since I need that VRAM and raytracing for 3D animation and AI. RTX accelerates AI and 3D renders and my 1080 TI isn't cutting it anymore since it only has 11gb vram and no RT. old Xeons absolutely rip even new AMD CPU's because of the lack of cores, if you have a single Xeon CPU like the 2699 v3 with 18 cores i've seen it compete with an amd ryzen 7 3700 despite being old x99
You can also feel above most enterprise hardware by buying DIY prosumer/workstation class hardware instead, like the ASUS WS C621E SAGE (X299), ASUS Z10PE-D8 WS (X99) or ASrock X79 Extreme11 (X79) and multiple Chinese brand X79 and X99 boards (new), for example, which are more affordable now and good value since on the used market and Chinese ones are new. I guarantee many of the boards are above quality than say many Supermicro & Tyan boards, a good chunk of HP, Lenovo and Dell boards (and many also have IPMI, etc). Only the very high end and truly proprietary boards are really different enough. =)
Literally cannot go back once you went workstation grade hardware. That stability is amazing. I got a Xeon w5-2465x with 128GB DDR5 ECC RAM, plus a RTX A4500. Storage I went with Optane 800GB U.2 drive, and a Micron 7450 6.4TB drive. Pricy yes, but stable and reliable.
You probably should spread RAM chips evenly between the CPUs. Otherwise, the second CPU is starved, as it has to stall whenever it needs data. The second CPU board has memory slots for a reason.
I too got a T7820 ($200) and after upgrades it's running dual 6126 for 24c/48t and 12x32 (384GB) RAM. It's a very capable platform for workstation and home server duties!
Holy moses you got a good deal!!!! A T7820 here is about $1000 with some pedestrian CPU's and 64Gb RAM. Just 12x32Gb would cost $600 if not more(€50 is about the lowest for 32Gb RIMM per module). I got the cheapest T5810 I could find for €180, came with a E5-1620 and 32gb in 8gb sticks. Now it has 96Gb(€120) and a 2683v3(€30) . If I could find a T7820 for €600 I'd be pretty happy.
@@jainayrogeorge2924 Like most PC's, with the appropriate software, yes. Some hardware you need to be a bit selective about, but basically all PC's "support" virtualization; You'll want to have enough ram, cpu cores and supporting hardware for your VM's though, my box right now has 14 cpu cores, 96gb ram, 5 gigabit NIC's, that's enough to run a few decent virtualized servers.
Dude i feel you. The pain i receive daily tryng to shoot videos is incomprehensible to my old self. I was a passenger in a car accident in 2018. Almost died, but was blessed to walk away with broken ribs, herniated discs and vertebrae knocked out of alignment thru my back n neck. I had to medically retire out the military n adjust to a new life. Im glad you were able to walk away and have a solid suport base. Just found you, but love the video and your perseverance. Keep going
@@RushinVr6 yeah that's pretty amazing. Kratom is insanely useful though. I know a couple people that went through hellish accidents that have used it for over a decade now. It's finally getting studied thoroughly in depth in the states for the compounds that are found in the leaves, but it will never be used to treat pain because it would ruin big pharma's opioid dealing
Lay a red light strip on your back, it regressed my disk hernias, not far infrared becuz that will burn your eyes, just led red light. 10 to 15 minutes a day for 14 days straight. Repeat as necessary, wear goggles.
I rarely sub off of 1 great video, but you supporting that this is a norm for this channel, I have subbed. Also, very rare, I added this video as a FAVORITE. A FAVORITE! A truly rare event! Why? Because I might build one of these.
I Just got a T7820 w/ 4110 and 32gb of ram that i won a bid on for 178, after upgrading the cpu to 6140 and 48gb ram and selling the wx2100 im still under 200$, love these older Workstations,
I asked this question on your Discord as well. Apologies for the repeated wall of text. I like the idea of the 40 core budget monster for video editing and transcoding. That leads me to my question here. Why don't tech tubers ever show things like numbers of simultaneous transcodes with Plex, or Handbrake HEVC veryslow transcodes or AV1 preset 1 or 0 transcodes? Most just say something like "Handbrake transcode speed in XX minutes," but never say their settings. I really want to build this 40 core 80 thread beast, but I need more info and stats.
I wish there were an easier way to cool these. I had to design custom ducting, fan mounting, and a service for Linux to dynamically adjust the fan speed.
@@execration_texts Hi, sorry for mu bad english, I have Tesla m40 and installed heatpipe of nvidia 980ti with 3 fans max 4.000 rpm and under stress it reaches a maximum of 65 C :) Now I trying to cool the Tesla p100 16Gb :)
@@execration_texts I thought there were aftermarket powered fans that fit on them, so the only thing needed is a spare fan power connector ... and a wide chasis.
I saw this and ended up getting this same barebones 7820 and OH MY GOD. if they didnt use all proprietary parts i would never build diy pcs again, and personally its gonna take a lot to get me off the used workstation train from now on. awesome video
You should replace the GPU with an ARC A750. Hear me out. QuickSync beats the crap out of CUDA on encoding, and you can encode in VP9 directly so you save time on the UA-cam conversions. Try it in a video with this nice encoding CPU setup and you’ll see what I mean. I replaced my Quadro RTX A4000 with an ARC A750 and I get 40+ 4K HEVC h265 to 1080p h264 transcodes simultaneously now in Plex with zero buffering. However I also have 4x 4TB PCIE 3.0 NVME SSDs (bifurcation, which your motherboard supports) in RAID 5, so 12TB total as my storage, and transcode drive is a 128 GB RAM Drive that dumps to disk before a reboot. This RAM Drive dump happens lightning fast too. I might increase the RAM Drive to 192 GB because I have 256 GB DDR 3 ECC total, and I can never seem to go over 50 GB of RAM usage with 200 chrome tabs open and other apps.
7 місяців тому+4
What is your base workstation setup (based on)? What RAM disk? What RAID card? How much did your setup cost? Could you please elaborate? Sounds like you use it as a home media / lab server setup, exactly what I'm thinking of! Any feedback how it handles virtualization & *especially* what's its power consumption like & how loud it gets under different loads? Thanks in advance!
@ RAM disk is probably just system RAM used as a storage device with tmpfs or ramfs. Likely not a RAID card, probably a PCIE x16 to M.2x4 splitter with software RAID 5.
@@davidmcken I’ve never tried, but my personal opinion is that the A580 or A750 are just priced better for the performance you get. If you wanted to squeeze every possible transcode out of the setup? Probably, but it’s not worth the $100 premium over the A750 and certainly not the $140-ish premium over the A580. Maybe a 10% increase in transcodes? Maybe 20% increase? Definitely not worth double the price over the A580 or the 50% price increase over the A750. A580 appears to be the sweet spot or the A750 if you have the extra $30 budget is a no brainer. If you have a desktop Intel CPU, remember to disable the integrated GPU completely prior to installing Plex. And for me, Windows appeared to give me better transcode performance than Linux, just another hint that could hopefully save you some time. Intel still has work to do on their Linux Arc drivers.
@@bleeb1347 ok, np. I am seeing A770s for about $300 atm and I'm looking at one since level 1 techs are hinting at it being able to be cross flashed to a flex opening up the possibility of SR-IOV as well and well 16GB of ram seems reasonable for the price. If ffmpeg transcoding performance comes free for the ride it's a good card for a home lab server being that multipurpose, at the current price might as well "splurge" for the highest model so I have space to do stuff like OpenAIs whisper or anything else I might want to throw at it on the AI front.
My first Dell Precision was a T5600, which I got for free from my employer (gifted/fully written off) after it was decommissioned in 2017; plus an extra 825W PSU that had been trashed along with another T5600 chassis. Still have it as a backup unit at our second apartment. I upgraded it to dual E5-2680's (from dual E5-2667) back in 2018. The only investment was a 0.5TB SSD, single use Win10 Pro license, and repurposed RX590.
Hello, fellow bad-back haver! Do whatever exercises your doctor/physical therapist gave you, on whatever schedule they gave you, or on some regular schedule if you didn't get a schedule. Also, get a back brace. Also also, if your car doesn't have lumbar support for it's seat, consider getting a lumbar support pillow. I found one at an auto parts store and I wanna buy like ten more because that one took me from not being able to comfortably drive to often feeling better *after* driving than before!
I got my 5810 from the same place a couple years ago. Not a powerhouse but I was super happy with the purchase - it has been my "living room"/media PC ever since... and I play pretty modern games on it all the time. 1650 v4. I also do some music production and it handles a crapton of channels and effects quite well, especially after a small ram upgrade.
Crazy that the 14700k gets 36000 points in cinebench r23, and for a while you could get cpu ram and motherboard for $500 in a bundle deal at microcenter. Of course, the final PC build would end up being twice the cost of this, but pretty crazy how far hardware has advanced.
I got the single cpu thinkcenter cause it had a good 8 pin for a 3060 12gb. Im never going back. Came with a xeon 2135 and 64 gb of ram for $190. This thing just runs beautifully. Congrats on the build, definitely use earlier videos you made to look further into this so thank you.
@@kevboost I needed cores over speed. Nothing I do is terribly cpu hungry but I do amor of different operations at once. 2135 is a 6 core and all I could really afford at the time. I'm deeply looking at a more core count processor or going to 128gb of ram next
I understand that the Xeon gold 6138 only have AVX 512, unlike the 6148 that have more instructions. Is a problem the difference between instructions? Or the Xeon gold 6138 it's pretty good with just one instruction? Despite that incredible video 🔥
Just so you know, a 7950x3d benchmarks in cinebench around 38k when optimized(my personal workstation cpu), above a threadripper for a little less its under $500 cpu now. But thats a killer value build i love it!
Arrived for the tech, but fell in love with your cats. My twin brother has a tuxedo marked cat, who knows what his breed is, he was a Humane Society rescue cat. But Toby-cat Martinez was supposed to be for my brother's 11-yo son. Like his Dad, he would have preferred a dog, but Toby-cat was whathe got. My nephew did non of the upkeep work for his new pet and Toby-cat quickly figured what side of his bread was buttered. He soon became my brother's alarm clock and I watch my twin from become a dyed-in-the-wool dog dude to a Cat Dad. Pretty soon he was serving Toby-cat 3 cheeseburgers a day (kidding, that is our fav food). Through COVID, my brother's office became his home office, with Toby-cat either at his feet or making his presence known when he wanted back into the house. Both your cats are adorable--says the twin who has always loved cats.
As a reference, while this system can handle 768 GB of RAM across two CPUs, a Threadripper 7xxx system can take 2TB of RAM. And that's on a single socket board. If you went dual socket EPYC, the max RAM would be 4 TB.
I've seen a dual socket mobo bundled with two 7501s on aliexpress for $800. Not sure if it's a scam or we're entering an era of cheap old epyc workstations
I got a T5820 during covid when everything was unobtanium. Replaced the CPU with a 12core v3 CPU and that workstation did everything I needed until I built myself a new machine. Since then it's been my proxmox server and I'm just getting around to replacing it with an Epyc system. For the ~$400 I put into it it was definitely worth it.
@@dimidimi6243 I know it's months later but in case anyone else is interested a Xeon E5-2678V3 was the replacement CPU. Cost me $80 back in 2020 you can get them now for $20.
Video memory issue can fix Tesla series from NVIDIA for example TESLA m80 goes for around 100$ and has 24 gigs of GDDR5. Athrough you will need to tinker with cooling.
Last year I bought a barebone Precision 7920 (wider 7820 with a bit better cooling and more expansion slots) for $460, a pair of Xeon Gold 6146s for $230, and 128gb ram to use as a home server. I loaded it up with 8 3.5" HDDs, a few GPUs, and installed Proxmox. I ran Cinebench R23 and got 1084 SC and 27794 MC. I ended up selling it (for a profit at least) cause I really just didn't need it lol. It was a beast thought and I wish I had found a reason to keep it. Cheers
Very nice. I also considered a very similar route with either the Dell or the Lenovo dual CPU machines. But there is that issue with a lot of software unable to make use of both CPUs. I watched a LOT of Miyconst videos as well. Turns out that there is a Xeon chip with 18 cores and 36 threads, but not all cores can Turbo Boost. Except Miyconst and his friends found a way to unlock Turbo on ALL cores. It's the Xeon E5-2699 v3. This is only a 3.6GHz boost clock. But with 18 cores, even at that speed you need some pretty aggressive cooling. HOWEVER, it turns out that there is a special version of this CPU. It's NOT on the Intel Ark database, which means that this was a custom configuration for a single OEM. I'm guessing for a Mac Pro, but I have no idea. What I DO know is that this variation, the E5-2696 v3 is the same silicon as the 2699, except that it has a max boost clock of 3.9GHz. Plus it has some additional instruction sets not found on the commercial version. And thanks to the BIOS injection revealed by Miyconst, et al, with water cooling that means 36 threads at 3.9GHz, which should be buttery smooth for editing. I'm a little chuffed that my 3080 FE only has 10Gb of VRAM, but everything I've read about the cards packed with RAM chips from this generation forward tells me that they are just running too hot. They will work fine for a while, but sooner or later they are going to fail. Not like my GTX 980 Ti that is as good as the day I bought it. In fact, it's going into the same rig. And I can keep the VRAM on the 3080 at 80° max under stress tests And aside from using the 980 analog output just for dedicated gaming on CRT (that never gets old), I can still do things like dedicate OBS capture to it, separate from the 3080, and use its 6Gb of VRAM for additional compute power for rendering or whatever. The machine is 3/4 built right now, just modifying the case for an external drive bay and add some additional fans. I'm making the case have positive air pressure to keep out dust. All said and done, it will have an MSI x99A MB with the OEM version of the 2699 v3, 64Gb of Four Channel ECC RAM, a 1500w Seasonic PSU, a Corsair AIO, the RTX 3080 and GTX 980 Ti, 2.5GHz Wi-Fi, a 500Gb NVMe Samsung 980 system drive, plus a 2Tb Samsung 970 EVO working drive, a 16Tb Enterprise HDD for cold storage in a modified mid-tower whose name just left my mind. The popular one with the metal bar that hides the cables? And I've started a separate NAS device also with a couple of Enterprise level HDDs that will only serve as backup. And once any backups are complete, it will be disconnected from the network and the internet. But it's fun as hell building these kinds of rigs, and seeing just how much performance you can squeeze out while doing it on a budget. I love what you've done, and I'd REALLY like to talk you out of that Titan X. That is the VERY BEST card ever made with an analog output. Almost identical to my 980 Ti, with double the VRAM. I have a nice water cooler if you ever want to let it go....
Same here, dual 2673 v4s with a radeon rx6800xt and 128g of ram. Love my Z640. Thinking of going with a HP Z4 G6 build for gaming...once again on the cheap.
Nice! I just recently got a T7910 so a little bit older, but 28 cores, 56 threads + 64GB of DDR4, and a gtx 1080. oh and a 500GB nvme drive with adapter. Its quite a beast I might add. I may eventually drop my RTX 2080 Super in there.
I was racking my brain trying to figure out how you got 40 cores under $800…when I have a 36 core T7920 under my desk. Lmao I’m stoked Skylake-EP is getting super affordable. Big core counts, lots of PCIe lanes, and common bifurcation support makes it a nice upgrade over Haswell/Broadwell. Mine’s running a ProxMox lab after I pulled it from NAS duty in favor of a 4c Kaby Lake Xeon.
Older tech is cool. I got a Dell PowerEdge R620, 2xCPU, 144Gb RAM, 4 x 1.2TB SAS drives altogether around $410. I also got a Dell PowerEdge T410 for $50 then put 6 x 6TB SAS drives in it. I also found a Lenovo B490 laptop at my brother's office with no power supply. I happen to have an extra Lenovo X220 power supply and it worked. $29 for an SSD and got 16Gb RAM from old laptops and it's all good.
Hey if you use zfs + zstd compression you can achieve (# of drives) x (speed of single drive) x (compression ratio [I get between 1.7 - 3.64x compression) Resulting in crazy fast storage, which also happens to have a ram disk cache (ie zfs arc) I'm getting nearly 14GB/s read speeds off a laptop with x2 gen3 m2 ssds...
7 місяців тому+1
I'm *REALLY* interested in this setup's power consumption & how loud does it get under different loads. I would give it a try without GPUs as a home lab server with virtualization. Would you 🙏please consider making such video? Thanks!
Funny that you bring Miyconst up. I had a little talk with him about a nas server solution not long ago. I think he is very knowledgeable in this field
I use an 8U array two IBM x3850 X5 configured in '2 node' using QPI cable interlink, 8x Intel XEON E7-8870V1 (OctaSocket-DecaCore). Took me ages to source the QPI cables. nVidia GTX 770 2GB is an entry level GPU for 4K desktop real estate, of which I use a 43" HiSense SmartTV.
I like the Quadro GP100s. Why? HBM memory (lower temps) and if you add another GP100 and NVLink bridges the two cards become two Tesla P100's with on board cooling. :)
Old workstations are amazing. I've used an old HP Z200 with an Xeon X3440 and 16GB to host a minecraft server and home NAS. Upgraded it to a HP Z400 with Xeon W3565 which now has 40GB, still need to get another two 8GB sticks for the full 48GB you can fit.
I typically stay away from that gen xeon for budget workstation and instead go first gen EPYC as the are super easy to OC and Enermax TR4 (gen2) has no problem keeping them cool. My 32core is at 3.8Ghz and u get more expandability and can go 64 core if need be. For storage i use 2 asus hyper m.2 adapters with 8 nvme drives (all 8 in raid zero with immense bandwidth / speed cause i dont store anything permanent there just work and games) and I have 4 8TB HDDs but I use Primocahce to speed them up immensely by using ram plus an extra nvme as cache (apart from the 8 nvme on the asus adapters, the motherboard has 2 nvme slots plus i have extra pcie slots free. This is my baseline config for all my budget EPYC builds.
So glad you mentioned Miyconst; so far he and Tech Yes City are the only ones that routinely revisit Xeon cpus for pro-sumer applications. I wouldn't have done any of my computer builds without them.
Xeon gold 6148's are going for as low as $80, which are the same core count, but higher clock count and higher single core performance too, 6138's are going for as low as $35, which isnt bad at all
Good video. I just wonder how close that an upgrade to a 5950x from a 3700x on your original rig would have gotten you to this Xeon tower's performance.
I want to eventually test that, but I needed a second PC anyways so I wanted to try something new. I'm guessing the performance isn't too far off honestly
Damn, I get absolute shit performance on my 5900x in Eco Mode, I'm tempted to run R15 in PBO and see if the space heater amount of heat it generates is worth it
A 5950x would have better performance for majority of things. The Ryzen has a higher single core. Most games and workstations do not need 40C/80T. Also, not sure if you know or care, but your idle consumption is probably in the 130w range now, with that system easily pulling 250w under load. Once again, awesome pcie expansion of you need it, but completely overkill for what you use it for. I have the 5950x, a dual Xeon scalable gold 6138, Xeon e-2146g and the i5-14500. The 5950x with dual Nvme drives, AMD 6700xt with 3 monitors pulls about 90w idle, 150w under load. Golds are at about 70w idle (just boot drive, 64gb ram,10gbe nic), and goes to about 250w under load. 2146g pulls 48w idle, 80w load with dual ssd mirror boot, dual mirror nvme, dual mirror 18tb enterprise drives, 128gb ram, with 10gbe nic for the truenas server. 14500 has single nvme, 64gb ddr5 and 2.5gbe nic. It is used for Plex server/transcoding and home assistant under virtualized proxmox. Idle is 20w, load about 40w. Either way, I understand the cool factor of enterprise gear. Realistically, it is not efficient nor suited for most tasks, nor really great for a home server either.
XEON GANG LET'S GOOOOOOOOOOOO! I bought a used Dell T5810 Workstation, added a second 200-something (240 I think?) GB SATA SSD, slapped in an RX 6600 and initially ran it with 12 Gigs of DDR4, but recently raised that to 20 GB of Ram, and replaced the Xeon E5-1650 V3 that it came with an E5-1650 V4, and I'm gaming pretty well. My next planned updates are to go for a Xeon 2697 V3, more SSD (maybe going for a 2TB M.2 and using a PCIE to M.2 converter), and maybe in a couple of years, getting a 6650 XT or a 6700 XT.
I have the following as my main music making rig Dell Precision T3600 : Intel Xeon E5-2690 @ 2.9Ghz (8C/16T) // 32GB ECC RAM// Gigabyte Nvidia RTX2060 6GB//2TB Samsung 860//2TB WD Black// it does so good
Was there a noticeable upgrade difference using the 1650v4 over the 1650v3?? I built an x99 platform gaming pc/media server last year and decided to stick with v3 gen cpu's since they have overclocking capabilities when it comes to gaming performance. I was super lucky and won a bid for a 1680v3 for around $60 on ebay not to long ago as well so i swapped out the 1660v3 i had bought a few qeeks prior with it and it definitely made a small increase in performance but idk if it was justifiable. This UA-cam channel called "miyconst" is like the official x99/xeon tech guru of UA-cam and his testing shows the best cpu ultimately for this platform is the xeon 2697v3 but locked at max frequency boost which would usually require a true x99 chipset motherboard. I thought about going with a xeon 2689v4 if i can find one for reasonable price but i really do appreciate the ability to tinker with overclocking the v3 cpu's. Just curious what your opinion is going with the v4 over the v3 cpu's?
Bought an HP 620 workstation off Ebay, upgraded the Xeon x2 cpu(s) and Video Card and "Boom" a powerful workstation that games. Only down side is it produces a lot of heat. People underestimate old workstations. Best value/$.
I would've loved to see how much power it draws at it's maximum potential but regardless, amazing video, nice cinematic shots and great story writing skillz, hope you have a safe recovery
I got myself a home server which is a hp z640 and has dual xeon e5 2680 v4 so I have 28/56 of broadwell not skylake and 128gb of ram. I'm also running Truenas which makes its so all programs are numa aware and will run across all cpu cores. all in spent like less than 300 usd and that thing is beast for anything I throw at it. my average cpu utilization is about 2 percent with a boat load of services running.
@@RaduRadonys mate it was less than $300. I also use it as a blender render node so it'll max out occasionally. The ram is for when I get around to it ill get myself a 10x2gig nic for my main pc and the server and just use it as an external drive with speeds up to 2.5GB/s which is on par with some lower end nvme drives.
@@RaduRadonys Also do you use 100% of your car? I doubt all seats are full constantly all cargo space is full always, in day to day life you probably could get away with a smart car, but no one buys one because it's nice to have overhead. It applies even more if you own a pickup and use it as a daily driver.
Will it run Stable Diffusion? SDXL? SVD? very curious if I can go this route with an old 1080ti and get SDXL running. I got hit like that outside Kuala Lumpur.. by a giant dumptruck who was trying to pass on the left while we were in the right lane waiting to make a right. Right-side driver.. I was passenger seat.. seatback broke.. sent use into oncoming traffic.. which was light so no further impact. I was flying home that night.. had to sit on a metal milk crate in a cinder-block tin-roof "police station" to do my witness report. I made the flight and stood up in the aisle for most of it because my back was killing me. I still wonder today what damage was done that is nagging me now some 30ish years later.. so very good luck to you, Sir, on your recovery.
Yep I'm running A1111 on a similar machine, but 80gb ram & a rtx3060 12gb. Keep in mind tweaking may be necessary due to weird AVX support. Strangely Xeons don't have consumer elements due to speed, I think the Xeons shown probably are o.k.
Gah, I have a thing for these massive CPUs, and you've just informed me that they're relatively accessible... Stop giving me ideas, I'm already spending too much money!
This is the same thing I did, I was looking for a PC to run my firewall, I went on eBay and found so many different prebuilt Lenovo and Dells that was much cheaper vs building yourself, eBay is flooded with these workstations. Only downside is that power supply, like you mentioned
I would use server as base. Something like Dell R720 or R730 or equivalent tower model. (I may go with EPYC 7D12 and H11.) GPU, I would choose P40 or P100. They have tons of memory with Pastcal core. And much cheaper due to lack of video output. If not install in a server then you need to 3D print a fan shrout for it. NVMe, I like PM983 and PM9A3.
Hi . Amaizing info. I am doubtful that both the xeon 6148 gold will be utilised/recignised by win11/10. Some say some cpu clustering issue by os. I am not aware of the exact problem but some switched to Linux. Did you have similar issue with your build? ?
The problem is that the bare bones workstation is still expensive in some countries and the motherboards that dell makes are custom design. Is there a motherboard out there that is cheap and widely available?
5:38 it's 6 channel ram. (not quad, not dual) So you should run a ram speed test and tell us the results (i assume you have 12 sticks of ram so all slots are populated?)
I did one similar to the first idea you had with the watercooling build back in 2016, I regretted every single decision I made with dual E5-2690v2 that chucks power like it was nothing, and the thing is I used it as my NAS + Web host + VM test field, that was the worst build I made value wise, besides those abandoned watercooled PCs I made even earlier. At least it runs fine.
Put a Intel Arc to decode the h.264 or h.265 via quick sync and you can use this system or your previous system whitout problems, worked wonders for me.
I got that SSD, make sure to do the firmware update on it from solidigm. Mine was 6.4tb for $350. Pcie4 rated for 5dwpd for 5 years. Dell still lists the exact same drive as a bto option on their server builds for $18k. Mine had zero hours. I have been so frustrated by computers in recent years. Mostly expensive for slow (single threaded) server grade hardware. Last summer I deployed an epyc 9124 (zen 4 3.7ghz) with a supermicro motherboard so I get 12 ddr5 channels, 112 PCIe lanes, and decent single threaded performance. The single threaded cpu performance was 2 times faster than the fastest frequency optimized zen3 epyc.
I used to be a big proponent of re-using workstations, but with the impending Win 11 hardware apocalypse with the EoL October, 2025 for Win 10 then it will be harder to justify using older hardware. This is a shame, as I have given away an old Z620 and a Z820 in recent months which was still going strong. This means you're likely have to swap to Linux if you can if you are using Da Vinci but leaves the question of other software support under Linux. You could use the rig in a air gapped configuration, but it will be only a matter of time before software applications will cease to support Win 10. Sadly even my Z840 I just decomm'd doesn't run Win 11. It may not be as big as an issue next year picking up workstations, but you're likely looking at systems from 2021/2022 onwards and the mis-matching of which Xeon's are actually supported. Now I know there is workarounds for Win 11 but MS is closing them down.
My guess...with the impending economic crisis and collapse of new HW purchases (see latest AMD sales), there will be HUGE pressure on MSFT to relax the TPM 2.0 requirement.
mate did something similar, he was looking at new intel desktop chips and I just put it to him, have you looked at old servers. 40 cores, 80 threads, 256Gb of ECC RAM, redundant power supply and dual m.2 nvme drives in a single slot rack mount cost him less than just the i9 CPU and motherboard he was looking at would have cost.
Bought a HP Z620 years ago for $200. Upgraded the Xeons to E5-2667V2. It came with 64 MB EEC, two 2T hard drives , and a 6MB video card. For maximum productivity at minimum cost, old workstations are the way to go.
@@dave7244 If you aren't computer literate, you buy a machine from Best Buy or a iMac. John Wayne said: "Live is tough, it's tougher if you're stupid."
I got a 6148 on an asus c621 sage mobo and an A4000 on it that I use as my home server. The 6148 has the same single-core turbo as what you have and it's not really snappy in windows. I'd go with a 5950X and 64 gis of ram if I were you.
Join the discord!!!!!!!!!!!!!!!! discord.gg/2Wj8WanUzn
Id have gone a 4xxx series RTX as they support AV1 codec and learn to use DaVinci Resolve (i got a KEY with a small Keyboard) good setup for YT videos if you U/G it later
@@mrsrhardy ANYTHING newer than the Titan is a good choice. Most likely an A770 or RTX card will end up in this workstation
@@aChairLeg Checked out a few of these systems on ebay and can't find anything close to the price you paid. Nice work as it's quite a workstation.
I was thinking of replacing my x299 socketed i9-9900X with one of these, however that PSU is a real downer. I recently purchased an ASUS ProArt 4080 Super and it's powered by 3x 8 pin PCIe (with the included adaptor).
Strange that it has so many full length PCIe slots with nowhere near as many PCIe power cables as you'd need.
for your passive cooling test.. look into getting a tesla m40 24gb pg600 (the pg model will indicate internal generation which means better write speeds, so the pg600 is the last of the line and fastest. They are only about $80, but are server grade with no peripherals so you need a quadro 600 to pair with. You designate the program to the Tesla, and route the output to the Quadro. It basically has the same architecture and frame as the Titan X so use those drivers, and any 3rd party stuff like water coolers meant for the Titan X to get compatibility. Hope this helps, it took me a fair amount of research to piece this stuff together to build a VDI gaming server with a few linked together with NVlink for shits n giggles. It's probably the cheapest $/GB gpu solution currently, and wild to build out ~100GB GPU for the price of a 3060 * but you definitely will need more cooling for them, they are power hungry fuckers at 300W each, I highly recommend getting the 3d printed fan adapters for them on ebay; they operate around ~220wat nominal in my work loads, and youll want to get special splitters that turn TWO 8 pin power supplies into one 8pin for each GPU, as each 8 pin is only giving around 150W, and you can't use 6 pin converters because 6 pin is only around 75W. Power is probably the main reason they're so cheap and people haven't caught on to it as a solution, well that and not many people have motherboards that support multiple GPU/NVLink operations anymore but realistically it's so easy to build it out on something like you'd built here and give them 2 PSU, and set it up to only run what you actually need because frankly even 1 is a beast let alone 2 paired with NVLink or FOUR, so just set it up to run 2 per cheaper PSU, with one PSU as the master for everything else for a motherboard, and the other only for the GPU's themselves. In a configuration like yours, I'd have a PSU configured to power 1 on each motherboard, then the 2nd PSU turns on the 2nd for each motherboard. Shouldn't hurt them to not have that PSU controlling power to the GPU via motherboard communications, they're power hungry server grade fuckers with their own built in power management because they're designed to be plugged into stuff that might not necessarily support controlling power for it, this will also help keep them from all trying to pull the potentially 75 watts feeding from the PCIE all at once and will reduce the heat on the motherboard, and with a modular PSU, you can route the PSU outside the case to redirect that extra heat elsewhere. Food for thought for anyone going down this dark road. I suggest getting a v4 though, they've been really coming down lately unless you want to attempt the EPYC dark road, but frankly, there are benefits and arguments to be made about getting a dual EPYC SP3 board and only building out one 1 at first, with the expectation of using the other half for a future 2nd CPU when you need to expand for more power to make the machine more future proof. Bare in mind the EPYC cpu's noted with a P in their name don't like being configured for dual use. For instance the EPYC 7502p vs the 7502. Anyways, that's just what I've been playing with lately for an RSPS
Discord is for kids.
the thing about old workstations is, that they were once worth thousands and thousands for a reason! a 10 year old workstation can go further than most people realize
The other side is that these beasts are designed to be able to work 24/7 for a few years. And because of that, often they have been run 24/7 for years. I had a Z800 (well I still have it, but it's dead), and eventually all the RAM went bad and had to be replaced. Then suddenly it just died on me.
Until they break down, past warranty and no vendor support. We had to tell a client we can't support his system any longer. Motherboard issues along with part failure.
Got a 2010 MacPro Tower, with two Xeons. Thing is still usable for high-end 3D work today.
Dell T5500 144GB ram 2 X 5670, 2060 12GB GPU, got the box for $100 AUD, ram for $25 each 9 sticks, nvme, USB 3.1 card, cost in 2021 to build the system about $800. Win 10 pro OEM DELL.
Fantastic video editing machine.
@@tomleykisfan7280 And you got it on a budget. Maybe I spent too much on mine two years ago.
now you have ascended to the enterprise hardware realm and can look look down at all the plebeians with their consumer hardware
It's just simply so much better
@@aChairLeg it is, I got a chinese motherboard and built a dual Xeon e-5 2698 v3 CPU's with 32 cores total 3.6 Ghz and 256gb of DDR4 and 1tb m.2 with a 3.2 TB 12gb u.2 drive and a 1080TI to top it all off. and running linux. The CPU's have taken every single large task I throw at them no problem, I will be rendering in Blender and doing local AI at the same time while webbrowsing and have no slowdown whatsoever. I only paid 500$ for my system too, since I reused the Graphics card and power supply. It is simply a dream to work with and I'm saving up to upgrade to a 3090 TI since I need that VRAM and raytracing for 3D animation and AI. RTX accelerates AI and 3D renders and my 1080 TI isn't cutting it anymore since it only has 11gb vram and no RT. old Xeons absolutely rip even new AMD CPU's because of the lack of cores, if you have a single Xeon CPU like the 2699 v3 with 18 cores i've seen it compete with an amd ryzen 7 3700 despite being old x99
You can also feel above most enterprise hardware by buying DIY prosumer/workstation class hardware instead, like the ASUS WS C621E SAGE (X299), ASUS Z10PE-D8 WS (X99) or ASrock X79 Extreme11 (X79) and multiple Chinese brand X79 and X99 boards (new),
for example, which are more affordable now and good value since on the used market and Chinese ones are new. I guarantee many of the boards are above quality than say many Supermicro & Tyan boards, a good chunk of HP, Lenovo and Dell boards (and many also have IPMI, etc). Only the very high end and truly proprietary boards are really different enough. =)
I bought a rig with one of these threadripper pro 5995wx .... is that enterprise?
Literally cannot go back once you went workstation grade hardware. That stability is amazing. I got a Xeon w5-2465x with 128GB DDR5 ECC RAM, plus a RTX A4500. Storage I went with Optane 800GB U.2 drive, and a Micron 7450 6.4TB drive. Pricy yes, but stable and reliable.
Sorry to hear about your accident. Hope you get better soon.
I'm getting there thankfully!
I know this was for the incredible video maker, but as someone who had a nasty car accident in January, this positively impacts me as well.
@@tradingnichols2255 I hope you get better as well. Vehicle accidents suck!
@@ewasteredux Thank you. I greatly appreciate it!
@@aChairLeg womp womp
*"Two Hundred and FIFTY DOLLARS"* Aaaaand thanks to this video they are now 400 and up, _sigh._
You probably should spread RAM chips evenly between the CPUs. Otherwise, the second CPU is starved, as it has to stall whenever it needs data. The second CPU board has memory slots for a reason.
I too got a T7820 ($200) and after upgrades it's running dual 6126 for 24c/48t and 12x32 (384GB) RAM. It's a very capable platform for workstation and home server duties!
Nice!
Holy moses you got a good deal!!!! A T7820 here is about $1000 with some pedestrian CPU's and 64Gb RAM. Just 12x32Gb would cost $600 if not more(€50 is about the lowest for 32Gb RIMM per module). I got the cheapest T5810 I could find for €180, came with a E5-1620 and 32gb in 8gb sticks. Now it has 96Gb(€120) and a 2683v3(€30) .
If I could find a T7820 for €600 I'd be pretty happy.
Dies this pc support virtualization???
@@jainayrogeorge2924 indeed. Proxmox is running great.
@@jainayrogeorge2924 Like most PC's, with the appropriate software, yes. Some hardware you need to be a bit selective about, but basically all PC's "support" virtualization; You'll want to have enough ram, cpu cores and supporting hardware for your VM's though, my box right now has 14 cpu cores, 96gb ram, 5 gigabit NIC's, that's enough to run a few decent virtualized servers.
Thanks! Nice.
Dude i feel you. The pain i receive daily tryng to shoot videos is incomprehensible to my old self. I was a passenger in a car accident in 2018. Almost died, but was blessed to walk away with broken ribs, herniated discs and vertebrae knocked out of alignment thru my back n neck. I had to medically retire out the military n adjust to a new life. Im glad you were able to walk away and have a solid suport base. Just found you, but love the video and your perseverance. Keep going
All without an opioid addiction? Ur blessed
Damn man. Glad you're alive. Hope things get better somehow.
@@RushinVr6 yeah that's pretty amazing. Kratom is insanely useful though. I know a couple people that went through hellish accidents that have used it for over a decade now. It's finally getting studied thoroughly in depth in the states for the compounds that are found in the leaves, but it will never be used to treat pain because it would ruin big pharma's opioid dealing
Lay a red light strip on your back, it regressed my disk hernias, not far infrared becuz that will burn your eyes, just led red light. 10 to 15 minutes a day for 14 days straight. Repeat as necessary, wear goggles.
Hot Take: Dell (USED) Workstations are the best value to price. Especially during "refresh" cycles where companies liquidate them on eBay.
Dell servers even better. Like 930 with 4*8880 v4+192gb ram for $1000
I love this approach- "We have the 1950X at home."
Your focus on older hardware reminds me of Iceberg Tech. Subbed!
I rarely sub off of 1 great video, but you supporting that this is a norm for this channel, I have subbed. Also, very rare, I added this video as a FAVORITE. A FAVORITE! A truly rare event!
Why? Because I might build one of these.
I Just got a T7820 w/ 4110 and 32gb of ram that i won a bid on for 178, after upgrading the cpu to 6140 and 48gb ram and selling the wx2100 im still under 200$, love these older Workstations,
I took similar route and am in love with my HP Z840 workstation. Nvidia 3090 and it is a great AI monster for little money.
Never seen stacked cpus before, that’s really super neat
I asked this question on your Discord as well. Apologies for the repeated wall of text.
I like the idea of the 40 core budget monster for video editing and transcoding. That leads me to my question here. Why don't tech tubers ever show things like numbers of simultaneous transcodes with Plex, or Handbrake HEVC veryslow transcodes or AV1 preset 1 or 0 transcodes? Most just say something like "Handbrake transcode speed in XX minutes," but never say their settings. I really want to build this 40 core 80 thread beast, but I need more info and stats.
you can get a nvidia telsa p40 with 24gb of ram at least of 200 dollar for vram video editing
I wish there were an easier way to cool these. I had to design custom ducting, fan mounting, and a service for Linux to dynamically adjust the fan speed.
@@execration_texts you right the teslas are little problematic to cool because of passive cooling system
@@execration_texts Hi, sorry for mu bad english, I have Tesla m40 and installed heatpipe of nvidia 980ti with 3 fans max 4.000 rpm and under stress it reaches a maximum of 65 C :) Now I trying to cool the Tesla p100 16Gb :)
@@execration_texts I thought there were aftermarket powered fans that fit on them, so the only thing needed is a spare fan power connector ... and a wide chasis.
I saw this and ended up getting this same barebones 7820 and OH MY GOD. if they didnt use all proprietary parts i would never build diy pcs again, and personally its gonna take a lot to get me off the used workstation train from now on. awesome video
Those PCIe slots support Bifurcation... You can get a 4X M.2 carrier board and put four NVMe drives in one slot.
I do this with mine at work and it’s nuts in RAID on Linux
You should replace the GPU with an ARC A750. Hear me out. QuickSync beats the crap out of CUDA on encoding, and you can encode in VP9 directly so you save time on the UA-cam conversions. Try it in a video with this nice encoding CPU setup and you’ll see what I mean. I replaced my Quadro RTX A4000 with an ARC A750 and I get 40+ 4K HEVC h265 to 1080p h264 transcodes simultaneously now in Plex with zero buffering. However I also have 4x 4TB PCIE 3.0 NVME SSDs (bifurcation, which your motherboard supports) in RAID 5, so 12TB total as my storage, and transcode drive is a 128 GB RAM Drive that dumps to disk before a reboot. This RAM Drive dump happens lightning fast too. I might increase the RAM Drive to 192 GB because I have 256 GB DDR 3 ECC total, and I can never seem to go over 50 GB of RAM usage with 200 chrome tabs open and other apps.
What is your base workstation setup (based on)? What RAM disk? What RAID card? How much did your setup cost? Could you please elaborate? Sounds like you use it as a home media / lab server setup, exactly what I'm thinking of! Any feedback how it handles virtualization & *especially* what's its power consumption like & how loud it gets under different loads? Thanks in advance!
@ RAM disk is probably just system RAM used as a storage device with tmpfs or ramfs. Likely not a RAID card, probably a PCIE x16 to M.2x4 splitter with software RAID 5.
Do you know if the A770 would be able to run more streams? Especially the 16GB version?
@@davidmcken I’ve never tried, but my personal opinion is that the A580 or A750 are just priced better for the performance you get. If you wanted to squeeze every possible transcode out of the setup? Probably, but it’s not worth the $100 premium over the A750 and certainly not the $140-ish premium over the A580. Maybe a 10% increase in transcodes? Maybe 20% increase? Definitely not worth double the price over the A580 or the 50% price increase over the A750. A580 appears to be the sweet spot or the A750 if you have the extra $30 budget is a no brainer. If you have a desktop Intel CPU, remember to disable the integrated GPU completely prior to installing Plex. And for me, Windows appeared to give me better transcode performance than Linux, just another hint that could hopefully save you some time. Intel still has work to do on their Linux Arc drivers.
@@bleeb1347 ok, np. I am seeing A770s for about $300 atm and I'm looking at one since level 1 techs are hinting at it being able to be cross flashed to a flex opening up the possibility of SR-IOV as well and well 16GB of ram seems reasonable for the price. If ffmpeg transcoding performance comes free for the ride it's a good card for a home lab server being that multipurpose, at the current price might as well "splurge" for the highest model so I have space to do stuff like OpenAIs whisper or anything else I might want to throw at it on the AI front.
would be interested to see how performance would have compared with a straightforward CPU swap-out to a Ryzen 5950X
Thought the same thing.... Guessing same or better performance?
As the owner of a 22 core Xeon, 64 GB RAM and RTX A4000 graphics card, I understand your early impressions of using such a system.
My first Dell Precision was a T5600, which I got for free from my employer (gifted/fully written off) after it was decommissioned in 2017; plus an extra 825W PSU that had been trashed along with another T5600 chassis. Still have it as a backup unit at our second apartment. I upgraded it to dual E5-2680's (from dual E5-2667) back in 2018. The only investment was a 0.5TB SSD, single use Win10 Pro license, and repurposed RX590.
Hello, fellow bad-back haver!
Do whatever exercises your doctor/physical therapist gave you, on whatever schedule they gave you, or on some regular schedule if you didn't get a schedule.
Also, get a back brace.
Also also, if your car doesn't have lumbar support for it's seat, consider getting a lumbar support pillow. I found one at an auto parts store and I wanna buy like ten more because that one took me from not being able to comfortably drive to often feeling better *after* driving than before!
I got my 5810 from the same place a couple years ago. Not a powerhouse but I was super happy with the purchase - it has been my "living room"/media PC ever since... and I play pretty modern games on it all the time. 1650 v4. I also do some music production and it handles a crapton of channels and effects quite well, especially after a small ram upgrade.
Crazy that the 14700k gets 36000 points in cinebench r23, and for a while you could get cpu ram and motherboard for $500 in a bundle deal at microcenter. Of course, the final PC build would end up being twice the cost of this, but pretty crazy how far hardware has advanced.
For LGA 1700, only W680 has ECC support.
I got the single cpu thinkcenter cause it had a good 8 pin for a 3060 12gb. Im never going back. Came with a xeon 2135 and 64 gb of ram for $190. This thing just runs beautifully. Congrats on the build, definitely use earlier videos you made to look further into this so thank you.
I'm trying to do the same thing, but i'm having a hard time deciding which xeon CPU to use. THere are so many options.
@@kevboost I needed cores over speed. Nothing I do is terribly cpu hungry but I do amor of different operations at once. 2135 is a 6 core and all I could really afford at the time. I'm deeply looking at a more core count processor or going to 128gb of ram next
I understand that the Xeon gold 6138 only have AVX 512, unlike the 6148 that have more instructions. Is a problem the difference between instructions? Or the Xeon gold 6138 it's pretty good with just one instruction? Despite that incredible video 🔥
whats the lowest idle power this does
Did you find out? I'm dying to know.
@@tradingnichols2255 tbh I never rechecked but I’m sure if he did a video on it yeat it self that I know of
@@tradingnichols2255 Dont even try to know as it is high. Gas guzzler.
Just so you know, a 7950x3d benchmarks in cinebench around 38k when optimized(my personal workstation cpu), above a threadripper for a little less its under $500 cpu now. But thats a killer value build i love it!
@12:48 You're not turning off proxies to scrub, that moving around the timeline is after you turned proxies on.
I use one of these similar workstations at work. I was amazed just how much they can handle.
Arrived for the tech, but fell in love with your cats. My twin brother has a tuxedo marked cat, who knows what his breed is, he was a Humane Society rescue cat. But Toby-cat Martinez was supposed to be for my brother's 11-yo son. Like his Dad, he would have preferred a dog, but Toby-cat was whathe got. My nephew did non of the upkeep work for his new pet and Toby-cat quickly figured what side of his bread was buttered. He soon became my brother's alarm clock and I watch my twin from become a dyed-in-the-wool dog dude to a Cat Dad. Pretty soon he was serving Toby-cat 3 cheeseburgers a day (kidding, that is our fav food). Through COVID, my brother's office became his home office, with Toby-cat either at his feet or making his presence known when he wanted back into the house. Both your cats are adorable--says the twin who has always loved cats.
As a reference, while this system can handle 768 GB of RAM across two CPUs, a Threadripper 7xxx system can take 2TB of RAM. And that's on a single socket board. If you went dual socket EPYC, the max RAM would be 4 TB.
I've seen a dual socket mobo bundled with two 7501s on aliexpress for $800. Not sure if it's a scam or we're entering an era of cheap old epyc workstations
And it would be 6k MTS... No even the same league
I got a T5820 during covid when everything was unobtanium. Replaced the CPU with a 12core v3 CPU and that workstation did everything I needed until I built myself a new machine. Since then it's been my proxmox server and I'm just getting around to replacing it with an Epyc system. For the ~$400 I put into it it was definitely worth it.
Hi, what model CPU did you use?
@@dimidimi6243 E5-2678 V3. You can get them a lot cheaper now than I did then.
@@dimidimi6243 I know it's months later but in case anyone else is interested a Xeon E5-2678V3 was the replacement CPU. Cost me $80 back in 2020 you can get them now for $20.
Video memory issue can fix Tesla series from NVIDIA for example TESLA m80 goes for around 100$ and has 24 gigs of GDDR5. Athrough you will need to tinker with cooling.
I LOVE WORKSTATIONS
Last year I bought a barebone Precision 7920 (wider 7820 with a bit better cooling and more expansion slots) for $460, a pair of Xeon Gold 6146s for $230, and 128gb ram to use as a home server. I loaded it up with 8 3.5" HDDs, a few GPUs, and installed Proxmox. I ran Cinebench R23 and got 1084 SC and 27794 MC. I ended up selling it (for a profit at least) cause I really just didn't need it lol. It was a beast thought and I wish I had found a reason to keep it. Cheers
13:02 The storage warning is perfect.
i am amused - the xeon you bought was an MSRP of 3k each and you got them for 150 or so. buying old tech is so nice...
I think old xeons are dirt cheap but replacing the board is more of an issue
@@quintrapnell3605 the other thing is that i can get a stupid amount of memory - 64g dimms for $85? and i can get 12?
In my country they still sell for msrp
Very nice. I also considered a very similar route with either the Dell or the Lenovo dual CPU machines. But there is that issue with a lot of software unable to make use of both CPUs.
I watched a LOT of Miyconst videos as well.
Turns out that there is a Xeon chip with 18 cores and 36 threads, but not all cores can Turbo Boost. Except Miyconst and his friends found a way to unlock Turbo on ALL cores.
It's the Xeon E5-2699 v3. This is only a 3.6GHz boost clock. But with 18 cores, even at that speed you need some pretty aggressive cooling.
HOWEVER, it turns out that there is a special version of this CPU. It's NOT on the Intel Ark database, which means that this was a custom configuration for a single OEM. I'm guessing for a Mac Pro, but I have no idea.
What I DO know is that this variation, the E5-2696 v3 is the same silicon as the 2699, except that it has a max boost clock of 3.9GHz. Plus it has some additional instruction sets not found on the commercial version.
And thanks to the BIOS injection revealed by Miyconst, et al, with water cooling that means 36 threads at 3.9GHz, which should be buttery smooth for editing.
I'm a little chuffed that my 3080 FE only has 10Gb of VRAM, but everything I've read about the cards packed with RAM chips from this generation forward tells me that they are just running too hot.
They will work fine for a while, but sooner or later they are going to fail. Not like my GTX 980 Ti that is as good as the day I bought it. In fact, it's going into the same rig. And I can keep the VRAM on the 3080 at 80° max under stress tests
And aside from using the 980 analog output just for dedicated gaming on CRT (that never gets old), I can still do things like dedicate OBS capture to it, separate from the 3080, and use its 6Gb of VRAM for additional compute power for rendering or whatever.
The machine is 3/4 built right now, just modifying the case for an external drive bay and add some additional fans. I'm making the case have positive air pressure to keep out dust.
All said and done, it will have an MSI x99A MB with the OEM version of the 2699 v3, 64Gb of Four Channel ECC RAM, a 1500w Seasonic PSU, a Corsair AIO, the RTX 3080 and GTX 980 Ti, 2.5GHz Wi-Fi, a 500Gb NVMe Samsung 980 system drive, plus a 2Tb Samsung 970 EVO working drive, a 16Tb Enterprise HDD for cold storage in a modified mid-tower whose name just left my mind. The popular one with the metal bar that hides the cables?
And I've started a separate NAS device also with a couple of Enterprise level HDDs that will only serve as backup. And once any backups are complete, it will be disconnected from the network and the internet.
But it's fun as hell building these kinds of rigs, and seeing just how much performance you can squeeze out while doing it on a budget.
I love what you've done, and I'd REALLY like to talk you out of that Titan X. That is the VERY BEST card ever made with an analog output. Almost identical to my 980 Ti, with double the VRAM.
I have a nice water cooler if you ever want to let it go....
I myself have a 40 core 80 thread build with dual Xeon E5-2673 V4s and i love it
Same here, dual 2673 v4s with a radeon rx6800xt and 128g of ram. Love my Z640. Thinking of going with a HP Z4 G6 build for gaming...once again on the cheap.
@gleighteen7525 mine is custom built with 64 gigs of DDR4 2133 MHz ECC, and a 22 gig RTX 2080 Ti (modded for double vram)
Looks like you've got a Dell server the "big iron" model.
I rarely ever watch a sponsor segment the full way through - amazing camera work, was truly mesmerizing.
Absolute Monster. I just got the Lenovo P520 and now I’m rethinking.
Nice! I just recently got a T7910 so a little bit older, but 28 cores, 56 threads + 64GB of DDR4, and a gtx 1080. oh and a 500GB nvme drive with adapter. Its quite a beast I might add. I may eventually drop my RTX 2080 Super in there.
I was racking my brain trying to figure out how you got 40 cores under $800…when I have a 36 core T7920 under my desk. Lmao
I’m stoked Skylake-EP is getting super affordable. Big core counts, lots of PCIe lanes, and common bifurcation support makes it a nice upgrade over Haswell/Broadwell.
Mine’s running a ProxMox lab after I pulled it from NAS duty in favor of a 4c Kaby Lake Xeon.
Older tech is cool. I got a Dell PowerEdge R620, 2xCPU, 144Gb RAM, 4 x 1.2TB SAS drives altogether around $410. I also got a Dell PowerEdge T410 for $50 then put 6 x 6TB SAS drives in it.
I also found a Lenovo B490 laptop at my brother's office with no power supply. I happen to have an extra Lenovo X220 power supply and it worked. $29 for an SSD and got 16Gb RAM from old laptops and it's all good.
These sorts of videos are like butter on my oc brain. The level of soothing it provides
That's an awesome build! The Dell T7820 is surprisingly good, now I want to build one myself!
dell pro gear is a dream to work on, and cheap as hell
@@dercooney I would agree if they would use standard ATX PSU & motherboards.
I like how they went 3d cpu mounting.
Wild design!
The Lenovo was fittingly old school single board dual CPU. 😂😂😂
I bought a used dell t7810 and upgraded it to 44cores with 512gb ram. Never been happier for analysis for imaging. Everything else crawls.
Hey if you use zfs + zstd compression you can achieve (# of drives) x (speed of single drive) x (compression ratio [I get between 1.7 - 3.64x compression)
Resulting in crazy fast storage, which also happens to have a ram disk cache (ie zfs arc)
I'm getting nearly 14GB/s read speeds off a laptop with x2 gen3 m2 ssds...
I'm *REALLY* interested in this setup's power consumption & how loud does it get under different loads. I would give it a try without GPUs as a home lab server with virtualization. Would you 🙏please consider making such video? Thanks!
Excellent video, enterprise grade workstations are awesome and cheap on the used market.
Funny that you bring Miyconst up. I had a little talk with him about a nas server solution not long ago. I think he is very knowledgeable in this field
I use an 8U array two IBM x3850 X5 configured in '2 node' using QPI cable interlink, 8x Intel XEON E7-8870V1 (OctaSocket-DecaCore). Took me ages to source the QPI cables. nVidia GTX 770 2GB is an entry level GPU for 4K desktop real estate, of which I use a 43" HiSense SmartTV.
I like the Quadro GP100s. Why? HBM memory (lower temps) and if you add another GP100 and NVLink bridges the two cards become two Tesla P100's with on board cooling. :)
This was way more fascinating than I expected it being. Your enthusiasm for it really was gripping and now I want to build one of these!
Old workstations are amazing. I've used an old HP Z200 with an Xeon X3440 and 16GB to host a minecraft server and home NAS. Upgraded it to a HP Z400 with Xeon W3565 which now has 40GB, still need to get another two 8GB sticks for the full 48GB you can fit.
very cool video whats the power usage?
I typically stay away from that gen xeon for budget workstation and instead go first gen EPYC as the are super easy to OC and Enermax TR4 (gen2) has no problem keeping them cool. My 32core is at 3.8Ghz and u get more expandability and can go 64 core if need be. For storage i use 2 asus hyper m.2 adapters with 8 nvme drives (all 8 in raid zero with immense bandwidth / speed cause i dont store anything permanent there just work and games) and I have 4 8TB HDDs but I use Primocahce to speed them up immensely by using ram plus an extra nvme as cache (apart from the 8 nvme on the asus adapters, the motherboard has 2 nvme slots plus i have extra pcie slots free. This is my baseline config for all my budget EPYC builds.
So glad you mentioned Miyconst; so far he and Tech Yes City are the only ones that routinely revisit Xeon cpus for pro-sumer applications. I wouldn't have done any of my computer builds without them.
I’m currently waiting till the 2nd gen scalable become affordable so I can use Optane Pmem
it seems like the 3647 is the new 2011 used server for homelab hardware.
It's getting there
Xeon gold 6148's are going for as low as $80, which are the same core count, but higher clock count and higher single core performance too, 6138's are going for as low as $35, which isnt bad at all
this is amazing.i never thought about looking at second hand corporate servers! brilliant!
Good video. I just wonder how close that an upgrade to a 5950x from a 3700x on your original rig would have gotten you to this Xeon tower's performance.
I want to eventually test that, but I needed a second PC anyways so I wanted to try something new. I'm guessing the performance isn't too far off honestly
@@aChairLeg I mean sure, performance may be similar, but I'm sure the expandability of that workstation is insane. PCIe bandwidth for days.
Damn, I get absolute shit performance on my 5900x in Eco Mode, I'm tempted to run R15 in PBO and see if the space heater amount of heat it generates is worth it
A 5950x would have better performance for majority of things. The Ryzen has a higher single core. Most games and workstations do not need 40C/80T. Also, not sure if you know or care, but your idle consumption is probably in the 130w range now, with that system easily pulling 250w under load.
Once again, awesome pcie expansion of you need it, but completely overkill for what you use it for.
I have the 5950x, a dual Xeon scalable gold 6138, Xeon e-2146g and the i5-14500. The 5950x with dual Nvme drives, AMD 6700xt with 3 monitors pulls about 90w idle, 150w under load. Golds are at about 70w idle (just boot drive, 64gb ram,10gbe nic), and goes to about 250w under load. 2146g pulls 48w idle, 80w load with dual ssd mirror boot, dual mirror nvme, dual mirror 18tb enterprise drives, 128gb ram, with 10gbe nic for the truenas server. 14500 has single nvme, 64gb ddr5 and 2.5gbe nic. It is used for Plex server/transcoding and home assistant under virtualized proxmox. Idle is 20w, load about 40w.
Either way, I understand the cool factor of enterprise gear. Realistically, it is not efficient nor suited for most tasks, nor really great for a home server either.
@@bhume7535 Power draw is also insane, good luck running it for like half a day every day.
XEON GANG LET'S GOOOOOOOOOOOO!
I bought a used Dell T5810 Workstation, added a second 200-something (240 I think?) GB SATA SSD, slapped in an RX 6600 and initially ran it with 12 Gigs of DDR4, but recently raised that to 20 GB of Ram, and replaced the Xeon E5-1650 V3 that it came with an E5-1650 V4, and I'm gaming pretty well. My next planned updates are to go for a Xeon 2697 V3, more SSD (maybe going for a 2TB M.2 and using a PCIE to M.2 converter), and maybe in a couple of years, getting a 6650 XT or a 6700 XT.
I have the following as my main music making rig Dell Precision T3600 : Intel Xeon E5-2690 @ 2.9Ghz (8C/16T) // 32GB ECC RAM// Gigabyte Nvidia RTX2060 6GB//2TB Samsung 860//2TB WD Black// it does so good
Was there a noticeable upgrade difference using the 1650v4 over the 1650v3??
I built an x99 platform gaming pc/media server last year and decided to stick with v3 gen cpu's since they have overclocking capabilities when it comes to gaming performance. I was super lucky and won a bid for a 1680v3 for around $60 on ebay not to long ago as well so i swapped out the 1660v3 i had bought a few qeeks prior with it and it definitely made a small increase in performance but idk if it was justifiable.
This UA-cam channel called "miyconst" is like the official x99/xeon tech guru of UA-cam and his testing shows the best cpu ultimately for this platform is the xeon 2697v3 but locked at max frequency boost which would usually require a true x99 chipset motherboard.
I thought about going with a xeon 2689v4 if i can find one for reasonable price but i really do appreciate the ability to tinker with overclocking the v3 cpu's.
Just curious what your opinion is going with the v4 over the v3 cpu's?
Bought an HP 620 workstation off Ebay, upgraded the Xeon x2 cpu(s) and Video Card and "Boom" a powerful workstation that games. Only down side is it produces a lot of heat. People underestimate old workstations. Best value/$.
I got that stove they call a 5800x for < $200 which was an upgrade from a 3700x. I need to benchmark it now that the TIM is ready.
this is one of my favorite vids this year
I would've loved to see how much power it draws at it's maximum potential but regardless, amazing video, nice cinematic shots and great story writing skillz, hope you have a safe recovery
I'll check after work and let you know
@@aChairLegkiller long day huh?😂
I like these chairs but I really think we need more kneeling and push up chairs. You get some benefits of sitting and standing or you can just stand
I got myself a home server which is a hp z640 and has dual xeon e5 2680 v4 so I have 28/56 of broadwell not skylake and 128gb of ram. I'm also running Truenas which makes its so all programs are numa aware and will run across all cpu cores. all in spent like less than 300 usd and that thing is beast for anything I throw at it. my average cpu utilization is about 2 percent with a boat load of services running.
Wow! That's a hell of a setup, especially for $300
So you bought something that you use 2% of. Not the best purchase since you are not using 98% of what you bought.
@@RaduRadonys mate it was less than $300. I also use it as a blender render node so it'll max out occasionally. The ram is for when I get around to it ill get myself a 10x2gig nic for my main pc and the server and just use it as an external drive with speeds up to 2.5GB/s which is on par with some lower end nvme drives.
@@RaduRadonys It's also just for fun it's useful if I want to play around with llm's or image generation.
@@RaduRadonys Also do you use 100% of your car? I doubt all seats are full constantly all cargo space is full always, in day to day life you probably could get away with a smart car, but no one buys one because it's nice to have overhead. It applies even more if you own a pickup and use it as a daily driver.
What is that MSI thing you plugged in the PCIE slot?
Will it run Stable Diffusion? SDXL? SVD? very curious if I can go this route with an old 1080ti and get SDXL running. I got hit like that outside Kuala Lumpur.. by a giant dumptruck who was trying to pass on the left while we were in the right lane waiting to make a right. Right-side driver.. I was passenger seat.. seatback broke.. sent use into oncoming traffic.. which was light so no further impact. I was flying home that night.. had to sit on a metal milk crate in a cinder-block tin-roof "police station" to do my witness report. I made the flight and stood up in the aisle for most of it because my back was killing me. I still wonder today what damage was done that is nagging me now some 30ish years later.. so very good luck to you, Sir, on your recovery.
Yep I'm running A1111 on a similar machine, but 80gb ram & a rtx3060 12gb. Keep in mind tweaking may be necessary due to weird AVX support. Strangely Xeons don't have consumer elements due to speed, I think the Xeons shown probably are o.k.
Gah, I have a thing for these massive CPUs, and you've just informed me that they're relatively accessible... Stop giving me ideas, I'm already spending too much money!
They're so cool looking though!!
This is the same thing I did, I was looking for a PC to run my firewall, I went on eBay and found so many different prebuilt Lenovo and Dells that was much cheaper vs building yourself, eBay is flooded with these workstations.
Only downside is that power supply, like you mentioned
I would use server as base. Something like Dell R720 or R730 or equivalent tower model. (I may go with EPYC 7D12 and H11.)
GPU, I would choose P40 or P100. They have tons of memory with Pastcal core. And much cheaper due to lack of video output. If not install in a server then you need to 3D print a fan shrout for it.
NVMe, I like PM983 and PM9A3.
Hi . Amaizing info.
I am doubtful that both the xeon 6148 gold will be utilised/recignised by win11/10. Some say some cpu clustering issue by os. I am not aware of the exact problem but some switched to Linux. Did you have similar issue with your build? ?
Very cool, love seeing these old workstation rebuilds. Curious what was the wattage it's pulling on average?
500 GW/h.
Someone gave me some E5-2650v4s, a TON of 16GB/32GB ECC DDR4 and a pile of SAS3 disks. I've been on a journey like yours, ever since.
Using dual 6148 for a sixth year already. Not a single problem, except replacement of the dead 1080 Ti with 3090 in 2022.
What a steal! I love this, when used Hardware performs so good.
Love that you tried BeamNG on it, really like seeing how the sim performs on high core count systems like yours
waiting for the water cooling video! have fun with that 2-tier cpu system
The problem is that the bare bones workstation is still expensive in some countries and the motherboards that dell makes are custom design. Is there a motherboard out there that is cheap and widely available?
5:38 it's 6 channel ram. (not quad, not dual) So you should run a ram speed test and tell us the results (i assume you have 12 sticks of ram so all slots are populated?)
This is waay better than the usual high core xeon builds with poor ipc.
I did one similar to the first idea you had with the watercooling build back in 2016, I regretted every single decision I made with dual E5-2690v2 that chucks power like it was nothing, and the thing is I used it as my NAS + Web host + VM test field, that was the worst build I made value wise, besides those abandoned watercooled PCs I made even earlier. At least it runs fine.
i got mine single slot 14 core for rougly 400, plus duties from UK. i plan to upgrade on the go to 28 core.
Glad you're OK man! That car crash was intense.
Put a Intel Arc to decode the h.264 or h.265 via quick sync and you can use this system or your previous system whitout problems, worked wonders for me.
I got that SSD, make sure to do the firmware update on it from solidigm. Mine was 6.4tb for $350. Pcie4 rated for 5dwpd for 5 years. Dell still lists the exact same drive as a bto option on their server builds for $18k. Mine had zero hours.
I have been so frustrated by computers in recent years. Mostly expensive for slow (single threaded) server grade hardware. Last summer I deployed an epyc 9124 (zen 4 3.7ghz) with a supermicro motherboard so I get 12 ddr5 channels, 112 PCIe lanes, and decent single threaded performance. The single threaded cpu performance was 2 times faster than the fastest frequency optimized zen3 epyc.
I used to be a big proponent of re-using workstations, but with the impending Win 11 hardware apocalypse with the EoL October, 2025 for Win 10 then it will be harder to justify using older hardware. This is a shame, as I have given away an old Z620 and a Z820 in recent months which was still going strong. This means you're likely have to swap to Linux if you can if you are using Da Vinci but leaves the question of other software support under Linux. You could use the rig in a air gapped configuration, but it will be only a matter of time before software applications will cease to support Win 10. Sadly even my Z840 I just decomm'd doesn't run Win 11. It may not be as big as an issue next year picking up workstations, but you're likely looking at systems from 2021/2022 onwards and the mis-matching of which Xeon's are actually supported. Now I know there is workarounds for Win 11 but MS is closing them down.
My guess...with the impending economic crisis and collapse of new HW purchases (see latest AMD sales), there will be HUGE pressure on MSFT to relax the TPM 2.0 requirement.
mate did something similar, he was looking at new intel desktop chips and I just put it to him, have you looked at old servers. 40 cores, 80 threads, 256Gb of ECC RAM, redundant power supply and dual m.2 nvme drives in a single slot rack mount cost him less than just the i9 CPU and motherboard he was looking at would have cost.
Bought a HP Z620 years ago for $200. Upgraded the Xeons to E5-2667V2. It came with 64 MB EEC, two 2T hard drives , and a 6MB video card. For maximum productivity at minimum cost, old workstations are the way to go.
I had the Z820. Amazing machine. I tell people they should buy them but people don't listen to me and get meme machines instead or a mac.
@@dave7244 If you aren't computer literate, you buy a machine from Best Buy or a iMac. John Wayne said: "Live is tough, it's tougher if you're stupid."
Recover well ❤
I’m on a ryzen 53600 system and well ordered a ryzen 9700xd3 now I need a mobo/ram/psu since my 750w ain’t gona be even close to enough
I got a 6148 on an asus c621 sage mobo and an A4000 on it that I use as my home server. The 6148 has the same single-core turbo as what you have and it's not really snappy in windows. I'd go with a 5950X and 64 gis of ram if I were you.