Id have gone a 4xxx series RTX as they support AV1 codec and learn to use DaVinci Resolve (i got a KEY with a small Keyboard) good setup for YT videos if you U/G it later
@@aChairLeg Checked out a few of these systems on ebay and can't find anything close to the price you paid. Nice work as it's quite a workstation. I was thinking of replacing my x299 socketed i9-9900X with one of these, however that PSU is a real downer. I recently purchased an ASUS ProArt 4080 Super and it's powered by 3x 8 pin PCIe (with the included adaptor). Strange that it has so many full length PCIe slots with nowhere near as many PCIe power cables as you'd need.
for your passive cooling test.. look into getting a tesla m40 24gb pg600 (the pg model will indicate internal generation which means better write speeds, so the pg600 is the last of the line and fastest. They are only about $80, but are server grade with no peripherals so you need a quadro 600 to pair with. You designate the program to the Tesla, and route the output to the Quadro. It basically has the same architecture and frame as the Titan X so use those drivers, and any 3rd party stuff like water coolers meant for the Titan X to get compatibility. Hope this helps, it took me a fair amount of research to piece this stuff together to build a VDI gaming server with a few linked together with NVlink for shits n giggles. It's probably the cheapest $/GB gpu solution currently, and wild to build out ~100GB GPU for the price of a 3060 * but you definitely will need more cooling for them, they are power hungry fuckers at 300W each, I highly recommend getting the 3d printed fan adapters for them on ebay; they operate around ~220wat nominal in my work loads, and youll want to get special splitters that turn TWO 8 pin power supplies into one 8pin for each GPU, as each 8 pin is only giving around 150W, and you can't use 6 pin converters because 6 pin is only around 75W. Power is probably the main reason they're so cheap and people haven't caught on to it as a solution, well that and not many people have motherboards that support multiple GPU/NVLink operations anymore but realistically it's so easy to build it out on something like you'd built here and give them 2 PSU, and set it up to only run what you actually need because frankly even 1 is a beast let alone 2 paired with NVLink or FOUR, so just set it up to run 2 per cheaper PSU, with one PSU as the master for everything else for a motherboard, and the other only for the GPU's themselves. In a configuration like yours, I'd have a PSU configured to power 1 on each motherboard, then the 2nd PSU turns on the 2nd for each motherboard. Shouldn't hurt them to not have that PSU controlling power to the GPU via motherboard communications, they're power hungry server grade fuckers with their own built in power management because they're designed to be plugged into stuff that might not necessarily support controlling power for it, this will also help keep them from all trying to pull the potentially 75 watts feeding from the PCIE all at once and will reduce the heat on the motherboard, and with a modular PSU, you can route the PSU outside the case to redirect that extra heat elsewhere. Food for thought for anyone going down this dark road. I suggest getting a v4 though, they've been really coming down lately unless you want to attempt the EPYC dark road, but frankly, there are benefits and arguments to be made about getting a dual EPYC SP3 board and only building out one 1 at first, with the expectation of using the other half for a future 2nd CPU when you need to expand for more power to make the machine more future proof. Bare in mind the EPYC cpu's noted with a P in their name don't like being configured for dual use. For instance the EPYC 7502p vs the 7502. Anyways, that's just what I've been playing with lately for an RSPS
the thing about old workstations is, that they were once worth thousands and thousands for a reason! a 10 year old workstation can go further than most people realize
The other side is that these beasts are designed to be able to work 24/7 for a few years. And because of that, often they have been run 24/7 for years. I had a Z800 (well I still have it, but it's dead), and eventually all the RAM went bad and had to be replaced. Then suddenly it just died on me.
Until they break down, past warranty and no vendor support. We had to tell a client we can't support his system any longer. Motherboard issues along with part failure.
Dell T5500 144GB ram 2 X 5670, 2060 12GB GPU, got the box for $100 AUD, ram for $25 each 9 sticks, nvme, USB 3.1 card, cost in 2021 to build the system about $800. Win 10 pro OEM DELL. Fantastic video editing machine.
@@aChairLeg it is, I got a chinese motherboard and built a dual Xeon e-5 2698 v3 CPU's with 32 cores total 3.6 Ghz and 256gb of DDR4 and 1tb m.2 with a 3.2 TB 12gb u.2 drive and a 1080TI to top it all off. and running linux. The CPU's have taken every single large task I throw at them no problem, I will be rendering in Blender and doing local AI at the same time while webbrowsing and have no slowdown whatsoever. I only paid 500$ for my system too, since I reused the Graphics card and power supply. It is simply a dream to work with and I'm saving up to upgrade to a 3090 TI since I need that VRAM and raytracing for 3D animation and AI. RTX accelerates AI and 3D renders and my 1080 TI isn't cutting it anymore since it only has 11gb vram and no RT. old Xeons absolutely rip even new AMD CPU's because of the lack of cores, if you have a single Xeon CPU like the 2699 v3 with 18 cores i've seen it compete with an amd ryzen 7 3700 despite being old x99
You can also feel above most enterprise hardware by buying DIY prosumer/workstation class hardware instead, like the ASUS WS C621E SAGE (X299), ASUS Z10PE-D8 WS (X99) or ASrock X79 Extreme11 (X79) and multiple Chinese brand X79 and X99 boards (new), for example, which are more affordable now and good value since on the used market and Chinese ones are new. I guarantee many of the boards are above quality than say many Supermicro & Tyan boards, a good chunk of HP, Lenovo and Dell boards (and many also have IPMI, etc). Only the very high end and truly proprietary boards are really different enough. =)
Literally cannot go back once you went workstation grade hardware. That stability is amazing. I got a Xeon w5-2465x with 128GB DDR5 ECC RAM, plus a RTX A4500. Storage I went with Optane 800GB U.2 drive, and a Micron 7450 6.4TB drive. Pricy yes, but stable and reliable.
I too got a T7820 ($200) and after upgrades it's running dual 6126 for 24c/48t and 12x32 (384GB) RAM. It's a very capable platform for workstation and home server duties!
Holy moses you got a good deal!!!! A T7820 here is about $1000 with some pedestrian CPU's and 64Gb RAM. Just 12x32Gb would cost $600 if not more(€50 is about the lowest for 32Gb RIMM per module). I got the cheapest T5810 I could find for €180, came with a E5-1620 and 32gb in 8gb sticks. Now it has 96Gb(€120) and a 2683v3(€30) . If I could find a T7820 for €600 I'd be pretty happy.
@@jainayrogeorge2924 Like most PC's, with the appropriate software, yes. Some hardware you need to be a bit selective about, but basically all PC's "support" virtualization; You'll want to have enough ram, cpu cores and supporting hardware for your VM's though, my box right now has 14 cpu cores, 96gb ram, 5 gigabit NIC's, that's enough to run a few decent virtualized servers.
Dude i feel you. The pain i receive daily tryng to shoot videos is incomprehensible to my old self. I was a passenger in a car accident in 2018. Almost died, but was blessed to walk away with broken ribs, herniated discs and vertebrae knocked out of alignment thru my back n neck. I had to medically retire out the military n adjust to a new life. Im glad you were able to walk away and have a solid suport base. Just found you, but love the video and your perseverance. Keep going
@@RushinVr6 yeah that's pretty amazing. Kratom is insanely useful though. I know a couple people that went through hellish accidents that have used it for over a decade now. It's finally getting studied thoroughly in depth in the states for the compounds that are found in the leaves, but it will never be used to treat pain because it would ruin big pharma's opioid dealing
You probably should spread RAM chips evenly between the CPUs. Otherwise, the second CPU is starved, as it has to stall whenever it needs data. The second CPU board has memory slots for a reason.
I saw this and ended up getting this same barebones 7820 and OH MY GOD. if they didnt use all proprietary parts i would never build diy pcs again, and personally its gonna take a lot to get me off the used workstation train from now on. awesome video
I rarely sub off of 1 great video, but you supporting that this is a norm for this channel, I have subbed. Also, very rare, I added this video as a FAVORITE. A FAVORITE! A truly rare event! Why? Because I might build one of these.
My first Dell Precision was a T5600, which I got for free from my employer (gifted/fully written off) after it was decommissioned in 2017; plus an extra 825W PSU that had been trashed along with another T5600 chassis. Still have it as a backup unit at our second apartment. I upgraded it to dual E5-2680's (from dual E5-2667) back in 2018. The only investment was a 0.5TB SSD, single use Win10 Pro license, and repurposed RX590.
I Just got a T7820 w/ 4110 and 32gb of ram that i won a bid on for 178, after upgrading the cpu to 6140 and 48gb ram and selling the wx2100 im still under 200$, love these older Workstations,
I got my 5810 from the same place a couple years ago. Not a powerhouse but I was super happy with the purchase - it has been my "living room"/media PC ever since... and I play pretty modern games on it all the time. 1650 v4. I also do some music production and it handles a crapton of channels and effects quite well, especially after a small ram upgrade.
I got the single cpu thinkcenter cause it had a good 8 pin for a 3060 12gb. Im never going back. Came with a xeon 2135 and 64 gb of ram for $190. This thing just runs beautifully. Congrats on the build, definitely use earlier videos you made to look further into this so thank you.
@@kevboost I needed cores over speed. Nothing I do is terribly cpu hungry but I do amor of different operations at once. 2135 is a 6 core and all I could really afford at the time. I'm deeply looking at a more core count processor or going to 128gb of ram next
Arrived for the tech, but fell in love with your cats. My twin brother has a tuxedo marked cat, who knows what his breed is, he was a Humane Society rescue cat. But Toby-cat Martinez was supposed to be for my brother's 11-yo son. Like his Dad, he would have preferred a dog, but Toby-cat was whathe got. My nephew did non of the upkeep work for his new pet and Toby-cat quickly figured what side of his bread was buttered. He soon became my brother's alarm clock and I watch my twin from become a dyed-in-the-wool dog dude to a Cat Dad. Pretty soon he was serving Toby-cat 3 cheeseburgers a day (kidding, that is our fav food). Through COVID, my brother's office became his home office, with Toby-cat either at his feet or making his presence known when he wanted back into the house. Both your cats are adorable--says the twin who has always loved cats.
Crazy that the 14700k gets 36000 points in cinebench r23, and for a while you could get cpu ram and motherboard for $500 in a bundle deal at microcenter. Of course, the final PC build would end up being twice the cost of this, but pretty crazy how far hardware has advanced.
Hello, fellow bad-back haver! Do whatever exercises your doctor/physical therapist gave you, on whatever schedule they gave you, or on some regular schedule if you didn't get a schedule. Also, get a back brace. Also also, if your car doesn't have lumbar support for it's seat, consider getting a lumbar support pillow. I found one at an auto parts store and I wanna buy like ten more because that one took me from not being able to comfortably drive to often feeling better *after* driving than before!
You should replace the GPU with an ARC A750. Hear me out. QuickSync beats the crap out of CUDA on encoding, and you can encode in VP9 directly so you save time on the UA-cam conversions. Try it in a video with this nice encoding CPU setup and you’ll see what I mean. I replaced my Quadro RTX A4000 with an ARC A750 and I get 40+ 4K HEVC h265 to 1080p h264 transcodes simultaneously now in Plex with zero buffering. However I also have 4x 4TB PCIE 3.0 NVME SSDs (bifurcation, which your motherboard supports) in RAID 5, so 12TB total as my storage, and transcode drive is a 128 GB RAM Drive that dumps to disk before a reboot. This RAM Drive dump happens lightning fast too. I might increase the RAM Drive to 192 GB because I have 256 GB DDR 3 ECC total, and I can never seem to go over 50 GB of RAM usage with 200 chrome tabs open and other apps.
6 місяців тому+4
What is your base workstation setup (based on)? What RAM disk? What RAID card? How much did your setup cost? Could you please elaborate? Sounds like you use it as a home media / lab server setup, exactly what I'm thinking of! Any feedback how it handles virtualization & *especially* what's its power consumption like & how loud it gets under different loads? Thanks in advance!
@ RAM disk is probably just system RAM used as a storage device with tmpfs or ramfs. Likely not a RAID card, probably a PCIE x16 to M.2x4 splitter with software RAID 5.
@@davidmcken I’ve never tried, but my personal opinion is that the A580 or A750 are just priced better for the performance you get. If you wanted to squeeze every possible transcode out of the setup? Probably, but it’s not worth the $100 premium over the A750 and certainly not the $140-ish premium over the A580. Maybe a 10% increase in transcodes? Maybe 20% increase? Definitely not worth double the price over the A580 or the 50% price increase over the A750. A580 appears to be the sweet spot or the A750 if you have the extra $30 budget is a no brainer. If you have a desktop Intel CPU, remember to disable the integrated GPU completely prior to installing Plex. And for me, Windows appeared to give me better transcode performance than Linux, just another hint that could hopefully save you some time. Intel still has work to do on their Linux Arc drivers.
@@bleeb1347 ok, np. I am seeing A770s for about $300 atm and I'm looking at one since level 1 techs are hinting at it being able to be cross flashed to a flex opening up the possibility of SR-IOV as well and well 16GB of ram seems reasonable for the price. If ffmpeg transcoding performance comes free for the ride it's a good card for a home lab server being that multipurpose, at the current price might as well "splurge" for the highest model so I have space to do stuff like OpenAIs whisper or anything else I might want to throw at it on the AI front.
Last year I bought a barebone Precision 7920 (wider 7820 with a bit better cooling and more expansion slots) for $460, a pair of Xeon Gold 6146s for $230, and 128gb ram to use as a home server. I loaded it up with 8 3.5" HDDs, a few GPUs, and installed Proxmox. I ran Cinebench R23 and got 1084 SC and 27794 MC. I ended up selling it (for a profit at least) cause I really just didn't need it lol. It was a beast thought and I wish I had found a reason to keep it. Cheers
I got a T5820 during covid when everything was unobtanium. Replaced the CPU with a 12core v3 CPU and that workstation did everything I needed until I built myself a new machine. Since then it's been my proxmox server and I'm just getting around to replacing it with an Epyc system. For the ~$400 I put into it it was definitely worth it.
@@dimidimi6243 I know it's months later but in case anyone else is interested a Xeon E5-2678V3 was the replacement CPU. Cost me $80 back in 2020 you can get them now for $20.
I wish there were an easier way to cool these. I had to design custom ducting, fan mounting, and a service for Linux to dynamically adjust the fan speed.
@@execration_texts Hi, sorry for mu bad english, I have Tesla m40 and installed heatpipe of nvidia 980ti with 3 fans max 4.000 rpm and under stress it reaches a maximum of 65 C :) Now I trying to cool the Tesla p100 16Gb :)
@@execration_texts I thought there were aftermarket powered fans that fit on them, so the only thing needed is a spare fan power connector ... and a wide chasis.
Just so you know, a 7950x3d benchmarks in cinebench around 38k when optimized(my personal workstation cpu), above a threadripper for a little less its under $500 cpu now. But thats a killer value build i love it!
As a reference, while this system can handle 768 GB of RAM across two CPUs, a Threadripper 7xxx system can take 2TB of RAM. And that's on a single socket board. If you went dual socket EPYC, the max RAM would be 4 TB.
I've seen a dual socket mobo bundled with two 7501s on aliexpress for $800. Not sure if it's a scam or we're entering an era of cheap old epyc workstations
XEON GANG LET'S GOOOOOOOOOOOO! I bought a used Dell T5810 Workstation, added a second 200-something (240 I think?) GB SATA SSD, slapped in an RX 6600 and initially ran it with 12 Gigs of DDR4, but recently raised that to 20 GB of Ram, and replaced the Xeon E5-1650 V3 that it came with an E5-1650 V4, and I'm gaming pretty well. My next planned updates are to go for a Xeon 2697 V3, more SSD (maybe going for a 2TB M.2 and using a PCIE to M.2 converter), and maybe in a couple of years, getting a 6650 XT or a 6700 XT.
I have the following as my main music making rig Dell Precision T3600 : Intel Xeon E5-2690 @ 2.9Ghz (8C/16T) // 32GB ECC RAM// Gigabyte Nvidia RTX2060 6GB//2TB Samsung 860//2TB WD Black// it does so good
Was there a noticeable upgrade difference using the 1650v4 over the 1650v3?? I built an x99 platform gaming pc/media server last year and decided to stick with v3 gen cpu's since they have overclocking capabilities when it comes to gaming performance. I was super lucky and won a bid for a 1680v3 for around $60 on ebay not to long ago as well so i swapped out the 1660v3 i had bought a few qeeks prior with it and it definitely made a small increase in performance but idk if it was justifiable. This UA-cam channel called "miyconst" is like the official x99/xeon tech guru of UA-cam and his testing shows the best cpu ultimately for this platform is the xeon 2697v3 but locked at max frequency boost which would usually require a true x99 chipset motherboard. I thought about going with a xeon 2689v4 if i can find one for reasonable price but i really do appreciate the ability to tinker with overclocking the v3 cpu's. Just curious what your opinion is going with the v4 over the v3 cpu's?
Nice! I just recently got a T7910 so a little bit older, but 28 cores, 56 threads + 64GB of DDR4, and a gtx 1080. oh and a 500GB nvme drive with adapter. Its quite a beast I might add. I may eventually drop my RTX 2080 Super in there.
Gah, I have a thing for these massive CPUs, and you've just informed me that they're relatively accessible... Stop giving me ideas, I'm already spending too much money!
So glad you mentioned Miyconst; so far he and Tech Yes City are the only ones that routinely revisit Xeon cpus for pro-sumer applications. I wouldn't have done any of my computer builds without them.
I asked this question on your Discord as well. Apologies for the repeated wall of text. I like the idea of the 40 core budget monster for video editing and transcoding. That leads me to my question here. Why don't tech tubers ever show things like numbers of simultaneous transcodes with Plex, or Handbrake HEVC veryslow transcodes or AV1 preset 1 or 0 transcodes? Most just say something like "Handbrake transcode speed in XX minutes," but never say their settings. I really want to build this 40 core 80 thread beast, but I need more info and stats.
Funny that you bring Miyconst up. I had a little talk with him about a nas server solution not long ago. I think he is very knowledgeable in this field
Older tech is cool. I got a Dell PowerEdge R620, 2xCPU, 144Gb RAM, 4 x 1.2TB SAS drives altogether around $410. I also got a Dell PowerEdge T410 for $50 then put 6 x 6TB SAS drives in it. I also found a Lenovo B490 laptop at my brother's office with no power supply. I happen to have an extra Lenovo X220 power supply and it worked. $29 for an SSD and got 16Gb RAM from old laptops and it's all good.
Old workstations are amazing. I've used an old HP Z200 with an Xeon X3440 and 16GB to host a minecraft server and home NAS. Upgraded it to a HP Z400 with Xeon W3565 which now has 40GB, still need to get another two 8GB sticks for the full 48GB you can fit.
I would've loved to see how much power it draws at it's maximum potential but regardless, amazing video, nice cinematic shots and great story writing skillz, hope you have a safe recovery
I was racking my brain trying to figure out how you got 40 cores under $800…when I have a 36 core T7920 under my desk. Lmao I’m stoked Skylake-EP is getting super affordable. Big core counts, lots of PCIe lanes, and common bifurcation support makes it a nice upgrade over Haswell/Broadwell. Mine’s running a ProxMox lab after I pulled it from NAS duty in favor of a 4c Kaby Lake Xeon.
not to rain on anyone's parade but when you do the math, skylake IPC is so much lower than zen 5 you can just get a 9950X and have equivalent multithreaded power, and the system would actually be efficient and useful for lightly threaded tasks and gaming. this is the problem with budget CPUs imo. CPU tech has been advancing so fast it's just not worth it to go back to prior generations.
Bought an HP 620 workstation off Ebay, upgraded the Xeon x2 cpu(s) and Video Card and "Boom" a powerful workstation that games. Only down side is it produces a lot of heat. People underestimate old workstations. Best value/$.
Very nice. I also considered a very similar route with either the Dell or the Lenovo dual CPU machines. But there is that issue with a lot of software unable to make use of both CPUs. I watched a LOT of Miyconst videos as well. Turns out that there is a Xeon chip with 18 cores and 36 threads, but not all cores can Turbo Boost. Except Miyconst and his friends found a way to unlock Turbo on ALL cores. It's the Xeon E5-2699 v3. This is only a 3.6GHz boost clock. But with 18 cores, even at that speed you need some pretty aggressive cooling. HOWEVER, it turns out that there is a special version of this CPU. It's NOT on the Intel Ark database, which means that this was a custom configuration for a single OEM. I'm guessing for a Mac Pro, but I have no idea. What I DO know is that this variation, the E5-2696 v3 is the same silicon as the 2699, except that it has a max boost clock of 3.9GHz. Plus it has some additional instruction sets not found on the commercial version. And thanks to the BIOS injection revealed by Miyconst, et al, with water cooling that means 36 threads at 3.9GHz, which should be buttery smooth for editing. I'm a little chuffed that my 3080 FE only has 10Gb of VRAM, but everything I've read about the cards packed with RAM chips from this generation forward tells me that they are just running too hot. They will work fine for a while, but sooner or later they are going to fail. Not like my GTX 980 Ti that is as good as the day I bought it. In fact, it's going into the same rig. And I can keep the VRAM on the 3080 at 80° max under stress tests And aside from using the 980 analog output just for dedicated gaming on CRT (that never gets old), I can still do things like dedicate OBS capture to it, separate from the 3080, and use its 6Gb of VRAM for additional compute power for rendering or whatever. The machine is 3/4 built right now, just modifying the case for an external drive bay and add some additional fans. I'm making the case have positive air pressure to keep out dust. All said and done, it will have an MSI x99A MB with the OEM version of the 2699 v3, 64Gb of Four Channel ECC RAM, a 1500w Seasonic PSU, a Corsair AIO, the RTX 3080 and GTX 980 Ti, 2.5GHz Wi-Fi, a 500Gb NVMe Samsung 980 system drive, plus a 2Tb Samsung 970 EVO working drive, a 16Tb Enterprise HDD for cold storage in a modified mid-tower whose name just left my mind. The popular one with the metal bar that hides the cables? And I've started a separate NAS device also with a couple of Enterprise level HDDs that will only serve as backup. And once any backups are complete, it will be disconnected from the network and the internet. But it's fun as hell building these kinds of rigs, and seeing just how much performance you can squeeze out while doing it on a budget. I love what you've done, and I'd REALLY like to talk you out of that Titan X. That is the VERY BEST card ever made with an analog output. Almost identical to my 980 Ti, with double the VRAM. I have a nice water cooler if you ever want to let it go....
I really like when people reuse old(er) hardware and make it viable again. And a lot of people are willing to provide their knowledge on what to get, where to look etc. You can save a lot of money buying the proper "old" workstation system instead of throwing your money at the state of the art top consumer hardware available.
@@aChairLegRemember how used Linux? Now use Arch Linux, more specifically *BlackArch* even if you won't use the tools. I want you to go through the pain.(at this point, i think im so evil even satan fears me)
Old work stations are where it's at! I'm running a T5810 with E5-2667g3 (got a E5-2687Wg4 waiting to go in) 128GB ECC registered ram, RTX-2060 w/16Gb and an M.2 nvme on an adaptor card. It's running Ubuntu LTS. Before this I used a Dell T3500 for years. The parts are old but they're cheap, well built and have great real-world performance. Welcome to the enterprise-grade club.
I actually bought a dell 7920 rack mount a few weeks ago (hasnt arrived yet) and this video has me even more hyped about the performance I can hopefully get out of it 27k on cinebench? Damn Althlough I onlt got 2 12 core xeons, don't remember their model names
Bought a HP Z620 years ago for $200. Upgraded the Xeons to E5-2667V2. It came with 64 MB EEC, two 2T hard drives , and a 6MB video card. For maximum productivity at minimum cost, old workstations are the way to go.
@@dave7244 If you aren't computer literate, you buy a machine from Best Buy or a iMac. John Wayne said: "Live is tough, it's tougher if you're stupid."
Good video. I just wonder how close that an upgrade to a 5950x from a 3700x on your original rig would have gotten you to this Xeon tower's performance.
I want to eventually test that, but I needed a second PC anyways so I wanted to try something new. I'm guessing the performance isn't too far off honestly
Damn, I get absolute shit performance on my 5900x in Eco Mode, I'm tempted to run R15 in PBO and see if the space heater amount of heat it generates is worth it
A 5950x would have better performance for majority of things. The Ryzen has a higher single core. Most games and workstations do not need 40C/80T. Also, not sure if you know or care, but your idle consumption is probably in the 130w range now, with that system easily pulling 250w under load. Once again, awesome pcie expansion of you need it, but completely overkill for what you use it for. I have the 5950x, a dual Xeon scalable gold 6138, Xeon e-2146g and the i5-14500. The 5950x with dual Nvme drives, AMD 6700xt with 3 monitors pulls about 90w idle, 150w under load. Golds are at about 70w idle (just boot drive, 64gb ram,10gbe nic), and goes to about 250w under load. 2146g pulls 48w idle, 80w load with dual ssd mirror boot, dual mirror nvme, dual mirror 18tb enterprise drives, 128gb ram, with 10gbe nic for the truenas server. 14500 has single nvme, 64gb ddr5 and 2.5gbe nic. It is used for Plex server/transcoding and home assistant under virtualized proxmox. Idle is 20w, load about 40w. Either way, I understand the cool factor of enterprise gear. Realistically, it is not efficient nor suited for most tasks, nor really great for a home server either.
Video memory issue can fix Tesla series from NVIDIA for example TESLA m80 goes for around 100$ and has 24 gigs of GDDR5. Athrough you will need to tinker with cooling.
Great video of how to use second hand hardware! :D The price of electricity in Denmark is pretty high. If I lived somewhere where the cost was low, I would build a cluster of old dell servers in a heartbeat. They are somewhat cheap and are great for homelabs.
I am only a few moments in on the video, the guy behind you must of thought you had a tesla, to beat the light. Pretty spooky to see the light change color after the accident.
Personally tested passively cooled 24GB Tesla T40 cards and the temps stay at or around 70-80C under full load. these workstations are essentially desktop server towers.
Put a Intel Arc to decode the h.264 or h.265 via quick sync and you can use this system or your previous system whitout problems, worked wonders for me.
Xeon gold 6148's are going for as low as $80, which are the same core count, but higher clock count and higher single core performance too, 6138's are going for as low as $35, which isnt bad at all
Just be careful, now that you own one of these badboys, you better get yourself a good UPS to protect your power supply, and don't forget to get a spair one while they're cheap & available, because once they're gone, good luck finding another one to fit. Yeah.. that shit happens. GJ though.. I'll bet this is the most power hungry machine you'll ever own :)
I did one similar to the first idea you had with the watercooling build back in 2016, I regretted every single decision I made with dual E5-2690v2 that chucks power like it was nothing, and the thing is I used it as my NAS + Web host + VM test field, that was the worst build I made value wise, besides those abandoned watercooled PCs I made even earlier. At least it runs fine.
Join the discord!!!!!!!!!!!!!!!! discord.gg/2Wj8WanUzn
Id have gone a 4xxx series RTX as they support AV1 codec and learn to use DaVinci Resolve (i got a KEY with a small Keyboard) good setup for YT videos if you U/G it later
@@mrsrhardy ANYTHING newer than the Titan is a good choice. Most likely an A770 or RTX card will end up in this workstation
@@aChairLeg Checked out a few of these systems on ebay and can't find anything close to the price you paid. Nice work as it's quite a workstation.
I was thinking of replacing my x299 socketed i9-9900X with one of these, however that PSU is a real downer. I recently purchased an ASUS ProArt 4080 Super and it's powered by 3x 8 pin PCIe (with the included adaptor).
Strange that it has so many full length PCIe slots with nowhere near as many PCIe power cables as you'd need.
for your passive cooling test.. look into getting a tesla m40 24gb pg600 (the pg model will indicate internal generation which means better write speeds, so the pg600 is the last of the line and fastest. They are only about $80, but are server grade with no peripherals so you need a quadro 600 to pair with. You designate the program to the Tesla, and route the output to the Quadro. It basically has the same architecture and frame as the Titan X so use those drivers, and any 3rd party stuff like water coolers meant for the Titan X to get compatibility. Hope this helps, it took me a fair amount of research to piece this stuff together to build a VDI gaming server with a few linked together with NVlink for shits n giggles. It's probably the cheapest $/GB gpu solution currently, and wild to build out ~100GB GPU for the price of a 3060 * but you definitely will need more cooling for them, they are power hungry fuckers at 300W each, I highly recommend getting the 3d printed fan adapters for them on ebay; they operate around ~220wat nominal in my work loads, and youll want to get special splitters that turn TWO 8 pin power supplies into one 8pin for each GPU, as each 8 pin is only giving around 150W, and you can't use 6 pin converters because 6 pin is only around 75W. Power is probably the main reason they're so cheap and people haven't caught on to it as a solution, well that and not many people have motherboards that support multiple GPU/NVLink operations anymore but realistically it's so easy to build it out on something like you'd built here and give them 2 PSU, and set it up to only run what you actually need because frankly even 1 is a beast let alone 2 paired with NVLink or FOUR, so just set it up to run 2 per cheaper PSU, with one PSU as the master for everything else for a motherboard, and the other only for the GPU's themselves. In a configuration like yours, I'd have a PSU configured to power 1 on each motherboard, then the 2nd PSU turns on the 2nd for each motherboard. Shouldn't hurt them to not have that PSU controlling power to the GPU via motherboard communications, they're power hungry server grade fuckers with their own built in power management because they're designed to be plugged into stuff that might not necessarily support controlling power for it, this will also help keep them from all trying to pull the potentially 75 watts feeding from the PCIE all at once and will reduce the heat on the motherboard, and with a modular PSU, you can route the PSU outside the case to redirect that extra heat elsewhere. Food for thought for anyone going down this dark road. I suggest getting a v4 though, they've been really coming down lately unless you want to attempt the EPYC dark road, but frankly, there are benefits and arguments to be made about getting a dual EPYC SP3 board and only building out one 1 at first, with the expectation of using the other half for a future 2nd CPU when you need to expand for more power to make the machine more future proof. Bare in mind the EPYC cpu's noted with a P in their name don't like being configured for dual use. For instance the EPYC 7502p vs the 7502. Anyways, that's just what I've been playing with lately for an RSPS
Discord is for kids.
the thing about old workstations is, that they were once worth thousands and thousands for a reason! a 10 year old workstation can go further than most people realize
The other side is that these beasts are designed to be able to work 24/7 for a few years. And because of that, often they have been run 24/7 for years. I had a Z800 (well I still have it, but it's dead), and eventually all the RAM went bad and had to be replaced. Then suddenly it just died on me.
Until they break down, past warranty and no vendor support. We had to tell a client we can't support his system any longer. Motherboard issues along with part failure.
Got a 2010 MacPro Tower, with two Xeons. Thing is still usable for high-end 3D work today.
Dell T5500 144GB ram 2 X 5670, 2060 12GB GPU, got the box for $100 AUD, ram for $25 each 9 sticks, nvme, USB 3.1 card, cost in 2021 to build the system about $800. Win 10 pro OEM DELL.
Fantastic video editing machine.
@@tomleykisfan7280 And you got it on a budget. Maybe I spent too much on mine two years ago.
now you have ascended to the enterprise hardware realm and can look look down at all the plebeians with their consumer hardware
It's just simply so much better
@@aChairLeg it is, I got a chinese motherboard and built a dual Xeon e-5 2698 v3 CPU's with 32 cores total 3.6 Ghz and 256gb of DDR4 and 1tb m.2 with a 3.2 TB 12gb u.2 drive and a 1080TI to top it all off. and running linux. The CPU's have taken every single large task I throw at them no problem, I will be rendering in Blender and doing local AI at the same time while webbrowsing and have no slowdown whatsoever. I only paid 500$ for my system too, since I reused the Graphics card and power supply. It is simply a dream to work with and I'm saving up to upgrade to a 3090 TI since I need that VRAM and raytracing for 3D animation and AI. RTX accelerates AI and 3D renders and my 1080 TI isn't cutting it anymore since it only has 11gb vram and no RT. old Xeons absolutely rip even new AMD CPU's because of the lack of cores, if you have a single Xeon CPU like the 2699 v3 with 18 cores i've seen it compete with an amd ryzen 7 3700 despite being old x99
You can also feel above most enterprise hardware by buying DIY prosumer/workstation class hardware instead, like the ASUS WS C621E SAGE (X299), ASUS Z10PE-D8 WS (X99) or ASrock X79 Extreme11 (X79) and multiple Chinese brand X79 and X99 boards (new),
for example, which are more affordable now and good value since on the used market and Chinese ones are new. I guarantee many of the boards are above quality than say many Supermicro & Tyan boards, a good chunk of HP, Lenovo and Dell boards (and many also have IPMI, etc). Only the very high end and truly proprietary boards are really different enough. =)
I bought a rig with one of these threadripper pro 5995wx .... is that enterprise?
Literally cannot go back once you went workstation grade hardware. That stability is amazing. I got a Xeon w5-2465x with 128GB DDR5 ECC RAM, plus a RTX A4500. Storage I went with Optane 800GB U.2 drive, and a Micron 7450 6.4TB drive. Pricy yes, but stable and reliable.
*"Two Hundred and FIFTY DOLLARS"* Aaaaand thanks to this video they are now 400 and up, _sigh._
Sorry to hear about your accident. Hope you get better soon.
I'm getting there thankfully!
I know this was for the incredible video maker, but as someone who had a nasty car accident in January, this positively impacts me as well.
@@tradingnichols2255 I hope you get better as well. Vehicle accidents suck!
@@ewasteredux Thank you. I greatly appreciate it!
@@aChairLeg womp womp
I too got a T7820 ($200) and after upgrades it's running dual 6126 for 24c/48t and 12x32 (384GB) RAM. It's a very capable platform for workstation and home server duties!
Nice!
Holy moses you got a good deal!!!! A T7820 here is about $1000 with some pedestrian CPU's and 64Gb RAM. Just 12x32Gb would cost $600 if not more(€50 is about the lowest for 32Gb RIMM per module). I got the cheapest T5810 I could find for €180, came with a E5-1620 and 32gb in 8gb sticks. Now it has 96Gb(€120) and a 2683v3(€30) .
If I could find a T7820 for €600 I'd be pretty happy.
Dies this pc support virtualization???
@@jainayrogeorge2924 indeed. Proxmox is running great.
@@jainayrogeorge2924 Like most PC's, with the appropriate software, yes. Some hardware you need to be a bit selective about, but basically all PC's "support" virtualization; You'll want to have enough ram, cpu cores and supporting hardware for your VM's though, my box right now has 14 cpu cores, 96gb ram, 5 gigabit NIC's, that's enough to run a few decent virtualized servers.
Dude i feel you. The pain i receive daily tryng to shoot videos is incomprehensible to my old self. I was a passenger in a car accident in 2018. Almost died, but was blessed to walk away with broken ribs, herniated discs and vertebrae knocked out of alignment thru my back n neck. I had to medically retire out the military n adjust to a new life. Im glad you were able to walk away and have a solid suport base. Just found you, but love the video and your perseverance. Keep going
All without an opioid addiction? Ur blessed
Damn man. Glad you're alive. Hope things get better somehow.
@@RushinVr6 yeah that's pretty amazing. Kratom is insanely useful though. I know a couple people that went through hellish accidents that have used it for over a decade now. It's finally getting studied thoroughly in depth in the states for the compounds that are found in the leaves, but it will never be used to treat pain because it would ruin big pharma's opioid dealing
You probably should spread RAM chips evenly between the CPUs. Otherwise, the second CPU is starved, as it has to stall whenever it needs data. The second CPU board has memory slots for a reason.
Hot Take: Dell (USED) Workstations are the best value to price. Especially during "refresh" cycles where companies liquidate them on eBay.
Dell servers even better. Like 930 with 4*8880 v4+192gb ram for $1000
I saw this and ended up getting this same barebones 7820 and OH MY GOD. if they didnt use all proprietary parts i would never build diy pcs again, and personally its gonna take a lot to get me off the used workstation train from now on. awesome video
Never seen stacked cpus before, that’s really super neat
I love this approach- "We have the 1950X at home."
Your focus on older hardware reminds me of Iceberg Tech. Subbed!
I rarely sub off of 1 great video, but you supporting that this is a norm for this channel, I have subbed. Also, very rare, I added this video as a FAVORITE. A FAVORITE! A truly rare event!
Why? Because I might build one of these.
I took similar route and am in love with my HP Z840 workstation. Nvidia 3090 and it is a great AI monster for little money.
My first Dell Precision was a T5600, which I got for free from my employer (gifted/fully written off) after it was decommissioned in 2017; plus an extra 825W PSU that had been trashed along with another T5600 chassis. Still have it as a backup unit at our second apartment. I upgraded it to dual E5-2680's (from dual E5-2667) back in 2018. The only investment was a 0.5TB SSD, single use Win10 Pro license, and repurposed RX590.
I Just got a T7820 w/ 4110 and 32gb of ram that i won a bid on for 178, after upgrading the cpu to 6140 and 48gb ram and selling the wx2100 im still under 200$, love these older Workstations,
I got my 5810 from the same place a couple years ago. Not a powerhouse but I was super happy with the purchase - it has been my "living room"/media PC ever since... and I play pretty modern games on it all the time. 1650 v4. I also do some music production and it handles a crapton of channels and effects quite well, especially after a small ram upgrade.
I got the single cpu thinkcenter cause it had a good 8 pin for a 3060 12gb. Im never going back. Came with a xeon 2135 and 64 gb of ram for $190. This thing just runs beautifully. Congrats on the build, definitely use earlier videos you made to look further into this so thank you.
I'm trying to do the same thing, but i'm having a hard time deciding which xeon CPU to use. THere are so many options.
@@kevboost I needed cores over speed. Nothing I do is terribly cpu hungry but I do amor of different operations at once. 2135 is a 6 core and all I could really afford at the time. I'm deeply looking at a more core count processor or going to 128gb of ram next
I myself have a 40 core 80 thread build with dual Xeon E5-2673 V4s and i love it
Arrived for the tech, but fell in love with your cats. My twin brother has a tuxedo marked cat, who knows what his breed is, he was a Humane Society rescue cat. But Toby-cat Martinez was supposed to be for my brother's 11-yo son. Like his Dad, he would have preferred a dog, but Toby-cat was whathe got. My nephew did non of the upkeep work for his new pet and Toby-cat quickly figured what side of his bread was buttered. He soon became my brother's alarm clock and I watch my twin from become a dyed-in-the-wool dog dude to a Cat Dad. Pretty soon he was serving Toby-cat 3 cheeseburgers a day (kidding, that is our fav food). Through COVID, my brother's office became his home office, with Toby-cat either at his feet or making his presence known when he wanted back into the house. Both your cats are adorable--says the twin who has always loved cats.
Crazy that the 14700k gets 36000 points in cinebench r23, and for a while you could get cpu ram and motherboard for $500 in a bundle deal at microcenter. Of course, the final PC build would end up being twice the cost of this, but pretty crazy how far hardware has advanced.
For LGA 1700, only W680 has ECC support.
Hello, fellow bad-back haver!
Do whatever exercises your doctor/physical therapist gave you, on whatever schedule they gave you, or on some regular schedule if you didn't get a schedule.
Also, get a back brace.
Also also, if your car doesn't have lumbar support for it's seat, consider getting a lumbar support pillow. I found one at an auto parts store and I wanna buy like ten more because that one took me from not being able to comfortably drive to often feeling better *after* driving than before!
You should replace the GPU with an ARC A750. Hear me out. QuickSync beats the crap out of CUDA on encoding, and you can encode in VP9 directly so you save time on the UA-cam conversions. Try it in a video with this nice encoding CPU setup and you’ll see what I mean. I replaced my Quadro RTX A4000 with an ARC A750 and I get 40+ 4K HEVC h265 to 1080p h264 transcodes simultaneously now in Plex with zero buffering. However I also have 4x 4TB PCIE 3.0 NVME SSDs (bifurcation, which your motherboard supports) in RAID 5, so 12TB total as my storage, and transcode drive is a 128 GB RAM Drive that dumps to disk before a reboot. This RAM Drive dump happens lightning fast too. I might increase the RAM Drive to 192 GB because I have 256 GB DDR 3 ECC total, and I can never seem to go over 50 GB of RAM usage with 200 chrome tabs open and other apps.
What is your base workstation setup (based on)? What RAM disk? What RAID card? How much did your setup cost? Could you please elaborate? Sounds like you use it as a home media / lab server setup, exactly what I'm thinking of! Any feedback how it handles virtualization & *especially* what's its power consumption like & how loud it gets under different loads? Thanks in advance!
@ RAM disk is probably just system RAM used as a storage device with tmpfs or ramfs. Likely not a RAID card, probably a PCIE x16 to M.2x4 splitter with software RAID 5.
Do you know if the A770 would be able to run more streams? Especially the 16GB version?
@@davidmcken I’ve never tried, but my personal opinion is that the A580 or A750 are just priced better for the performance you get. If you wanted to squeeze every possible transcode out of the setup? Probably, but it’s not worth the $100 premium over the A750 and certainly not the $140-ish premium over the A580. Maybe a 10% increase in transcodes? Maybe 20% increase? Definitely not worth double the price over the A580 or the 50% price increase over the A750. A580 appears to be the sweet spot or the A750 if you have the extra $30 budget is a no brainer. If you have a desktop Intel CPU, remember to disable the integrated GPU completely prior to installing Plex. And for me, Windows appeared to give me better transcode performance than Linux, just another hint that could hopefully save you some time. Intel still has work to do on their Linux Arc drivers.
@@bleeb1347 ok, np. I am seeing A770s for about $300 atm and I'm looking at one since level 1 techs are hinting at it being able to be cross flashed to a flex opening up the possibility of SR-IOV as well and well 16GB of ram seems reasonable for the price. If ffmpeg transcoding performance comes free for the ride it's a good card for a home lab server being that multipurpose, at the current price might as well "splurge" for the highest model so I have space to do stuff like OpenAIs whisper or anything else I might want to throw at it on the AI front.
As the owner of a 22 core Xeon, 64 GB RAM and RTX A4000 graphics card, I understand your early impressions of using such a system.
i am amused - the xeon you bought was an MSRP of 3k each and you got them for 150 or so. buying old tech is so nice...
I think old xeons are dirt cheap but replacing the board is more of an issue
@@quintrapnell3605 the other thing is that i can get a stupid amount of memory - 64g dimms for $85? and i can get 12?
In my country they still sell for msrp
Last year I bought a barebone Precision 7920 (wider 7820 with a bit better cooling and more expansion slots) for $460, a pair of Xeon Gold 6146s for $230, and 128gb ram to use as a home server. I loaded it up with 8 3.5" HDDs, a few GPUs, and installed Proxmox. I ran Cinebench R23 and got 1084 SC and 27794 MC. I ended up selling it (for a profit at least) cause I really just didn't need it lol. It was a beast thought and I wish I had found a reason to keep it. Cheers
I got a T5820 during covid when everything was unobtanium. Replaced the CPU with a 12core v3 CPU and that workstation did everything I needed until I built myself a new machine. Since then it's been my proxmox server and I'm just getting around to replacing it with an Epyc system. For the ~$400 I put into it it was definitely worth it.
Hi, what model CPU did you use?
@@dimidimi6243 E5-2678 V3. You can get them a lot cheaper now than I did then.
@@dimidimi6243 I know it's months later but in case anyone else is interested a Xeon E5-2678V3 was the replacement CPU. Cost me $80 back in 2020 you can get them now for $20.
I use one of these similar workstations at work. I was amazed just how much they can handle.
you can get a nvidia telsa p40 with 24gb of ram at least of 200 dollar for vram video editing
I wish there were an easier way to cool these. I had to design custom ducting, fan mounting, and a service for Linux to dynamically adjust the fan speed.
@@execration_texts you right the teslas are little problematic to cool because of passive cooling system
@@execration_texts Hi, sorry for mu bad english, I have Tesla m40 and installed heatpipe of nvidia 980ti with 3 fans max 4.000 rpm and under stress it reaches a maximum of 65 C :) Now I trying to cool the Tesla p100 16Gb :)
@@execration_texts I thought there were aftermarket powered fans that fit on them, so the only thing needed is a spare fan power connector ... and a wide chasis.
Just so you know, a 7950x3d benchmarks in cinebench around 38k when optimized(my personal workstation cpu), above a threadripper for a little less its under $500 cpu now. But thats a killer value build i love it!
As a reference, while this system can handle 768 GB of RAM across two CPUs, a Threadripper 7xxx system can take 2TB of RAM. And that's on a single socket board. If you went dual socket EPYC, the max RAM would be 4 TB.
I've seen a dual socket mobo bundled with two 7501s on aliexpress for $800. Not sure if it's a scam or we're entering an era of cheap old epyc workstations
And it would be 6k MTS... No even the same league
I like how they went 3d cpu mounting.
Wild design!
The Lenovo was fittingly old school single board dual CPU. 😂😂😂
I rarely ever watch a sponsor segment the full way through - amazing camera work, was truly mesmerizing.
would be interested to see how performance would have compared with a straightforward CPU swap-out to a Ryzen 5950X
Thought the same thing.... Guessing same or better performance?
I LOVE WORKSTATIONS
XEON GANG LET'S GOOOOOOOOOOOO!
I bought a used Dell T5810 Workstation, added a second 200-something (240 I think?) GB SATA SSD, slapped in an RX 6600 and initially ran it with 12 Gigs of DDR4, but recently raised that to 20 GB of Ram, and replaced the Xeon E5-1650 V3 that it came with an E5-1650 V4, and I'm gaming pretty well. My next planned updates are to go for a Xeon 2697 V3, more SSD (maybe going for a 2TB M.2 and using a PCIE to M.2 converter), and maybe in a couple of years, getting a 6650 XT or a 6700 XT.
I have the following as my main music making rig Dell Precision T3600 : Intel Xeon E5-2690 @ 2.9Ghz (8C/16T) // 32GB ECC RAM// Gigabyte Nvidia RTX2060 6GB//2TB Samsung 860//2TB WD Black// it does so good
Was there a noticeable upgrade difference using the 1650v4 over the 1650v3??
I built an x99 platform gaming pc/media server last year and decided to stick with v3 gen cpu's since they have overclocking capabilities when it comes to gaming performance. I was super lucky and won a bid for a 1680v3 for around $60 on ebay not to long ago as well so i swapped out the 1660v3 i had bought a few qeeks prior with it and it definitely made a small increase in performance but idk if it was justifiable.
This UA-cam channel called "miyconst" is like the official x99/xeon tech guru of UA-cam and his testing shows the best cpu ultimately for this platform is the xeon 2697v3 but locked at max frequency boost which would usually require a true x99 chipset motherboard.
I thought about going with a xeon 2689v4 if i can find one for reasonable price but i really do appreciate the ability to tinker with overclocking the v3 cpu's.
Just curious what your opinion is going with the v4 over the v3 cpu's?
That's an awesome build! The Dell T7820 is surprisingly good, now I want to build one myself!
dell pro gear is a dream to work on, and cheap as hell
@@dercooney I would agree if they would use standard ATX PSU & motherboards.
Nice! I just recently got a T7910 so a little bit older, but 28 cores, 56 threads + 64GB of DDR4, and a gtx 1080. oh and a 500GB nvme drive with adapter. Its quite a beast I might add. I may eventually drop my RTX 2080 Super in there.
Gah, I have a thing for these massive CPUs, and you've just informed me that they're relatively accessible... Stop giving me ideas, I'm already spending too much money!
They're so cool looking though!!
Love that you tried BeamNG on it, really like seeing how the sim performs on high core count systems like yours
Absolute Monster. I just got the Lenovo P520 and now I’m rethinking.
So glad you mentioned Miyconst; so far he and Tech Yes City are the only ones that routinely revisit Xeon cpus for pro-sumer applications. I wouldn't have done any of my computer builds without them.
13:02 The storage warning is perfect.
I asked this question on your Discord as well. Apologies for the repeated wall of text.
I like the idea of the 40 core budget monster for video editing and transcoding. That leads me to my question here. Why don't tech tubers ever show things like numbers of simultaneous transcodes with Plex, or Handbrake HEVC veryslow transcodes or AV1 preset 1 or 0 transcodes? Most just say something like "Handbrake transcode speed in XX minutes," but never say their settings. I really want to build this 40 core 80 thread beast, but I need more info and stats.
Funny that you bring Miyconst up. I had a little talk with him about a nas server solution not long ago. I think he is very knowledgeable in this field
Mate this vid was so good that I went and bought a 7820 as well. Posts and boots a bit slow but it is CORE CENTRAL. Mental and massive. cheers
Excellent video, enterprise grade workstations are awesome and cheap on the used market.
I bought a used dell t7810 and upgraded it to 44cores with 512gb ram. Never been happier for analysis for imaging. Everything else crawls.
This was way more fascinating than I expected it being. Your enthusiasm for it really was gripping and now I want to build one of these!
These sorts of videos are like butter on my oc brain. The level of soothing it provides
This is waay better than the usual high core xeon builds with poor ipc.
this is amazing.i never thought about looking at second hand corporate servers! brilliant!
Those PCIe slots support Bifurcation... You can get a 4X M.2 carrier board and put four NVMe drives in one slot.
I do this with mine at work and it’s nuts in RAID on Linux
very cool video whats the power usage?
Older tech is cool. I got a Dell PowerEdge R620, 2xCPU, 144Gb RAM, 4 x 1.2TB SAS drives altogether around $410. I also got a Dell PowerEdge T410 for $50 then put 6 x 6TB SAS drives in it.
I also found a Lenovo B490 laptop at my brother's office with no power supply. I happen to have an extra Lenovo X220 power supply and it worked. $29 for an SSD and got 16Gb RAM from old laptops and it's all good.
Old workstations are amazing. I've used an old HP Z200 with an Xeon X3440 and 16GB to host a minecraft server and home NAS. Upgraded it to a HP Z400 with Xeon W3565 which now has 40GB, still need to get another two 8GB sticks for the full 48GB you can fit.
I would've loved to see how much power it draws at it's maximum potential but regardless, amazing video, nice cinematic shots and great story writing skillz, hope you have a safe recovery
I'll check after work and let you know
@@aChairLegkiller long day huh?😂
Glad you're OK man! That car crash was intense.
Someone gave me some E5-2650v4s, a TON of 16GB/32GB ECC DDR4 and a pile of SAS3 disks. I've been on a journey like yours, ever since.
I was racking my brain trying to figure out how you got 40 cores under $800…when I have a 36 core T7920 under my desk. Lmao
I’m stoked Skylake-EP is getting super affordable. Big core counts, lots of PCIe lanes, and common bifurcation support makes it a nice upgrade over Haswell/Broadwell.
Mine’s running a ProxMox lab after I pulled it from NAS duty in favor of a 4c Kaby Lake Xeon.
Not a bad option for people who have a tight budget.
waiting for the water cooling video! have fun with that 2-tier cpu system
not to rain on anyone's parade but when you do the math, skylake IPC is so much lower than zen 5 you can just get a 9950X and have equivalent multithreaded power, and the system would actually be efficient and useful for lightly threaded tasks and gaming. this is the problem with budget CPUs imo. CPU tech has been advancing so fast it's just not worth it to go back to prior generations.
Bought an HP 620 workstation off Ebay, upgraded the Xeon x2 cpu(s) and Video Card and "Boom" a powerful workstation that games. Only down side is it produces a lot of heat. People underestimate old workstations. Best value/$.
Hope you Are better now After your accident ❤
this is one of my favorite vids this year
@12:48 You're not turning off proxies to scrub, that moving around the timeline is after you turned proxies on.
Very nice. I also considered a very similar route with either the Dell or the Lenovo dual CPU machines. But there is that issue with a lot of software unable to make use of both CPUs.
I watched a LOT of Miyconst videos as well.
Turns out that there is a Xeon chip with 18 cores and 36 threads, but not all cores can Turbo Boost. Except Miyconst and his friends found a way to unlock Turbo on ALL cores.
It's the Xeon E5-2699 v3. This is only a 3.6GHz boost clock. But with 18 cores, even at that speed you need some pretty aggressive cooling.
HOWEVER, it turns out that there is a special version of this CPU. It's NOT on the Intel Ark database, which means that this was a custom configuration for a single OEM. I'm guessing for a Mac Pro, but I have no idea.
What I DO know is that this variation, the E5-2696 v3 is the same silicon as the 2699, except that it has a max boost clock of 3.9GHz. Plus it has some additional instruction sets not found on the commercial version.
And thanks to the BIOS injection revealed by Miyconst, et al, with water cooling that means 36 threads at 3.9GHz, which should be buttery smooth for editing.
I'm a little chuffed that my 3080 FE only has 10Gb of VRAM, but everything I've read about the cards packed with RAM chips from this generation forward tells me that they are just running too hot.
They will work fine for a while, but sooner or later they are going to fail. Not like my GTX 980 Ti that is as good as the day I bought it. In fact, it's going into the same rig. And I can keep the VRAM on the 3080 at 80° max under stress tests
And aside from using the 980 analog output just for dedicated gaming on CRT (that never gets old), I can still do things like dedicate OBS capture to it, separate from the 3080, and use its 6Gb of VRAM for additional compute power for rendering or whatever.
The machine is 3/4 built right now, just modifying the case for an external drive bay and add some additional fans. I'm making the case have positive air pressure to keep out dust.
All said and done, it will have an MSI x99A MB with the OEM version of the 2699 v3, 64Gb of Four Channel ECC RAM, a 1500w Seasonic PSU, a Corsair AIO, the RTX 3080 and GTX 980 Ti, 2.5GHz Wi-Fi, a 500Gb NVMe Samsung 980 system drive, plus a 2Tb Samsung 970 EVO working drive, a 16Tb Enterprise HDD for cold storage in a modified mid-tower whose name just left my mind. The popular one with the metal bar that hides the cables?
And I've started a separate NAS device also with a couple of Enterprise level HDDs that will only serve as backup. And once any backups are complete, it will be disconnected from the network and the internet.
But it's fun as hell building these kinds of rigs, and seeing just how much performance you can squeeze out while doing it on a budget.
I love what you've done, and I'd REALLY like to talk you out of that Titan X. That is the VERY BEST card ever made with an analog output. Almost identical to my 980 Ti, with double the VRAM.
I have a nice water cooler if you ever want to let it go....
Very cool, love seeing these old workstation rebuilds. Curious what was the wattage it's pulling on average?
500 GW/h.
I too was hit as a pedestrian, the driver was drunk. Hope you have a speedy recovery. Great video and amazing build for the cost, liked and Subbed. 😇
I really like when people reuse old(er) hardware and make it viable again. And a lot of people are willing to provide their knowledge on what to get, where to look etc.
You can save a lot of money buying the proper "old" workstation system instead of throwing your money at the state of the art top consumer hardware available.
Wake up babe new chairleg video dropped
I was already awake wym
@@aChairLegRemember how used Linux? Now use Arch Linux, more specifically *BlackArch* even if you won't use the tools. I want you to go through the pain.(at this point, i think im so evil even satan fears me)
hi abdo
@@nikolan123 hi Niko
@@Facade866 hiii facade
This beast must heat up your room real good
Id have to try and Hackintosh that
Old work stations are where it's at! I'm running a T5810 with E5-2667g3 (got a E5-2687Wg4 waiting to go in) 128GB ECC registered ram, RTX-2060 w/16Gb and an M.2 nvme on an adaptor card. It's running Ubuntu LTS. Before this I used a Dell T3500 for years. The parts are old but they're cheap, well built and have great real-world performance. Welcome to the enterprise-grade club.
whats the lowest idle power this does
Did you find out? I'm dying to know.
@@tradingnichols2255 tbh I never rechecked but I’m sure if he did a video on it yeat it self that I know of
@@tradingnichols2255 Dont even try to know as it is high. Gas guzzler.
never had any problems using my 3700x when I was editing videos and streaming, nice cat btw
I actually bought a dell 7920 rack mount a few weeks ago (hasnt arrived yet) and this video has me even more hyped about the performance I can hopefully get out of it
27k on cinebench? Damn
Althlough I onlt got 2 12 core xeons, don't remember their model names
The nice part is you can always upgrade!
Recover well ❤
Bought a HP Z620 years ago for $200. Upgraded the Xeons to E5-2667V2. It came with 64 MB EEC, two 2T hard drives , and a 6MB video card. For maximum productivity at minimum cost, old workstations are the way to go.
I had the Z820. Amazing machine. I tell people they should buy them but people don't listen to me and get meme machines instead or a mac.
@@dave7244 If you aren't computer literate, you buy a machine from Best Buy or a iMac. John Wayne said: "Live is tough, it's tougher if you're stupid."
That's a dope Monster PC
What a steal! I love this, when used Hardware performs so good.
Good video. I just wonder how close that an upgrade to a 5950x from a 3700x on your original rig would have gotten you to this Xeon tower's performance.
I want to eventually test that, but I needed a second PC anyways so I wanted to try something new. I'm guessing the performance isn't too far off honestly
@@aChairLeg I mean sure, performance may be similar, but I'm sure the expandability of that workstation is insane. PCIe bandwidth for days.
Damn, I get absolute shit performance on my 5900x in Eco Mode, I'm tempted to run R15 in PBO and see if the space heater amount of heat it generates is worth it
A 5950x would have better performance for majority of things. The Ryzen has a higher single core. Most games and workstations do not need 40C/80T. Also, not sure if you know or care, but your idle consumption is probably in the 130w range now, with that system easily pulling 250w under load.
Once again, awesome pcie expansion of you need it, but completely overkill for what you use it for.
I have the 5950x, a dual Xeon scalable gold 6138, Xeon e-2146g and the i5-14500. The 5950x with dual Nvme drives, AMD 6700xt with 3 monitors pulls about 90w idle, 150w under load. Golds are at about 70w idle (just boot drive, 64gb ram,10gbe nic), and goes to about 250w under load. 2146g pulls 48w idle, 80w load with dual ssd mirror boot, dual mirror nvme, dual mirror 18tb enterprise drives, 128gb ram, with 10gbe nic for the truenas server. 14500 has single nvme, 64gb ddr5 and 2.5gbe nic. It is used for Plex server/transcoding and home assistant under virtualized proxmox. Idle is 20w, load about 40w.
Either way, I understand the cool factor of enterprise gear. Realistically, it is not efficient nor suited for most tasks, nor really great for a home server either.
@@bhume7535 Power draw is also insane, good luck running it for like half a day every day.
Congratulations for making the best of a tough situation!
Using dual 6148 for a sixth year already. Not a single problem, except replacement of the dead 1080 Ti with 3090 in 2022.
Video memory issue can fix Tesla series from NVIDIA for example TESLA m80 goes for around 100$ and has 24 gigs of GDDR5. Athrough you will need to tinker with cooling.
Dual E5-2699 for $40 each, Whole build including a K80 was under $500, 32tb storage, and 10GbE.
Hope you get better, back issues suck!
Great video of how to use second hand hardware! :D The price of electricity in Denmark is pretty high. If I lived somewhere where the cost was low, I would build a cluster of old dell servers in a heartbeat. They are somewhat cheap and are great for homelabs.
Myconst is GOAT.
I am only a few moments in on the video, the guy behind you must of thought you had a tesla, to beat the light. Pretty spooky to see the light change color after the accident.
Personally tested passively cooled 24GB Tesla T40 cards and the temps stay at or around 70-80C under full load. these workstations are essentially desktop server towers.
Put a Intel Arc to decode the h.264 or h.265 via quick sync and you can use this system or your previous system whitout problems, worked wonders for me.
Xeon gold 6148's are going for as low as $80, which are the same core count, but higher clock count and higher single core performance too, 6138's are going for as low as $35, which isnt bad at all
Just be careful, now that you own one of these badboys, you better get yourself a good UPS to protect your power supply, and don't forget to get a spair one while they're cheap & available, because once they're gone, good luck finding another one to fit. Yeah.. that shit happens. GJ though.. I'll bet this is the most power hungry machine you'll ever own :)
You finally put something together that meets your needs, now what? No more scrap builds with sarcasm. Now I'm sad😢 take care.
Trust me, there's a lot of fun stuff out there still I want to cover.
Man that was an awesome video time to retire the T410!!
ohh my word really just clicks jeje, btw cool build 40 cores beastly and truly insane for the price
I like these chairs but I really think we need more kneeling and push up chairs. You get some benefits of sitting and standing or you can just stand
There are auctions for decommissioned super computers. Maybe another upgrade is in order!
I did one similar to the first idea you had with the watercooling build back in 2016, I regretted every single decision I made with dual E5-2690v2 that chucks power like it was nothing, and the thing is I used it as my NAS + Web host + VM test field, that was the worst build I made value wise, besides those abandoned watercooled PCs I made even earlier. At least it runs fine.