I picked one up on eBay today, and found your video by searching for cooling solutions. :) I've been wanting one for ever, and been looking for a while now for my media server, and what tipped me over was playing with Stable Diffusion and running out of VRAM on my 2080 Ti. The 24GB Tesla is going in my server with twin Xeon 12-cores and mucho RAM. Super stoked, can't wait to play around with this. 😍
@@straysum7865 It's currently in a 4U chassis using an MSI motherboard with an H110 chipset, but I don't recall the exact model off-hand. It's a Skylake chipset, but I put an i7-7700 in it since the only Skylake chip I have unused was an i5-6500. I also converted the card from passive to active cooling using the parts from a dead eBay 980 Ti so I didn't have to worry about rigging up a blower to it or anything.
nice video! yeah the most practical application for these cards is video editing with adobe or davinci (or any application that utilizes CUDA cores and huge amounts of VRAM alongside a CPU workload as well). there is a 24gb version also probably more applicable to blender rendering
It's a Maxwell chip so it has a pretty early version of NVENC hardware, something software can't really solve and quality will be less than a newer chip. This is the same line of chip as the 900 series desktop cards, though it has more compute and memory than even a 980 Ti, thus being closer to a 1070 in framerate at the cost of 100W more power. 24GB makes it a good option for people who want to play with machine learning. Slow on compute by modern standards (like 1/8th the speed of a 3090) but it will let you create rather large images in AI art creation tools and such with the huge memory footprint.
I have a K80 a M40 and a P40. I render with iRay and use Topaz Ai apps along with Ai video upscaling. I love them! Being able to render in the background almost as fast as my 2080Ti is a GOD SEND! I use the "crap" shrouds, but using a server fan without issue.
It's crazy that these are $150 or less in the US. You could hook a Kraken G13 to it and cool it with an existing AIO or use a 3D printed fan shroud and a 6000 RPM delta fan to cool it. In other countries, these are like $500 minimum, so we have to buy the K20X, which have shit DX12 and Vulkan support unlike Maxwell cards :(
You Could also get A Rajintek or a used accelero 4 for cooling this Thing, i cooled it with the 10 years old Scythe Setsugen 2 which was to my surprise compatible and it didnt get that hot it got only 82°c wit the fan running on 5v
This is quite literally the only card which can do what I want that I can afford. But my concerns over the lack of display outputs and the hackiness of the entire affair really are holding me back (I have zero recourse if something doesn't work). Knowing how some games like Halo Infinite are so finicky with drivers...
I can confirm it plays halo infinite multiplayer comfortably at 60fps now. I have an R720 with a pair of E5-2690v1s and 2 of these running 2 gaming VMs simultaneously. There is some registry hacking required.
I have a A4000 but is not passively cooling, in z840 and the airflow is sufficient for gaming even at 99% usage. I don't know about the passive cooling.
just bought a 12GB M40 off eBay for 48 bucks. My plans are to install it into my supermicro unRAID server to use for plex transcoding. If successful I plan on upgrading from a Dual Xeon 2695 V2 setup on the super micro board to something a bit lighter on power requirements but still strong enough to handle the rest of the docker containers and VMs that I run on the server.
A new case like Lian Li Lancool 216 with a dedicated fan on the PCIe expansion slots would also work for cooling a passive GPU like Nvidia Tesla and AMD Instinct
There are real budget alternatives available. The Tesla M2090 goes for 20-30 bucks frequently. Works for esports titles and has 6GB of VRAM. The K20X which you can easily cool with a 3D printed fan duct and a Delta PWM fan. Also has 6GB of VRAM and GTX Titan (Kepler) performance at $70 cost. In the current market, that's pretty damn good.
@@shalokshalom AMD has video ports on theirs. There's the AMD Instinct Mi8, the FirePro S7000 or Radeon Sky 500, the S7100 and S7150. You can look them up on Techpowerup to figure out which ones make most sense to you. Personally I don't think any besides maybe the Mi8 are worth it because they're all too overpriced.
12:00 I'm curious about your settings for the gpu encoding. with mine I can re-encode an h264 1080p 8 hour recording with a file size of about 12GB down to h264 720p with 2.18 GB for under an hour using h264_nvenc vbr on slow preset; with the cpu libx264 on slow preset it'd squash it down to 1.73 GB but it took more than 10 hours to complete (more than 15 hours on very slow preset but with the same file size)
Quick question for you. I know this is an older video and you may not have this GPU any longer, however, I was wondering with all the new locally runnable LLMs (i.e. chatGPT like local apps), would this work for them as a viable option to something like an Nvidia 40 series as VRAM seems to be the major bottleneck?
ChatGPT needs several A100s with 80GB RAM each. You can’t run ChatGPT on this one, but there are other LLMs that would run on this. I just ordered one from eBay. It’s going to arrive next week.
First vid of yours I have seen and instant fan. Speaking of fans, which type of 40mm fan did you try with the shroud? I have just printed the shroud as I found this. But if you used those little 80ma fans maybe some thicker 180ma ones might do it. Worts case I have a 3A centifuga fan kicking around which will fit ona shroud..I think about $15 for that. Will let you know...
Hi, I am interested in trying a Tesla card for video encoding, either using Handbrake, or RipBot264, but most of the YT clips about them is based on gaming....so my question is, do Tesla's support HEVC x265 software encoding ? or are you "stuck" with NVEnc ?? I won't be using VM's or anything "fancy" like that, just wanting the Tesla to do most of the work. I hope you can help. Cheers
Nah, the Tesla is gonna be hardware nvenc only, which in my testing wasn’t very optimal for proxies. It would be decent for real time transcoding though.
so with one of teh sff ones which go for 85 bucks rn... i can turn my dell 7050 into a server and then use my laptop to game like geforce now... with parsec..or just use the igpu out on my desktop...i may do that i s its worth getting if i am using a rx550 rn...
@@APersonPeople1 No and nor does it work on newer A4000, you can divide computations at software level, but for gaming it's anyway yet another story, where you would have to write your own drivers for a checkered rendering across 2 cards, probably, it's not something easy to do...
I recently purchased two Nvidia Tesla M40 12gb video cards on Ebay with delivery to the Russian Federation at a price of $ 130 apiece. It is planned to arrange beggar gaming
i have a problem , i have a gtx 1050 ti and the tesla , i downloaded the drivers for tesla it automatically work after that but the resolution is blocked on 4k i can't lower it , and also the 1050ti loose his drivers and i can't choose to use it in some aplication , i only have the tesla m40 , and if i download the 1050ti drivers again it just do the same , the tesla loose the drivers and i can use only the 1050ti but this time i can change resolution with 1050ti , pls help
Hello; i have a huananzhi dual xeon motherboard with 32 gb ram x79. Will a tesla m10 work with this setup? From what i read its a quad instances gpu 8 gb x4 32gb total. Thanks in advance
I want to use parsec, I really do, but every time I try to connect to an instance I can't see my mouse cursor, which makes using parsec very difficult. I have tried the fixes I could find online, but they didn't work. Very discouraged. Have you experienced a disappearing cursor before?
I actually have experienced this. Check this article from Parsec: support.parsec.app/hc/en-us/articles/115002623892-Mouse-and-Keyboard-Isn-t-Working-Correctly-When-Connected What worked for me was installing the Wacom driver in the "The cursor isn't visible when I try moving" section.
@@RaidOwl I tried the Wacom driver, it didn't work. I also tried the cursor trails. But those shouldn't be the problems because the machines I connect to have physical mice. I just connect to them remotely. It just has me bummed out.
8:23 you cant cool this with that tiny fan, you need a blower fan, preferably from delta. i have one from an old optiplex from 2003 that is temp controlled, by design this doesnt work on the tesla, but if you short the temp sensor pins, it goes full blast, which is what you need for these.
@@RaidOwl so you went with a small delta fan? That doesn't make sense. Watercooling is better but you dot get quite the amount of surface area, if you turned down the voltage on the blower fan and make sure that there is a good seal on the adapter, you could achieve acceptable noise levels and Temps.
@@RaidOwl same theory, high speed and small but not high static pressure. the fans in a server chassis are similar but probably larger, are delta branded, and have a ton of ducting inside the chassis and have much more of them.
Hi guys, I have a MSI B450m-a pro max with a ryzen 5 3400g (integrated video). With this config I use double monitor thanks to the double HDMI output of the mother board. I would try to install a Tesla K20x but I need keep using double monitor. Do you think Is possibile or this new config Will have problem to manage 2 video outputs? Thanks to all
I initially didn’t like it at all. As a viewer…I still hate it. However, as a small creator, I’m leaning more towards liking it. The reason is because I was personally way too obsessed with my like/dislike ratio. Now when I go to my video I don’t even see it (yes I can still see it in Studio) so that reliefs unwanted anxiety. It’s a tough call and there’s more to it but that’s my basic thoughts.
Little late on this but isn't the Epyc CPU cost like 14.k? So the value vs cost is kind of alright in my mind. Even out here in 2022. Your Desktop is about 3k (just off the top of my head) you get 3 minute 800mb files with either cpu or gpu encoding Your Server CPU is roughly 1.5k and get 8 minute 300mb files Your Server GPU is around 150 and get 12 minutes almost gig files I still think a GPU would work great to start this off. Love the tests though. Made me want to buy a couple of these cards to cram in my server! I don't have epyc processors but I do have dual Xeon so I might try a handbrake test just to see how that works on those as well.
7:41 "I know what some of you are probably thinking, [shows the wrong model, but one that's at least dual 40mm] 'Oh I've seen shrouds, like 3D printed shrouds you can put on the Tesla M40 to help cool it', but those don't work. The reason I know that is because I tried [pulls out the worst possible model, a single 40mm fan mount] I don't know why this exists. [...]" My man, there are models out there for fitting just about any fan you could think of onto one of these cards, but just because you can doesn't mean you should. Use some of those critical thinking and executive function skills they taught you in school, or even just pay attention to the video you referenced. Because he showed in no uncertain terms there was no way a single 40mm fan could handle that card by itself under any circumstance. The models you'll see recommended all over the forums mount a 60mm or 120mm blower fan. Yes it's much louder, but it's also pushing 4-6 times the air with no modifications to the stock card, there's a reason these cards are down to $100 when a Titan X in good condition still runs 3x that.
i got a k20mx for 22 shipped on ebay. With a few smi commands combined with dkvk I have this card running crysis 3, Watch dogs legion, Witcher 3, Lego Starwars and I havent tried anything else.... yet Getting the cards to game is NOT THE HARD PART!!! THE HARD PART IS COOLING THESE BIG BASTERDS!!!
Sorry, I'm missing the price point argument. The gtx 1070 is 400$ and your card plus fan is 300$ but your time of setup work is more than 100$. I'm not getting it other than that it's a cool setup.
@@RaidOwl looks like prices are already going up. I've been putting off buying any GPU and just using the onboard on my server mb but it is so slow and kind of frustrating to use. Waiting for the world to change
"Can someone help me? I have an MSI X370 Gaming Plus motherboard, a Ryzen 7 1800X, and an EVGA 80 Platinum power supply. I bought a 24 GB Tesla M40, but when I connect it to the power supply, it doesn't turn on. Any ideas?
I really want to buy this card and downgrade my 1080ti but they costs the same here in my country (Philippines). I want to tinker with it then slap a waterblock (titan x waterblock are compatible with these).
The layout of the board is the same, but all waterblocks may not work. Maxwell Titan X and the m40 are the same PCB, just like the 1080 TI reference/Titan X/P40 are the same in the pascal cards. The BIG BIG BIG DOES NOT FIT difference is if you have a waterblock with the incredibly stupid extension all the way to the end of the card. There will be a cutout for the 2x PCI-E power connectors. The tesla cards move them to the end edge of the card around the corner from where you find them on a Titan X. You'd need to find a shorter block. If you see only one row(a line of three standoffs) of "screw goes through board and into the waterblock" between the actual 8 the tighten the block to the die/memory/vrm and the end, it should be short enough. If it uses the two holes/standoffs on the end of the card, it will impact your power connector(won't fit). For example, the heatkiller IV for 980 TI/Titan X should work(weird arrow shaped termination just after the first set of non-functional hole/standoffs), but the heatkiller IV for 1080 TI/Titan X changed to extend the full length of the card.
I picked one up on eBay today, and found your video by searching for cooling solutions. :)
I've been wanting one for ever, and been looking for a while now for my media server, and what tipped me over was playing with Stable Diffusion and running out of VRAM on my 2080 Ti. The 24GB Tesla is going in my server with twin Xeon 12-cores and mucho RAM. Super stoked, can't wait to play around with this. 😍
yo what mother board you got in it because I've been trying to find a compatible motherboard
@@straysum7865 It's currently in a 4U chassis using an MSI motherboard with an H110 chipset, but I don't recall the exact model off-hand. It's a Skylake chipset, but I put an i7-7700 in it since the only Skylake chip I have unused was an i5-6500.
I also converted the card from passive to active cooling using the parts from a dead eBay 980 Ti so I didn't have to worry about rigging up a blower to it or anything.
@@DavidHansen725 thx
Did SD run well on it?
@@abudyawad1843 It depends on the settings, really. Generally, it will churn out a 1,024 x 1,024 image with 200 steps in 45-50 minutes.
nice video! yeah the most practical application for these cards is video editing with adobe or davinci (or any application that utilizes CUDA cores and huge amounts of VRAM alongside a CPU workload as well). there is a 24gb version also probably more applicable to blender rendering
It's a Maxwell chip so it has a pretty early version of NVENC hardware, something software can't really solve and quality will be less than a newer chip. This is the same line of chip as the 900 series desktop cards, though it has more compute and memory than even a 980 Ti, thus being closer to a 1070 in framerate at the cost of 100W more power.
24GB makes it a good option for people who want to play with machine learning. Slow on compute by modern standards (like 1/8th the speed of a 3090) but it will let you create rather large images in AI art creation tools and such with the huge memory footprint.
Right now you can get them for about ~$65 before shipping and tax.
12GB or 24GB?
@@terjeoseberg990 24gb, I bid on a bunch of them tho and they all work.
Your video reached the Italian market! Nice idea and nice workaround for gpu shortages! Thank you!
Please tell me which version of driver is best for M40 24gb on PC? I couldn't install Nvidia Driver for Tesla. It keeping failed. :(((
I have a K80 a M40 and a P40. I render with iRay and use Topaz Ai apps along with Ai video upscaling. I love them! Being able to render in the background almost as fast as my 2080Ti is a GOD SEND! I use the "crap" shrouds, but using a server fan without issue.
Is it all three together run almost as fast as a 2080ti or one or each individually?
Do you tried out these under win XP? :D With some IGP or radeon GPU?
Honestly, the "Subscribe please" right into the mic gave me a good lol. Subbed.
It's crazy that these are $150 or less in the US. You could hook a Kraken G13 to it and cool it with an existing AIO or use a 3D printed fan shroud and a 6000 RPM delta fan to cool it.
In other countries, these are like $500 minimum, so we have to buy the K20X, which have shit DX12 and Vulkan support unlike Maxwell cards :(
I use this card and the P40 to render. Octane and iRay work flawlessly on these.
You Could also get A Rajintek or a used accelero 4 for cooling this Thing, i cooled it with the 10 years old Scythe Setsugen 2 which was to my surprise compatible and it didnt get that hot it got only 82°c wit the fan running on 5v
Just saw the Rajintek on sale on Newegg the other day and let my discord know about it.
This is quite literally the only card which can do what I want that I can afford.
But my concerns over the lack of display outputs and the hackiness of the entire affair really are holding me back (I have zero recourse if something doesn't work). Knowing how some games like Halo Infinite are so finicky with drivers...
I can confirm it plays halo infinite multiplayer comfortably at 60fps now.
I have an R720 with a pair of E5-2690v1s and 2 of these running 2 gaming VMs simultaneously.
There is some registry hacking required.
I have a A4000 but is not passively cooling, in z840 and the airflow is sufficient for gaming even at 99% usage. I don't know about the passive cooling.
Respect to anyone who keeps Campari in their home bar. Cheers!
I just ordered a m40 the power cable adapter and a Arctic Acc Xtreme iii cooler, do you think a gt240 will work as the display out ?
Yessir
I really want to thank you for giving me this option. My GPU is dying and i need a new one,i think this the best option.
just bought a 12GB M40 off eBay for 48 bucks. My plans are to install it into my supermicro unRAID server to use for plex transcoding. If successful I plan on upgrading from a Dual Xeon 2695 V2 setup on the super micro board to something a bit lighter on power requirements but still strong enough to handle the rest of the docker containers and VMs that I run on the server.
A new case like Lian Li Lancool 216 with a dedicated fan on the PCIe expansion slots would also work for cooling a passive GPU like Nvidia Tesla and AMD Instinct
There are real budget alternatives available. The Tesla M2090 goes for 20-30 bucks frequently. Works for esports titles and has 6GB of VRAM.
The K20X which you can easily cool with a 3D printed fan duct and a Delta PWM fan. Also has 6GB of VRAM and GTX Titan (Kepler) performance at $70 cost. In the current market, that's pretty damn good.
Pretty sure the m2090 is fermi
Do you know an AMD variant?
@@shalokshalom AMD has video ports on theirs. There's the AMD Instinct Mi8, the FirePro S7000 or Radeon Sky 500, the S7100 and S7150. You can look them up on Techpowerup to figure out which ones make most sense to you. Personally I don't think any besides maybe the Mi8 are worth it because they're all too overpriced.
12:00 I'm curious about your settings for the gpu encoding. with mine I can re-encode an h264 1080p 8 hour recording with a file size of about 12GB down to h264 720p with 2.18 GB for under an hour using h264_nvenc vbr on slow preset; with the cpu libx264 on slow preset it'd squash it down to 1.73 GB but it took more than 10 hours to complete (more than 15 hours on very slow preset but with the same file size)
Quick question for you. I know this is an older video and you may not have this GPU any longer, however, I was wondering with all the new locally runnable LLMs (i.e. chatGPT like local apps), would this work for them as a viable option to something like an Nvidia 40 series as VRAM seems to be the major bottleneck?
ChatGPT needs several A100s with 80GB RAM each. You can’t run ChatGPT on this one, but there are other LLMs that would run on this. I just ordered one from eBay. It’s going to arrive next week.
Have you checked out looking glass? I was wondering if it would have any use with an application like this? I use it in my vfio setup and love it.
That def looks usable. I’ve never heard of it, but I’ll be sure to check it out. Thanks!
@@RaidOwl looking forward to it, it's a pretty awesome solution to a one monitor, keyboard, and mouse solution for a low latency windows vm.
Looking glass is pretty popular in the VM gaming community.
First vid of yours I have seen and instant fan. Speaking of fans, which type of 40mm fan did you try with the shroud? I have just printed the shroud as I found this. But if you used those little 80ma fans maybe some thicker 180ma ones might do it. Worts case I have a 3A centifuga fan kicking around which will fit ona shroud..I think about $15 for that. Will let you know...
I love your animated logo. It's like entering a Mega Man stage. Also it kinda reminds me of those brown fans.
thanks for the video! I just ordered mine and can't wait for the project ahead
Is there a way to replicate this on linux?
Hi, I am interested in trying a Tesla card for video encoding, either using Handbrake, or RipBot264, but most of the YT clips about them is based on gaming....so my question is, do Tesla's support HEVC x265 software encoding ? or are you "stuck" with NVEnc ?? I won't be using VM's or anything "fancy" like that, just wanting the Tesla to do most of the work. I hope you can help. Cheers
Nah, the Tesla is gonna be hardware nvenc only, which in my testing wasn’t very optimal for proxies. It would be decent for real time transcoding though.
@@RaidOwl Thankyou very much for your prompt reply & answer...OK, well, they're not for me, then. Cheers
Using it on a Dell PowerEdge R720 that is my dedicated Plex server.
You re very likable, good content. Keep it up
Always excellent videos from this channel!
☺️☺️☺️
so with one of teh sff ones which go for 85 bucks rn... i can turn my dell 7050 into a server and then use my laptop to game like geforce now... with parsec..or just use the igpu out on my desktop...i may do that i s its worth getting if i am using a rx550 rn...
did u use a high pressure 40mm fan with that shroud bro....or just a 40mm fan....
So I got a question, I know the Tesla series do not have SLI but would it be possible to run 2 m40 cards for better performance?
Nah they won’t work off each other. You can run two independently of each other and assign different tasks to them, though.
@@RaidOwl ah okay, that's what I thought, but I was hoping there would be chance that it could be done.
@@APersonPeople1 No and nor does it work on newer A4000, you can divide computations at software level, but for gaming it's anyway yet another story, where you would have to write your own drivers for a checkered rendering across 2 cards, probably, it's not something easy to do...
Ok I got my m40 the psu adapter cable,but unfortunately my motherboard isn't compatible, what motherboard did you use ?
It should be compatible with pretty much any motherboard. What issue are you having?
I recently purchased two Nvidia Tesla M40 12gb video cards on Ebay with delivery to the Russian Federation at a price of $ 130 apiece. It is planned to arrange beggar gaming
i have a problem , i have a gtx 1050 ti and the tesla , i downloaded the drivers for tesla it automatically work after that but the resolution is blocked on 4k i can't lower it , and also the 1050ti loose his drivers and i can't choose to use it in some aplication , i only have the tesla m40 , and if i download the 1050ti drivers again it just do the same , the tesla loose the drivers and i can use only the 1050ti but this time i can change resolution with 1050ti , pls help
i just want change the resolutin on 2k and use the tesla
do u solved the problem mr?
@@flobermainmc i have sold the gpu and bought a new one
@@MarcoKatITA ahh oki mr thank you for answer
Hello; i have a huananzhi dual xeon motherboard with 32 gb ram x79. Will a tesla m10 work with this setup? From what i read its a quad instances gpu 8 gb x4 32gb total. Thanks in advance
Yep there is no reason it shouldn’t work
hi Can I add this Graphic card on dl 380 g10? After install driver cannot install on server 2022
I want to use parsec, I really do, but every time I try to connect to an instance I can't see my mouse cursor, which makes using parsec very difficult. I have tried the fixes I could find online, but they didn't work. Very discouraged. Have you experienced a disappearing cursor before?
I actually have experienced this. Check this article from Parsec: support.parsec.app/hc/en-us/articles/115002623892-Mouse-and-Keyboard-Isn-t-Working-Correctly-When-Connected
What worked for me was installing the Wacom driver in the "The cursor isn't visible when I try moving" section.
@@RaidOwl I tried the Wacom driver, it didn't work. I also tried the cursor trails. But those shouldn't be the problems because the machines I connect to have physical mice. I just connect to them remotely. It just has me bummed out.
Did you try Immersive Mode in the Client settings of the PC you’re using to connect?
@@RaidOwl I thought I did, but I'll double check when I get home. Maybe that is it? Thanks for the suggestion. 👍
Fingers crossed 🤞🏻
anyone know if its possible to run this GPU alongside a 1080ti? not sure how driver compatibility will work.
A good rendering card as well... If you can find a 24gb model👍
8:23 you cant cool this with that tiny fan, you need a blower fan, preferably from delta. i have one from an old optiplex from 2003 that is temp controlled, by design this doesnt work on the tesla, but if you short the temp sensor pins, it goes full blast, which is what you need for these.
Right but I wanted something low noise so that was not an option for me.
@@RaidOwl so you went with a small delta fan? That doesn't make sense. Watercooling is better but you dot get quite the amount of surface area, if you turned down the voltage on the blower fan and make sure that there is a good seal on the adapter, you could achieve acceptable noise levels and Temps.
@@raycert07 I went with a small 40mm Noctua fan, not a delta fan
@@RaidOwl same theory, high speed and small but not high static pressure. the fans in a server chassis are similar but probably larger, are delta branded, and have a ton of ducting inside the chassis and have much more of them.
I know…which is why I said my solution didn’t work…which is also why I went with the AIO to resolve it.
Hi guys, I have a MSI B450m-a pro max with a ryzen 5 3400g (integrated video). With this config I use double monitor thanks to the double HDMI output of the mother board. I would try to install a Tesla K20x but I need keep using double monitor. Do you think Is possibile or this new config Will have problem to manage 2 video outputs? Thanks to all
hi just wondering ur opinion on dislike removed and how it affect relatively smaller youtuber (not the big ones like 3-5M sub)?
I initially didn’t like it at all. As a viewer…I still hate it. However, as a small creator, I’m leaning more towards liking it. The reason is because I was personally way too obsessed with my like/dislike ratio. Now when I go to my video I don’t even see it (yes I can still see it in Studio) so that reliefs unwanted anxiety. It’s a tough call and there’s more to it but that’s my basic thoughts.
Need to probably use a server 40mm fan, something with some real speed
Little late on this but isn't the Epyc CPU cost like 14.k? So the value vs cost is kind of alright in my mind. Even out here in 2022.
Your Desktop is about 3k (just off the top of my head) you get 3 minute 800mb files with either cpu or gpu encoding
Your Server CPU is roughly 1.5k and get 8 minute 300mb files
Your Server GPU is around 150 and get 12 minutes almost gig files
I still think a GPU would work great to start this off.
Love the tests though. Made me want to buy a couple of these cards to cram in my server! I don't have epyc processors but I do have dual Xeon so I might try a handbrake test just to see how that works on those as well.
7:41 "I know what some of you are probably thinking, [shows the wrong model, but one that's at least dual 40mm] 'Oh I've seen shrouds, like 3D printed shrouds you can put on the Tesla M40 to help cool it', but those don't work. The reason I know that is because I tried [pulls out the worst possible model, a single 40mm fan mount] I don't know why this exists. [...]"
My man, there are models out there for fitting just about any fan you could think of onto one of these cards, but just because you can doesn't mean you should. Use some of those critical thinking and executive function skills they taught you in school, or even just pay attention to the video you referenced. Because he showed in no uncertain terms there was no way a single 40mm fan could handle that card by itself under any circumstance. The models you'll see recommended all over the forums mount a 60mm or 120mm blower fan. Yes it's much louder, but it's also pushing 4-6 times the air with no modifications to the stock card, there's a reason these cards are down to $100 when a Titan X in good condition still runs 3x that.
My man, that was the whole point of the video. “Look this didn’t work so I used an aio which works great”.
well put together, but that won't fit in my hp omen from 2018 :P
i got a k20mx for 22 shipped on ebay. With a few smi commands combined with dkvk I have this card running crysis 3, Watch dogs legion, Witcher 3, Lego Starwars and I havent tried anything else.... yet Getting the cards to game is NOT THE HARD PART!!! THE HARD PART IS COOLING THESE BIG BASTERDS!!!
I've picked up a T600 for ~135 EUR with is ~ 153 US-$. I will replace my GT 1030 in my Plex Server.
Nice desktop wallpaper!
how would this pair with a i5 9600k
There is a video out showing you how to cool it with two fans that are 16.00$
Dooooh ... You don't encode on your GPU if you have a decent CPU.
You run the input filter on the GPU and encode on the CPU.
Sorry, I'm missing the price point argument. The gtx 1070 is 400$ and your card plus fan is 300$ but your time of setup work is more than 100$. I'm not getting it other than that it's a cool setup.
trying this gpu as a gaming gpu. any motherboards that are compatible with the M40?
I have one of these in my server and this thing is a straight up beast
Hmm definitely interesting. Definitely something to consider for my unRAID server to transcode Plex
Yeah I think it can be useful…for certain setups haha
@@RaidOwl looks like prices are already going up. I've been putting off buying any GPU and just using the onboard on my server mb but it is so slow and kind of frustrating to use. Waiting for the world to change
Was thinking the same but no H265 per the chart Raid Owl shared near the end.
what about linux and virt manager ? there is a tutorial for that =?
you will do something about it ?
Ok, how was it supposed to remain cool originally, without the AIO?
They’re designed to sit in a rack mounted server that has lots of airflow from front to back.
Now that eth pow Is over and gpu prices lower. Still worth it?
Ehhh if you need a bunch of VRAM, maybe...otherwise prob not
@@RaidOwl 65$ on ebay rn... cheaper than the tesla p4, as you said similar performace to a 1070, m40 still are less than half the price
Looking for a few gous of this caliber more VRAM the better for Topaz video Enhance.
"Can someone help me? I have an MSI X370 Gaming Plus motherboard, a Ryzen 7 1800X, and an EVGA 80 Platinum power supply. I bought a 24 GB Tesla M40, but when I connect it to the power supply, it doesn't turn on. Any ideas?
Yeah, try paying your electricity bill.🤣🤣🤣🤣
Thanks owl!
You're welcome sean!
Saw one of these today for 59$ came here to see if i could use it.
6pin for a 250 watt card is not a great idea right?
You need to fill all the pins
rendering you forgot rendering in the poll unless transcoding is rendering? anyways Great video
Saw one on a RF shop and then watched ur video. Definitely not worth it for more than 90€. Too much work for so little delivery.
Agreed nice
Effort XD nice review
$170? Never seen it that low.
AMD variant?
Great video, nice job.
Someone hasn't heard of the laptop trick
It would be nice if you try on VR games 😈
Yeeeh Haw!!!
🤠🤠🤠
thanks for your efforts , i did video on p100 ( in my channel ) , tesla are amazing cards
I really want to buy this card and downgrade my 1080ti but they costs the same here in my country (Philippines). I want to tinker with it then slap a waterblock (titan x waterblock are compatible with these).
Yeah I was thinking of going with a water block if this didn’t work.
The layout of the board is the same, but all waterblocks may not work. Maxwell Titan X and the m40 are the same PCB, just like the 1080 TI reference/Titan X/P40 are the same in the pascal cards.
The BIG BIG BIG DOES NOT FIT difference is if you have a waterblock with the incredibly stupid extension all the way to the end of the card. There will be a cutout for the 2x PCI-E power connectors. The tesla cards move them to the end edge of the card around the corner from where you find them on a Titan X.
You'd need to find a shorter block. If you see only one row(a line of three standoffs) of "screw goes through board and into the waterblock" between the actual 8 the tighten the block to the die/memory/vrm and the end, it should be short enough. If it uses the two holes/standoffs on the end of the card, it will impact your power connector(won't fit).
For example, the heatkiller IV for 980 TI/Titan X should work(weird arrow shaped termination just after the first set of non-functional hole/standoffs), but the heatkiller IV for 1080 TI/Titan X changed to extend the full length of the card.
I think your cpu is bottlenecking the gpu
You MUST overclock it, otherwise it runns very crappy.
Overclocking gives you 45% more performance
I dunno what you mean lol, it runs at titan x speeds stock. Are you talking about tesla k80?
Stubbed good content
Hello
Hi there 👋🏼
hahahahahahahaaa
Love your video..