Nice video dude, keep it up 👍 Hope that vacuum cleaner was ESD safe, all that air moving around makes a significant amount of static in normal vacuums, kills components all the time. Same with air compressors. Just a tip! Keep the videos coming 👌
It's a shame Nvidia entirely killed off 2-slot coolers for the 3090 and 4090. That was only to prevent people from using these relatively cheap gaming cards in workstations and servers, rather than their overpriced Quadro RTX 5000 which is the same card but more expensive. These hilariously oversized 4-slot coolers don't fit in normal PC cases anymore. 2-slot 3090/4090s only rarely appear on ebay nowadays, and at super expensive prices.
You can always go for the RTX A2000 12GB, which costs around 600 bucks. If you're desperate enough, you can get the A6000, which is better designed than the 4090, dissipates less heat and only needs a single EPS connector to power it instead of the badly designed pipe dream that is the 12VHWPR connector.
What is cluster and , can you provide me a details? I'm using local llms want to create a server on my machine And also wana use my pc at same time . My specs are : i5 10 Ram 48gb ddr4 Rtx 3060 12 gb
I'm running all my workloads in a kubernetes cluster which orchestrates all my service across my physical machines. I suggest you look into getting cuda running on your machine
Great suggestion. I did some simple comparisons and for my workloads 3090s fit the bill. There is usually a premium on the tesla cards. They are more powerful for certain tasks but for my specific workloads these were the best dollar to power ratio.
This is really interesting thanks for sharing!, and at the end when you displayed Headbot I literally LOL'd, a very novel idea. Cool project.
Glad you found it interesting!
Nice video dude, keep it up 👍
Hope that vacuum cleaner was ESD safe, all that air moving around makes a significant amount of static in normal vacuums, kills components all the time. Same with air compressors. Just a tip! Keep the videos coming 👌
Oops haha had a few people tell me about this! I will be more careful in the future!
It's a shame Nvidia entirely killed off 2-slot coolers for the 3090 and 4090. That was only to prevent people from using these relatively cheap gaming cards in workstations and servers, rather than their overpriced Quadro RTX 5000 which is the same card but more expensive. These hilariously oversized 4-slot coolers don't fit in normal PC cases anymore.
2-slot 3090/4090s only rarely appear on ebay nowadays, and at super expensive prices.
Nvidia really milking every last cent!
You can always go for the RTX A2000 12GB, which costs around 600 bucks. If you're desperate enough, you can get the A6000, which is better designed than the 4090, dissipates less heat and only needs a single EPS connector to power it instead of the badly designed pipe dream that is the 12VHWPR connector.
Why did they make a 3090 with a single fan? that thing sounds like a leaf blower
I often question whether some of Nvidia's product decisions are in the best interests of the customers haha
@thegalah for me I am going to go with A 7900xtx from XFX
Interested to see how the build turns out
so you can fit more in a case
@@nuck477 the answer is NO.
Damn that’s nice. How many gpu do you have?
Nice I’m gonna code the hell out my lab
What are you building?
Nice build. Curious what models headbot is actually running behind the scenes.
Sdxl
That 3090 blower card is over 1000usd in 2024 card alone. The best value 3090 now is aroud 750.usd
what about 2080 ti sli 2x ? cool solution man...
Why not just get an adapter for the second gpu power connector
Good question, I didn't want to risk frying the 700 dollar card for a 120 dollar component; since the PCIe pinout cables are non-standard.
What is cluster and , can you provide me a details?
I'm using local llms want to create a server on my machine
And also wana use my pc at same time .
My specs are :
i5 10
Ram 48gb ddr4
Rtx 3060 12 gb
I'm running all my workloads in a kubernetes cluster which orchestrates all my service across my physical machines.
I suggest you look into getting cuda running on your machine
check ollama for running llms on your hardware
Nice Video
Thanks!
Just $1195.36 ?
Bro instead of 70+k
@@louishauger3057 😂😂🤣🤣🤣🤣 Never for that old crap
thought you left the vacuum running or smtn lol ridiculous
Haha that's the sound of the fans spinning under Max gpu load
couldn't you just buy one of those nvidia tesla cards? they're designed for AI stuff and have more vram and are cheaper
Great suggestion. I did some simple comparisons and for my workloads 3090s fit the bill. There is usually a premium on the tesla cards. They are more powerful for certain tasks but for my specific workloads these were the best dollar to power ratio.
What’s the point of an AI server?
loud music
Thanks for the feedback