Building My Own AI Server for Just $1195.36: A Homelab Journey

Поділитися
Вставка
  • Опубліковано 27 вер 2024

КОМЕНТАРІ • 44

  • @ChrisDaveGG
    @ChrisDaveGG 11 місяців тому +6

    This is really interesting thanks for sharing!, and at the end when you displayed Headbot I literally LOL'd, a very novel idea. Cool project.

    • @thegalah
      @thegalah  11 місяців тому

      Glad you found it interesting!

  • @Avose243
    @Avose243 Рік тому +9

    Nice video dude, keep it up 👍
    Hope that vacuum cleaner was ESD safe, all that air moving around makes a significant amount of static in normal vacuums, kills components all the time. Same with air compressors. Just a tip! Keep the videos coming 👌

    • @thegalah
      @thegalah  Рік тому +1

      Oops haha had a few people tell me about this! I will be more careful in the future!

  • @ProjectPhysX
    @ProjectPhysX 9 місяців тому +6

    It's a shame Nvidia entirely killed off 2-slot coolers for the 3090 and 4090. That was only to prevent people from using these relatively cheap gaming cards in workstations and servers, rather than their overpriced Quadro RTX 5000 which is the same card but more expensive. These hilariously oversized 4-slot coolers don't fit in normal PC cases anymore.
    2-slot 3090/4090s only rarely appear on ebay nowadays, and at super expensive prices.

    • @thegalah
      @thegalah  8 місяців тому +1

      Nvidia really milking every last cent!

    • @BrunodeSouzaLino
      @BrunodeSouzaLino 6 місяців тому +1

      You can always go for the RTX A2000 12GB, which costs around 600 bucks. If you're desperate enough, you can get the A6000, which is better designed than the 4090, dissipates less heat and only needs a single EPS connector to power it instead of the badly designed pipe dream that is the 12VHWPR connector.

  • @Kingfinite
    @Kingfinite Рік тому +10

    Why did they make a 3090 with a single fan? that thing sounds like a leaf blower

    • @thegalah
      @thegalah  Рік тому +2

      I often question whether some of Nvidia's product decisions are in the best interests of the customers haha

    • @Kingfinite
      @Kingfinite Рік тому +1

      @thegalah for me I am going to go with A 7900xtx from XFX

    • @thegalah
      @thegalah  Рік тому +1

      Interested to see how the build turns out

    • @nuck477
      @nuck477 6 місяців тому

      so you can fit more in a case

    • @Kingfinite
      @Kingfinite 6 місяців тому

      @@nuck477 the answer is NO.

  • @henryl7421
    @henryl7421 2 місяці тому

    Damn that’s nice. How many gpu do you have?

  • @anthonyrubio1194
    @anthonyrubio1194 Рік тому +2

    Nice I’m gonna code the hell out my lab

    • @thegalah
      @thegalah  Рік тому

      What are you building?

  • @xorcody
    @xorcody 10 місяців тому +1

    Nice build. Curious what models headbot is actually running behind the scenes.

  • @joelg1318
    @joelg1318 Місяць тому

    That 3090 blower card is over 1000usd in 2024 card alone. The best value 3090 now is aroud 750.usd

  • @shepardcoronel1980
    @shepardcoronel1980 3 місяці тому

    what about 2080 ti sli 2x ? cool solution man...

  • @paultalanoa9513
    @paultalanoa9513 Рік тому +1

    Why not just get an adapter for the second gpu power connector

    • @thegalah
      @thegalah  Рік тому

      Good question, I didn't want to risk frying the 700 dollar card for a 120 dollar component; since the PCIe pinout cables are non-standard.

  • @talhagaming7204
    @talhagaming7204 8 місяців тому +1

    What is cluster and , can you provide me a details?
    I'm using local llms want to create a server on my machine
    And also wana use my pc at same time .
    My specs are :
    i5 10
    Ram 48gb ddr4
    Rtx 3060 12 gb

    • @thegalah
      @thegalah  8 місяців тому

      I'm running all my workloads in a kubernetes cluster which orchestrates all my service across my physical machines.
      I suggest you look into getting cuda running on your machine

    • @bilbobeutlin3405
      @bilbobeutlin3405 2 місяці тому

      check ollama for running llms on your hardware

  • @hwole7895
    @hwole7895 Рік тому +1

    Nice Video

  • @jj-icejoe6642
    @jj-icejoe6642 4 місяці тому +2

    Just $1195.36 ?

    • @louishauger3057
      @louishauger3057 Місяць тому

      Bro instead of 70+k

    • @jj-icejoe6642
      @jj-icejoe6642 Місяць тому

      @@louishauger3057 😂😂🤣🤣🤣🤣 Never for that old crap

  • @TheGameGuruTv
    @TheGameGuruTv 8 місяців тому +1

    thought you left the vacuum running or smtn lol ridiculous

    • @thegalah
      @thegalah  8 місяців тому

      Haha that's the sound of the fans spinning under Max gpu load

  • @genericgoon3748
    @genericgoon3748 9 місяців тому +1

    couldn't you just buy one of those nvidia tesla cards? they're designed for AI stuff and have more vram and are cheaper

    • @thegalah
      @thegalah  8 місяців тому

      Great suggestion. I did some simple comparisons and for my workloads 3090s fit the bill. There is usually a premium on the tesla cards. They are more powerful for certain tasks but for my specific workloads these were the best dollar to power ratio.

  • @riffdex
    @riffdex Місяць тому

    What’s the point of an AI server?

  • @yousefabdulrhman
    @yousefabdulrhman Рік тому +2

    loud music

    • @thegalah
      @thegalah  Рік тому

      Thanks for the feedback