Dual 3090Ti Build for 70B AI Models

Поділитися
Вставка
  • Опубліковано 15 бер 2024
  • In this video, I take you through my exciting journey of upgrading my computer setup by adding an additional Nvidia RTX 3090Ti, with the ultimate goal of running highly demanding 70B localllm models and other GPU-intensive applications. For those who share a passion for pushing the boundaries of AI research and computational power, you know how crucial having the right hardware can be. That's exactly why I embarked on this upgrade mission.
    After extensive research and monitoring the market for the best deals on GPUs, I stumbled upon a golden opportunity at my local Micro Center. To my surprise, they had refurbished Nvidia 3090 and 3090Ti Founders Edition cards on offer at prices that undercut even the second-hand market. This was a deal too good to pass up, especially for a high-performance enthusiast like myself looking to bolster my system's capabilities for handling some of the most compute-intensive tasks out there.
    In this detailed build log, I'll show you every step of the process, from the decision-making to the installation and eventual performance testing. We'll explore why the Nvidia 3090Ti is a game-changer for anyone interested in deep learning, AI model training, and running sophisticated algorithms that demand significant GPU resources.
    Furthermore, I'll share insights on how to spot great deals on high-end hardware, the importance of considering refurbished components, and tips for ensuring your system is ready to take on the challenges of next-generation computing. Whether you're a seasoned AI developer, a deep learning enthusiast, or simply someone fascinated by the capabilities of modern technology, this video is packed with valuable information.
    Join me as I boost my computer's performance to new heights, making it capable of running 70B localllm models and beyond. Don't forget to like, share, and subscribe for more content on AI, technology, and high-performance computing builds. Your support helps me bring more of these in-depth guides and tutorials. Let's dive into the world of high-end computing together!
  • Розваги

КОМЕНТАРІ • 30

  • @UpNorth937
    @UpNorth937 13 днів тому

    Great video!

  • @cybermazinh0
    @cybermazinh0 Місяць тому

    The video is very cool, the case of the 3090 could be very beautiful

    • @OminousIndustries
      @OminousIndustries  Місяць тому +1

      Thanks very much! I am going to be swapping everything over into a Thermaltake View 71 case very soon.

    • @jamesvictor2182
      @jamesvictor2182 10 днів тому

      Unlike the inside of that case!

  • @mcdazz2011
    @mcdazz2011 Місяць тому +1

    One of the best things you can do in the short term, is to clean the front air filters. I can see one at 11:48, and there's a fair amount of dust between the filter and the fan. You'll get better air intake just by cleaning them, which will help with any heat generated in that case (which is a BIG heat trap).
    Longer term, definitely look at getting a new case with better air flow.
    The way it is at the moment, that case is going to act like an oven and you'll likely find that the CPU/GPUs might thermal throttle and rob you of performance.
    Thermaltake make some pretty big cases (on wheels if that's your thing), so you might like the Core W100 or Core W200.

    • @OminousIndustries
      @OminousIndustries  Місяць тому

      Excellent advice, ironically enough I recently purchased a Thermaltake View 71 to transfer all the components into. I am excited to do the swap.

  • @jamesvictor2182
    @jamesvictor2182 10 днів тому

    I am awaiting my second 3090 ti, probably going to end up water cooling. How has it been for you with heat management?

    • @OminousIndustries
      @OminousIndustries  10 днів тому

      I have not seen crazy temps while running localllama. I did render something in keyshot pro that made the cards far too hot but for any llm stuff it hasn't been too bad at all.

  • @mbe102
    @mbe102 Місяць тому

    What is the aim for using opendalie? Is it just... for fun, or is there some monetary gain to be had through this?

    • @OminousIndustries
      @OminousIndustries  Місяць тому

      Personally I just use it for fun. Some people use these uncensored image models to generate NSFW images that they then release on patreon, etc to make some money, but that is not in my wheelhouse.

  • @mixmax6027
    @mixmax6027 Місяць тому +1

    How'd you increase your swap file? I have the same issues with 72B models running dual 3090s

    • @OminousIndustries
      @OminousIndustries  Місяць тому +1

      These instructions should work, though I have only used them on 2022.04 wiki.crowncloud.net/?How_to_Add_Swap_Space_on_Ubuntu_22_04#Add+Swap

  • @m0ntreeaL
    @m0ntreeaL День тому

    BIG Price ...i guess 200bucks to High

  • @MikeHowles
    @MikeHowles День тому

    Bro, use nvtop. You're welcome.

    • @OminousIndustries
      @OminousIndustries  День тому

      I'm going to install that tonight for my intel gpu build, I previously hadn't found a monitor for that gpu on linux.

  • @atabekkasimov9702
    @atabekkasimov9702 Місяць тому +1

    Did You plan to use NVLink with new Ryzen setup?

    • @OminousIndustries
      @OminousIndustries  Місяць тому +2

      It is something I would like to add once I swap over to a threadripper. I have seen conflicting opinions on how much it helps but I would like it for "completeness" if nothing more.

  • @M4XD4B0ZZ
    @M4XD4B0ZZ Місяць тому +1

    Ok so i am very interested in local llms and found that my system is way too weak for my likings. But i really have to ask.. what are you doing with this technology? I have no "real" use case for it and wouldn't consider buying two new gpus for it. What are actual beneficial use cases for it? Maybe coding?

    • @OminousIndustries
      @OminousIndustries  Місяць тому +2

      I have a business that utilizes LLMs for some of my products so it is a 50/50 split between business-related research and hobbyist tinkering. The requirements to run LLMS locally are heavily dependent on the type and size of model you want to run. You don't need a large vram setup like this to fool around with them, I just went for this so that I could run larger models like 70B models. Some of the smaller models would run fine on an older card like a 3060 which can be had without breaking the bank. Some of the model "curators" post the requirements for vram for the models on huggingface, bartowski being one who lists the requirements.

    • @M4XD4B0ZZ
      @M4XD4B0ZZ Місяць тому

      @@OminousIndustries thank you for the insights really appreciate it

    • @OminousIndustries
      @OminousIndustries  Місяць тому

      @@M4XD4B0ZZ Of course!

  • @codescholar7345
    @codescholar7345 26 днів тому

    What CPU and motherboard? What is the temperature of the cards? Thanks!

    • @OminousIndustries
      @OminousIndustries  26 днів тому

      The cpu is an I7-12700K and the mobo is a MSI PRO Z690-A. I purchased them as a micro center bundle about a year ago. I have not seen the card temps get over about 75c when using the text-gen-webui. I was using keyshot pro for something and decided to use both cards to render the project and they got far too hot, so cooling is first priority to be upgraded.

    • @codescholar7345
      @codescholar7345 25 днів тому

      @@OminousIndustries Okay thanks. Yeah there's not much space in that case. I have a bigger case, I'm looking to get another 3090 or 4090 and possibly water cool them. Would be nice to get an A6000 but too much right now

    • @OminousIndustries
      @OminousIndustries  25 днів тому

      @@codescholar7345 I have a thermaltake view 71 to swap them into when I get the time. The A6000 would be awesome but yeah that price could get you a dual 4090 setup. A water cooling setup would be very cool and a good move for these situations.

  • @emiribrahimbegovic813
    @emiribrahimbegovic813 День тому

    Where did you buy your cafd

    • @OminousIndustries
      @OminousIndustries  День тому

      I got it at Micro Center, they were selling them refurbished. Not sure if they still have any in stock. They also had 3090s.

  • @skrebneveugene5918
    @skrebneveugene5918 5 днів тому

    What about llama3?

    • @OminousIndustries
      @OminousIndustries  4 дні тому

      I tested a small version of it in one of my more recent videos!