How to Install Llama3 On Any Computer

Поділитися
Вставка
  • Опубліковано 2 чер 2024
  • In this video, I walk you through the process of installing Llama 3 on your computer.
    This enables you to use the power of AI with no limits, even if your favorite chatbot's servers go offline. Plus, it means that none of your potentially private data is being sent who-knows-where, it all stays on your local machine.
    You can copy the below links into LM Studio search function to directly access these models.
    Links:
    lmstudio.ai/
    huggingface.co/meta-llama/Met...
    huggingface.co/cognitivecompu...
    ---------------------------------------------------------------------------------
    Chapters:
    0:00 Intro
    0:23 LM Studio
    0:54 Download a Model
    1:41 Chat Offline
    2:09 More Options
    #ai #chatgpt #llama3
    ---------------------------------------------------------------------------------
    🔑 Get My Free ChatGPT Templates: myaiadvantage.com/newsletter
    🎓 Join the AI Advantage Course + Community: myaiadvantage.com/community
    🤯 Unlock ChatGPT's true potential: shop.myaiadvantage.com/produc...
    🐦 Twitter: / theaiadvantage
    📸 Instagram: / ai.advantage
    🛒 AI Advantage Shop: shop.myaiadvantage.com/
  • Наука та технологія

КОМЕНТАРІ • 34

  • @findmeinthecarpet
    @findmeinthecarpet Місяць тому

    Thank you! I've been waiting for this with dolphin on a basic Mac.

  • @aigriffin42604
    @aigriffin42604 Місяць тому +4

    I love the way you talk!❤

    • @helicopterway
      @helicopterway Місяць тому +1

      Yes Yes, me too. Personally, only reason I watch the channel. My attorney teaches me the rest.

  • @SpearSilver
    @SpearSilver Місяць тому

    Thanks, very useful.

  • @Salionca
    @Salionca Місяць тому

    Dark mode and zoom in. Perfect. Thanks.

  • @joshuawilson1396
    @joshuawilson1396 Місяць тому +4

    Are you able to upload documents (PDFs for example) in this local application of the LLM? If not, is there any way to do that? Great video as usual!

    • @yesla4tesla
      @yesla4tesla Місяць тому +1

      hah, this is what I just asked too. Did not see your question. I am in same boat as you as I upload stuff to gpt4 and that is one of the main functions I require.

    • @paelnever
      @paelnever Місяць тому +3

      Not inside LM studio (that is a closed source application) but yes combining other open source applications to set up what is a commonly known as RAG (Retrieval Augmented Generation). There are tutorials of how to set up RAG locally in youtube (not in this channel obviously) but if you want a really simple way you can do it with an RTX card (not in mac hardware obviously since mac hardware don't allow to plug GPUs) and their "chat with RTX" application that comes with RAG integrated.

    • @aiadvantage
      @aiadvantage  Місяць тому +3

      Not this simply no. As the other comment by @paelnever suggests this would be done with RAG which makes this way more complicated. Will research if there are any simpler ways beside chat with rtx which requires a Nvidia card.

    • @paelnever
      @paelnever Місяць тому +1

      @@aiadvantage Probably the easier way would be to combine any LM open source framework like ollama with "open interpreter", that is actually much more than RAG but hey, still open source and free.

    • @aiadvantage
      @aiadvantage  Місяць тому +1

      @@paelnever Yes, ture. Should just be noted that even Ollama requires a basic understanding of a command line interface. For most users using GPT-4 with code interpreter is the way to go for now.

  • @yesla4tesla
    @yesla4tesla Місяць тому

    ok, here is what I am looking for. I use chatgpt 4 due to the ability for it to look at certain files. I can upload a file and say, curate the data or other info but easily allows me to upload a file. Is this possible in LM Studio? Thanks for you dedication to the videos you make... long time subscriber here.

  • @fidgetspinner343
    @fidgetspinner343 Місяць тому

    Can you run this on a AWS Lightsail instance? I noticed a brief flash on the screen on Microsoft, Google and Amazon

  • @1edber
    @1edber Місяць тому

    Awesome, thank you for this! Would it be possible to run Llama 3 70B locally on a M1 Mac Studio? Could I do it with 32GB ram? (if not, would would I need hardware wise with a Mac to run it locally). Thanks!

    • @paelnever
      @paelnever Місяць тому

      No, but for half the price you can do it in a PC with linux.

    • @aiadvantage
      @aiadvantage  Місяць тому +1

      Hmm will be close. Just download LM Studio, search for LLama-3-70B and it will show you right away. Let me know how it went.

  • @pranjal9830
    @pranjal9830 Місяць тому +3

    Mobile phone when ?

  • @David-yq2dp
    @David-yq2dp Місяць тому +2

    Admin privileges required??

  • @Jaysearching
    @Jaysearching Місяць тому

    Is there anything even close to an equivalent that could be ran on an M2 IPad Pro?

  • @CM-zl2jw
    @CM-zl2jw Місяць тому

    nice... 👏🍎🍎🍏🍏 So much for my excuse for buying an H100.

  • @mrd6869
    @mrd6869 Місяць тому

    1:40
    What does that laptop hat lady have to do with anything?
    🤣

    • @aiadvantage
      @aiadvantage  Місяць тому +3

      Don't underestimate her. She is running Llama 70B in a vineyard

  • @HuynhLuong227
    @HuynhLuong227 Місяць тому

    free GPT4 or GPT3?

    • @aiadvantage
      @aiadvantage  Місяць тому

      Llama 3 8B is better than GPT-3.5 but worse than GPT-4. Hope that answers your question

    • @paelnever
      @paelnever Місяць тому

      The 8B parameters version that he's running in the video is actually better than gpt3.5 turbo. If you want something equivalent to gpt4 you need to run the 70B version but if you want acceptable inference speed for that you better use a graphic card with 40Gb vram or more and then obviously we are not talking about apple hardware but PC hardware, preferably with linux.

  • @tar-yy3ub
    @tar-yy3ub Місяць тому

    Misleading title, not any computer

  • @BobDowns
    @BobDowns Місяць тому

    Title is misleading - won’t run on “any” computer. LM Studio won’t run on Intel-based MacBooks, like mine. 😢

    • @aiadvantage
      @aiadvantage  Місяць тому +1

      Try jan dot ai. Same thing, fully open source, works on intel macs, just not updated as frequently

    • @Content_Supermarket
      @Content_Supermarket Місяць тому +1

      What about for mobiles like PHi 3 ?

    • @paelnever
      @paelnever Місяць тому +1

      Anyway trying to run even a small model in such a crap of computer would be a pain, get ready to wait 10 seconds to write every single word. You can get a medium grade PC with a second hand nvidia card with 24Gb vram for less than 500 bucks and with that getting inference speeds around 15 tokens/second.