How To Run ANY Open Source LLM LOCALLY In Linux

Поділитися
Вставка
  • Опубліковано 5 лип 2024
  • In this video, I will show you how to run ANY open source llm (large language models) locally on Linux using Ollama & LMStudio. Ollama & LMStudio are the best tools that allows you to run various models such as llama3, Gemma, Mistral, codellama & much more. Watch this video and learn running LLMS locally on Linux computer.
    Timestamps
    00:00 Introduction
    00:38 Pre-requisites
    01:16 Installing Ollama
    02:18 Download LLM
    03:01 Testing LLAMA3 & Gemma
    05:31 Customizing Model
    06:55 Installing LMStudio
    Download
    Ollama: ollama.com/download
    LMStudio: lmstudio.ai/
    Relevant Tech Videos
    Dual boot ubuntu 24.04 LTS And Windows 11 - • How to Dual Boot Ubunt...
    Clean Install Ubuntu 24.04 LTS - • How TO Install Ubuntu ...
    Install Ubuntu 24.04 LTS On Virtual Box - • How To Install Ubuntu ...
    ~ Buy Me A Coffee - buymeacoffee.com/kskroyal
    ~ Connect On Instagram - @KSKROYALTECH
    ~ For Business Enquires ONLY - business.ksktech@yahoo.com
    ~ My Website - kskroyal.com/
    © KSK ROYAL
    MereSai
  • Наука та технологія

КОМЕНТАРІ • 23

  • @0xd3addev
    @0xd3addev 23 дні тому +3

    For anyone having trouble with the 'ollama create', you have to spell the model's name in lowercase according to the ollama documentation. So the first line would be 'FROM llama3'

  • @seventhtenth
    @seventhtenth 24 дні тому

    LLM local running is cool but what is the best training set?

  • @Zer0YT
    @Zer0YT 23 дні тому

    Very nice Video 🙏🏼
    Bur is There also a free ki you can Host localy for picture Generation? Maybe this would be a Video Worth 😊🙌🏼 I would be interested 💯🙌🏼

  • @shrirammadurantakam
    @shrirammadurantakam 24 дні тому

    The ollama system configuration is very iseful for agentic workflows
    Need to learn to make llms talk to each other

  • @AgentX-dh3lf
    @AgentX-dh3lf 24 дні тому +2

    Any good LLM for low end hardware?

  • @wolfisraging
    @wolfisraging 24 дні тому +1

    Alpaca is best for llm GUI. its on flatpak as well. Clean & simple UI.

  • @Amit-hb9ex
    @Amit-hb9ex 24 дні тому +2

    which is your main system for you work also what are you doinng in you life like in education pov

    • @kskroyaltech
      @kskroyaltech  23 дні тому +2

      I use Linux and macOS as primary OS'es.. MacOS I use for building iOS Apps.
      but mostly I spend my time with Linux . I love tinkering open source stuff.
      Education: I dropped Out B-Tech long back. Natural Farming I do in the part time and full time UA-cam.

    • @Arador1112
      @Arador1112 23 дні тому

      ​@@kskroyaltech great bro

    • @Amit-hb9ex
      @Amit-hb9ex 23 дні тому

      ​@@Arador1112 Nice dp 🙂

  • @Arador1112
    @Arador1112 23 дні тому

    hey,how do one can delete a model from oolama?

    • @kskroyaltech
      @kskroyaltech  19 днів тому

      ollama list
      to see all models
      ollama rm MODEL_NAME

    • @Arador1112
      @Arador1112 19 днів тому

      @@kskroyaltech i mean how to completely delete it. it does take the space even after running this command

  • @PTRAARON
    @PTRAARON 19 днів тому

    How to uninstall ollama from my computer. I have no graphics card

  • @chef2654
    @chef2654 24 дні тому

    Do what exactly makes Linux superior for AI?
    You do realise that you can run Ollama & LM Studio just as easily on macOS & Windows. Not to mention, they also work with AMD GPUs, not just Nvidia.

    • @kskroyaltech
      @kskroyaltech  23 дні тому

      Offcourse.

    • @MrRom079
      @MrRom079 13 днів тому

      Yea but windows sucks balls 😂😂😂😂