How to Run Any GGUF AI Model with Ollama Locally

Поділитися
Вставка
  • Опубліковано 22 жов 2024
  • This video is a step-by-step tutorial to install and run any LLM in GGUF format with Ollama locally.
    🔥 Buy Me a Coffee to support the channel: ko-fi.com/fahd...
    🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:
    bit.ly/fahd-mirza
    Coupon code: FahdMirza
    ▶ Become a Patron 🔥 - / fahdmirza
    #gguf #ollama
    PLEASE FOLLOW ME:
    ▶ LinkedIn: / fahdmirza
    ▶ UA-cam: / @fahdmirza
    ▶ Blog: www.fahdmirza.com
    RELATED VIDEOS:
    ▶ Resource ollama.com/
    All rights reserved © 2021 Fahd Mirza

КОМЕНТАРІ • 11

  • @golden--hand
    @golden--hand 2 місяці тому

    I don't like working in the console. That said, this is the only video or description of any I have come across that managed to get me to understand how to actually do this so thank you. So many others seem to skip steps because they assume people are familiar with them already.
    Works in ollama now and some other apps I have that attach to it; doesn't show up in Open WebUI for some reason, but I don't really need custom modules for how I use that app at the moment. so again, thanks for the way you described this.

  • @cesarmelchior5858
    @cesarmelchior5858 Місяць тому

    Hey Fahd... Thanks for sharing the knowledge. Hugs from Brazil.

    • @fahdmirza
      @fahdmirza  Місяць тому

      Thanks you made my day

  • @kuntoajibukanpenyanyi
    @kuntoajibukanpenyanyi 2 місяці тому

    Great, will you share for safetensors type

  • @TheYuriTS
    @TheYuriTS 4 місяці тому

    Thank for your tutorial. I follow only you and i learn more about u

    • @fahdmirza
      @fahdmirza  4 місяці тому

      Thanks my friend, you just made my day. cheers

  • @ZVCi
    @ZVCi 3 місяці тому

    how can i fix Error: this model is not supported by your version of Ollama. You may need to upgrade and i already upgrade the latest ver

    • @Hotboy-q7n
      @Hotboy-q7n 3 місяці тому

      Atualiza seu ollama

  • @chandraprakash6196
    @chandraprakash6196 4 місяці тому

    I didn't even know 48GB VRAM Nvidia existed for personal use.

    • @fahdmirza
      @fahdmirza  3 місяці тому

      sure

    • @CasanovaSan
      @CasanovaSan 2 місяці тому

      you can rent 48 vram or even higher gpus on websites like runpod