Ollama: How To Create Custom Models From HuggingFace ( GGUF )

Поділитися
Вставка
  • Опубліковано 11 жов 2024

КОМЕНТАРІ • 34

  • @SIR_Studios786
    @SIR_Studios786 4 місяці тому +2

    gguf can be run directly with ollama, how to convert other formats to be converted to use with ollama
    ?

  • @SantK1208
    @SantK1208 7 місяців тому +1

    Thanks for sharing it, could you please create a video where we can do Q&A with local documents using Ollma models

    • @SantK1208
      @SantK1208 7 місяців тому

      Kindly do it , it’s really needed

    • @datasciencebasics
      @datasciencebasics  7 місяців тому

      Please check other videos in this channel, here u go ->
      Chat With Documents Using ChainLit, LangChain, Ollama & Mistral 🧠
      ua-cam.com/video/2IL0Sd3neWc/v-deo.html

  • @albuquerqueroger
    @albuquerqueroger 2 місяці тому

    Parabéns, conteúdo excelente! Obrigado por compartilhar.

  • @ph3ll3r
    @ph3ll3r 4 місяці тому

    It would be interesting to see if you could Open-webUI instead of "Chat UI using ChainLit, LangChain, Ollama and Gemma"

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 7 місяців тому

    Excellent tutorial !!

  • @amiranvarov
    @amiranvarov 6 місяців тому

    Hey mate, very good video with clear explanation. Do u mind to share what terminal app you are using? seems like very convenient with all that autocompletion and some hints

    • @datasciencebasics
      @datasciencebasics  6 місяців тому +1

      I am using Warp but you can watch this video for making terminal better
      HOW To Make Your Mac Terminal Amazing
      ua-cam.com/video/ycapVWVl98M/v-deo.html

  • @ScrantonStrangler19
    @ScrantonStrangler19 6 місяців тому

    Good explanation! Is there a list of model architectures that are supported by Ollama?

    • @datasciencebasics
      @datasciencebasics  6 місяців тому +1

      I don’t think there is any architecture for Ollama, you can download any models and use with Ollama. Give a try !!

  • @build.aiagents
    @build.aiagents 7 місяців тому

    Phenomenal

  • @nat.amato-10
    @nat.amato-10 5 місяців тому

    How can I edit the Modelfile, so that it includes within it a context, a personality, or a precise way of answering questions?

  • @mixp1x
    @mixp1x 5 місяців тому +1

    Is there any way we can use Exl2 on Ollama?

  • @hackedbyBLAGH
    @hackedbyBLAGH 2 місяці тому

    Thank you

  • @Pedro-mq7kt
    @Pedro-mq7kt 7 місяців тому

    Nice video,
    Can you make a video changing the system-prompt, temperature
    If its possible.
    Thankyou

    • @datasciencebasics
      @datasciencebasics  7 місяців тому

      Its possible, just provide those parameters in Modelfile.

  • @SantK1208
    @SantK1208 7 місяців тому

    Could you please use llama index with hugging face

  • @121Gamerscom
    @121Gamerscom 6 місяців тому

    how do i pimp my chainlit logo to mine to change it please

  • @zamanganji1262
    @zamanganji1262 3 місяці тому

    I want to finetune the llama 3 but I need to crate the special_tokens_map.json as follows:
    {
    "bos_token": {
    "content": "",
    "lstrip": false,
    "normalized": false,
    "rstrip": false,
    "single_word": false
    },
    "eos_token": {
    "content": "",
    "lstrip": false,
    "normalized": false,
    "rstrip": false,
    "single_word": false
    },
    "pad_token": {
    "content": "",
    "lstrip": false,
    "normalized": false,
    "rstrip": false,
    "single_word": false
    }
    }
    How can I do?
    Moreover I want to have a ollama run the model to have a chat with the model.

  • @terry-bn7bh
    @terry-bn7bh 6 місяців тому

    good, how to create the Modefile on windows?😁

  • @terry-bn7bh
    @terry-bn7bh 6 місяців тому

    how to create the Modefile on windows?

    • @AnudeepKolluri
      @AnudeepKolluri 5 місяців тому

      Just open vscode and create a file called Modelfile (not sure about capitalization) and insert the content into it. It doesnt need any extension

  • @adhyanmishra0197
    @adhyanmishra0197 6 місяців тому

    hello sir can we deploy this model

    • @datasciencebasics
      @datasciencebasics  6 місяців тому

      Yes you can !!

    • @adhyanmishra0197
      @adhyanmishra0197 6 місяців тому

      @@datasciencebasics Sir if possible please can u create one tutorial for deploying this model

    • @datasciencebasics
      @datasciencebasics  6 місяців тому

      hello, deploying a model is use case specific. It can be deployed locally, in different cloud services , etc. Please refer to other videos in my channel for help.

  • @AlanCarrOnlineHypnosis
    @AlanCarrOnlineHypnosis 4 місяці тому

    Far more hassle than it's worth. Just use any other app that uses normal GGUF files, like normal people.

    • @datasciencebasics
      @datasciencebasics  4 місяці тому

      Thanks for the comment. There are many ways to do and its just a preference which one to use🙂

  • @IamalwaysOK
    @IamalwaysOK 7 місяців тому

    Excellent tutorial !!