Cracking the Enigma of Ollama Templates

Поділитися
Вставка
  • Опубліковано 27 гру 2024

КОМЕНТАРІ •

  • @aurielklasovsky1435
    @aurielklasovsky1435 Місяць тому

    Let's go! This is the video I've been waiting for. Thank you again for this wonderful course

  • @ErolErten
    @ErolErten 26 днів тому

    Thank you so much for this video and content. I've been looking for exactly this information.

  • @kaushalkanakamedala6886
    @kaushalkanakamedala6886 Місяць тому

    I think template is just taking in parameters, generating the input to be fed to the model. I want to know if ollama can use inference time reasoning like o1 and be able to use the template to reason. Maybe like provide a template where given the query, the model generates a reasoning, using something like chain of thought or tree of thought to reason and output the result? Easily achieved with langchain or python code on top. Just wanted to know if running this way is possible or if it can be faster.

    • @technovangelist
      @technovangelist  Місяць тому

      That’s not a function of the Template but rather the model.

    • @kaushalkanakamedala6886
      @kaushalkanakamedala6886 Місяць тому

      @@technovangelist Not the function of it. I agree. But when employing techniques like 'chain of thought' or 'tree of thought' the model generates intermediate 'thinking tokens'. Necessary for computation but might not be needed for the end user. Rather given template can define the structure, I was just wondering if template can start generation of thinking steps but only output the response or the final answer.

    • @technovangelist
      @technovangelist  Місяць тому

      It would be more appropriate in the system prompt

  • @GrandpasPlace
    @GrandpasPlace Місяць тому

    Thank you for these great videos!
    I would like to make a request, N8N now has an AI Agent that supports tool calls. Ive been working with it and I can set it up with Ollama and set up a tool that it calls and uses the returned information to formulate the answer. The problem is that no one seems to know how to get it to pass information to the tool. Im asked on the N8N message board and even had others say they are having the same issue. With your knowledge of Ollama and having used N8N do you think you could make a working example and explain how to pass information from the model to the tool? For example the tool looks up a stock price but needs to know which stock symbol to look up. The model is asked what the price of google is and needs to pass it to the tool.
    Thank you

  • @ByronBennett
    @ByronBennett Місяць тому

    Do we need to use these templates if we're using the OpenAI compatible REST API? I'm trying to understand how they relate to each other?

    • @technovangelist
      @technovangelist  Місяць тому +1

      All models use a template. But if using a model from ollama it’s already there

    • @Mum40535RBX
      @Mum40535RBX 18 днів тому

      @@technovangelist How do these templates differ from the template I feed my LLM using something like the LangChain ChatOllama API? Does that template get put inside the Ollama template? In other words, when I'm telling the llama3.2 to perform sentiment analsysis, I show it a few example prompts and then leave a space for the tweet, that is my template - how does it interact with the ollama template?

    • @technovangelist
      @technovangelist  17 днів тому +1

      I don’t know. For a long time langchain broke this. They used both even though there should be one. Thankfully there are very few reasons to ever use langchain. In most cases you can simplify by not using it.

    • @Mum40535RBX
      @Mum40535RBX 17 днів тому

      @@technovangelist I've found that too, went down a rabbit hole of trying to find the 'right' framework to work with. Silly me.

  • @JNET_Reloaded
    @JNET_Reloaded Місяць тому

    all models can have a model file for example i have a template maker script i made for crewai to make any local model work with crew ai

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Місяць тому

    Was it called Modelfile before?

    • @technovangelist
      @technovangelist  Місяць тому

      The modelfile is still the modelfile. A template is one of the things that goes into a modelfile to build a model. You only need to define the template if importing a new model weights file that doesn't have a template define, which would be most of them.

  • @HassanAllaham
    @HassanAllaham Місяць тому

    Thanks for the very good content. I was waiting this video for sooooooo long. One thing I noticed (I do not know if it is true): If you downloaded a model like llama3.2 and you created a new model from it using a simple template, then you can NOT use tools as mentioned in Ollama api i.e. you can not pass tools to the client even the model originally support tool calling.. this means that Ollama checks for something in the template to decide whether the model support tool or not. If downloaded llama3.2 from ollama hub, it uses the default template the uploader used, and if you read that default llama3.2 template from the hub you will discover that it forces the model to always call a tool unless it received the tool response i.e. if you called llama3.2 (with tools inserted to client) with the message Hello... It will use one of the tools returning something not useful at all. I believe It is very bad idea to relate ability to pass tools to client with something in the template.. Also I believe that this what makes you and me preferring to use the old way for building tooled agent and considering it more reliable .. Thanks again for the good content 🌹

    • @technovangelist
      @technovangelist  Місяць тому

      The models from ollama in the official library already have the template defined correctly as per the model developers.

    • @technovangelist
      @technovangelist  Місяць тому

      If you send a request with tools then it will respond with the tool to use. If you don’t want it to use a tool don’t send it tools to use.

    • @HassanAllaham
      @HassanAllaham Місяць тому

      @@technovangelist So if I passed a tool, the model CAN NOT decide when to use the tool and when Not to use it and it will use the tool always even if I invoked it with a message like "Hello".

  • @AliAlias
    @AliAlias Місяць тому

    Thanks ❤

  • @MarincaGheorghe
    @MarincaGheorghe Місяць тому

    Maybe what was not spelled in many of these videos is that a template is the formatting used/ the way one decides what data to send to the model, for mat of the data used for inference.

  • @LOSTOfficial_ww
    @LOSTOfficial_ww Місяць тому

    It’s look like you wearing Malaysian Batik or something like that..nice, love it!❤Love from Malaysia🫡

    • @technovangelist
      @technovangelist  Місяць тому

      I used to spend a lot of time in KL. But this one is from Amazon.

  • @60pluscrazy
    @60pluscrazy Місяць тому +2

    🎉

  • @StudyWithMe-mh6pi
    @StudyWithMe-mh6pi Місяць тому +1

    🤩🤩🤩

  • @K600K300
    @K600K300 Місяць тому +1

    Your explanations are always like drinking a glass of ice water in a hot weather.

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Місяць тому +1

    What is a template in Ollama?

    • @technovangelist
      @technovangelist  Місяць тому +7

      perhaps you should watch the video

    • @user-wr4yl7tx3w
      @user-wr4yl7tx3w Місяць тому

      @ no offense but i watched the first 5 minutes but it went straight into process rather than a high level of what it is, so I was lost at the outset and didn’t expect prospect to change.

    • @sad_man_no_talent
      @sad_man_no_talent Місяць тому

      thing

    • @technovangelist
      @technovangelist  Місяць тому

      If you are having to read them and use them you know

    • @technovangelist
      @technovangelist  Місяць тому

      This is one of the advanced topics and assumes you have a basic knowledge of how ollama works.