Give Internet Access to your Local LLM (Python, Ollama, Local LLM)

Поділитися
Вставка
  • Опубліковано 1 тра 2024
  • Give your local LLM model internet access using Python. The LLM will be able to use the internet to find information relevant to the user's questions.
    Join the Discord: / discord
    Library Used:
    github.com/emirsahin1/llm-axe
  • Фільми й анімація

КОМЕНТАРІ • 14

  • @Oxxygen_io
    @Oxxygen_io 2 місяці тому +3

    This was great, thanks for the introduction.
    Love to see a deep dive to have this add on as an extra where you get command prompt with internet search. Like running "ollama but now with internet"

    • @polymir9053
      @polymir9053  2 місяці тому +1

      Thanks! I appreciate the suggestion.
      There is a small little demo showing how this can be used to make a command prompt chat where you can chat with the online agent. Here is the link: github.com/emirsahin1/llm-axe/blob/main/examples/ex_online_chat_demo.py

  • @_areck_
    @_areck_ 26 днів тому +1

    amazing video, extremely underrated channel. Good work, I needed this to complete my program for an assistant model using ollama that has the capability to create files, run files, edit the contents of files, search the web, and maintain a persistent memory. This was the second to last thing I needed to finish it up, now I just need to finish the run files part.

  • @ViralComparison
    @ViralComparison 2 місяці тому +1

    perfect! need more videos like these

  • @irkedoff
    @irkedoff 2 місяці тому +1

    💜

  • @coachvalente
    @coachvalente Місяць тому

    Is it better to open various tabs from the same LLM in case i want to ask different subjects like we do in ChatGPT? Or i can use only one chat for everything i want to do?

    • @polymir9053
      @polymir9053  Місяць тому +1

      You can use a single Agent for multiple subjects. While agents do keep track of history, chat history is only used if passed in along with the question.

  • @fakhrun4038
    @fakhrun4038 Місяць тому +1

    Can you make a video for the web ui?

    • @polymir9053
      @polymir9053  Місяць тому +1

      There is no webui for this, but you with some coding you could easily tie this up to any existing open source chat UIs.

  • @paleostressmanagement
    @paleostressmanagement Місяць тому +1

    Can this be used with the Ollama API? If so, how?

    • @polymir9053
      @polymir9053  Місяць тому +1

      Yes, I'm using Ollama in the video. It has built in support for the Ollama API through the OllamaChat class. See this example: github.com/emirsahin1/llm-axe/blob/main/examples/ex_online_agent.py

    • @paleostressmanagement
      @paleostressmanagement Місяць тому

      @@polymir9053 Thanks! But i am still a bit confused as to how to use this with the ollama API example for a chat completion?
      curl localhost:11434/api/chat -d '{
      "model": "llama3",
      "messages": [
      {
      "role": "user",
      "content": "why is the sky blue?"
      }
      ],
      "stream": false
      }'

  • @brenden_Li
    @brenden_Li Місяць тому

    what app executing the code with

    • @polymir9053
      @polymir9053  Місяць тому

      It's just Python and Ollama.