Give Internet Access to your Local LLM (Python, Ollama, Local LLM)
Вставка
- Опубліковано 1 тра 2024
- Give your local LLM model internet access using Python. The LLM will be able to use the internet to find information relevant to the user's questions.
Join the Discord: / discord
Library Used:
github.com/emirsahin1/llm-axe - Фільми й анімація
This was great, thanks for the introduction.
Love to see a deep dive to have this add on as an extra where you get command prompt with internet search. Like running "ollama but now with internet"
Thanks! I appreciate the suggestion.
There is a small little demo showing how this can be used to make a command prompt chat where you can chat with the online agent. Here is the link: github.com/emirsahin1/llm-axe/blob/main/examples/ex_online_chat_demo.py
amazing video, extremely underrated channel. Good work, I needed this to complete my program for an assistant model using ollama that has the capability to create files, run files, edit the contents of files, search the web, and maintain a persistent memory. This was the second to last thing I needed to finish it up, now I just need to finish the run files part.
perfect! need more videos like these
💜
Is it better to open various tabs from the same LLM in case i want to ask different subjects like we do in ChatGPT? Or i can use only one chat for everything i want to do?
You can use a single Agent for multiple subjects. While agents do keep track of history, chat history is only used if passed in along with the question.
Can you make a video for the web ui?
There is no webui for this, but you with some coding you could easily tie this up to any existing open source chat UIs.
Can this be used with the Ollama API? If so, how?
Yes, I'm using Ollama in the video. It has built in support for the Ollama API through the OllamaChat class. See this example: github.com/emirsahin1/llm-axe/blob/main/examples/ex_online_agent.py
@@polymir9053 Thanks! But i am still a bit confused as to how to use this with the ollama API example for a chat completion?
curl localhost:11434/api/chat -d '{
"model": "llama3",
"messages": [
{
"role": "user",
"content": "why is the sky blue?"
}
],
"stream": false
}'
what app executing the code with
It's just Python and Ollama.