How to run an LLM Locally on Ubuntu Linux

Поділитися
Вставка
  • Опубліковано 18 чер 2024
  • Here's a step by step guide to running a ChatGPT like LLM on your own machine with Ollama. Includes how to install Ollama on your machine, how to access it with CURL, Python, and how to install a web interface.
    Perfect for beginners to LLMs.
    Instructions from this article:
    www.jeremymorgan.com/blog/gen...
    Ollama website:
    ollama.com/
    Open WebUI:
    github.com/open-webui/open-webui
    NVidia Container Toolkit:
    docs.nvidia.com/datacenter/cl...
    ---
    Follow me on UA-cam: ua-cam.com/users/jeremymorgan?...
    Yell at me on Twitter: / jeremycmorgan
    Check out my Blog: www.jeremymorgan.com
  • Навчання та стиль

КОМЕНТАРІ • 2

  • @dipanwitadutta3008
    @dipanwitadutta3008 5 днів тому

    It is very important to show how to close Ollama Web UI? And will that port will always be occupied? Once I close the browser Tab how to restart everything?

  • @mufeedco
    @mufeedco 19 днів тому

    Thank you, great guide. How can I configure Oobabooga text-generation-webui with Open WebUI?