Easy Open-WebUI + LM Studio Tutorial: Free & Local ChatGPT Alternative

Поділитися
Вставка
  • Опубліковано 26 чер 2024
  • In this video, we'll walk you through the simple steps to install Open WebUI, set up a dedicated environment, and connect it to LM Studio for a fully functional, locally-run AI assistant. You'll learn how to: Install Open WebUI and its dependencies
    Set up a dedicated environment with Miniconda
    Connect to LM Studio for open-source language models
    Use your local AI assistant for text-to-speech, image generation, document Q&A, and more! With over 29,000 stars on GitHub, the open-source community loves Open WebUI. Join the movement and take the first step towards a more private and customizable AI experience.
    Let us know in the comments if you'd like to see more tutorials on what you can do with Open WebUI Don't forget to like, subscribe, and hit that notification bell for more exciting AI adventures!"
    Local Lab Twitter - / thelocallab_
    buymeacoffee.com/thelocallab
  • Наука та технологія

КОМЕНТАРІ • 16

  • @TheLocalLab
    @TheLocalLab  21 день тому

    🔴 How To Run Your Llama 3.1 Models With Open WebUI Web Search Locally 👉 ua-cam.com/video/eBLvRV73xzU/v-deo.html

  • @jackreacher8632
    @jackreacher8632 4 дні тому

    Thank so much for your easy to understand tutorials🙏

  • @tlumme
    @tlumme 5 днів тому

    thank you for the video finally got figured out put "correct url's addresses" in correct places to connect Ollama then managed to download Ollama 3.1 model

  • @Habenskii
    @Habenskii 11 днів тому

    so that solution can only help with working with documents? this can also be done directly in the LM studio...
    It would be interesting to understand how to complement a conversation with a model with an Internet search, this is really important..

  • @user-cn7nd4cq4q
    @user-cn7nd4cq4q 16 днів тому

    Can we instal the openwebui in android? With termux?

  • @SCUTXD
    @SCUTXD Місяць тому

    Hey, i have tried this. In the API-key field, "none" does not work, the apikey is "lm-studio"

    • @TheLocalLab
      @TheLocalLab  Місяць тому

      That's a bit strange but if it works, it works. Normally if your using a local OpenAI compatible API you don't need an API key at all and "none" usually works for me as demonstrated in the video.

    • @SyamsQbattar
      @SyamsQbattar 26 днів тому

      I have same problem

    • @JohnSmith-iv5zy
      @JohnSmith-iv5zy 17 днів тому

      just here to say it worked here, say if your curl is "localhost:1234/v1" paste that in the first box and type none in the second. I did this on pc so I hit enter on both fields just to make sure then saved, then watched it gave me the verified notification.

  • @rakibislam6918
    @rakibislam6918 Місяць тому

    how to run second time?

    • @TheLocalLab
      @TheLocalLab  Місяць тому

      Activate the conda environment and run "open-webui serve". That's it.

  • @setyoufree2726
    @setyoufree2726 17 днів тому

    I dont get it. Why do we need open web ui if the prompt chat can directly be done in LM Studio?

    • @JohnSmith-iv5zy
      @JohnSmith-iv5zy 17 днів тому

      personally I was looking for this exact video because I wanted to be able to get LLMs at what ever quality(/quant) and size I want, know if my machine can run it or not, and download it. and besides I have ollama connected to it so what's wrong with having 2 quality sources to pull from. I can test the model on LM, if I like it I move it to web ui and tweak it to what I need it to be. one of the many reasons Lol

    • @hedonicas
      @hedonicas 15 днів тому

      @@JohnSmith-iv5zy It may be a good fit for the open source models, but for openai api it's meaningless. You can add the api key instead getting key via lm studio.