2. Installing Ollama

Поділитися
Вставка
  • Опубліковано 26 лис 2024

КОМЕНТАРІ • 26

  • @fabriai
    @fabriai 3 місяці тому +1

    Matt, thanks for the course. So far so good.

  • @ReidKimball
    @ReidKimball 3 місяці тому

    Excited for the next videos! I'd like to start developing my own tools and small apps using LLMs. I'm technical and have made small python scripts, but not an experienced software engineer.

  • @usiala
    @usiala 2 місяці тому

    Great Series. Thanks.

  • @zwelimdlalose1059
    @zwelimdlalose1059 3 місяці тому

    Thank you Matt been struggling to run Ollama for a minute now you just showed me what I was doing wrong 😅

  • @solyarisoftware
    @solyarisoftware 3 місяці тому +1

    Thanks, Matt. I'm upvoting all your videos. Is there a way to know if Ollama is using a GPU in the background? Perhaps a command-line command? More generally, a session that dives deeper into GPU usage would be great.

    • @solyarisoftware
      @solyarisoftware 3 місяці тому

      oh sorry, you already answered a similar question in the first comment

  • @ambarrlite
    @ambarrlite 3 місяці тому

    lol, I was thinking of moving the .ollama folder to another drive and using a symbolic link, but now I will wait for those enviorment instructions. :)

  • @incogveto
    @incogveto 3 місяці тому +1

    I appreciate your videos 🦾. Running Ollama in Docker Desktop on Windows 10 utilizing wsl 2 Ubuntu integration. I have a single 4090 and it smokes with 8b fp16 models. Running Ollama directly on Windows was terribly slow even with 8b q4 models. It took forever to load models into vram 🫵🤓👍

    • @technovangelist
      @technovangelist  3 місяці тому +1

      you have multiple levels of abstraction there. Its always going to be best running ollama directly on Windows rather than docker in a ubuntu container on the wsl vm. If it was slower, there must have been an issue with the drivers. I would solve that first.

    • @incogveto
      @incogveto 3 місяці тому

      @@technovangelist I reinstalled Ollama on Windows and it's just as fast if not faster. Must have had a old driver idk...Thanks for the info Matt!

    • @technovangelist
      @technovangelist  3 місяці тому

      Nice. Thanks for letting me know its all good.

  • @thibdub2752
    @thibdub2752 3 місяці тому +2

    Is it possible to have a video on Paperspace and Brev ?

    • @technovangelist
      @technovangelist  3 місяці тому +1

      That’s a great idea. I hadn't considered doing it before, but I probably should.

  • @kellysnodgrass2236
    @kellysnodgrass2236 Місяць тому

    I am building an ollama environment for my small business. I am way more familiar with Windows than Linux. Should I install ollama on Windows or Linux? If Linux, I plan to use WSL. I come from a development background so I'm sure I can get around in Linux. Any recommendations? Thanks for sharing your knowledge with us!

  • @claudioguendelman
    @claudioguendelman 3 місяці тому

    Thanks soo much i want to add the IA to a PHP Sistem

  • @alanrussell6678
    @alanrussell6678 3 місяці тому +1

    How do I know whether Ollama is seeing/using my GPU?

    • @znzbest2004
      @znzbest2004 3 місяці тому +1

      run ' ollama ps ' in terminal . there is an entry " PROCESSOR" which shows 'CPU or GPU'

    • @technovangelist
      @technovangelist  3 місяці тому

      After asking a question you can run ollama ps to see if ollama used the gpu and how much of the model could be offloaded to the gpu.

  • @Francotujk
    @Francotujk 3 місяці тому

    Hi!
    Do you know how to “package” ollama and an llm inside one app (for example an electron-react native app)?
    So the end user don’t need to install ollama/the llm and don’t need use the terminal. Just download the electron app and start using the llm

    • @technovangelist
      @technovangelist  3 місяці тому

      I don’t but Bruce on the ollama team did it with chatd. Well he added ollama, not a model. You can find the source on GitHub.

    • @technovangelist
      @technovangelist  3 місяці тому

      Here is a link to a message on this in the ollama discord. discord.com/channels/1128867683291627614/1261421971615252480

    • @Francotujk
      @Francotujk 3 місяці тому

      @@technovangelist thank you so much!! I will take a look to it!
      I really appreciate it! 👍🏻

  • @Leon-AlexisSauer
    @Leon-AlexisSauer 2 місяці тому

    is it possible to get an graphical ui ?

    • @technovangelist
      @technovangelist  2 місяці тому

      sure, there are a lot of choices on the ollama repo