Ollama has a Python library!

Поділитися
Вставка
  • Опубліковано 2 жов 2024

КОМЕНТАРІ • 31

  • @gerykis
    @gerykis 8 місяців тому +12

    you need to run ollama server before . otherwise will be connection refused .

    • @learndatawithmark
      @learndatawithmark  8 місяців тому +1

      Yeh - I should have mentioned that at the start. Will do on videos going forwards

    • @musictube6672
      @musictube6672 7 місяців тому +3

      pin this message

    • @federicogentile90
      @federicogentile90 6 місяців тому

      Could you please tell me where i should run it? What should I type in the code?

    • @learndatawithmark
      @learndatawithmark  6 місяців тому

      ​@@federicogentile90You need to run ollama serve from a terminal. The other commands in the video you can run from a Python shell/REPL, Jupyter Notebook, or a Python script.

  • @NextGenSellPOS
    @NextGenSellPOS 7 місяців тому

    great work i have been struggling with research to use python with language models

  • @PreshanSumanadeera
    @PreshanSumanadeera 7 місяців тому +1

    Does this ollama Python library utilize the GPU?

  • @zakariadrif3744
    @zakariadrif3744 8 місяців тому +1

    Hello thank you for sharing! it is possible to turn it on colab ?
    following to a t your tutorial i v received ConnectionRefusedError: [Errno 111] Connection refused

    • @learndatawithmark
      @learndatawithmark  8 місяців тому

      That's something I need to figure out. Ollama has a binary that you need to install - do you know how to do that on Colab?

    • @mavert17
      @mavert17 8 місяців тому

      I have the same error!

    • @learndatawithmark
      @learndatawithmark  8 місяців тому

      ​@@mavert17 Sorry - I should have mentioned that you also need to install Ollama via the binary - ua-cam.com/video/NFgEgqua-fg/v-deo.html
      On this Stack Overflow post they show how you can install Ollama when you're using Google Colab. I haven't tried it out yet but might be worth checking out - stackoverflow.com/questions/77697302/how-to-run-ollama-in-google-colab

    • @zakariadrif3744
      @zakariadrif3744 8 місяців тому

      @learndatawithmark thanks for answering no I don't have any idea .

  • @forestevans5453
    @forestevans5453 3 місяці тому

    Very helpful, thank you. Is there any way to use my gpu to run the model using cuda?

    • @learndatawithmark
      @learndatawithmark  3 місяці тому

      I think it should do that automatically is that not what you're seeing?

  • @the-notorious-khaki
    @the-notorious-khaki 3 місяці тому

    it errors for me
    error:
    Exception has occurred: AttributeError
    partially initialized module 'ollama' has no attribute 'pull' (most likely due to a circular import)

    • @learndatawithmark
      @learndatawithmark  3 місяці тому

      Maybe try uninstalling the Ollama library and then re-installing?

    • @the-notorious-khaki
      @the-notorious-khaki 3 місяці тому

      @@learndatawithmark nah i fixed it, i just called my python script Ollama.py
      silly mistake i made

  • @PrashantKumar.GPT3.5
    @PrashantKumar.GPT3.5 5 місяців тому

    Does ollama python require ollama server to be running?

  • @hilal53426
    @hilal53426 8 місяців тому +1

    Beautiful!

  • @GizmoTheDev
    @GizmoTheDev 5 місяців тому

    thx bro i was struggling and then i just searched it up and it worked :D

  • @SomethingSpiritual
    @SomethingSpiritual 6 місяців тому

    ModuleNotFoundError: No module named 'ollama'

    • @learndatawithmark
      @learndatawithmark  6 місяців тому

      Did you do pip install ollama?

    • @SomethingSpiritual
      @SomethingSpiritual 6 місяців тому

      @@learndatawithmark yes i did, still same issue

    • @TheOfficialArcVortex
      @TheOfficialArcVortex 5 місяців тому

      @@SomethingSpiritualpython cannot find the installed library. Make sure if you are running a venv it is installed in the venv aswell.

  • @onoff5604
    @onoff5604 8 місяців тому

    So exciting.

  • @mexo100
    @mexo100 8 місяців тому

    Adding own document to chat with it?

    • @learndatawithmark
      @learndatawithmark  8 місяців тому

      Yes we could do that as well. I guess the easiest way would be to load the text into a message that gets passed into the array of messages passed to the chat function. I showed how to do this with litellm (similar API to Ollama) over here - ua-cam.com/video/MiJQ_zlnBeo/v-deo.html