Expert Guide: Installing Ollama LLM with GPU on AWS in Just 10 Mins

Поділитися
Вставка
  • Опубліковано 28 лют 2024
  • Learn how to install Ollama LLM with GPU on AWS in just 10 minutes! Follow this expert guide to set up a powerful virtual private LLM server for fast and efficient deep learning. Unlock the full potential of your AI projects with Ollama and AWS.
    #ai #llm #gpu

КОМЕНТАРІ • 19

  • @fastandsimpledevelopment
    @fastandsimpledevelopment  4 місяці тому +3

    Need a heavy GPU machine? Check out this video on setting AWS EC2 GPU instance. If you like this one check out my video on setting up a full RAG API with Llama3, Ollama, Langchain and ChromaDB - ua-cam.com/video/7VAs22LC7WE/v-deo.html

  • @christague2084
    @christague2084 4 місяці тому +1

    Cannot wait for part two with LangChain! This video was fantastic

  • @ExpertKNowledgeGroup
    @ExpertKNowledgeGroup 4 місяці тому

    What a simple way to setup Ollama LLM with GPU support in only a few minutes, thanks!

  • @bingbingxv
    @bingbingxv 4 місяці тому

    Thank you so much! Your video helps me a lot. I am looking forward to your new video.

  • @123arskas
    @123arskas 4 місяці тому

    Thank you. This was helpful

  • @danilchurko2882
    @danilchurko2882 2 місяці тому +1

    Thanks man a lot! Great video!

  • @hebertgodoy5039
    @hebertgodoy5039 Місяць тому

    Excellent. Thank you very much for sharing.

  • @ctoxyz
    @ctoxyz 2 місяці тому

    good vid!

  • @yashshinde8185
    @yashshinde8185 4 дні тому

    The Video was awesome and prety helpful but can you cover the security point of view too like anyone with the IP and portnumber can access it So how can we avoide that?

  • @paulluka7594
    @paulluka7594 2 місяці тому +1

    Thanks a lot for the video !!
    Question : Is it possible to start the instance only if we do a request to the server ? It can be usfull to limit the costs.
    I think it is feasable with kubernetes and docker, but i would enjoy a video about it :) !
    Thnks again, very good video

  • @ferasalawadi4273
    @ferasalawadi4273 3 місяці тому

    thanks buddy

  • @sachin1250
    @sachin1250 2 місяці тому

    How to add openwebui to it, and expose the openwebui to be accessible from macbook browser?

  • @Gerald-iz7mv
    @Gerald-iz7mv 3 місяці тому +1

    can you also use the ubuntu 22.04 image and install cuda etc? why use this deep learning image?

    • @fastandsimpledevelopment
      @fastandsimpledevelopment  3 місяці тому

      I only select this AMI since it already has teh other code I need like Python

    • @Gerald-iz7mv
      @Gerald-iz7mv 3 місяці тому

      @@fastandsimpledevelopment if i correctly understand you can select the base ubuntu 22.04 image and install all yourself: nvidia driver, cuda driver, tensorflow, python etc?

  • @pushkarpadmnav
    @pushkarpadmnav 3 місяці тому +1

    How do you make it scalable ?

    • @fastandsimpledevelopment
      @fastandsimpledevelopment  3 місяці тому

      By itself it is not, you need to add a front end like Nginx and then have several Ollama servers running, that is the only way that I am aware today. There is new updates all the time to keep track of Ollama updates

  • @emineyoubah7418
    @emineyoubah7418 3 місяці тому

    Cannot wait for part two with LangChain! This video was fantastic