Now You can Easily Host your Ollama using Salad Cloud at Just $0.3

Поділитися
Вставка
  • Опубліковано 7 вер 2024
  • In this video, let’s host your ollama models on cloud.
    Create a chatbot for just $0.3 using Ollama, SALAD and Open webui.
    Let me take you step by step approach on how to do this.
    Let’s do this!
    Join the AI Revolution!
    #SALAD #SALAD GPU#customollama #custommodels #noushermes #functioncalling #jsonstructuredoutpur #AGI #openai #autogen #windows #ollama #ai #llm_selector #auto_llm_selector #localllms #github #streamlit #langchain #openai #ollama #webui #github #python #llm #largelanguagemodels
    CHANNEL LINKS:
    🕵️‍♀️ Join my Patreon: / promptengineer975
    ☕ Buy me a coffee: ko-fi.com/prom...
    📞 Get on a Call with me at $125 Calendly: calendly.com/p...
    ❤️ Subscribe: / @promptengineer48
    💀 GitHub Profile: github.com/Pro...
    🔖 Twitter Profile: / prompt48
    TIME STAMPS:
    0:00 Intro
    🎁Subscribe to my channel: / @promptengineer48
    If you have any questions, comments or suggestions, feel free to comment below.
    🔔 Don't forget to hit the bell icon to stay updated on our latest innovations and exciting developments in the world of AI!

КОМЕНТАРІ • 12

  • @kaviarasana7584
    @kaviarasana7584 Місяць тому +1

    I cant find the Deployment URL as illustrated. Where do I check them ?

  • @YashDesai95
    @YashDesai95 4 місяці тому +1

    Best video

  • @AClotheswoman
    @AClotheswoman 2 місяці тому +1

    nice video, one Question do you know how i set that ollama allows multiple requests

    • @PromptEngineer48
      @PromptEngineer48  2 місяці тому

      yes. check out this video.
      ua-cam.com/video/8r_8CZqt5yk/v-deo.htmlsi=TDCcO0gksibb57P_

    • @AClotheswoman
      @AClotheswoman 2 місяці тому

      @@PromptEngineer48 yes that works. But I don’t know how I can set this on salad

  • @mulkproject687
    @mulkproject687 4 місяці тому

    Bro follow up question. Let say deploy and running. If i will not use the app will it still charge me with per hour rate? and second question the token response is unlimited? no limit?

    • @PromptEngineer48
      @PromptEngineer48  4 місяці тому

      Hi.. yes this will keep on charging you per hour rate. If you want to be charged when you use it only, you need to explore serverless architecture. search for runpods serverless.
      response token is limited by the llm. which llm are you using.. it is not unlimited.

  • @josephj6802
    @josephj6802 2 місяці тому +1

    Just $0.394 ... per hour 😏= $283.68 a month?