Local LLM with Ollama, LLAMA3 and LM Studio // Private AI Server

Поділитися
Вставка
  • Опубліковано 23 січ 2025

КОМЕНТАРІ • 20

  • @kenmurphy4259
    @kenmurphy4259 7 місяців тому

    Thanks Brandon, nice review of what’s out there for local LLMs

  • @SteheveRodriguez
    @SteheveRodriguez 7 місяців тому

    Its great idea, thanks Brandon. I will test on my homelab.

  • @fermatdad
    @fermatdad 7 місяців тому

    Thank you for the helpful tutorial.

  • @trucpham9772
    @trucpham9772 7 місяців тому

    How to run ollama3 in macos, i want public localhost to use nextchatgpt , can you share command this solution

  • @mjes911
    @mjes911 6 місяців тому

    How many concurrent users can this support for business cases?

  • @kderectorful
    @kderectorful 6 місяців тому

    I am accessing the openai server via a Mac, and my guess is that the netsh command is about your windows workstation you are accessing the server. Is there a similar command that would need to be run, or if I do this on my Linux server via Firefox, will I still have the same issue. I cannot seem to get the ollama3:latest installed for openweb. Any insight would be greatly appreciated as this was the most concise video I have seen on the topic.

  • @LibyaAi
    @LibyaAi 7 місяців тому

    Nobody explained how to install ollama and run it in properite way ، it should be step's ، is docker important to install before ollama? I tried to install ollama alone and it doesn't installed completely!! I don't know why

    • @kironlau
      @kironlau 7 місяців тому +1

      1. you should mention what is your os
      2. read the official documentation
      3. if you run on win, just download the exe/msi file, install with one click(and click yes...)

  • @totallyperfectworld
    @totallyperfectworld 11 днів тому

    How many CudaCores do you need for running this without getting frustrated. I know, the more the better. But what really makes sense? Just trying to find out what graphic card I should bet without busting my bank account…

  • @romayojr
    @romayojr 7 місяців тому +1

    this is awesome and can’t wait to try it. is there a mobile app for open webui?

    • @jjaard
      @jjaard 6 місяців тому +1

      I suspect technically it can easily run via any browser

  • @TheTricou
    @TheTricou 4 місяці тому

    so this "guide" is missing some key things like how to change the ip for wsl then how to run ollama like a service. even in his written guide is not telling on how to do this.

  • @thiesenf
    @thiesenf 5 місяців тому

    GPT4ALL is another good locally running chat interface... it can run On both the CPU but also on the GPU using Vulkan...

  • @SyamsQbattar
    @SyamsQbattar 6 місяців тому

    Is LMStudio better than Ollama?

    • @camsand6109
      @camsand6109 6 місяців тому

      no, but its a good option

    • @SyamsQbattar
      @SyamsQbattar 6 місяців тому

      @@camsand6109 then, Ollama is better?

  • @nobody-P
    @nobody-P 7 місяців тому

    😮I'm gonna try this now

  • @klovvin
    @klovvin 5 місяців тому

    This would be better content if done by an AI

    • @thiesenf
      @thiesenf 5 місяців тому

      Atleast we got the usually extremely boring stock videos as B rolls... *sigh*...