LM Studio - User-friendly way to run language models locally

Поділитися
Вставка
  • Опубліковано 14 лис 2024

КОМЕНТАРІ • 22

  • @megaaziib
    @megaaziib 3 місяці тому +5

    i'm using this too because it's more friendly user interface.

  • @snintendog
    @snintendog 3 місяці тому +8

    They did just update OpenCl to Vulkan so NICE.

    • @Six-mg5vs
      @Six-mg5vs 3 місяці тому +4

      I'm so happy about this update. I have an AMD card unsupported by ROCm but was able to get models running on my GPU thanks to vulkan.

  • @ИчовтвВьвтвььв
    @ИчовтвВьвтвььв 3 місяці тому +10

    Hello, I have an important question. In general, video cards degrade after long-term work with LLMs, and all because the video cards are not loaded at all at first, and when a request comes from the user, they are instantly loaded to 100%, which is why the video chip degrades. Is it possible to somehow make sure that the video card is constantly at 70-100% load? The simple fact is that the condition of constant 70-100% is more favorable for a video card than from 0% to 100% in an instant. Thanks for your reply (This comment was written with the help of a translator).

    • @stop_tryharding
      @stop_tryharding 3 місяці тому +10

      This is not true. GPU chips aren't machinery, they don't degrade from load cycling. You are looking for a solution to a problem that doesn't exist. If you're concerned about *thermal cycling* which could in theory wear out your cpu fan and cause a gpu to overheat when the fan fails, then set a power limit on the card to keep temps down.

    • @ИчовтвВьвтвььв
      @ИчовтвВьвтвььв 3 місяці тому

      @@stop_tryharding thanks for information. I appreciate it.

  • @ElaraArale
    @ElaraArale 3 місяці тому +1

    yeah THE GOAT

  • @CursedStorybookTales
    @CursedStorybookTales 3 місяці тому +2

    Brother, can you make a tutorial about LM Studio to do screen sharing with character? With llava or some LLM we can run offline on my rig? I can use llava inside LMStudio, but I can't connect it to sillytavern.
    I didn't have any problem connect LM studio to Vpet via API.

  • @GothxRegi
    @GothxRegi 3 місяці тому

    Thanks! This looks pretty interesting! Will you cover Jan AI sometime as well?

  • @AnotherDnBFanhere
    @AnotherDnBFanhere 3 місяці тому +2

    Wowie look good 👍

  • @VongolaChouko
    @VongolaChouko Місяць тому

    Can we use LM Studio as backend for vision capable models then connect it to Silly Tavern? I don't understand how to set it in image captioning extension as it's not in the options.
    Do we need to use an OpenAI-compatible API and set it as Chat Completions in Silly Tavern? Is that possible for LM Studio?

  • @vmen1436
    @vmen1436 3 місяці тому +2

    ooba with more cool UI. But if one have a potato pc, have a potato pc XD
    i use infermatic fo online LLM

  • @eugene-bright
    @eugene-bright 3 місяці тому +2

    For Linux users ollama can be more preferable

  • @nekotoru
    @nekotoru 3 місяці тому

    Cool, but is it possible to connect ollama and ST?

  • @madnessteapot
    @madnessteapot 3 місяці тому

    Please tell me what is the best model for RP that I can run locally on my 3070ti?

    • @MustacheAI
      @MustacheAI  3 місяці тому +1

      ua-cam.com/video/uFcEMfYbh9c/v-deo.html

  • @Noeminya
    @Noeminya 3 місяці тому

    What are the components of your PC?

  • @NoidoDev
    @NoidoDev 3 місяці тому

    But not open source?

    • @snintendog
      @snintendog 3 місяці тому +4

      Its literally on Github....

    • @NoidoDev
      @NoidoDev 3 місяці тому

      @@snintendog
      Oh, okay.

    • @strkn25
      @strkn25 3 місяці тому +3

      @@snintendog no it's not, only the prompt templates are open source. The app is closed.

    • @snintendog
      @snintendog 3 місяці тому

      @@strkn25 The app is just a python wrapper....right cant talk to those that can do basic research.