STOP using LLM APIs incorrectly - use this instead!

Поділитися
Вставка
  • Опубліковано 8 лют 2025
  • I talk about the problems that arise when connecting to multiple LLM (large language model) providers directly. I explore LiteLLM, a new piece of open source software designed to solve the problems associated with connecting to multiple AI providers. LiteLLM handles cost tracking, logging and more! I install LiteLLM on my laptop using Docker Compose and give my reaction to the UI and main features of the software.

КОМЕНТАРІ • 6

  • @BenjaminMaggi
    @BenjaminMaggi 2 дні тому +1

    This is awesome, exactly what I need to make my agents more professional !

  • @Baker-zn1xh
    @Baker-zn1xh 5 днів тому +1

    AC/DC once said “lock up your daughters, lock up your wives.” This quote comes to mind every time a Tommy Codes vid drops

  • @kotakcloud
    @kotakcloud 2 дні тому +1

    Thanks buddy. Seems like its only come with python, do we have such alternative for nodes as well ?

    • @tommy_codes_5
      @tommy_codes_5  День тому +1

      I'm not aware of a NodeJS library that has all of the litellm client library functionalities, but if you run the litellm proxy server you can just use the openai NodeJS client.

    • @kotakcloud
      @kotakcloud День тому

      @@tommy_codes_5 Great, I'll explore this option in coming time. Additionally if i come across such, will share here for sure. Thanks 🙏