STOP using LLM APIs incorrectly - use this instead!
Вставка
- Опубліковано 8 лют 2025
- I talk about the problems that arise when connecting to multiple LLM (large language model) providers directly. I explore LiteLLM, a new piece of open source software designed to solve the problems associated with connecting to multiple AI providers. LiteLLM handles cost tracking, logging and more! I install LiteLLM on my laptop using Docker Compose and give my reaction to the UI and main features of the software.
This is awesome, exactly what I need to make my agents more professional !
AC/DC once said “lock up your daughters, lock up your wives.” This quote comes to mind every time a Tommy Codes vid drops
@@Baker-zn1xh 🫡
Thanks buddy. Seems like its only come with python, do we have such alternative for nodes as well ?
I'm not aware of a NodeJS library that has all of the litellm client library functionalities, but if you run the litellm proxy server you can just use the openai NodeJS client.
@@tommy_codes_5 Great, I'll explore this option in coming time. Additionally if i come across such, will share here for sure. Thanks 🙏