How To Run Open Canvas Locally

Поділитися
Вставка
  • Опубліковано 17 гру 2024

КОМЕНТАРІ •

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 17 хвилин тому

    why do we need langgraph if it doesn't involve agents?

  • @Truizify
    @Truizify 15 годин тому +1

    What if you're running the LLM locally as well, is that supported?

    • @LangChain
      @LangChain  14 годин тому +1

      This is possible, however it'll require forking the repo and updating the code to support calling a local model provider (e.g Ollama). See this section on how to add support for more LLM providers: github.com/langchain-ai/open-canvas?tab=readme-ov-file#troubleshooting

    • @vaidphysics
      @vaidphysics 9 годин тому

      @@LangChainthat seems like a very complicated way of doing something which should be as simple as specifying the llm endpoint which could be either local or remote.

  • @Texa8
    @Texa8 10 годин тому +3

    Not a big fan of Langsmith. Too convoluted and I’m not sure if it adds any value over open source offerings