This is possible, however it'll require forking the repo and updating the code to support calling a local model provider (e.g Ollama). See this section on how to add support for more LLM providers: github.com/langchain-ai/open-canvas?tab=readme-ov-file#troubleshooting
@@LangChainthat seems like a very complicated way of doing something which should be as simple as specifying the llm endpoint which could be either local or remote.
why do we need langgraph if it doesn't involve agents?
What if you're running the LLM locally as well, is that supported?
This is possible, however it'll require forking the repo and updating the code to support calling a local model provider (e.g Ollama). See this section on how to add support for more LLM providers: github.com/langchain-ai/open-canvas?tab=readme-ov-file#troubleshooting
@@LangChainthat seems like a very complicated way of doing something which should be as simple as specifying the llm endpoint which could be either local or remote.
Not a big fan of Langsmith. Too convoluted and I’m not sure if it adds any value over open source offerings