Tool Calling with LangChain is awesome!

Поділитися
Вставка
  • Опубліковано 26 гру 2024

КОМЕНТАРІ • 44

  • @vijaynadkarni
    @vijaynadkarni 2 місяці тому +1

    Great video! I learned a lot about the structure of LLM responses as well as the circumstances under which the model will go to the LLM with tools vs the baseline LLM. The depth of understanding that you provide regarding the various LLM and tool operations is what makes your videos stand out.

    • @codingcrashcourses8533
      @codingcrashcourses8533  2 місяці тому

      Thank you for that comment. I had to explain that multiple times to my coworkers, so it some point it became second nature to explain it 😀

  • @Shashikantzz
    @Shashikantzz 7 місяців тому +3

    Excellent video! You made life simple of non coders like us to actually solve complex tasks.. Kudos 🎉

  • @tane_ma
    @tane_ma 7 місяців тому +2

    Amazing video. I was looking for these informations for some time, it was hard to find a clear explanation. Thank you for the summarized info and code

  • @rutvikjaiswal4986
    @rutvikjaiswal4986 4 місяці тому +1

    Oh man, really I'm just got masterpiece today. I'm searching for long time for sure and finally here it's .
    One request. please make video on advance rag using LangGraph

  • @GeorgAubele
    @GeorgAubele 6 місяців тому +1

    Great tutorial!
    I've got one question, though:
    in 6:02 respectively 6:54: Does the model decide which tool to use on basis of the doc string?

  • @surajvardhan8490
    @surajvardhan8490 5 місяців тому +3

    Thanks!! You have stopped 2 days of my misery :)

    • @codingcrashcourses8533
      @codingcrashcourses8533  5 місяців тому +1

      @@surajvardhan8490 great! What happened? :)

    • @surajvardhan8490
      @surajvardhan8490 4 місяці тому +1

      @@codingcrashcourses8533 All I was worried about is my result was an empty string with the tools. And no one actually said about this anywhere how to do it.

    • @AWhite_
      @AWhite_ 2 місяці тому

      @@codingcrashcourses8533 He was trying to find how to calls tools with llms, like me... Thanks man.

  • @kyudechama
    @kyudechama 5 місяців тому +1

    Thanks for the video! I could not get my AIMessage to return multiple tool calls. Doesn't matter how many different questions I put in or how many tools I bind. The model always only chooses one tool to call as response. Any idea why that happens?

  • @Obinna-ai
    @Obinna-ai 3 місяці тому

    Great video.
    Quick question. How would you handle including a knowledge base in the application? I want the chatbot to reference a knowledge base (like Supabase) for responses, and then tools for specific functionality (like the frequency of a word in the context).
    Keep up the good work!

    • @codingcrashcourses8533
      @codingcrashcourses8533  3 місяці тому

      I have multiple videos on that, with LangChain and also with Agents (LangGraph) :).

  • @b18181
    @b18181 5 місяців тому

    Awesome video! Would you consider adding a module to discuss how to do tool calling with other LLMs (such as Llama3 70B via Groq or Mistral)?

    • @codingcrashcourses8533
      @codingcrashcourses8533  5 місяців тому +1

      Question upfront? Does it not work with other models? LangChain normally provides a standardized interface for all models

    • @b18181
      @b18181 5 місяців тому

      @@codingcrashcourses8533 - Thanks for the reply. Perhaps I was doing something incorrectly because it is working with Groq now.
      FYI your videos are probably the best I've found. Seriously great work. Thanks so much for creating this channel!

    • @codingcrashcourses8533
      @codingcrashcourses8533  5 місяців тому +1

      @@b18181 No worries, that questions are totally fine. But it´s just the biggest benefit of using Langchain, that you dont have to worry about APIs, but you can just switch Classes and it will (should) work ;-).
      Thank you for your kind comment

  • @Leonid.Shamis
    @Leonid.Shamis 7 місяців тому +1

    Thank you very much for the explanation. Does it apply only to OpenAI models (ChatOpenAI)? I tried using your code with the Ollama-powered local Llama3-8B model and it looks like the tools are not bound to the model or another issue - the response does not contain "tool_calls"

    • @codingcrashcourses8533
      @codingcrashcourses8533  7 місяців тому

      From the docs: Many LLM providers, including Anthropic, Cohere, Google, Mistral, OpenAI, and others, support variants of a tool calling feature.
      To be honest, I dont know if Llama supports tool/function calling. I would also have to google that :)

    • @Leonid.Shamis
      @Leonid.Shamis 7 місяців тому

      ​@@codingcrashcourses8533 ​ Thank you for your response. Meta-Llama-3-8B-Instruct is #28 in the Berkeley Function-Calling Leaderboard, but indeed it does not have that "FC" (native support for function/tool calling) indicator. I guess I'll have to try Gorilla-OpenFunctions-v2 (FC), which is Apache 2.0 licensed and ranked #5, just behind the GPT-4 models.

  • @AritraSen
    @AritraSen 7 місяців тому +1

    Excellent demo as usual , just curious is the tool_mapping dict is mandatory to create , can't we just use the tool_call['name'] ?

    • @codingcrashcourses8533
      @codingcrashcourses8533  7 місяців тому

      I ask you: What would happen if you call: tool_call['name'] without the mapping? ;-)

    • @udaasnafs
      @udaasnafs 7 місяців тому

      🎉🎉 excellent as always

  • @udaasnafs
    @udaasnafs 7 місяців тому +2

    excellent as always🎉

  • @GeorgAubele
    @GeorgAubele 5 місяців тому +1

    As far as I understand, this does not work with Ollama at the moment, does it?

    • @codingcrashcourses8533
      @codingcrashcourses8533  5 місяців тому

      Not sure to be honest.

    • @surajvardhan8490
      @surajvardhan8490 5 місяців тому +1

      In the ChatOpenI attribute, give your base_url option by hosting your ollama models with litellm. Worked for me and it should work for you too.

  • @olivergattermayr
    @olivergattermayr 7 місяців тому

    Great video.
    I’m developing something where I have a database of courses and general info, with prices, availability and bookings.
    I was trying to build a hybrid RAG pipeline with sql and semantic search, but perhaps this could replace it all together?
    Also, I bought a few of your courses a while ago, but I’m missing a full fledged implementation of sql. In your previous rag video you have one table; but how would you implement something where it’s connected to a Postgres db with dozens or hundreds of tables? Perhaps using supabase, which is pretty newbie friendly.
    Happy to buy a course where you go in more details about keeping dbs in sync / updated and also working with Langsmith Evals.

    • @codingcrashcourses8533
      @codingcrashcourses8533  7 місяців тому

      My 2 cents to use what:
      RAG: Text Data
      SQL: Tabular data
      Functions/Tools: Call third party tools/APIs

  • @FarzanaBanu-li8yo
    @FarzanaBanu-li8yo 7 місяців тому +1

    Can you provide the code link to test for our use case

  • @frag_it
    @frag_it 7 місяців тому

    How would you use it with LCEL ?

    • @AWhite_
      @AWhite_ 2 місяці тому

      Seems like automated API calls not implemented to LCEL yet.

  • @debatradas1597
    @debatradas1597 24 дні тому

    Thanks

  • @Kabayel
    @Kabayel 7 місяців тому

    es waere besser, when Sie mit den anderen LLM's gezeigt haetten.

    • @codingcrashcourses8533
      @codingcrashcourses8533  7 місяців тому

      Wieso? Openai bietet aktuell dem besten Support und langchain bietet ein stabdardisiertes interface für function calling