LangChain Expression Language (LCEL) | Langchain Tutorial | Code

Поділитися
Вставка
  • Опубліковано 25 січ 2025

КОМЕНТАРІ • 15

  • @techwithsaketh
    @techwithsaketh 3 місяці тому +1

    Great Tutorial - Keep the good work

  • @prasad_yt
    @prasad_yt 8 місяців тому +1

    Nice simplified explanation ❤

  • @benepstein3970
    @benepstein3970 Рік тому +1

    Thanks, subscribed!

  • @andaldana
    @andaldana Рік тому +1

    Great tutorial - thanks!

  • @MuhammadFaizanMumtaz3
    @MuhammadFaizanMumtaz3 Рік тому +1

    Sir! Your doing great job

  • @KEVALKANKRECHA
    @KEVALKANKRECHA 7 місяців тому

    Great video .!

  • @muhammedaslama9908
    @muhammedaslama9908 11 місяців тому

    AzureOpenAi chat model doesn't seem to support LCEL. Am I doing something wrong?

  • @thumarzeel
    @thumarzeel 11 місяців тому

    Awesome buddy

  • @Tushii
    @Tushii Рік тому

    Is there a way in which I could batch invoke a list of text files ?
    I want to extract certain texts from each file using openai?
    Or would I have to do it one by one and loop

    • @FutureSmartAI
      @FutureSmartAI  Рік тому

      you can extract fill text and pass it in batch

    • @Tushii
      @Tushii Рік тому

      @@FutureSmartAI cool, thanks, I shall try it out

  • @mushinart
    @mushinart 10 місяців тому

    First of all , thank you for the amazing way you explain stuff.... Its elegant...now i have only one question when we userd the retriever to assign it's value to the context variable, i assume the retriever will pass all of the documents in the rag list to the llm so that when it validates the question,it will use the whole rag documents that are all being passed as a context in the prompt. Am i right? ... Will it be a good idea if we created a chain to first take the question to query the rag and get just the needed context then pass it with the question again to form a human understandable answer from the llm ? ....im asking to see if im understanding it right or not .... Thanks man