6-Building Advanced RAG Q&A Project With Multiple Data Sources With Langchain

Поділитися
Вставка
  • Опубліковано 27 гру 2024
  • Hello All we are going to build Advanced RAG Projects With Multiple Data Sources as arxiv,wikipedia and others .Here we will be learnign about agents,tools,toolkits and agent executor
    Code Github: github.com/kri...
    ---------------------------------------------------------------------------------------------
    Support me by joining membership so that I can upload these kind of videos
    / @krishnaik06
    -----------------------------------------------------------------------------------
    Fresh Langchain Playlist: • Fresh And Updated Lang...
    ►LLM Fine Tuning Playlist: • Steps By Step Tutorial...
    ►AWS Bedrock Playlist: • Generative AI In AWS-A...
    ►Llamindex Playlist: • Announcing LlamaIndex ...
    ►Google Gemini Playlist: • Google Is On Another L...
    ►Langchain Playlist: • Amazing Langchain Seri...
    ►Data Science Projects:
    • Now you Can Crack Any ...
    ►Learn In One Tutorials
    Statistics in 6 hours: • Complete Statistics Fo...
    End To End RAG LLM APP Using LlamaIndex And OpenAI- Indexing And Querying Multiple Pdf's
    Machine Learning In 6 Hours: • Complete Machine Learn...
    Deep Learning 5 hours : • Deep Learning Indepth ...
    ►Learn In a Week Playlist
    Statistics: • Live Day 1- Introducti...
    Machine Learning : • Announcing 7 Days Live...
    Deep Learning: • 5 Days Live Deep Learn...
    NLP : • Announcing NLP Live co...
    ---------------------------------------------------------------------------------------------------
    My Recording Gear
    Laptop: amzn.to/4886inY
    Office Desk : amzn.to/48nAWcO
    Camera: amzn.to/3vcEIHS
    Writing Pad:amzn.to/3OuXq41
    Monitor: amzn.to/3vcEIHS
    Audio Accessories: amzn.to/48nbgxD
    Audio Mic: amzn.to/48nbgxD

КОМЕНТАРІ •

  • @nishantchoudhary3245
    @nishantchoudhary3245 8 місяців тому +9

    One of the best langchain series. Thanks God I am able to find such good content

  • @rajamailtome
    @rajamailtome 8 місяців тому +22

    Instead of openai, plz use any other downloadable open source LLM which can be run locally 😊

  • @jayashreesanthanam816
    @jayashreesanthanam816 10 годин тому

    Very nice tutorial . Very helpful. Please add a commentary to include Langchain API key when using prompts from hub. I am learning a lot , thanks so much Krish

  • @dhanashrikolekar-j7e
    @dhanashrikolekar-j7e 8 місяців тому +1

    Thank you sir .Great videos you are making............

  • @carlosbelleza5104
    @carlosbelleza5104 8 місяців тому +2

    your videos are amazing... state of the art

  • @athulroby3082
    @athulroby3082 5 місяців тому

    simple and best Langchain series, keep up the good work.👏

  • @jayanthAILab
    @jayanthAILab 8 місяців тому +1

    Sir understood the complete flow. Great explaination. ❤❤

  • @deepaksingh9318
    @deepaksingh9318 3 місяці тому

    Amazing information Krish.. Thanks for making thi series.

  • @DoomsdayDatabase
    @DoomsdayDatabase 8 місяців тому +7

    Wow this is what i wanted!
    can i use the same code with opensource model by just changing the way it loads sir?
    ​​I wrote email to you yesterday and the video came today! This is next level Krish sir! Thankyou ❤!

    • @dharmikmehta5593
      @dharmikmehta5593 2 місяці тому

      yes, you can use Ollama with Llama3. I did same thing with open source.

  • @canyouvish
    @canyouvish 8 місяців тому

    Very comprehensive and super helpful!

  • @tintintintin576
    @tintintintin576 8 місяців тому +1

    God bless you , sir!

  • @shankarpentyala2390
    @shankarpentyala2390 6 місяців тому

    Thanks for introducing and agents

  • @THOSHI-cn6hg
    @THOSHI-cn6hg 8 місяців тому +5

    U can use Lllama 2 or other opensource api insteaddddddddddddddddddddddddddddddddddddddddd....

  • @varindanighanshyam
    @varindanighanshyam 8 місяців тому

    Krish fantastic work. Can you explain it keeping ollama in context

  • @Nishant-xu1ns
    @Nishant-xu1ns 8 місяців тому

    wiating for next video

  • @hemanthram7907
    @hemanthram7907 8 місяців тому +1

    This video is very comprehensive and easy to understand, really grateful for your efforts sir, However, Could you please create a session on how to achieve the function calling , tools and agents using Gemini Pro or any other Open source LLM, unfortunately, there is no alternative for the open AI version (create_openai_tools_agent). Please explain us the workaround to use other LLMs.

    • @aj.arijit
      @aj.arijit 6 місяців тому +2

      exactly sent one full day wasting on this and reading hell lot of documentations although gained a lot of knowledge
      i found a function using gemini which a person wrote as no a agent formation tool for ollama which support chat generation using different tools and ollama together at the same time
      def process_user_request(user_input):
      # Parse user input for potential tool usage
      if "{" in user_input and "}" in user_input:
      # Extract tool name and arguments
      tool_call = user_input.split("{")[1].split("}")[0]
      tool_name, arguments = tool_call.split(":")
      arguments = eval(arguments)
      # Find the corresponding tool function
      for tool in tools:
      if tool.__name__ == tool_name:
      # Execute the tool with user arguments
      tool_output = tool(arguments)
      return tool_output
      # User request doesn't involve a tool, respond normally
      return f"I understand, but I can't use a tool for this request. {user_input}"
      while True:
      # Get user input
      user_input = input("User: ")
      # Process user request and generate response with Llama 2
      response = model.generate(
      input_ids=model.tokenizer.encode(prompt.format(user_input=user_input, list_of_available_tools="
      * ".join([t.__name__ for t in tools]))),
      max_length=1024,
      num_beams=5,
      no_repeat_ngram_size=2,
      early_stopping=True
      )
      # Extract and format the generated response
      generated_text = model.tokenizer.decode(response[0]["generated_tokens"], skip_special_tokens=True)
      tool_output = process_user_request(user_input)
      final_response = generated_text.replace("{generated_response}", tool_output)
      # Print the final response to the user
      print(final_response)
      there is something called binding of func which i could not understand shit

    • @asadpanhwar634
      @asadpanhwar634 5 місяців тому

      I used create_openai_tools_agent with llama 3 and it worked fine. I think you can use it and even that loaded prompt from the hub is for openai model but it worked fine with llama 3 70b model.

    • @dharmikmehta5593
      @dharmikmehta5593 2 місяці тому

      @@aj.arijit you can use create_react_agent instead of create_openai_tools_agent to prepare agent with Ollama & llama3.2 model. I build the same thing with open source model

  • @santhiyac8252
    @santhiyac8252 8 місяців тому +1

    Hello krish ,
    I had done google palm2 using pdf bot , but it's not gave response properly because prompt it will working publicly. How to handle prompt for specific dataset.please reply me..

  • @tekionixkeshavag.452
    @tekionixkeshavag.452 8 місяців тому +31

    Pls don't use OpenAI in this project...as it's API is paid so we can't access it....instead use gemini or any other open source model...so that we can also try it at our end...

    • @datatalkswithchandranshu2028
      @datatalkswithchandranshu2028 8 місяців тому +1

      Not gemini...as they are removing free features quicklu

    • @tekionixkeshavag.452
      @tekionixkeshavag.452 8 місяців тому

      @@datatalkswithchandranshu2028 okk..

    • @quezinmark8225
      @quezinmark8225 8 місяців тому

      Break your training data into chunks size less than token limit so that you can use free version even for big data...

    • @theinhumaneme
      @theinhumaneme 8 місяців тому

      Switch the LLM in langchain

    • @tekionixkeshavag.452
      @tekionixkeshavag.452 8 місяців тому +1

      That is somewhat complicated and we need help for that only from Krish sir ​@@theinhumaneme

  • @CyberSavvyMind
    @CyberSavvyMind 3 місяці тому

    Amazing videos, Krisk! I have a question that you haven't covered yet. After getting results from the similarity search in RAG mode, you attach them to the prompt and send them to the LLM model. Given the character limit when querying the LLM, what approaches do you take if this limit is exceeded? Please explain this or create a video with code on this topic.

  • @piyush_nimbokar_07
    @piyush_nimbokar_07 8 місяців тому +1

    Sir can you please elaborate on using neo4j knowledge graph to build RAG application

  • @cartolla
    @cartolla 3 місяці тому

    Hi, very interesting video! How do I get the documents and their metadata returned by the retriever? I would like to show, for example, the Wikipedia links or articles to the user related to the answer.

  • @yashthakkar2629
    @yashthakkar2629 6 місяців тому

    I tried adding an SQLDatabase tool to the tools list. I got an error because i think the QuerySQLDataBaseTool is not really returning a tool. What am i suppose to do if i want to add an sqlDatabase to the following list of tools without any error, kindly help.

  • @kannansingaravelu
    @kannansingaravelu 7 місяців тому

    Hi Krish, if we use "create_conversational_retrieval_agent", how do we pass the prompt - Is prompt mandatory?

  • @lalaniwerake881
    @lalaniwerake881 7 місяців тому

    amazing - Thank you

  • @yerasam
    @yerasam 5 місяців тому

    Hi, Can you make a video on multitenancy using agents and tools?

  • @AsifKhan-cc3ye
    @AsifKhan-cc3ye 8 місяців тому

    hey krish, i develop an app using rag for qc of manually populated data in excel but the model is not performing with accuracy i used llama2, is there any other best athematic open source llm?

  • @binayashrestha4131
    @binayashrestha4131 8 місяців тому

    thank you so much

  • @AIConverge
    @AIConverge 8 місяців тому

    Great video. Any alternatives to Langsmith?

  • @ayonbanerjee1969
    @ayonbanerjee1969 4 місяці тому

    Is this updated langchain content available in your udemy course? Or is the udemy course in need of updates?

  • @divya-ob3jq
    @divya-ob3jq 8 місяців тому

    Sir please make a video on virtual car assistant using LLMs

  • @123arskas
    @123arskas 7 місяців тому

    You're awesome.

  • @naudua9272
    @naudua9272 2 місяці тому

    Pdf upload agen not implemented right

  • @ashishmalhotra2230
    @ashishmalhotra2230 8 місяців тому

    Hi Krish, can you make a video on conversational chatbot trained on own datasource

  • @muhammedyaseenkm9292
    @muhammedyaseenkm9292 8 місяців тому

    How can we extract multi columnar tabular data , especially from images,

  • @Mabzone-q4p
    @Mabzone-q4p 5 місяців тому

    Hi @Krish Naik, i saw many videos on Generative AI, but i feel that there are many missing connections from basic level understanding to coding understanding, I thinking Everyone is capable to loading the libraries and use classes and get the code done. Also provide the basic core concepts also about Prompts, chat models, Tools, agent, memory, chains with their types and where to use them using coding. The basic knowledge in these videos are broken in different parts, time and space. Hope you will find this comment.

    • @vinayaksharma3650
      @vinayaksharma3650 5 місяців тому

      Learn from the blogs posted on websites from their it will be easy to understand things like agents,Tools etc

    • @Mabzone-q4p
      @Mabzone-q4p 5 місяців тому

      @@vinayaksharma3650 Hi, thanks for the suggestion. I read and tried to understand, but some of the concept are high overview that need to be understandable. My meaning for the above comment was, if someone is explaining the things which are already explained, so it means the explanation should be like more than from the document in a easy way.

  • @karansingh-fk4gh
    @karansingh-fk4gh 7 місяців тому

    Hi Krish,
    Can you please create Vedio on langgraph??

  • @mohsenghafari7652
    @mohsenghafari7652 8 місяців тому

    Hi dear friend .
    Thank you for your efforts .
    How to use this tutorial in PDFs at other language (for example Persian )
    What will the subject ?
    I made many efforts and tested different models, but the results in asking questions about pdfs are not good and accurate!
    Thank you for the explanation

  • @atharvsakalley9633
    @atharvsakalley9633 8 місяців тому

    How to get the accuracy of our search with implementation?

  • @thetagang6854
    @thetagang6854 8 місяців тому +1

    Great video. Ignore the comments on not using OpenAI, if they don’t want to pay they wouldn’t be the ones to develop actual apps anyway

  • @arID3371ER
    @arID3371ER 7 місяців тому

    Man I thought you got your hair back! 😂😂😂❤❤❤

  • @DamanjeetGTBIT
    @DamanjeetGTBIT 8 місяців тому

    I am learning Stats, sql and ML from miscellaneous videos. I want to start with a clean course . Which one is better to pursue data science career ? IBM Machine Learning or Google Advanced Data Analytics?

    • @slayer_dan
      @slayer_dan 6 місяців тому

      Find a roadmap and follow it loosly but not a big reroute.
      Then find a person who teaches the concepts in a way you can absorb it. And practice if they recommend or not.
      For example, I found these people very much as per my taste.
      - Codebasics, Krish for ML concepts with analogies and practical implementation.
      - StatQuest for visual interpretation and understanding of Statistical concepts.
      I hope it would be useful for you.

  • @mvuyisogqwaru2409
    @mvuyisogqwaru2409 8 місяців тому

    Hey guys, is anyone else having an issue on the invoke call when using Ollama (llama2 and llama3). I get the following error: ValueError: Ollama call failed with status code 400. Details: {"error":"invalid options: tools"}

    • @aj.arijit
      @aj.arijit 6 місяців тому

      exactly spent one full day wasting on this and reading hell lot of documentations although gained a lot of knowledge
      i found a function using gemini as no a agent formation tool for ollama which support chat generation using different tools and ollama together at the same time
      def process_user_request(user_input):
      # Parse user input for potential tool usage
      if "{" in user_input and "}" in user_input:
      # Extract tool name and arguments
      tool_call = user_input.split("{")[1].split("}")[0]
      tool_name, arguments = tool_call.split(":")
      arguments = eval(arguments)

      # Find the corresponding tool function
      for tool in tools:
      if tool._name_ == tool_name:
      # Execute the tool with user arguments
      tool_output = tool(arguments)
      return tool_output

      # User request doesn't involve a tool, respond normally
      return f"I understand, but I can't use a tool for this request. {user_input}"
      while True:
      # Get user input
      user_input = input("User: ")

      # Process user request and generate response with Llama 2
      response = model.generate(
      input_ids=model.tokenizer.encode(prompt.format(user_input=user_input, list_of_available_tools="
      * ".join([t._name_ for t in tools]))),
      max_length=1024,
      num_beams=5,
      no_repeat_ngram_size=2,
      early_stopping=True
      )

      # Extract and format the generated response
      generated_text = model.tokenizer.decode(response[0]["generated_tokens"], skip_special_tokens=True)
      tool_output = process_user_request(user_input)
      final_response = generated_text.replace("{generated_response}", tool_output)

      # Print the final response to the user
      print(final_response)
      there is something called binding of func which i could not understand shit which could solve the problem using the langchain.agents func - create_tool_calling_agent

  • @omkarjamdar4076
    @omkarjamdar4076 8 місяців тому

    I decided not to use retriever tool so,
    tools=[wiki,arxiv]
    the following error occured
    TypeError: type 'Result' is not subscriptable

    • @mohamed_deshaune
      @mohamed_deshaune 8 місяців тому

      can you provide the whole code i can help you if you want

    • @dear_nidhi
      @dear_nidhi 7 місяців тому

      this is work for me and i have created its ui also its working fine please send code or more about it ...we will help you

  • @SrinithiMalar
    @SrinithiMalar 8 місяців тому

    Is block chain is good career to start in 2024 and it's future scope

  • @AlanSabuJohn
    @AlanSabuJohn 8 місяців тому

    sir,can you do the embedchain tutorials

  • @rishiraj2548
    @rishiraj2548 8 місяців тому

    🙏🙂👍

  • @dear_nidhi
    @dear_nidhi 8 місяців тому +1

    Please dont use any paid api key ...we can't excess it

  • @thelifehackerpro9943
    @thelifehackerpro9943 7 місяців тому

    Code explanation is not good, each class ans component should be explained properly, it is very confusing, you are just writing the code that you have been working with.

  • @rakeshkumar-pf6yu
    @rakeshkumar-pf6yu 8 місяців тому

    with due respect....you are little faster here...don't know why you are in hurry these days...please be a little slower..may be of 80% speed of your current speed...thanks..

  • @venky433
    @venky433 8 місяців тому

    Did anyone faced below "orjson.orjson" module error.
    error:ModuleNotFoundError: No module named 'orjson.orjson'
    its coming after running "from langchain_community.tools import WikipediaQueryRun" code.

    • @venky433
      @venky433 8 місяців тому

      this error is coming with python 3.12 , but it worked with lower version ie 3.10