LangChain: Giving Memory to LLMs

Поділитися
Вставка
  • Опубліковано 26 лип 2024
  • LangChain offers a significant advantage by enabling the development of Chat Agents capable of managing their memory. In this video, we explore different langchain memory types and provide guidance on integrating them into a LangChain Conversation chain.
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬
    ☕ Buy me a Coffee: ko-fi.com/promptengineering
    |🔴 Support my work on Patreon: Patreon.com/PromptEngineering
    🦾 Discord: / discord
    ▶️️ Subscribe: www.youtube.com/@engineerprom...
    📧 Business Contact: engineerprompt@gmail.com
    💼Consulting: calendly.com/engineerprompt/c...
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    LINKS:
    LangChain Memory: python.langchain.com/en/lates...
    Google Notebook: colab.research.google.com/dri...
    ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
    All Interesting Videos:
    Everything LangChain: • LangChain
    Everything LLM: • Large Language Models
    Everything Midjourney: • MidJourney Tutorials
    AI Image Generation: • AI Image Generation Tu...
    #langchain #openai #chatgpttutorial
  • Наука та технологія

КОМЕНТАРІ • 43

  • @Unicorn-qg8mz
    @Unicorn-qg8mz Рік тому +1

    Wow can't wait to see more!

  • @DJPapzin
    @DJPapzin Рік тому +1

    Great video. You re such a great teacher

  • @SWARAJKS10
    @SWARAJKS10 11 місяців тому

    your videos are super helpful. As a beginner I find it easy to follow the steps. These provides everything I need to execute end to end of the project (most of the times )

    • @engineerprompt
      @engineerprompt  11 місяців тому

      Great to hear! Enjoy the learning :)

  • @TheAmit4sun
    @TheAmit4sun 11 місяців тому

    Dude you just earned a new subscriber. Thankyou so much.

  • @brezl8
    @brezl8 Рік тому

    this was great. clear and understandable, thanks a lot!

  • @eaugustine
    @eaugustine Рік тому

    Good video.

  • @robxmccarthy
    @robxmccarthy Рік тому +10

    Nice video 💯. I'm interested in more long term memory and vector storage. Mainly, how to keep track of memories over weeks, months, or years.

  • @jirikosek3714
    @jirikosek3714 Рік тому +3

    Very good video, please go into further details of langchain (e.g. working with a llm + tabular data could be very interesting (sql, pandas agents))

    • @engineerprompt
      @engineerprompt  Рік тому +1

      Will be making a lot more videos on LLMs. Stay tuned!

    • @J3R3MI6
      @J3R3MI6 Рік тому

      Yes (OpenAI API + Tabular Data) to connected to internet to create an AI Data Scientist.

  • @user-gs6cq2rg9w
    @user-gs6cq2rg9w Місяць тому +1

    thanks

  • @yazanrisheh5127
    @yazanrisheh5127 Рік тому

    Hey. I was wondering if there's a way to create the my custom chatGPT to write like it where it displays letter by letter as it's writing the answer rather than wait for few seconds then it shows it all at once. Thanks!

  • @TheAstroengineer
    @TheAstroengineer Рік тому

    Thank you for the wonderful video. How do I implement memory functionality for Vector Index search? I have developed a Q&A chatbot based on my documents and I would like to implement memory functionality to remember past few conversations.

  • @xevenau
    @xevenau Рік тому

    is it possible to add this memoery sheet into the multple pdf sheet you also provide so that I can track all the questions i asked regarding the pdfs and also have it regain the memory of all of hte questions I ask?

  • @sunnylee6001
    @sunnylee6001 Рік тому

    thannnnnnnnk you

  • @svyatglukhov
    @svyatglukhov 11 місяців тому

    hello brother! I liked your video and I would like to ask you about one thing. I have a lot of dialogs and how do I give a specific dialog to the message chain?

  • @binstitus3909
    @binstitus3909 6 місяців тому

    How can I keep the conversation context of multiple users separately?

  • @maazbinmustaqeem
    @maazbinmustaqeem 9 місяців тому

    How to override the default promt in ConversationChain (The following is a friendly conversation....) ?

  • @dikshyakasaju7541
    @dikshyakasaju7541 Рік тому +1

    Have you figured out how to retain memory when the app is built on streamlit? Just curious cause that'd be super helpful.

    • @engineerprompt
      @engineerprompt  Рік тому

      I haven't tested this approach with streamlit but these approaches should work, in theory

    • @gr8ston
      @gr8ston Рік тому

      ​@@engineerprompt for some reason it doesnt. Memory gets reset everytime someone enters a query in chat.

  • @hacking4078
    @hacking4078 9 місяців тому

    Is it also possible to add author_ids?

  • @thecoxfamily7324
    @thecoxfamily7324 Рік тому

    Is there a solution for when utilizing the ChatGPT API?

  • @kavibharathi1547
    @kavibharathi1547 Рік тому +1

    How to add memory to load_qa chain or RetrievalQA chain

    • @engineerprompt
      @engineerprompt  Рік тому +1

      Next video :)

    • @kavibharathi1547
      @kavibharathi1547 Рік тому

      @@engineerprompt Thankyou ver much:) I was working on chat with pdf with memory here I was using load_qa and RetrievalQA but I couldn't able to add direct memory object can you suggest any solution I need it urgent

  • @DatTran-rb4lv
    @DatTran-rb4lv Рік тому

    Hi @PromptEngineering
    if i have list of products and list of orders, is posible to add to memory? if posible how can i do it?
    Thanks!!!

    • @engineerprompt
      @engineerprompt  Рік тому

      Yes, just save them using the memory.save_context or you can add them as context using the document retrieval approach. Watch my localGPT video.

    • @DatTran-rb4lv
      @DatTran-rb4lv Рік тому

      @@engineerprompt Thank you, my data is on DB now, could you please suggess me how to prepare those data as input data format when ingest?

  • @abusufyanvu
    @abusufyanvu 7 місяців тому

    How can save this buffer memory in mongodb?

  • @relaxandlearn7996
    @relaxandlearn7996 Рік тому

    how big would the memory get after an 1 week konversation where only fakts are saved and validate only by me ? 1TB ? 5TB ?

    • @engineerprompt
      @engineerprompt  Рік тому +1

      That will depend on the amount of conversations BUT keep in mind that all these llms have limited context windows (16k tokens for gpt-3.5) so if the memory has anything beyond that, it's not going to be useful. You probably want to look at embeddings at that point.

  • @REALVIBESTV
    @REALVIBESTV 10 місяців тому

    Can this work with a voice OpenAi chatbot in Python?

  • @MarshallMelnychuk
    @MarshallMelnychuk 11 місяців тому

    15:28 hi I have checked out your calendly schedule and would like to have a conversation with you but I need to have a preliminary conversation before I pay your consulting fee for a 45-minute session. What you have described in this video is very close to the problem I am trying to solve. I would like to discuss that and if you're able to solve it will gladly pay you for your time.

  • @Gamla123
    @Gamla123 11 місяців тому

    Thanks for trying but the video quality is very poor.

  • @gnosisdg8497
    @gnosisdg8497 Рік тому

    Do you think you can also train an llm model using the memory module ????