LangChain: Giving Memory to LLMs

Поділитися
Вставка
  • Опубліковано 3 гру 2024

КОМЕНТАРІ • 45

  • @Unicorn-qg8mz
    @Unicorn-qg8mz Рік тому +1

    Wow can't wait to see more!

  • @SWARAJKS10
    @SWARAJKS10 Рік тому

    your videos are super helpful. As a beginner I find it easy to follow the steps. These provides everything I need to execute end to end of the project (most of the times )

  • @TheAmit4sun
    @TheAmit4sun Рік тому

    Dude you just earned a new subscriber. Thankyou so much.

  • @DJPapzin
    @DJPapzin Рік тому +1

    Great video. You re such a great teacher

  • @brezl8
    @brezl8 Рік тому

    this was great. clear and understandable, thanks a lot!

  • @robxmccarthy
    @robxmccarthy Рік тому +10

    Nice video 💯. I'm interested in more long term memory and vector storage. Mainly, how to keep track of memories over weeks, months, or years.

  • @PratimMallick-o3z
    @PratimMallick-o3z 5 місяців тому +1

    thanks

  • @jirikosek3714
    @jirikosek3714 Рік тому +3

    Very good video, please go into further details of langchain (e.g. working with a llm + tabular data could be very interesting (sql, pandas agents))

    • @engineerprompt
      @engineerprompt  Рік тому +1

      Will be making a lot more videos on LLMs. Stay tuned!

    • @J3R3MI6
      @J3R3MI6 Рік тому

      Yes (OpenAI API + Tabular Data) to connected to internet to create an AI Data Scientist.

  • @maazbinmustaqeem
    @maazbinmustaqeem Рік тому

    How to override the default promt in ConversationChain (The following is a friendly conversation....) ?

  • @TheAstroengineer
    @TheAstroengineer Рік тому

    Thank you for the wonderful video. How do I implement memory functionality for Vector Index search? I have developed a Q&A chatbot based on my documents and I would like to implement memory functionality to remember past few conversations.

  • @binstitus3909
    @binstitus3909 10 місяців тому

    How can I keep the conversation context of multiple users separately?

  • @svyatglukhov
    @svyatglukhov Рік тому

    hello brother! I liked your video and I would like to ask you about one thing. I have a lot of dialogs and how do I give a specific dialog to the message chain?

  • @kavibharathi1547
    @kavibharathi1547 Рік тому +1

    How to add memory to load_qa chain or RetrievalQA chain

    • @engineerprompt
      @engineerprompt  Рік тому +1

      Next video :)

    • @kavibharathi1547
      @kavibharathi1547 Рік тому

      @@engineerprompt Thankyou ver much:) I was working on chat with pdf with memory here I was using load_qa and RetrievalQA but I couldn't able to add direct memory object can you suggest any solution I need it urgent

  • @hacking4078
    @hacking4078 Рік тому

    Is it also possible to add author_ids?

  • @dikshyakasaju7541
    @dikshyakasaju7541 Рік тому +1

    Have you figured out how to retain memory when the app is built on streamlit? Just curious cause that'd be super helpful.

    • @engineerprompt
      @engineerprompt  Рік тому

      I haven't tested this approach with streamlit but these approaches should work, in theory

    • @gr8ston
      @gr8ston Рік тому

      ​@@engineerprompt for some reason it doesnt. Memory gets reset everytime someone enters a query in chat.

  • @xevenau
    @xevenau Рік тому

    is it possible to add this memoery sheet into the multple pdf sheet you also provide so that I can track all the questions i asked regarding the pdfs and also have it regain the memory of all of hte questions I ask?

  • @abusufyanvu
    @abusufyanvu 11 місяців тому

    How can save this buffer memory in mongodb?

  • @eaugustine
    @eaugustine Рік тому

    Good video.

  • @thecoxfamily7324
    @thecoxfamily7324 Рік тому

    Is there a solution for when utilizing the ChatGPT API?

  • @yazanrisheh5127
    @yazanrisheh5127 Рік тому

    Hey. I was wondering if there's a way to create the my custom chatGPT to write like it where it displays letter by letter as it's writing the answer rather than wait for few seconds then it shows it all at once. Thanks!

  • @sunnylee6001
    @sunnylee6001 Рік тому

    thannnnnnnnk you

  • @paarttipaabhalaji336
    @paarttipaabhalaji336 3 місяці тому

    I have one query here. Conversation memory and context length are different ? . If the input context length of the LLM is 32k. then Prompt Input + conversation memory context length should not exceed 32k right ? please correct me if I'm wrong.

    • @engineerprompt
      @engineerprompt  3 місяці тому

      You are right with one addition: input + conversation memory + output tokens should not exceed context window.

  • @relaxandlearn7996
    @relaxandlearn7996 Рік тому

    how big would the memory get after an 1 week konversation where only fakts are saved and validate only by me ? 1TB ? 5TB ?

    • @engineerprompt
      @engineerprompt  Рік тому +1

      That will depend on the amount of conversations BUT keep in mind that all these llms have limited context windows (16k tokens for gpt-3.5) so if the memory has anything beyond that, it's not going to be useful. You probably want to look at embeddings at that point.

  • @gnosisdg8497
    @gnosisdg8497 Рік тому

    Do you think you can also train an llm model using the memory module ????

  • @REALVIBESTV
    @REALVIBESTV Рік тому

    Can this work with a voice OpenAi chatbot in Python?

  • @MarshallMelnychuk
    @MarshallMelnychuk Рік тому

    15:28 hi I have checked out your calendly schedule and would like to have a conversation with you but I need to have a preliminary conversation before I pay your consulting fee for a 45-minute session. What you have described in this video is very close to the problem I am trying to solve. I would like to discuss that and if you're able to solve it will gladly pay you for your time.

  • @DatTran-rb4lv
    @DatTran-rb4lv Рік тому

    Hi @PromptEngineering
    if i have list of products and list of orders, is posible to add to memory? if posible how can i do it?
    Thanks!!!

    • @engineerprompt
      @engineerprompt  Рік тому

      Yes, just save them using the memory.save_context or you can add them as context using the document retrieval approach. Watch my localGPT video.

    • @DatTran-rb4lv
      @DatTran-rb4lv Рік тому

      @@engineerprompt Thank you, my data is on DB now, could you please suggess me how to prepare those data as input data format when ingest?

  • @Gamla123
    @Gamla123 Рік тому

    Thanks for trying but the video quality is very poor.