Langchain Memory Model | How can LLM AI hold a ChatGPT-like conversation?
Вставка
- Опубліковано 9 гру 2024
- In this video, I'll cover Langchain Memory API, using ConversationBufferMemory and ChatMessageHistory as an example. I'll share some of my thoughts on why this is cool and essential to learn for a developer. Code examples I show and run are from Langchain tutorials, so it's sufficient to follow the links below to keep up.
As always, do show the love by clicking that like button and subscribing to my channel if this kind of content interests you. Also, feel free to comment, share the link to my videos, or request future content.
Here are the links covered in the video:
• Getting Started With L... (Getting started with Langchain)
• How to run ChatGPT in ... (GPT4All a free local model)
• Chat with your blogs |... (Chat with your blogs via Langchain)
• How To Create ChatGPT ... (Build an OpenAI Virtual assistant with speech interface)
python.langcha...
api.python.lan...
I recently began with Langchain and your videos are well explained, thank you.
We are so early.
Thank you, I agree with that. Langchain is not perfect but it's a great learning tool right now.
Thanks. How to create a conversational agent with langchain
great!!
Thank you! :)
How can I keep the conversation context of multiple users separately?"
Hi, very good question! You would need to do that outside Langchain, minimal solution would be array/list of user identities along with their history, but that kind of feature probably also would pull need for authenticating users, and storing the chat context more permanently.
So traditional development work. For purposes of LLM model, you simply pass in the relevant context (chat history for current user)
Why doesnt Open AI implement this feature with ChatGpt? They could set the API store conversation in the end users HDD, under a temp folder. It makes so much sense, I dont get it.
Well, there are ways to do that. However, ownership of data is still a bit superfluous, as in this model you still need to transmit the conversation back to model as context/input everytime.
But this is awesome when you run a local model, and open source local lightweight models are becoming more awesome every day.
ChatGPT has since I made this video rolled out the custom models, or GPTs, that allow you to do similar things online, packaged. Not quite the same but definitely aiming for similar use cases.
@@DevXplaining it would no more superfluous than the information Open AI is already privy too. I know they say all the prompts you feed into GPT are never stored or read, manipulated, etc... but come on, they certainly have the ability to live monitor and even adjust any thing that comes through their servers despite the data security spiel they give the public karen crowd. All they have to do is tell the public that all chat history is stored solely client side and the api only has access to said data per session and only when granted permission from the end user. Make a EULA you agree too, un order to use the feature, stating such and problem solved.
@@DevXplaining and yes ive seen the custom gpt3 models that have the chain feature, but i think something built in is not only a good idea, but if AI is to move forward, inevitable
Lol make a video on how to earn with chatgpt
Haha, cannot, still dirt broke here :)
@@DevXplaining be my mentor?🛐💀
Are you an AI?
As an AI language model I can no more confirm than deny questions related to my identity.
@@DevXplaining kkkkkkk