Hi, very good question! You would need to do that outside Langchain, minimal solution would be array/list of user identities along with their history, but that kind of feature probably also would pull need for authenticating users, and storing the chat context more permanently. So traditional development work. For purposes of LLM model, you simply pass in the relevant context (chat history for current user)
Why doesnt Open AI implement this feature with ChatGpt? They could set the API store conversation in the end users HDD, under a temp folder. It makes so much sense, I dont get it.
Well, there are ways to do that. However, ownership of data is still a bit superfluous, as in this model you still need to transmit the conversation back to model as context/input everytime. But this is awesome when you run a local model, and open source local lightweight models are becoming more awesome every day. ChatGPT has since I made this video rolled out the custom models, or GPTs, that allow you to do similar things online, packaged. Not quite the same but definitely aiming for similar use cases.
@@DevXplaining it would no more superfluous than the information Open AI is already privy too. I know they say all the prompts you feed into GPT are never stored or read, manipulated, etc... but come on, they certainly have the ability to live monitor and even adjust any thing that comes through their servers despite the data security spiel they give the public karen crowd. All they have to do is tell the public that all chat history is stored solely client side and the api only has access to said data per session and only when granted permission from the end user. Make a EULA you agree too, un order to use the feature, stating such and problem solved.
@@DevXplaining and yes ive seen the custom gpt3 models that have the chain feature, but i think something built in is not only a good idea, but if AI is to move forward, inevitable
I recently began with Langchain and your videos are well explained, thank you.
We are so early.
Thank you, I agree with that. Langchain is not perfect but it's a great learning tool right now.
Thanks. How to create a conversational agent with langchain
How can I keep the conversation context of multiple users separately?"
Hi, very good question! You would need to do that outside Langchain, minimal solution would be array/list of user identities along with their history, but that kind of feature probably also would pull need for authenticating users, and storing the chat context more permanently.
So traditional development work. For purposes of LLM model, you simply pass in the relevant context (chat history for current user)
great!!
Thank you! :)
Why doesnt Open AI implement this feature with ChatGpt? They could set the API store conversation in the end users HDD, under a temp folder. It makes so much sense, I dont get it.
Well, there are ways to do that. However, ownership of data is still a bit superfluous, as in this model you still need to transmit the conversation back to model as context/input everytime.
But this is awesome when you run a local model, and open source local lightweight models are becoming more awesome every day.
ChatGPT has since I made this video rolled out the custom models, or GPTs, that allow you to do similar things online, packaged. Not quite the same but definitely aiming for similar use cases.
@@DevXplaining it would no more superfluous than the information Open AI is already privy too. I know they say all the prompts you feed into GPT are never stored or read, manipulated, etc... but come on, they certainly have the ability to live monitor and even adjust any thing that comes through their servers despite the data security spiel they give the public karen crowd. All they have to do is tell the public that all chat history is stored solely client side and the api only has access to said data per session and only when granted permission from the end user. Make a EULA you agree too, un order to use the feature, stating such and problem solved.
@@DevXplaining and yes ive seen the custom gpt3 models that have the chain feature, but i think something built in is not only a good idea, but if AI is to move forward, inevitable
Are you an AI?
As an AI language model I can no more confirm than deny questions related to my identity.
@@DevXplaining kkkkkkk
Lol make a video on how to earn with chatgpt
Haha, cannot, still dirt broke here :)
@@DevXplaining be my mentor?🛐💀