This video caused some questions. Mainly due to confusion on weather this is vulnerability or not. I intended this for those who build Custom GPTs. Especially if they are investing a lot of time in to it. For ordinary users it a bug on my view that rarely can cause issues when one GPT ends up using files from another in unintended ways. Anyways here is a follow up video I made trying to answer some of the questions ua-cam.com/video/efXoLvB4Xkw/v-deo.html
I honestly hope they fix this. In mean time it can be worked around trough actions, but it's more of a chore, especially for non developers. What kind of SasS did you have in mind?
And, it's only about added files with code interpreter on. Still trough interactions it's possible to get information out anyways. But that is not that different from scrapping APIs. You are providing value to user trough your API and trough it they can get all of it. Question just us how slow process will be.
So I tried this on the Market Analyst GPT which I didn’t create, and it listed the files they used to train it on, I also could t create a download link. But then I asked it this: Do you list the files for transparency on your training ? GPT Yes, listing the files serves a purpose of transparency about the resources available to me for providing information and analysis. When users upload files, I mention these files to clarify that my responses are based on the specific materials provided. This ensures that users understand the source of the information I'm using to answer their queries. It's a way to maintain clarity and trust in the interaction, especially when dealing with specific, user-provided content.
I note that the files are also different sizes when you get them back. just for extra weird. It's freaky you noticed this today I too was thinking about the permanant files being "findable" but I did not expect that. Good job
What do you mean? Can you make example chats and share them? Did I understand correctly that you uploaded file of size say 50kb Then do this hack to get file back on other chat and got same file but size is different? What do you mean by permament?
Interesting but did you test it with a GPTs that you didn’t built ? Because GPT is session / user based and your are accessing your own data at the end (even if I agree it’s weird and seems not to be intended). If you can replicate that with a GPTs you didn’t built THAT would be something really concerning.
He would need to test on different accounts... using custom GPTS on the same account would use the same sandbox. Sandboxes aren't shared across user accounts.
Interesting but did you test it with a GPTs that you didn’t built ? Because GPT is session / user based and your are accessing your own data at the end (even if I agree it’s weird seems to be not the intended way). If you can replicate that with a GPTs you didn’t built THAT would be something really concerning.
Yes I did test. It does leak. What is funny is that you can go trough multiple bots and have all their files in one chat ) It was oversight on my side not to show it in video. Probably will do small video and pin here soon
It's actually gonna be extremely expensive and a level of distributed structure that we haven't seen before if code interpreter created a new directory per chat per user. I wonder if it will be better to just tune the code interpreter to always run some chmod or visibility (don't know which can protect against list_dir and Ls) commands that will give the visibility of a distributed service without the need for actual virtual instance spin up
I myself have 100+ chats most of which are dead but knowledge in them is useful. But I can't imagine 100+ virtualized directory instances for a single user. It's kinda amazing Bard even exists now that I think about it. Google is gonna have a hell time dealing with all these issues at their 5 billion users scale 😂
Well, they can make actions available per chat I agree that sharing code interpreter between user chats is not issue in itself. But there should be some kind of access controls. One thought I had. What if fir each file we can choose if it should be used as knowledge only or available in code interpreter too. But that is only half measure. Still the could be some per chat access control thing. I like your chmod idea. But it's up for them. In my case I am exploring actions now for some types of situations where I don't want file to leak completely.
I just replicated it again. Found something new. I opened two bots that have code interpreter and files uploaded. Or rather. 1. I opened first bot and checked its files(Mystery game) 2. Opened general GPT4 and checked that in it code interpreter contains the files 3. Went to other bot about space exploration that has file and code interpreter, checked its files and it contained one(I expected it to contain mistery games files too) 4. went to general GPT4 bot and asked it to list files and got only one file from space game, not from mistery Conclusion for me then is that it is a bug. When you open Custom GPTs bot with code interpreter and knowledge file it erases previous files and puts bot knowledge files in to code interpreter But when you go from custom gpt to general gpt4 it has no knowledge files and does not erase anything. Bug indeed. Will be fixed. Between two custom bots it works as I expect, erases previous files from file system/memory I do agree with you that it should not create new containers for each chat. It can reuse same, but may be do some kind of virtual env or something. Wait now, tried again. Now I am in mistery game and I have both mistery game and astro game files...
@@EduardsRuzga what is happening ahhhh. It would be good to discuss this properly but I’ve not yet created custom gpts. Let me find sometime this weekend after work and I can be on the same page as you. If you think it’s a bug though you should definitely file it, they’ll give you a sweet reward
I can download your files that way. But you are right I should have demonstrated it with 3rd party bot to whose files I do not have access. What we can do is that you can create a bot with file, instruct it to not give up file name or content. I will get access to it with this method. And make a pinned comment here mentioning you as contributor ;)
@@SFSylvester It's using the same sandbox for your account.. If you tried with two different accounts it won't work. The users OWNS BOTH GPTS.... big difference.
Not a vulnerability... you're literally just downloading files from your own sandbox. Try this nonsense on two different accounts from which you've shared the GPT... Won't work. Of course you can download files from your own sandbox. Been able to do this for MONTHS. This is just nonsense. It's just a temp directory in which your account can access from all GPTs. Yes, GPTs you didn't build will use YOUR account sandbox. If you shared it to another account, it wouldn't work for accessing YOUR files that you uploaded on your other account as the other account can't access your temp sandbox. It would reference two different directories. Just as my system directory isn't your system directory. Even if I used this GPT and uploaded files, you will NOT be able to access it. Because it's MY directory. Just saying. Not a real exploit in the slightest.
I never tried that before custom GPTs, but I do remember not being able to start chat on desktop, switch to phone and continue. It was loosing running instance. Due to it,starting Code Interpreter at the start of the chat, then loosing it and not being able to restart it never crossed my mind that this is the case. Actually before I was starting new chats with code interpreter and it was empty, no old files. Did you saw some one demonstrating this before this custom gpts launch? Also it is vulnerability from my point of view as a creator of custom gots. It's like exposing your database when creating webapp with full read access. Depending on the use case this basically makes it stupid to do custom gpts. For example if I make a game and put in those files some logic, lore, secrets that should be used by bot only in certain ways. This basically leaks them completely and makes knowledge files a bad idea to use.
@@EduardsRuzga it’s equivalent to iPhone apps having access to my files on my device. CustomGPT will likely chat with other GPTs you have in the future. Not a flaw. It’s a feature. If it was exposing data to other users, yes that’s a flaw, but it’s not. I can see many use cases for multiple GPTs talking to each other, fileGPT talking with databaseGPT talking with FinanceGPT. Only makes sense.
@@investmentanalyst779depends on what you are doing. What if your app is password storing app? Is it fine that other chat get's access and sends by accident them to action? Probably not valid use case. But what if one chat is multi player game and other chat is some server deployment platform with credentials stored? I already used Back4app or Netify to manipulate cloud trough that. No, files should not be shared between different chats. Its stupid. Its not about apps having access to same file system. Its about your Slack sharing all your files with all chats participants at all times. It is stupid.
@@EduardsRuzga unrealistic scenario. Regardless it’s user account based. Not a flaw no matter what you say. And this is coming from someone with 6 years in IT… and software engineer background. Grew up in browsing hack forums and other various blackhat forums. I’m not sure you quite understand what a flaw is. I can’t steal userdata with this. Your GPT isn’t relaying other users information to you, nor is it training your GPT based on other users uses. Drop it. Read up on security exploits, then demonstrate how you can potentially steal data from OTHER users. Thanks.
@@investmentanalyst779 appeal to authority? I am writing software for 23 years, 16 of which is professionally so your 6 is not argument here. Or anywhere. Like my 23/16. It's not about numbers but about concrete cases that show limitations of current approach they have to this. I outlined my case, there should be option to have restricted access files that are accessible only to one bot. Based on bot author or/and user. Depending on the owner of the file. Such functionality is not there and it is a vulnerability from the point if view of bot creators. It's like having file exposed directly that should only be used trough REST API. Same thing exactly. As if two apps on same server have access to files of each other. Make deliberate choice by exposing them, or allowing access only trough API.
hm, there is a way to share feedback. I do need to adjust messaging that it does not 100% protect but makes it harder, but there could be some 'misshots' Will retest now and share a link with you
So this one is not with files but just as example of GPT I protected with prompt generated with GPT Shield Asking for protective prompt chat.openai.com/share/faa75701-5c8a-4a68-b225-681e622c6441 Asking about instructions chat.openai.com/share/126a711b-5ac2-4f63-8158-362892d98d66
Still, this is a problem confined to your own system, hows this a vulnerability? They anounced that chatGpt will actually have shared memory between chats soon..
It's vulnerability as far as custom gpts are involved. Some put in knowledge books or other proprietary information. In case of games, mystery games or study telling apps that include secrets this is also not ideal. It's rather about bot creator keeping secrets from bot user. Yes shared memory about chats. But does it mean that a the bots will have full access to all knowledge files at all times. Including ability to remove or change them? Because currently that is what is the case.
I just replicated. Found something new. I opened two bots that have code interpreter and files uploaded. Or rather. 1. I opened first bot and checked its files(Mystery game) 2. Opened general GPT4 and checked that in it code interpreter contains the files 3. Went to other bot about space exploration that has file and code interpreter, checked its files and it contained one(I expected it to contain mistery games files too) 4. went to general GPT4 bot and asked it to list files and got only one file from space game, not from mistery Conclusion for me then is that it is a bug. When you open Custom GPTs bot with code interpreter and knowledge file it erases previous files and puts bot knowledge files in to code interpreter But when you go from custom gpt to general gpt4 it has no knowledge files and does not erase anything. Bug indeed. Will be fixed. Wait, I tested it more and no, something else happened. I am now in mistery game and have files from my bot and astro bot in it... I guess it was an accident it cleared the files there.
@@EduardsRuzga , I tried to replicate with 2 different accounts. Some files are short-lived in memory (on mounted disk), so may get lost after some time. Not sure it's a big or not. Maube feature, Sou you could access your once (!) uploaded files from multiple chats and they all live in single virtual environment. Maybe it is bug, that could become documented feature as described above :) I find this "feature" as good way to not stuff server with multiple copies from same user in different chats.
@@RolandsUmbrovskis could be. It just that as creator of gpts it forces me not to use code interpreter with knowledge files for certain projects. Forces me to explore custom actions deployment to hide files behind them as ChatGPT dies not allow such control. You btw may fall victim to this too as different chat bots may end up using files from different chats. Like phrase send all files to email will send pictures you generated in other chat... And no you do not need to restart code interpreter for this. Just make per chat folders that other chats by default do not have access to
Sorry answered in your thread with answers for another. To answer your question, its ChatGPT Plus with new custom GPTs section. By now everyone should have access who is paying 20$ a month.
This video caused some questions. Mainly due to confusion on weather this is vulnerability or not. I intended this for those who build Custom GPTs. Especially if they are investing a lot of time in to it. For ordinary users it a bug on my view that rarely can cause issues when one GPT ends up using files from another in unintended ways.
Anyways here is a follow up video I made trying to answer some of the questions
ua-cam.com/video/efXoLvB4Xkw/v-deo.html
if a guy is using light mode , you already know its serious!!
Thanks for your insights. Im s glad i resisted using this for my SaaS .
I honestly hope they fix this. In mean time it can be worked around trough actions, but it's more of a chore, especially for non developers.
What kind of SasS did you have in mind?
And, it's only about added files with code interpreter on. Still trough interactions it's possible to get information out anyways. But that is not that different from scrapping APIs. You are providing value to user trough your API and trough it they can get all of it. Question just us how slow process will be.
So I tried this on the Market Analyst GPT which I didn’t create, and it listed the files they used to train it on, I also could t create a download link. But then I asked it this: Do you list the files for transparency on your training ?
GPT
Yes, listing the files serves a purpose of transparency about the resources available to me for providing information and analysis. When users upload files, I mention these files to clarify that my responses are based on the specific materials provided. This ensures that users understand the source of the information I'm using to answer their queries. It's a way to maintain clarity and trust in the interaction, especially when dealing with specific, user-provided content.
Ouh thats a great find. I wonder if its hallucination or indeed a system prompt.
I will pin your comment for now, great find!
@@EduardsRuzga not sure. Time for more research.
I note that the files are also different sizes when you get them back. just for extra weird. It's freaky you noticed this today I too was thinking about the permanant files being "findable" but I did not expect that. Good job
What do you mean? Can you make example chats and share them?
Did I understand correctly that you uploaded file of size say 50kb
Then do this hack to get file back on other chat and got same file but size is different?
What do you mean by permament?
Yes that is exactly what the person is saying.@@EduardsRuzga
Interesting but did you test it with a GPTs that you didn’t built ? Because GPT is session / user based and your are accessing your own data at the end (even if I agree it’s weird and seems not to be intended). If you can replicate that with a GPTs you didn’t built THAT would be something really concerning.
I did. It was mistake not to show it in video. You can try yourself too
He would need to test on different accounts... using custom GPTS on the same account would use the same sandbox. Sandboxes aren't shared across user accounts.
I made new video, I demo there doing that with GPTs I did not own
ua-cam.com/video/efXoLvB4Xkw/v-deo.html
Interesting but did you test it with a GPTs that you didn’t built ? Because GPT is session / user based and your are accessing your own data at the end (even if I agree it’s weird seems to be not the intended way). If you can replicate that with a GPTs you didn’t built THAT would be something really concerning.
Yes I did test. It does leak. What is funny is that you can go trough multiple bots and have all their files in one chat )
It was oversight on my side not to show it in video.
Probably will do small video and pin here soon
I made a new video where I answer you question ;) Check it out
It's actually gonna be extremely expensive and a level of distributed structure that we haven't seen before if code interpreter created a new directory per chat per user. I wonder if it will be better to just tune the code interpreter to always run some chmod or visibility (don't know which can protect against list_dir and Ls) commands that will give the visibility of a distributed service without the need for actual virtual instance spin up
I myself have 100+ chats most of which are dead but knowledge in them is useful. But I can't imagine 100+ virtualized directory instances for a single user. It's kinda amazing Bard even exists now that I think about it. Google is gonna have a hell time dealing with all these issues at their 5 billion users scale 😂
Well, they can make actions available per chat
I agree that sharing code interpreter between user chats is not issue in itself. But there should be some kind of access controls.
One thought I had. What if fir each file we can choose if it should be used as knowledge only or available in code interpreter too.
But that is only half measure.
Still the could be some per chat access control thing. I like your chmod idea.
But it's up for them. In my case I am exploring actions now for some types of situations where I don't want file to leak completely.
I just replicated it again. Found something new.
I opened two bots that have code interpreter and files uploaded.
Or rather.
1. I opened first bot and checked its files(Mystery game)
2. Opened general GPT4 and checked that in it code interpreter contains the files
3. Went to other bot about space exploration that has file and code interpreter, checked its files and it contained one(I expected it to contain mistery games files too)
4. went to general GPT4 bot and asked it to list files and got only one file from space game, not from mistery
Conclusion for me then is that it is a bug.
When you open Custom GPTs bot with code interpreter and knowledge file it erases previous files and puts bot knowledge files in to code interpreter
But when you go from custom gpt to general gpt4 it has no knowledge files and does not erase anything.
Bug indeed. Will be fixed.
Between two custom bots it works as I expect, erases previous files from file system/memory
I do agree with you that it should not create new containers for each chat. It can reuse same, but may be do some kind of virtual env or something.
Wait now, tried again. Now I am in mistery game and I have both mistery game and astro game files...
@@EduardsRuzga what is happening ahhhh. It would be good to discuss this properly but I’ve not yet created custom gpts. Let me find sometime this weekend after work and I can be on the same page as you. If you think it’s a bug though you should definitely file it, they’ll give you a sweet reward
@@yugshende3I did file it on their discord. Same day as video. Wrong to publish it without reporting. But no answer yet.
how is this a vulnerability. if you upload your files and download them in another chat there is nothing wrong with it
The two custom GPTs SHOULD NOT be sharing files between chats and you shouldn't be able to delete files from another custom GPT.
I can download your files that way.
But you are right I should have demonstrated it with 3rd party bot to whose files I do not have access. What we can do is that you can create a bot with file, instruct it to not give up file name or content. I will get access to it with this method. And make a pinned comment here mentioning you as contributor ;)
@@SFSylvester It's using the same sandbox for your account.. If you tried with two different accounts it won't work. The users OWNS BOTH GPTS.... big difference.
@@investmentanalyst779No actually he does not, I am working on next video where I will show you getting files from GPTs you do not own
I made new video, I demo there doing that with GPTs I did not own
ua-cam.com/video/efXoLvB4Xkw/v-deo.html
Not a vulnerability... you're literally just downloading files from your own sandbox. Try this nonsense on two different accounts from which you've shared the GPT... Won't work. Of course you can download files from your own sandbox. Been able to do this for MONTHS.
This is just nonsense. It's just a temp directory in which your account can access from all GPTs. Yes, GPTs you didn't build will use YOUR account sandbox. If you shared it to another account, it wouldn't work for accessing YOUR files that you uploaded on your other account as the other account can't access your temp sandbox. It would reference two different directories.
Just as my system directory isn't your system directory. Even if I used this GPT and uploaded files, you will NOT be able to access it. Because it's MY directory. Just saying. Not a real exploit in the slightest.
I never tried that before custom GPTs, but I do remember not being able to start chat on desktop, switch to phone and continue. It was loosing running instance.
Due to it,starting Code Interpreter at the start of the chat, then loosing it and not being able to restart it never crossed my mind that this is the case.
Actually before I was starting new chats with code interpreter and it was empty, no old files. Did you saw some one demonstrating this before this custom gpts launch?
Also it is vulnerability from my point of view as a creator of custom gots. It's like exposing your database when creating webapp with full read access.
Depending on the use case this basically makes it stupid to do custom gpts.
For example if I make a game and put in those files some logic, lore, secrets that should be used by bot only in certain ways. This basically leaks them completely and makes knowledge files a bad idea to use.
@@EduardsRuzga it’s equivalent to iPhone apps having access to my files on my device. CustomGPT will likely chat with other GPTs you have in the future. Not a flaw. It’s a feature. If it was exposing data to other users, yes that’s a flaw, but it’s not. I can see many use cases for multiple GPTs talking to each other, fileGPT talking with databaseGPT talking with FinanceGPT. Only makes sense.
@@investmentanalyst779depends on what you are doing. What if your app is password storing app? Is it fine that other chat get's access and sends by accident them to action? Probably not valid use case.
But what if one chat is multi player game and other chat is some server deployment platform with credentials stored? I already used Back4app or Netify to manipulate cloud trough that.
No, files should not be shared between different chats. Its stupid.
Its not about apps having access to same file system. Its about your Slack sharing all your files with all chats participants at all times. It is stupid.
@@EduardsRuzga unrealistic scenario. Regardless it’s user account based. Not a flaw no matter what you say. And this is coming from someone with 6 years in IT… and software engineer background. Grew up in browsing hack forums and other various blackhat forums. I’m not sure you quite understand what a flaw is. I can’t steal userdata with this. Your GPT isn’t relaying other users information to you, nor is it training your GPT based on other users uses. Drop it. Read up on security exploits, then demonstrate how you can potentially steal data from OTHER users.
Thanks.
@@investmentanalyst779 appeal to authority? I am writing software for 23 years, 16 of which is professionally so your 6 is not argument here. Or anywhere. Like my 23/16. It's not about numbers but about concrete cases that show limitations of current approach they have to this.
I outlined my case, there should be option to have restricted access files that are accessible only to one bot. Based on bot author or/and user. Depending on the owner of the file. Such functionality is not there and it is a vulnerability from the point if view of bot creators. It's like having file exposed directly that should only be used trough REST API. Same thing exactly.
As if two apps on same server have access to files of each other. Make deliberate choice by exposing them, or allowing access only trough API.
Tried GPT Shield added the prompt it generated to my prompt but still doesn't protect instructions or files
hm, there is a way to share feedback.
I do need to adjust messaging that it does not 100% protect but makes it harder, but there could be some 'misshots' Will retest now and share a link with you
So this one is not with files but just as example of GPT I protected with prompt generated with GPT Shield
Asking for protective prompt
chat.openai.com/share/faa75701-5c8a-4a68-b225-681e622c6441
Asking about instructions
chat.openai.com/share/126a711b-5ac2-4f63-8158-362892d98d66
Still, this is a problem confined to your own system, hows this a vulnerability?
They anounced that chatGpt will actually have shared memory between chats soon..
It's vulnerability as far as custom gpts are involved. Some put in knowledge books or other proprietary information.
In case of games, mystery games or study telling apps that include secrets this is also not ideal. It's rather about bot creator keeping secrets from bot user.
Yes shared memory about chats. But does it mean that a the bots will have full access to all knowledge files at all times. Including ability to remove or change them? Because currently that is what is the case.
I made new video, I demo there doing that with GPTs I did not own
ua-cam.com/video/efXoLvB4Xkw/v-deo.html
Can not replicate. Only lists files from my other GPT chats that are uploaded and not yet deleted.
I just replicated. Found something new.
I opened two bots that have code interpreter and files uploaded.
Or rather.
1. I opened first bot and checked its files(Mystery game)
2. Opened general GPT4 and checked that in it code interpreter contains the files
3. Went to other bot about space exploration that has file and code interpreter, checked its files and it contained one(I expected it to contain mistery games files too)
4. went to general GPT4 bot and asked it to list files and got only one file from space game, not from mistery
Conclusion for me then is that it is a bug.
When you open Custom GPTs bot with code interpreter and knowledge file it erases previous files and puts bot knowledge files in to code interpreter
But when you go from custom gpt to general gpt4 it has no knowledge files and does not erase anything.
Bug indeed. Will be fixed.
Wait, I tested it more and no, something else happened. I am now in mistery game and have files from my bot and astro bot in it...
I guess it was an accident it cleared the files there.
@@EduardsRuzga , I tried to replicate with 2 different accounts.
Some files are short-lived in memory (on mounted disk), so may get lost after some time.
Not sure it's a big or not. Maube feature, Sou you could access your once (!) uploaded files from multiple chats and they all live in single virtual environment.
Maybe it is bug, that could become documented feature as described above :)
I find this "feature" as good way to not stuff server with multiple copies from same user in different chats.
@@RolandsUmbrovskis could be. It just that as creator of gpts it forces me not to use code interpreter with knowledge files for certain projects. Forces me to explore custom actions deployment to hide files behind them as ChatGPT dies not allow such control.
You btw may fall victim to this too as different chat bots may end up using files from different chats. Like phrase send all files to email will send pictures you generated in other chat...
And no you do not need to restart code interpreter for this. Just make per chat folders that other chats by default do not have access to
What website for bot generation is this? Sry for stupid question
Sorry answered in your thread with answers for another.
To answer your question, its ChatGPT Plus with new custom GPTs section. By now everyone should have access who is paying 20$ a month.
Try it on a GPT you didn’t create
I already did multiple times
I made new video, I demo there doing that with GPTs I did not own
ua-cam.com/video/efXoLvB4Xkw/v-deo.html