Of all the UA-cam videos I watched to understand Custom GPT vs GPT assistants as well as the Actions inside Custom GPT, yours is by far the best and loaded with genuine help as I am not programmer (nor am I a gamer!) but would love leverage ChatGPT with custom GPT for my business. Is there any way I could learn how to create a custom GPT to parse a 1000+ page PDF containing a bilingual table to create one flat table in a large XLSX as output? Again, much appreciate your video and wish you the best! You are indeed a gem in UA-camrs! Thank you!👍🤘
Thank you so much for your kind words 😊 for your use case Id probably approach it by writing a script (like in js or python) rather than a custom GPT. I am not sure that a GPT can respond with something so large. Here's what I'd try to include in the script: 1. Start with some PDF parsing and pass chunks of it to the OpenAI API to extract necessary data 2. Take the output and write it into a .csv file I know you're not a programmer but don't worry! You can still get started on this by having ChatGPT write the code for you! Here's a video I have on that: ua-cam.com/video/KjLg9x0WKkY/v-deo.html
@@VoloBuilds Wow! Thanks! I did not honestly think you'd reply, but you did! :-) I really appreciate your response and taking time to respond. You are a very good teacher and a person who helps. I will regularly follow you. Wish you the best!
I need this deployed for my academic research project. I want to build some openai assistants and use something like this to allow low code users to build their own custom GPT's that connect to a knowledge base and the assistants.
What sort of knowledge bases are you looking to connect to? You could potentially just export data and import it into the Assistants UI or create an automation like the one I describe in this video: ua-cam.com/video/JzxUW0ZT4to/v-deo.html Actions aren't necessary for knowledge retrieval - but perhaps you have something bigger in mind!
Was looking around to find out more about the usage/application of CustomGPT knowledge and Action, usually find them very general until I saw your video which opened up other dimensions and inspired possibilities. I like that you provide free or budget-friendly solutions, while many others tried to sell a vendor solution or pricey solution (although may be more convenient). Really appreciate your HOT (honest, open, and transparent), informative content. I am considering setting up an "external", local host (in a folder) knowledge base to store all past and latest research reports so that the CustomGPT (via Action) can search for information and make recommendations based on the stored reports. Disclaimer: I am not a programmer. Will greatly appreciate your input and guidance.
Hey there, thanks for your feedback! In terms of using a CustomGPT with local files, I'm not sure how that would work. Actions are just API calls, and perhaps you'd be able to create an API on top of some local files, but that seems a bit unusual. I think typically you would upload the files to the GPT so that it can answer questions without the actions. I'm not entirely sure if custom GPTs allow you to query localhost, but let me know if you try!
I am thinking of building a gpt that suggests the slide format to be used when a user queries it asking for suggestions. These slides are images of slides. I was thinking to store the images in a bucket and then upon request the cloud function could access them and render it for the user along with similar slides functionality. What's your opinion on this ?
Hey, that's a pretty cool idea! Thanks for sharing :) To generalize this further, I think the idea of having a bucket with files and retrieving them or linking to them through a custom action like this makes a lot of sense! If there's a limited number of slides/formats, you could potentially just create a file containing a list of each slide + URL for the image of that slide, and feed that into the GPT as knowledge. But if you have a large number of slides/files and plan to update that list semi-frequently, it could become a burden updating GPT knowledge all the time, so the Actions approach would be better then!
Hey , i just scraping the latest jobs in job boards , i want to run the code in cloud to update or to scrape data daily , is it usefull for this case ?
You can certainly create cloud functions to do scraping but you'll need to schedule it separately like with Google Scheduler. Also if you're doing s lot of scraping, it might make sense to deploy your scraper on something like Google App Engine instead of cloud functions so check that out! Do you plan to use the scraped data in a GPT somehow?
Your videos are 🔥- so hard to find quality content you’ll rise to the top man. In a discord discussing any of this? A space for projects collaborators to actually collaborate on all this would be great
@@tylanmillertech LOL fair enough, I just don't have any info to go off of (error, etc) - and I can't help in real time so I figured it was the easiest way haha
@@VoloBuilds Your good boss man, I was just making a funny. I am still learning VS code myself. I do think sometimes when there should just be a bit more detail as some things still need to be installed on the computer and etc. Love your vids though.
Dude! You are a unicorn 😂
😅
Of all the UA-cam videos I watched to understand Custom GPT vs GPT assistants as well as the Actions inside Custom GPT, yours is by far the best and loaded with genuine help as I am not programmer (nor am I a gamer!) but would love leverage ChatGPT with custom GPT for my business. Is there any way I could learn how to create a custom GPT to parse a 1000+ page PDF containing a bilingual table to create one flat table in a large XLSX as output? Again, much appreciate your video and wish you the best! You are indeed a gem in UA-camrs! Thank you!👍🤘
Thank you so much for your kind words 😊 for your use case Id probably approach it by writing a script (like in js or python) rather than a custom GPT. I am not sure that a GPT can respond with something so large. Here's what I'd try to include in the script:
1. Start with some PDF parsing and pass chunks of it to the OpenAI API to extract necessary data
2. Take the output and write it into a .csv file
I know you're not a programmer but don't worry! You can still get started on this by having ChatGPT write the code for you! Here's a video I have on that: ua-cam.com/video/KjLg9x0WKkY/v-deo.html
@@VoloBuilds Wow! Thanks! I did not honestly think you'd reply, but you did! :-) I really appreciate your response and taking time to respond. You are a very good teacher and a person who helps. I will regularly follow you. Wish you the best!
I need this deployed for my academic research project. I want to build some openai assistants and use something like this to allow low code users to build their own custom GPT's that connect to a knowledge base and the assistants.
What sort of knowledge bases are you looking to connect to? You could potentially just export data and import it into the Assistants UI or create an automation like the one I describe in this video: ua-cam.com/video/JzxUW0ZT4to/v-deo.html
Actions aren't necessary for knowledge retrieval - but perhaps you have something bigger in mind!
Was looking around to find out more about the usage/application of CustomGPT knowledge and Action, usually find them very general until I saw your video which opened up other dimensions and inspired possibilities. I like that you provide free or budget-friendly solutions, while many others tried to sell a vendor solution or pricey solution (although may be more convenient). Really appreciate your HOT (honest, open, and transparent), informative content.
I am considering setting up an "external", local host (in a folder) knowledge base to store all past and latest research reports so that the CustomGPT (via Action) can search for information and make recommendations based on the stored reports. Disclaimer: I am not a programmer. Will greatly appreciate your input and guidance.
Hey there, thanks for your feedback! In terms of using a CustomGPT with local files, I'm not sure how that would work. Actions are just API calls, and perhaps you'd be able to create an API on top of some local files, but that seems a bit unusual. I think typically you would upload the files to the GPT so that it can answer questions without the actions. I'm not entirely sure if custom GPTs allow you to query localhost, but let me know if you try!
@@VoloBuilds Thank you for your response. The reason for this idea is due to the limitation of knowledge that allows just 20 files.
Very details content, Thank you!
Thanks for watching :))
I am thinking of building a gpt that suggests the slide format to be used when a user queries it asking for suggestions. These slides are images of slides. I was thinking to store the images in a bucket and then upon request the cloud function could access them and render it for the user along with similar slides functionality. What's your opinion on this ?
Hey, that's a pretty cool idea! Thanks for sharing :) To generalize this further, I think the idea of having a bucket with files and retrieving them or linking to them through a custom action like this makes a lot of sense!
If there's a limited number of slides/formats, you could potentially just create a file containing a list of each slide + URL for the image of that slide, and feed that into the GPT as knowledge. But if you have a large number of slides/files and plan to update that list semi-frequently, it could become a burden updating GPT knowledge all the time, so the Actions approach would be better then!
Hey , i just scraping the latest jobs in job boards , i want to run the code in cloud to update or to scrape data daily , is it usefull for this case ?
You can certainly create cloud functions to do scraping but you'll need to schedule it separately like with Google Scheduler. Also if you're doing s lot of scraping, it might make sense to deploy your scraper on something like Google App Engine instead of cloud functions so check that out! Do you plan to use the scraped data in a GPT somehow?
Excellent! Thank you for posting this!
Thanks for watching!! :)
Your videos are 🔥- so hard to find quality content you’ll rise to the top man.
In a discord discussing any of this? A space for projects collaborators to actually collaborate on all this would be great
Thank you so much 😊 this made my day! I don't have a discord yet, but will put one together in the coming months!
Thanks, this is great!
Thanks for watching! :)
So sad I get an error in my VSCode after I try the npm install - super sad face.
Try asking ChatGPT to help troubleshoot things!
@@VoloBuilds I won't lie to you that is the worst answer ever, but I hear you.
@@tylanmillertech LOL fair enough, I just don't have any info to go off of (error, etc) - and I can't help in real time so I figured it was the easiest way haha
@@VoloBuilds Your good boss man, I was just making a funny. I am still learning VS code myself. I do think sometimes when there should just be a bit more detail as some things still need to be installed on the computer and etc. Love your vids though.
@@tylanmillertech Yeah I hear you, the initial learning curve can be frustrating for sure. Wishing you all the best!