This is genius and amazing. I use Supabase with Swift for iOS. I use Apple's Playgrounds as my Swift Sandbox for testing thing, isolating Views from the rest of the project etc... Now I finally have a Postgres Sandbox !!
Love this, struggled to come up with a db design. Now it is brainstorming and you can move on to the next step in your stack. I can imagine deploy is a little harder. But this is already very usable! Thanks so much
This is so so amazing .... we are using apache superset and the chart creation and data handling flow is tough ... we can integrate this into superset to simplify the flow
Just to be clear as a complete newbie into this, I can nust simply drop a CSV file and then it can read the csv file and generate the table and other necessary components and then I can link the table with my next js projects or anything. Am I correct? Thanks!
The problem with AI is that they are really good with trivial tasks but fail badly with complex tasks. So no need to worry that AI is gonna replace developer 😅
Ready to put your database into production? 👉 database.new
my next video "how I deployed 100 databases in 24 hours"
this is bananas. 🍌
i'm learning Postgres right now and this is like a cheat code + private tutor + brainstorming partner + etc...💝
Absolutely incredible work 🤯
Amazing work guys! Appreciate all you are doing for the community as a whole!
This is genius and amazing. I use Supabase with Swift for iOS. I use Apple's Playgrounds as my Swift Sandbox for testing thing, isolating Views from the rest of the project etc... Now I finally have a Postgres Sandbox !!
Impressive Demo 😊
Love this, struggled to come up with a db design. Now it is brainstorming and you can move on to the next step in your stack. I can imagine deploy is a little harder. But this is already very usable! Thanks so much
Wow. To do this for absolutely free is incredible. Thank you guys.
This is awesome. Thank you! The pgvector inclusion is so cool.
Amazing for prototyping, fast starting, grate work
Absolutely awesome
Mindbogglingly cool demo!
This is awesome. Also Greg! The channel needs more regular doses of Greg.
Nice UI, we live in the golden age of frontend
This looks nice for start to operate with data! Thanks
love it ! save my time on figuring out the structure of database on my project
Fantastic service. I cannot wait to use it
Love it. Great work!
Very cool feature, I'm trying it and absolutely in love with it
Best part it is free .. i don t know how to appreciate your efforts.. i really like your service
So cool!
This is amazing! Woooow.
damn you guys snatched up all the good domains lmao
😂
Beautiful demo! What tool did you guys use to record the demo?
www.screen.studio/
this is amazing!
awesome supabase tech stack
Insane development tool!! Love it ❤
🤯 whoa! this is amazing!
I can see myself using this. Thanks👍
Bro what?! Amazing!
Amazing demo 🎉🎉
Good job!
This is crazy!
This is nefarious work❤
This is so so amazing .... we are using apache superset and the chart creation and data handling flow is tough ... we can integrate this into superset to simplify the flow
Well done awesome!
This is a GAME. CHANGER.
this is just.. wow!!!
I'm impressed
POC game changer.
It's like a dream ✨
Y'all be crazy 💚
Amazing
Just to be clear as a complete newbie into this, I can nust simply drop a CSV file and then it can read the csv file and generate the table and other necessary components and then I can link the table with my next js projects or anything.
Am I correct? Thanks!
Do you swnd columns to openai for processing in every query
which LLM does it use in background because the input query is being sent to LLM so how to protect sensitive information ?
+
Currently GPT-4o, but you can run the project locally and point it to whichever LLM you like (including local Ollama).
how to restore the downloaded tar file and integrate it into local or remote project? maybe should have a community for us to discuss.
is possible do that for bigquery?
I cannot access the website functions, it tells me about the reaming
好好好👍
Mindblowing ❤❤❤🎉🎉🎉
Hey if my csv file has around 1000 columns and then the query comes, then how do relevant columns are selected? can you describe it in detail?
video about the sync, please.
Whoa!
if it needs openapi - the solution is not free - does it work with a local opensource llm ?
Amazing
incredible
🔥🔥🔥🔥🔥
great!
Sorry if I missed (I DID watch the whole video)...
Can we use this to create everything and then port it over to our SB account?
That's the deploy feature (on the way) that Greg mentioned at the end
@@Supabase Awesome. Love it.
Impressive
oh my god ^^
Wish You Luck!
game changer
Mind blowing mind boggling 🫨
Can we give it access directly to our database without uploading it as csv if not how can we accomplish that ?
How can I deploy this locally?
wow
good job
That sound like an episode of silicon valley
I can't believe this is free.
I'm confused as to how it is local only. As it appears as if it is doing some AI processing - ie when you run queries. Where are the queries going to?
read about pgvector
It goes to OpenAI for gpt-4o currently (according to Supabase CEO)
I don't like it going to openai... What's any use in that...
The problem with AI is that they are really good with trivial tasks but fail badly with complex tasks.
So no need to worry that AI is gonna replace developer 😅
But a.i. generated 4 tables, it's a complex system after all.
Whatttttt?
Bye bye analyst, soon data Eng too
this is ridiculous
please do a tutorial to explain how the hell you made this!
Incredible 🤯🤌
Mind blowing
game changer