Nice video.I am a junior data analyst and was figuring if it is possible to use the same concept on a existing db with like hundreds of tables. Was wondering bc I got some (hard) business inquiries but got little explanation about the db itself and was thinking about a way of having a conversation with your db. Let me know your thoughts
Hi I am trying to simulate the project showed in the video but I am having doubt that how to write a general prompt and then generate sql by passing table and column information.
@@scientificcoding3153 I am trying to implement this logic on the database that has approx 100+ tables then how can I make sure to generate the correct sql query. Because we can not pass all the 100 tables to the llm because of context length.
You will need to use a LLM which allows for longer prompts. With OpenAI's API this is currently not possible, as far as I know. Alternatively, you can also host your own LLM and fine tune it with information about your schema. That way you can use the prompt space exclusively for the query you are trying to build.
Nice video.I am a junior data analyst and was figuring if it is possible to use the same concept on a existing db with like hundreds of tables. Was wondering bc I got some (hard) business inquiries but got little explanation about the db itself and was thinking about a way of having a conversation with your db. Let me know your thoughts
Maybe index the columns and rows to vector db
Hi I am trying to simulate the project showed in the video but I am having doubt that how to write a general prompt and then generate sql by passing table and column information.
Can you elaborate?
@@scientificcoding3153 I am trying to implement this logic on the database that has approx 100+ tables then how can I make sure to generate the correct sql query. Because we can not pass all the 100 tables to the llm because of context length.
You will need to use a LLM which allows for longer prompts. With OpenAI's API this is currently not possible, as far as I know. Alternatively, you can also host your own LLM and fine tune it with information about your schema. That way you can use the prompt space exclusively for the query you are trying to build.
@@scientificcoding3153 Are you saying OpenAI can't read a table with a 100 tables? Or just a large join which might return more than a 100 columns?