i watched and read several articles on how to add a .csv file but wasn't able to add it finally i figured it out from your video that i should have checked the header .. thanks mate
Is it really necessary to create each table manually? I have a mysql database with 120 tables, tens of thousands rows of data - some tables have like 20-100 columns - this will take a week to just create them ?
What if you are trying to import a csv file that has 300 columns? You have to manually type in all the column names and column types? There has to be an easier way.
May I ask a question. What if I am going to import a csv file that has 300 or 500 columns given by the company I am working with. Do I necessarily do that? It would take some time for them to get the insights they wanted. What if there are 10 tables with 400 columns each. Do I really need to do that to provide the insights that the company is expecting?
Is there a way to import the csv files without defining first the table structures? Upon importing, it will automatically detect and create the headers.
I also had similar problems but if you check the error you should see what went wrong. In my own case i made a mistake in the datatype check your csv file properly to ensure that each column has the same dtattype in all rows and it doesnt have an error in the csv file
i watched and read several articles on how to add a .csv file but wasn't able to add it finally i figured it out from your video that i should have checked the header .. thanks mate
Is it really necessary to create each table manually? I have a mysql database with 120 tables, tens of thousands rows of data - some tables have like 20-100 columns - this will take a week to just create them ?
wow great fun and brief learning
What if you are trying to import a csv file that has 300 columns? You have to manually type in all the column names and column types? There has to be an easier way.
exactly
You could do it programmatically with a script to create the fields.
@@JeffreyJohnsonChow please
That is a gigantic problem of PostgreSQL and shamelessly, they've attempted to compare this garbage to MSSQL.
Good question! your question is as good as mine.
Thank you very very much! from Argentina
May I ask a question. What if I am going to import a csv file that has 300 or 500 columns given by the company I am working with. Do I necessarily do that? It would take some time for them to get the insights they wanted. What if there are 10 tables with 400 columns each. Do I really need to do that to provide the insights that the company is expecting?
Good question! your question is as good as mine.
Is there a way to import the csv files without defining first the table structures? Upon importing, it will automatically detect and create the headers.
Yes i am looking for the same solution
I'm from Brazil, thanks for the help
I'm using this in docker and I can't access my local files. Do you have any resources?
Sir, ... Thank you very much for the technique... I can apply it in my office job....
thanks mate , helped me a lot
Thank you so much!
Thank you
Good tutorial, thanks for that
thank you so much sir , i trying import data by using copy it give me error , after watching your tutorial it really help me
Thank you!
Good tutorial.
Hi
I tried but it keeps failing every time
Process started is successful but the other one isn’t
I also had similar problems but if you check the error you should see what went wrong. In my own case i made a mistake in the datatype check your csv file properly to ensure that each column has the same dtattype in all rows and it doesnt have an error in the csv file
I tried doing something like this and it failed.
Thanks