Thank you for uploading this wonderful session. I have a question at 11:57 why we should create the normal table for the csv. We should have created the external table, so that data just resides in the HDFS and then we copies the data from that external table to orc table. Creating normal hive table for csv will copy the whole csv data in user/hive/warehouse but external table will not do that. Please suggest
serde, compared to java, create a table with json row format table_json copy the data from normal table to table_json view the underlying data in json format realtime - load a file in normal text format create table with optimized format copy the table from normal table to optimized table
Thank you for uploading this wonderful session.
I have a question at 11:57 why we should create the normal table for the csv. We should have created the external table, so that data just resides in the HDFS and then we copies the data from that external table to orc table. Creating normal hive table for csv will copy the whole csv data in user/hive/warehouse but external table will not do that.
Please suggest
Can we insert data into the table from a local file using LOAD DATA command ?
local means that file will be in hadoop data lake then we can run load data local inpath into table
we can move the file to hadoop by using winscp tool
Where we need to copy jar .. from hdfs or local
We add from local, not copy.
serde,
compared to java,
create a table with json row format table_json
copy the data from normal table to table_json
view the underlying data in json format
realtime -
load a file in normal text format
create table with optimized format
copy the table from normal table to optimized table
@nanda kishore how to create optimised table?