Hive Practical - 1 | Hindi | Internal and External Tables

Поділитися
Вставка
  • Опубліковано 7 вер 2024

КОМЕНТАРІ • 97

  • @TechnologicalGeeksHindi
    @TechnologicalGeeksHindi  3 роки тому

    For Online classes Visit our website
    technologicalgeeks.com/
    Course Details : ua-cam.com/video/KBK85ETH5nI/v-deo.html

  • @Xavierpng
    @Xavierpng 11 місяців тому

    hats off to the explanation. amazed to see the explanation. ❤️

  • @neelbanerjee7875
    @neelbanerjee7875 5 років тому +2

    best.. even some paid courses cant match this level of explanation..!! need much more from you.. !! thanks buddy..

  • @ajaykushwaha4233
    @ajaykushwaha4233 5 років тому +2

    Jhakaas, bhai since today’s morning I was struggling to understand hiveql and ur tutorial breaks the ice.

  • @aditya19945
    @aditya19945 3 роки тому +1

    Major appreciation from a finance grad struggling as a data analyst. 👍

  • @shubhamhavale4941
    @shubhamhavale4941 2 роки тому

    Bhawa ..
    Ek number .. kash tu maza mentor astas je 3 months nahi samjle te atta clear zala..✌️

  • @sharanmk7847
    @sharanmk7847 5 років тому +3

    Your video solved many of my doubts. Keep up the good work

  • @kritikagoswami4876
    @kritikagoswami4876 6 років тому +2

    Awesome!!!! You explained the concept in a wonderful way! Thank u!

  • @souravmadhesiya9307
    @souravmadhesiya9307 3 роки тому

    What an explanation and practical session. Hawa aahe bhau tumchi...

  • @sr678-u7t
    @sr678-u7t 5 років тому

    thanx bhai ek number video hai exam ke ek din pehle dekha sab smjha

  • @chetanamohapatro4174
    @chetanamohapatro4174 6 років тому +1

    awsome...an IT person need such explanations...seriusly please upload complete spark vedios..it will be a great help...

  • @clgurumurthy
    @clgurumurthy 6 років тому +1

    Brother! Superb... I became your fan... the way you speak... movie ki style me...:)

  • @vaibhav_data
    @vaibhav_data 2 роки тому +1

    Best explanation

  • @AnkitSawantt
    @AnkitSawantt 4 роки тому

    Awesome. There still 4 people who think this video wasnt useful. They are either jealous of this guy being awesome or has no idea of big data.

  • @abhishektripathi2028
    @abhishektripathi2028 6 років тому +1

    like the way you teach thank you so much for the valuable tutorials

  • @rajb.9178
    @rajb.9178 Рік тому

    Very good explanation , helped me to understand the difference!

  • @creative_humans
    @creative_humans 3 роки тому

    Great Brother,,,keep sharing

  • @SumanYadav-vf8dn
    @SumanYadav-vf8dn 6 років тому

    Lyk the way you explain the topics , looking forward to Hbase, Kafka videos it will be great help

  • @shashanksaini
    @shashanksaini 3 роки тому

    Hi Sandip your videos are amazing easy to understand .I pray what you have determined you will achieve.I am new in this field.

  • @junkingjunking8477
    @junkingjunking8477 3 роки тому +1

    Awesome explanation. Thank you. :)

  • @mohammedameen5210
    @mohammedameen5210 2 роки тому

    Outstanding explanation

  • @harshadchopade4655
    @harshadchopade4655 2 роки тому

    I wish i had a teacher like him in all my academic life

  • @ayushcomputex1278
    @ayushcomputex1278 7 років тому +2

    bahut bahut shaandaarrrr...

  • @NiharRanjanSamantaray
    @NiharRanjanSamantaray 4 роки тому

    Very Nice Video Bro... Helped me a Lot

  • @dhruuv7772
    @dhruuv7772 6 років тому +1

    best on UA-cam

  • @abhishekkumar-es1wl
    @abhishekkumar-es1wl 5 років тому

    extraordinary !!! waiting for streaming & Kafka videos...

  • @imohitr888
    @imohitr888 4 роки тому

    u rock brother. please come up with more tutorials of hive and spark

  • @kalpanajain933
    @kalpanajain933 3 роки тому

    Awesome explanation, Thanks a lot!

  • @ArpitSingh-cf3yu
    @ArpitSingh-cf3yu 6 років тому +5

    Bhai ek live use case ka complete video banao na, jaise live streaming with kafka -- processing the same in spark -- hive me queries etc etc aisa complete pipeline banao, matlab live project ka koi ek use case..
    waiting for you response bro!!!

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  6 років тому +1

      Sure bhai, jald hi upload krunga

    • @phanikishoreavadhanula8652
      @phanikishoreavadhanula8652 5 років тому

      Bhai IS beech main videos upload hi nahn kar rahe ho ? ek end to end live project video upload karo na as said in above comment ?

  • @Saniragroup
    @Saniragroup 2 роки тому

    Very nice sir

  • @shubhamselkari1969
    @shubhamselkari1969 2 роки тому

    sir you explains so wonderful! create some more content for big data

  • @AbdulKadir-sq4mf
    @AbdulKadir-sq4mf 2 роки тому

    Lajawab sir

  • @RizwanShaikh-xs8bq
    @RizwanShaikh-xs8bq 4 роки тому

    Amazing buddy plz upload more videos related to hadoop project and ecosystem components in detail👍

  • @soumitrapathakbitsizecompu9070
    @soumitrapathakbitsizecompu9070 5 років тому

    Bhai naye videos upload nhi kar rahe kya?
    Bahut sahi videos banaye he..
    macha diya he apne..

  • @devendrakumar-ve4wv
    @devendrakumar-ve4wv 7 років тому +1

    bhai fan ho gy tumara

  • @nonie1107
    @nonie1107 6 років тому

    Awesome video! !!!

  • @prasadsasane9139
    @prasadsasane9139 5 років тому

    very nice explanation Bhai

  • @abhishekvaidya1797
    @abhishekvaidya1797 6 років тому

    Hi brother...you are doing a great job...please create Apache spark videos for beginners...

  • @yogitakarande1233
    @yogitakarande1233 5 років тому

    Nice videos Sir...

  • @vineetmimrot3202
    @vineetmimrot3202 5 років тому

    Jo bhi ho lekin Wallpaper pe " Bhai " ki photo dekh ke maje aa gyi .. :D

  • @akshaylanjudkar5167
    @akshaylanjudkar5167 Рік тому

    Thanks boss

  • @sreyashjaiswal6830
    @sreyashjaiswal6830 6 років тому

    big fan of yours

  • @ashwanishukla8029
    @ashwanishukla8029 4 роки тому

    Thnx for uploaded this 👏👏

  • @harshsaxena2179
    @harshsaxena2179 6 років тому

    good job sir

  • @soumyastar5274
    @soumyastar5274 7 років тому

    very nice dear frnd...plz upload more practical video regarding hadoop

  • @mahesh84
    @mahesh84 3 роки тому

    explanation in another level ... lol

  • @prathammishra2881
    @prathammishra2881 2 роки тому

    Sir if we want to see external table like we internal table in localhost... then what should we have to do

  • @ameytsen8158
    @ameytsen8158 6 років тому

    very nice explanation...keep posting videos like this. Thanks a lot. I had a question what if we don't specify any location while creating external table then what would happen ? please clarify this doubt of mine.

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  6 років тому

      +Amit Sen we have to specify the location while creating external table as the framework should know which file is to be inserted into the table.

  • @maheshparab452
    @maheshparab452 6 років тому

    i liked ur sessions of hive
    can u upload the videos about flume practicles

  • @shivambhardwaj9009
    @shivambhardwaj9009 6 років тому

    Sir me haryana se hu or kafi tym se follow kr ra hu aapko. Please ek Video bna do ki hadoop install kaise kre or ye sb commands ko apne laptop pe kaise chlaye... Mtlb kya install kre or kha se kre... Sari info de do

  • @snehakavinkar2240
    @snehakavinkar2240 4 роки тому +1

    Considering we can create multiple tables pointing to the same location can we create both internal and external tables that point to the same location? Thank you!

  • @datascienceds7965
    @datascienceds7965 6 років тому

    Your videos are good resource to complete my project work. As non-Hindi speaker ( I mentioned in other video) , I could follow what you were saying a bit as you used English words. But I couldn't get what you meant by commands ' use default ; and use demo ;'. Can I ask what was that? and also can you say again the concept of internal and external table , please?

    • @sahilpandey1043
      @sahilpandey1043 11 місяців тому

      Use demo is to use demo database otherwise it will default database

  • @ahmadali-cz4gu
    @ahmadali-cz4gu 7 років тому +1

    Brother please add more videos :-D waiting :D

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  7 років тому +1

      +ahmad ali sure Bhai 😊 , Thoda busy chal raha hu , lekin jald hi upload krne ki koshish karunga 😊

  • @naynadhone5908
    @naynadhone5908 5 років тому

    Thanks ....

  • @Raj780able
    @Raj780able 4 роки тому

    You have not shown where external table get stored? As all the data is under HiveData only.
    As per my understanding in external table we are using external files data to be seen in our external table. That's why even after dropping table only metadata gets deleted.

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  4 роки тому

      In external table we are just mentioning the location where data is stored , so external table is just pointing at the data, and this data is stored in HDFS.

  • @rohitpandey8
    @rohitpandey8 6 років тому

    Hello sandeep , i have a question in my mind . As big data consists of structured , unstructured and semi structured data . As taught by you for structured we can use hive . For unstructured , we need to write map reduce code . But how semi structured data will be processed as it contains mainly of excel data . Curious to know this as you already mentioned if data type discrepancies arise , query will fetch NULL and excel data will have no data type .

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  6 років тому

      +Rohit Pandey semi structured data is a type of data where we don't have schema for data but we know the structure of data as shown in example CSV is a semi structured data , while loading this semi structured data into hive we specify schema and this semi structured data is converted to structured data , we can export CSV,TSV from Excel and load it in hive by specifying datatypes for fields .

  • @amitkumar-wh6mx
    @amitkumar-wh6mx 3 роки тому

    Mai jab table create kar rha hun toh yeh error aa raha hai, mai cloudera manager use kar rha hun.
    NoViableAltException(26@[1750:103: ( tableRowFormatMapKeysIdentifier )?])

  • @harshadborkar2550
    @harshadborkar2550 3 роки тому

    difference between load data local inpath and hdfs dfs put commands

  • @user-qc7dc5sl3d
    @user-qc7dc5sl3d Рік тому

    where is the practical video of SQOOP. please share the link in comment section

  • @nikhilsinha3601
    @nikhilsinha3601 6 років тому

    external tables is created , but not showing into hdfs .please help me.

  • @abcdefghi9776
    @abcdefghi9776 5 років тому

    Hi Sir, Please make a video for flume too.

  • @Rohit-cz5un
    @Rohit-cz5un 7 років тому

    sir while creating internal table if we give location as well then where will my data get store,is it strore in user/hive/warehouse directory or it will point to location...i have some doubts regarding this clarify my doubts?

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  7 років тому

      The Location command is given to overwrite the default location (/user/hive/warehouse), so when we will specify LOCATION '/hdfs_directory' in our create table command , all the files that we will import in that table will be copied in the hdfs_directory that we have specified in our command ,
      Still there is difference between external and internal table , when we drop internal table , no matter where data(data+metadata) is stored it will get deleted , This wont be a case with external table , only metadata will be deleted in external table when we drop it .

  • @maheshparab452
    @maheshparab452 6 років тому

    i want to work on big data and hadoop, could u suggest to me ?

  • @rohitdesai183
    @rohitdesai183 2 роки тому

    database create kiya and use bhi kiya .create table mera ho gaya ye command dala LOAD DATA LOCAL INPATH '/home/cloudera/employee_data.csv' INTO TABLE employee_data; ye command run par table show karena command dala toh detail pe null null likha tha

  • @varultyagi
    @varultyagi 6 років тому

    Hbase and SQOOP ki video bhi upload kr do bhai g

  • @ishasaini2872
    @ishasaini2872 7 років тому

    sir ye jo files h student, student1 inka size to bytes me h or jo block size create Ho rha h wo mb me ho rha h to Memory Jada ja rhi hena apni or ye block me jo free space h wo kbii free hogy kya ???...
    or ha iska matlb ye bi hua ki size ka size agr block size se kam hua to bi same size ka block create hoga !!!!

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  7 років тому

      File blocks utne hi size ke create ho rahe hai , jitne size ki file hai , you might see 128Mb blocks on browser window , for detailed description about particular file and its blocks , execute following command
      $hadoop fsck /Demo/student -files -blocks
      Ye wali command execute hone ke bad total block size ke samne average block size dikhai jati hai , wahapar 128Mb nahi dikhega ....

    • @ishasaini2872
      @ishasaini2872 7 років тому

      Technological Geeks Hindi Ok I will catch... n tx

    • @ishasaini2872
      @ishasaini2872 7 років тому

      Sir can you also provide some video for Spark of any link ...
      if you have so plz share

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  7 років тому +1

      +Isha Saini I will be uploading video series for spark after completing hadoop 😊
      I will suggest you to have a look at official documentation of spark
      spark.apache.org/documentation.html

  • @piyush98182
    @piyush98182 7 років тому

    sir,
    apne start-dfs.sh and start-yarn.sh use kiya hai,
    lekin agar hum inke jagha start-all.sh use karte hai tho,
    warning ate hai 'this script is decripted instead use start-dfs.sh and start-yarn.sh separetly ',
    why does it happen ????????????????????

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  7 років тому

      +Piyush Ghildiyal Kyu ki wo command purani ho chuki hai , aur maybe next version me nahi aaegi , so users ne updated commands use krni chahiye isiliye 😊

    • @australiatravel
      @australiatravel 6 років тому

      In Hadoop 1.X the command start-all.sh was been used. But in the upgraded version of Hadoop 2.X which includes YARN the command has been upgraded to start-dfs.sh and start-yarn.sh. Inorder to run the hadoop demons of dfs and Yarn demons in an efficient manner.

  • @pkcommence4601
    @pkcommence4601 2 роки тому

    now how to stop hive

  • @meswapnilspal
    @meswapnilspal 6 років тому

    @sandeep what are those "~" files for ?

  • @kiva1823
    @kiva1823 7 років тому

    hi dost
    mujhe system administrator, or linux administrator k liye kuch videos ki links send KAROO Na

    • @TechnologicalGeeksHindi
      @TechnologicalGeeksHindi  7 років тому +1

      +K.G Choudhry Sure Bhai 😊 , edureka ke maximum videos available hai UA-cam pr ,
      Still agar tumhe certification krna hai , to mai suggest karunga ki koi institute join krna behtar rahega , aur apka mail id share Kro , agar muze practical wale videos mile to mai share kr dunga 😊

    • @kiva1823
      @kiva1823 7 років тому

      kailashchoudhry23@gmail.com

  • @AvinashGale
    @AvinashGale 5 років тому

    Ultimate explanation