12 Understand Spark UI, Read CSV Files and Read Modes

Поділитися
Вставка
  • Опубліковано 11 вер 2024

КОМЕНТАРІ • 23

  • @pradishpranam9108
    @pradishpranam9108 7 днів тому

    highly underrated series. Keep doing the good work

    • @easewithdata
      @easewithdata  5 днів тому

      Thank you so much for your lovely comment! ❤️ I hope my playlist made it easier for you to learn PySpark.
      To help me grow, please make sure to share with your network over LinkedIn 👍

  • @manishkumar1450
    @manishkumar1450 4 місяці тому +1

    crisp and clear👌

    • @easewithdata
      @easewithdata  4 місяці тому

      Thanks ❤️ Please make sure to share with your network over LinkedIn

  • @bidyasagarpradhan2751
    @bidyasagarpradhan2751 8 місяців тому

    Lots of new things learn today 👍

  • @sambatammavarapu2280
    @sambatammavarapu2280 10 місяців тому

    really good sessions

    • @easewithdata
      @easewithdata  10 місяців тому

      Glad you like them! Please make sure to share with your network on LinkedIn ❤️

  • @yo_793
    @yo_793 8 місяців тому

    AWESOME !

  • @vineethreddy.s
    @vineethreddy.s 4 місяці тому

    3:00 what do you mean by identifying the metadata? what's the use of it in this context?

    • @easewithdata
      @easewithdata  4 місяці тому +1

      Metdata means the information about the column names and their datatypes

  • @user-nv6ho7uk8b
    @user-nv6ho7uk8b 7 місяців тому

    Hi Shubham,
    Great content, I am following your series in data bricks environment. When we read a file it generates a job to get the metadata, when I to check the execution metrics in databricks ui, it does not show inputsize/record in databricks but in your docker container it show, where can we check that info in databricks?

  • @BnfHunterr
    @BnfHunterr 10 місяців тому +1

    please make a video on how to write a production grade code , unit testing , these things are not available on yt .. can u plz make it ....

    • @easewithdata
      @easewithdata  10 місяців тому

      Will surely make video on that. Thanks for Following ❤️

    • @yo_793
      @yo_793 8 місяців тому

      PySpark Interview Series of Top Companies
      ua-cam.com/play/PLqGLh1jt697zXpQy8WyyDr194qoCLNg_0.html&si=m82ejHBVkhSLWFET

  • @abdulraheem2874
    @abdulraheem2874 10 місяців тому +1

    can you make some video about Pyspark interview questions

    • @easewithdata
      @easewithdata  10 місяців тому

      Sure, will definitely create some on it. Make sure to share this with your network.

    • @yo_793
      @yo_793 8 місяців тому

      PySpark Interview Series for the top companies
      ua-cam.com/play/PLqGLh1jt697zXpQy8WyyDr194qoCLNg_0.html&si=m82ejHBVkhSLWFET

    • @abdulraheem2874
      @abdulraheem2874 8 місяців тому

      @@yo_793 thank you

  • @omkarm7865
    @omkarm7865 10 місяців тому

    Can you please do it in databricks

    • @easewithdata
      @easewithdata  10 місяців тому +1

      Hello,
      You can lift and shift the same code in Databricks and it will work. Only difference, you dont need to generate Spark Session in Databricks notebook, it generates one for you.
      Hope this helps.

  • @yo_793
    @yo_793 8 місяців тому

    PySpark Interview Series for the Top Companies
    ua-cam.com/play/PLqGLh1jt697zXpQy8WyyDr194qoCLNg_0.html&si=m82ejHBVkhSLWFET

  • @omkarm7865
    @omkarm7865 10 місяців тому

    So much gap😅

    • @easewithdata
      @easewithdata  10 місяців тому

      The series is now Resumed. New videos are being published every 3 days. Thanks for Following ❤️