Data Validation with Pyspark || Real Time Scenario

Поділитися
Вставка
  • Опубліковано 20 сер 2024
  • In this video will discuss about , how we are going to perform data validation with pyspark Dynamically
    Data Sources Link:
    drive.google.c...
    #pyspark #databricks #dataanalytics #data #dataengineering

КОМЕНТАРІ • 14

  • @mohitupadhayay1439
    @mohitupadhayay1439 3 місяці тому

    Amazing content. Keep a playlist for Real time scenarios for Industry.

  • @ajaykiranchundi9979
    @ajaykiranchundi9979 3 місяці тому

    Very helpful! Thank you

  • @vamshimerugu6184
    @vamshimerugu6184 3 місяці тому

    Great explanation ❤.Keep upload more content on pyspark

  • @ArabindaMohapatra
    @ArabindaMohapatra 2 місяці тому

    I just started watching this playlist. I'm hoping to learn how to deal with schema-related issues in real time.Thanks

  • @listentoyourheart45
    @listentoyourheart45 8 місяців тому

    Nice

  • @ComedyXRoad
    @ComedyXRoad 20 днів тому

    do we apply these techniques for delta tables also

  • @World_Exploror
    @World_Exploror 6 місяців тому

    how did you define reference_df and control_df

    • @DataSpark45
      @DataSpark45  6 місяців тому

      we defined as a table in any DataBase. As of know i used them as a csv

  • @skateforlife3679
    @skateforlife3679 8 місяців тому

    Cool, but is it like this every time ? Like you have a reference df containing all columns and file name / path and you have to iterate over it to see if its matching ?

  • @OmkarGurme
    @OmkarGurme 5 місяців тому

    while working with databricks we dont need to start a spark session right ?

    • @DataSpark45
      @DataSpark45  4 місяці тому +1

      No need brother, we can continue with out defining spark session, i just kept for practice