19. Databricks & Pyspark: Real Time ETL Pipeline Azure SQL to ADLS

Поділитися
Вставка
  • Опубліковано 30 січ 2025

КОМЕНТАРІ • 99

  • @maheshwarkuchana190
    @maheshwarkuchana190 2 роки тому +4

    A one stop solution to understand basics of ETL in Databricks. Thanks Mr. Raja for such amazing tutorials on your channel. We really benefit from them. Many thanks again.

  • @gopireddybhargavi2844
    @gopireddybhargavi2844 Рік тому +1

    Thank you so much sir actually I am having support kind of experience in Azure I just followed all of your videos now I got placed in 2 mnc's

    • @rajasdataengineering7585
      @rajasdataengineering7585  Рік тому

      Thanks Gopi for sharing your experience.
      It is really amazing to know that you got placed in MNCs. All the very best!
      If you find this channel helpful, just spread across data engineering communities so that people can be benefited

  • @OurOrangeKitchen
    @OurOrangeKitchen 2 роки тому +2

    Beautiful explanation and a very good example of ETL. Thanks a lot for this video. It helped a lot in gaining a clear picture of the ETL process in Databricks.

  • @archichaudhari9283
    @archichaudhari9283 10 місяців тому +2

    Best video series.

  • @tejaswinisunkavalli2086
    @tejaswinisunkavalli2086 11 місяців тому +1

    Great Explanation and easily understandble👏👏

  • @ranaumershamshad
    @ranaumershamshad 7 місяців тому

    I read data from a huge table in Azure SQL DB and wrote it to ADLS. It created one file of 900 MB instead of partitions. Is there any parameter we can change to create the partitions?

  • @ranajaymondal390
    @ranajaymondal390 3 роки тому +2

    Nice explanations and this series is really awesome. please create more videos on databricks while solving some real time ingest/export requirements using pyspark.

  • @mitusa1234566
    @mitusa1234566 8 місяців тому +1

    Great video with nice and clear instructions.. keep it up. Thanks.

  • @priyankapushp8271
    @priyankapushp8271 9 місяців тому +1

    Wow very nicely explained. Thanks a lot for your efforts.

  • @Rk-mv8sz
    @Rk-mv8sz 2 роки тому +1

    It's really very helpful. Please make a video on end to end project with ADF and ADB. Thank you for giving wonderful videos.

  • @anonymous-254
    @anonymous-254 Рік тому +1

    Instead of ADLS... can we put that data in synapse dedicated sql pool

    • @rajasdataengineering7585
      @rajasdataengineering7585  Рік тому +1

      Yes we can load into azure synapse and azure SQL also. Please watch the video no 87 in this channel

    • @anonymous-254
      @anonymous-254 Рік тому

      @@rajasdataengineering7585 ok thanks

  • @manikiranthota2811
    @manikiranthota2811 2 роки тому +2

    In Azure SQL db itself we can do the null handling ,join and duplicates deletion.why we are using data frame is there any reason specific to that.Thanks in advance.

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому +3

      The requirement is we need to move the data from azure SQL to adls after performing these transformations.
      Could you please explain how you will do these transformations in azure SQL itself as part of this requirement?

    • @manikiranthota2811
      @manikiranthota2811 2 роки тому +1

      @@rajasdataengineering7585 yes sir.i got it.i thought of using isnull operation on product ,delete the duplicates in fact and join the 2 tables in Azure SQL db itself but that will take too much time.Thanks.

    • @shot_freeze
      @shot_freeze 2 роки тому +1

      @@rajasdataengineering7585 Hi Raja , Could you please share us few requirements like this for us think logically , So that we can have some clear idea how we will get in our projects.

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому

      Sure will do

    • @shot_freeze
      @shot_freeze 2 роки тому

      @@rajasdataengineering7585 You can just share it here or either comment outside and Pin it .

  • @dianadai4616
    @dianadai4616 7 місяців тому

    Do you have your codes posted somewhere? It is very important for us to follow along

  • @PriyaDarshiniNeverSayNever
    @PriyaDarshiniNeverSayNever Рік тому

    Getting error driver not found @ send step. Please help how to solve this?

  • @what_worldgot
    @what_worldgot 11 місяців тому

    Where to find dataset for this table ??

  • @kneelakanta8137
    @kneelakanta8137 2 роки тому +1

    how did you get tables in azure sql database

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому

      We can create tables using create table statement. Otherwise you can use readymade adventure works database by choosing it in additional settings while creating the database

  • @navk4960
    @navk4960 Рік тому

    it would be really amazing if the links to topics - mentioned to refer are added in description. as you are an amazing tutor

  • @alluchandrasekhar2992
    @alluchandrasekhar2992 2 роки тому +1

    What if multiple tables(more than 10) needs to copy from azure sql db to data lake

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому

      We can create multiple dataframes reading multiple tables and load them into adls

    • @alluchandrasekhar2992
      @alluchandrasekhar2992 2 роки тому +1

      @@rajasdataengineering7585 yes we but can it be parametarized if yes then how?

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому +1

      Yes parameters can be setup using widgets

    • @alluchandrasekhar2992
      @alluchandrasekhar2992 2 роки тому

      @@rajasdataengineering7585 okay and BTW your video's are very informative .......keep doing such great videos.

  • @pawanukey2990
    @pawanukey2990 2 роки тому

    Hi Raja , You have explained this in detailed . Thanks for that ,But can you please provide the data set ?? To do hands on activity ??

  • @RAHULGUPTA-wy2zb
    @RAHULGUPTA-wy2zb Рік тому

    Hi Raja...would u plz tell that why do you take left outer join not inner join

    • @pavanaditya2309
      @pavanaditya2309 8 місяців тому

      Since this is onlya sample logic he is trying to do, I think for demonstration purpose does'nt matter which join he uses.

  • @arrooow9019
    @arrooow9019 2 роки тому

    No words to express my feelings.How great this tutorial sir.Thanks for this video👍 and also could you make any video how to clear left space and characters in string while cleansing data?

  • @naren06938
    @naren06938 2 місяці тому

    Can u have all these commands in ur github repo, can u give repo link to practice....

  • @mujeebrehman1146
    @mujeebrehman1146 2 роки тому +1

    Great tutorial but i have a question. Can we connect oracle database to databricks?

  • @VanakkamSQL
    @VanakkamSQL 11 місяців тому +1

    Great work!!!!!!!!!

  • @sriharichennupati5
    @sriharichennupati5 7 місяців тому +1

    Thanks for sharing this info

  • @alwalravi
    @alwalravi Рік тому +1

    Grt thanks for sharing the video

  • @mustafakamal5945
    @mustafakamal5945 6 місяців тому +1

    This is very informative video. Do you also have a video on connecting to SQL DB via Managed identity?

    • @rajasdataengineering7585
      @rajasdataengineering7585  6 місяців тому

      Thank you for your comment! No I haven't a video on managed identity

    • @mustafakamal5945
      @mustafakamal5945 6 місяців тому

      @@rajasdataengineering7585 I am struggling to find a resource for implementing it, do let me know if you know of any resources for guidance and help on this topic apart from ms learn site

  • @zubairmushtaq7912
    @zubairmushtaq7912 6 місяців тому +2

    read from azure sql db and write it again in azure sql db please make a video on it

  • @phanisrikrishna
    @phanisrikrishna Рік тому

    Hi Sir, It is a great series and well structured ones with regular topics and interview questions. Can you also share the notebooks for reference and practice. Thanks a lot in advance.

  • @sravankumar1767
    @sravankumar1767 3 роки тому

    Superb bro 👌 👏

  • @BhakthiYoutube
    @BhakthiYoutube Рік тому +1

    Can you please provide us the some big data end to end projects involving all components

    • @rajasdataengineering7585
      @rajasdataengineering7585  Рік тому

      Hi, I have already created one video on end to end project using multiple components.
      Pls refer the video number 87
      ua-cam.com/video/dxxXWe4gNTo/v-deo.html

  • @nagulmeerashaik5336
    @nagulmeerashaik5336 2 роки тому +1

    When we
    learn azure And also required to learn pyspark?

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому +1

      Not really. Azure has many services within data engineering like adf, synapse you analytics, databricks etc. Pyspak is mainly needed for databricks developer and spark pool inside synapse.
      If your project is on adf, Pyspark does not play any role

    • @nagulmeerashaik5336
      @nagulmeerashaik5336 2 роки тому

      Thanks.

  • @paarthiban8452
    @paarthiban8452 Рік тому +1

    A nice video. Can you create another video to automate this pipeline using Airflow?

  • @prathapganesh7021
    @prathapganesh7021 Рік тому +1

    Thank you for the content

  • @shalinikumari-qx9tn
    @shalinikumari-qx9tn 3 роки тому +1

    I want the dataset that you used..how to get

    • @rajasdataengineering7585
      @rajasdataengineering7585  3 роки тому

      Hi Shalini, I have used the sample database adventure works in this exercise. It is open source dataset

    • @kartikeshsaurkar4353
      @kartikeshsaurkar4353 3 роки тому +1

      While creating azure sql db you'll see the option to have sample tables/databases. After creation, you'll see few tables will be present by default

    • @shot_freeze
      @shot_freeze 2 роки тому +1

      @@kartikeshsaurkar4353 is that table has any data in it by default have u checked it ?

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому

      Yes Ajay, it would have sample data as well to practice

    • @shot_freeze
      @shot_freeze 2 роки тому

      @@rajasdataengineering7585 Thanks Raja !!! For your reply , Your videos are really helpful to prepare Azure Data Engineer.

  • @srinivasanmadeshwaran1130
    @srinivasanmadeshwaran1130 2 роки тому

    Hi Sir, i am getting below error when i try to connect jdbc
    java.sql.SQLException: No suitable driver
    pls help on this

    • @ayushbhatt9469
      @ayushbhatt9469 7 місяців тому +1

      .option("driver", jdbcDriver)
      add this code as well in the second step to resolve your issue

  • @susobhanghosh5093
    @susobhanghosh5093 Рік тому

    It will be very helpful if u can share the notebook also in HTML format

  • @AdexDurojaiye
    @AdexDurojaiye Рік тому

    Do you have a class where you can train me. thanks for your video.

  • @Umerkhange
    @Umerkhange 2 роки тому

    How to apply multiple rules in a single statement like SUM("UnitPrice"), SUM("TOTALLINE), AVG(PRICE) ETC

  • @mohsintamboli4394
    @mohsintamboli4394 2 роки тому +1

    Best example

  • @chinna4549
    @chinna4549 Рік тому +1

    Nice anna

  • @gunasekar_vs
    @gunasekar_vs 2 роки тому

    Thanks for great explanation 🙏🙏... I couldn't able to see the code clearly so if you don't mind can you pls share the code. We can also try once by following your videos 🙏🙏

  • @anil6328
    @anil6328 3 роки тому +1

    Explained very well 👌
    Can someone help on how to setup jdbc without showing password in the code?

    • @rajasdataengineering7585
      @rajasdataengineering7585  3 роки тому +1

      Hi Anil, yes that is possible.
      I have already explained that concept in key vault integration video. Please go through once
      ua-cam.com/video/c2EmTS_s5zw/v-deo.html

  • @Learn2Share786
    @Learn2Share786 2 роки тому

    @raja, nice explanation.. can you pls share notebook ?

  • @nivasnivi
    @nivasnivi 2 роки тому

    Can you help me on one small assignment sir please??

  • @sujitunim
    @sujitunim 2 роки тому +1

    👍👍

  • @hozefakanchwala8720
    @hozefakanchwala8720 2 роки тому +2

    It's a nice use case for batch processing but you shouldn't call it Real Time ETL.

    • @rajasdataengineering7585
      @rajasdataengineering7585  2 роки тому

      Real time does not mean streaming data here. It means one of the real time use case for ETL requirement.

  • @sravankumar1767
    @sravankumar1767 3 роки тому

    How to handle bad records in Azure databricks

    • @rajasdataengineering7585
      @rajasdataengineering7585  3 роки тому +2

      Hi Sravan, I have already posted a video on how to handle bad records. You can refer that
      ua-cam.com/video/w_AWaXnaI94/v-deo.html

    • @sravankumar1767
      @sravankumar1767 3 роки тому

      @@rajasdataengineering7585 thanks

  • @MRCyberstriker
    @MRCyberstriker Рік тому +1

    No intermediate steps explained, a beginner will find difficulty following! Please make it beginner friendly. take it as feedback

    • @rajasdataengineering7585
      @rajasdataengineering7585  Рік тому

      The intermediate steps are already explained in videos 17 and 18. Pls watch them as a prerequisite to this video
      ua-cam.com/video/bZzh7kfBcx4/v-deo.html
      ua-cam.com/video/xxN88Ca4ues/v-deo.html

  • @nagulmeerashaik5336
    @nagulmeerashaik5336 2 роки тому +1

    Most companies are asking that's why