35. Compare source data with target data using data flows in Azure data factory or Azure Synapse

Поділитися
Вставка
  • Опубліковано 14 жов 2024
  • In this video, I discussed about comparing data between source and target using data flows in Azure data factory or Azure Synapse.
    Link for Azure Synapse Analytics Playlist:
    • 1. Introduction to Azu...
    Link for Azure Databricks Play list:
    • 1. Introduction to Az...
    Link for Azure Functions Play list:
    • 1. Introduction to Azu...
    Link for Azure Basics Play list:
    • 1. What is Azure and C...
    Link for Azure Data factory Play list:
    • 1. Introduction to Azu...
    Link for Azure Data Factory Real time Scenarios
    • 1. Handle Error Rows i...
    Link for Azure LogicApps playlist
    • 1. Introduction to Azu...
    #Azure #AzureDatafactory #DataFactory

КОМЕНТАРІ • 43

  • @sambathnarayananparthasara9403
    @sambathnarayananparthasara9403 10 місяців тому

    I am using Synapse. As you mentioned, ADF is integrated into that. I am able to replicate your demos in Synapse Analytic. Thanks. Good job !

    • @JesI-lf4jy
      @JesI-lf4jy 9 місяців тому

      Is DQ stand for data quality in synapse big data? What is BG DQ?

  • @ranjansrivastava9256
    @ranjansrivastava9256 8 місяців тому +1

    Dear Maheer, Really very Informative Video. Could you please prepare one video for the Data Reconciliation in ADF the way you have created and from there we can able to send email to the customer or stakeholder like we have checked the data quality before sharing the report to the customers. Really appreciable for that.

  • @sumanyarlagadda6271
    @sumanyarlagadda6271 2 роки тому

    The shortcut to a clear-cut explanation is your knowledge sharing videos on ADF. Great to follow your channel or playlist. Thank you so much🙇 🎆

    • @WafaStudies
      @WafaStudies  2 роки тому +1

      Thank you for your kind words ☺️

  • @annukumari9629
    @annukumari9629 2 роки тому +2

    How did I miss this video.. This is amazing 😊 Thanks

  • @akashsharma4769
    @akashsharma4769 2 роки тому

    Hello sir. Firstly I wanna thank you for this amazing series on adf it's been helping a lot. Sir , will you be adding more real world scenarios to this playlist or its the end of this series.

  • @sunilpandey7197
    @sunilpandey7197 Рік тому

    Hi Maheer, Thank you for good quality videos on Azure!
    Could you please help me with the following:
    1. I need to test two Azure SQL DB's -
    - Need to compare if all the data rows are same, give output if there are any different rows or missing rows in the target DB table
    2. All general ETL sceanrios - Schema, tables count, table data etc.
    3. Transformation testing - if the table has any transoformation then test it on the target Azure DB table w.r.t to Source Table
    Overall idea is to to proper ETL testing
    Please let me know if that's possible? If yes, then please point me right direction by providing some links or reference to go through?
    Regards,
    Sunil

  • @sonamkori8169
    @sonamkori8169 2 роки тому +1

    Thank You sooo much for wonderful explanation.

  • @yedukondalu9100
    @yedukondalu9100 2 роки тому

    hi sir i hope i get a job by watching you your videos these are really looking good and good knowledge of information in it I really thank you

  • @brahmareddypotlapalli4060
    @brahmareddypotlapalli4060 2 роки тому +1

    Hi ,how can we get updated records as part of incremental load ?

  • @ayocs2
    @ayocs2 Рік тому

    hello sir, is it appropriate to use Exists against two database source (SQL Server and Postgres)
    i want to insert only the non existing rows from one database to another
    and also to note that I've to do this on production data; and sink is postgres

  • @srikanthreddy8779
    @srikanthreddy8779 2 роки тому +1

    It is helpful for my scenario..

  • @NeumsFor9
    @NeumsFor9 Рік тому

    When I have a small column set like this, I like to concatenate a delimiter to minimize collisions as a safe practice. Been burned before whwn not using it.

  • @ObservePeace
    @ObservePeace Рік тому

    which ETL tool can I use to compare data between on-prem SSMS and ADLS gen 2

  • @spranavshanker
    @spranavshanker 2 роки тому +1

    Can you please create this scenario using pipeline and activity, without using mapping dataflow. I think this will not be best approach with large data and costlier.

  • @guddu11000
    @guddu11000 2 роки тому +1

    how to exculde column from columns function, lets say 1 have 5 columns and want to use just 4 column but don't want to select individual column rather i want to use column() funtion and exculde a columnfrom it

  • @srinivasarao416
    @srinivasarao416 2 роки тому

    Hi Brother, it is just like SCD2 . shall we use same SCD2 logic here. Please let me know

  • @durgalakshmi3213
    @durgalakshmi3213 Рік тому

    hi..so when u add the sink at the last, will only the new column get added ?

  • @pankajjagdale-t5n
    @pankajjagdale-t5n 3 місяці тому

    Great

  • @kiranpatil4968
    @kiranpatil4968 9 місяців тому

    WhenCan you start new course for azure databrics with pyspark

  • @ksnaveenkumar1
    @ksnaveenkumar1 2 роки тому

    Have you tried this approach with large volume of data? I am curious to see the performance with large datasets.

    • @CoolGuy
      @CoolGuy 2 роки тому +1

      If the volume is large, use ADB.

  • @ravulapallivenkatagurnadha9605
    @ravulapallivenkatagurnadha9605 2 роки тому

    Continue this videos

  • @katha169
    @katha169 2 роки тому +1

    how to replace a null records in a table with someother value using ADF?

    • @WafaStudies
      @WafaStudies  2 роки тому

      U can use dataflows for this. Use derived column transformation

  • @JesI-lf4jy
    @JesI-lf4jy 9 місяців тому

    Is data quality DQ in analytics? What is BG DQ

  • @harithad1757
    @harithad1757 2 роки тому +1

    Too good

  • @ayushtiwari2649
    @ayushtiwari2649 2 роки тому

    Sir, please make a video on Incremental Data Loading Using ADF and Change Tracking for multiple tables.

    • @maheshreddy732
      @maheshreddy732 2 місяці тому

      did u get any info on this Multiple tables?

  • @mankaransaggu15
    @mankaransaggu15 2 роки тому

    hi sir .. how we can connect with you.... need some help with a use case

  • @hnii99
    @hnii99 2 роки тому

    Good work bro 👍

  • @NaveenKumar-kb2fm
    @NaveenKumar-kb2fm Рік тому

    Can you please do the same using ADB and post it

  • @sunilveerendravattikuti3848
    @sunilveerendravattikuti3848 2 роки тому

    Hi Maheer,
    I have a query here. If I use exist transformation(does not exist rows) directly without using any other transformation before. I think we will get the same result right? Pls clarify. Thanks.

    • @WafaStudies
      @WafaStudies  2 роки тому +4

      In that case for condition inside exists u need to add multiple conditions for multiple columns and to check. Instead generating one unique column(hash) based on all columns and then using that inside condition is best way.

    • @sunilveerendravattikuti3848
      @sunilveerendravattikuti3848 2 роки тому

      @@WafaStudies yeah. Thanks Maheer.

  • @Lazy_tuber_Ankita
    @Lazy_tuber_Ankita Рік тому

    Great