Azure Data Factory (ADF) How To Tip: Metadata-driven Copy Task = Flexible Copy Activity​

Поділитися
Вставка
  • Опубліковано 14 жов 2024

КОМЕНТАРІ • 24

  • @ASHOKKUMAR-vx9rh
    @ASHOKKUMAR-vx9rh 2 роки тому

    Really nice explanation Murugan without creating any pipeline activities in Data Factory they can create automatic scripts and activities so much time reduced to finish the task and copying data from SQL Server to ADLS Thank you so much for sharing this information.

    • @dataplatformcentral3036
      @dataplatformcentral3036  2 роки тому

      Thanks Ashok
      Glad that you liked it :)
      Feel free to share across. Also feel free to subscribe and hit bell icon for getting notifications on new videos
      Each week new video will be released based on Azure/ Power Platform based topics

  • @srikanthvarma5026
    @srikanthvarma5026 2 роки тому

    Super Explanation Murugan .Thank you for the video

    • @dataplatformcentral3036
      @dataplatformcentral3036  2 роки тому

      Thanks :)
      Feel free to like, share and subscribe.
      Make sure you press bel icon for getting notification on the new videos every week

  • @cedriclabuschagne5978
    @cedriclabuschagne5978 2 роки тому

    Great video, now to do this programmatically not using the user interface. Generation rather than configuration to be truly metadata driven. When one wants to do this for multiple sources and hundreds of tables.

  • @DilipDiwakarAricent
    @DilipDiwakarAricent 2 роки тому

    Nice explaination Murugan. Did you created any video on create databricks Lakehouse solution using ADF.

  • @prabhatratnala6589
    @prabhatratnala6589 Рік тому

    Wonderful, quick question, how does version control span out for the control table? Does it auto integrate with Azure repos? If yes how?

  • @arindomjit
    @arindomjit 3 роки тому

    Thanks for the in-depth video. It was very helpful. If I have different sources (SQL, Oracle, DB2), how would you recommend making entry into the control table for each of these different sources?

    • @dataplatformcentral3036
      @dataplatformcentral3036  3 роки тому +1

      If you have different sources you need multiple metadata driven pipelines. Properties like IR name, database type, file format type cannot be parameterized as on today. So you would need separate parameterized pipeline targeting each source. They all can use the same control table though and will store their own metadata info based on tables included
      This is documented here as well
      docs.microsoft.com/en-us/azure/data-factory/copy-data-tool-metadata-driven

  • @terryliu3635
    @terryliu3635 2 роки тому

    Great demo. Thanks for sharing! Quick question...do you know if this preview feature will be able to support SAP Table Connector, as we're trying to load the data from SAP ECC? Thanks.

    • @dataplatformcentral3036
      @dataplatformcentral3036  2 роки тому

      Sorry not too sure on that. I've not worked with SAP yet!
      But since it is supported as a source for Copy activity, I would expect it to work fine for Metadata-driven copy activity too.

  • @vibhaskashyap8247
    @vibhaskashyap8247 2 роки тому

    Nice video

    • @dataplatformcentral3036
      @dataplatformcentral3036  2 роки тому

      Thanks
      Feel free to like, share and subscribe
      Click bell icon to get notifications for new videos

  • @snad256
    @snad256 3 роки тому

    Hi,
    From where did you get the Sales table?

    • @dataplatformcentral3036
      @dataplatformcentral3036  3 роки тому

      sales is a schema under which different tables exist within the test database. It's taken from Adventureworks sample database

  • @Deepaksharma-we4eo
    @Deepaksharma-we4eo 2 роки тому

    Hi sir need you help
    Is there any other approach to go for the tables having foreign key relationships ?
    I am getting foreign key violation error message

  • @ketanmehta3058
    @ketanmehta3058 2 роки тому

    How we can add the where clause to the query?

    • @dataplatformcentral3036
      @dataplatformcentral3036  2 роки тому

      WHERE clause to your our data extraction query or the control table one?

    • @ketanmehta3058
      @ketanmehta3058 2 роки тому

      @@dataplatformcentral3036 In the metadata drive we can select only the table name but it did not give the option to apply filter criteria.
      As an example: select * from emp where dept = 'HR'
      I want to include the where clause in the HR table. How can I achieve it?

    • @nigelnaicker7948
      @nigelnaicker7948 2 роки тому

      @@ketanmehta3058 when you select the views,tables you want to copy, you are given the option to do a configure on the individual tables\views or use the same config for all, if you clicked on configure individual, there is an advanced expander, when you click it, you will see the query in use, which is generally 'select * from table' , you can edit this query however you want to. Hope this helps.