21. Dynamic Column mapping in Copy Activity in Azure Data Factory

Поділитися
Вставка
  • Опубліковано 22 січ 2025

КОМЕНТАРІ • 120

  • @vijaybodkhe8379
    @vijaybodkhe8379 Рік тому +1

    Thanks

  • @papachoudhary5482
    @papachoudhary5482 3 роки тому +5

    Khuda aap ko lambi umar dei...
    Sir, Long Live ! Long Live !

    • @WafaStudies
      @WafaStudies  3 роки тому +1

      Thank you so much for such a kind words 🙂☺️☺️

  • @Funwithkiddoss
    @Funwithkiddoss 3 роки тому +4

    Good scenario and superb demonstration. Thanks for your videos

  • @vijaybodkhe8379
    @vijaybodkhe8379 Рік тому

    This is a really useful video. Thanks for sharing and explaining in easy-to-understand language.

  • @neha1075
    @neha1075 3 роки тому +1

    How to pass these dynamic mapping if my json is array type?

  • @ajaikumars6948
    @ajaikumars6948 19 днів тому

    I could have saved 3 weeks if I had seen this video before. Thanks bro

  • @ReelsVibe1
    @ReelsVibe1 Рік тому

    Sir if we have done mapping manually to copy json object then why do we need to do it dynamically? Please reply sir

  • @satyaraj21
    @satyaraj21 7 місяців тому

    I have scenario of handling empty string in Source need to populate in DateTime column in target using TabularTranslator logic.

  • @acharjyadebasis
    @acharjyadebasis 7 місяців тому

    if the excel column is changing (like Emp_name/emp_name/EMP_NAME) then how to handle this scenario

  • @yashaswinividwanmani2143
    @yashaswinividwanmani2143 2 роки тому

    Data read and data written are not same I.e my source consists of details of 20 employees but my output when loaded into SQL db ...only first employee details are loaded...could u please help me resolve this issue???

  • @bagamanocnon
    @bagamanocnon 2 роки тому

    Hi. How do I change the type/data type in the Sink/destination dataset for a certain column? Say I want to change Emp_Id from int to nvarchar. I can't see any button to change the data type. thanks.

  • @NeumsFor9
    @NeumsFor9 3 місяці тому

    Is there a video for doing the same with JSON sources, dynamically passing the collection reference, the dynamic mapping itself, etc? Wasn't sure what the syntax was for it and if it is able to handle flattening any nested or complex types on the fly.....into a csv. We ignore headers if they are present with csv sources and pass "Prop_N" as the source column headers, along with the number of rows to skip. Wasn't sure what the equivalent was for "Prop_N" when it came to json..... perhaps the .[n] syntax?

  • @himanshutrivedi4956
    @himanshutrivedi4956 3 роки тому +1

    Wonderful ..You Rocks as always..My Azure Rockstar..

  • @sanketchakane7745
    @sanketchakane7745 Рік тому

    @WafaStudies I am still confused if it is still a dynamic way to accept data types because anyway we are storing the data type in a physical table and we are accepting as a input while load if input data type changes then table_mapping still has old data how to update that table before load satrts

  • @MarthaLigiaEstebanRamos
    @MarthaLigiaEstebanRamos Рік тому

    Hi, can yo guide me please? I'm loading csv file, but whe the data is in SQL, these data it is in the differente order, I need the data in the same order that the source. What can I do for this?

  • @deepureddy6567
    @deepureddy6567 Рік тому +1

    Great video sir these seem to be real time scenarios which are faced in projects

  • @karnatimanikantareddy9969
    @karnatimanikantareddy9969 2 роки тому

    Anna i am able to till 19 columns only, if I add one more extra column , i am getting the error like (Failed to run pipeline1 ) something i am getting can u pls Tess is there any limitations for adding the columns pls suggest anna

  • @gpriyanka811
    @gpriyanka811 3 роки тому

    how to concatenate two column values from source into one column value into destination in copy activity under mapping

  • @jebakanifamilyjyojay33
    @jebakanifamilyjyojay33 2 роки тому

    Hi , How can we manipulate/transform the source column in Copy data activity - Additional columns mapping ?

  • @generaltalksoflife
    @generaltalksoflife 3 роки тому +4

    Hi Maheer, if you revise at the end of the video like what all activities you did in the pipeline then would be very nice. Thanks. Keep it up. Best of luck.

  • @priyankadp5777
    @priyankadp5777 9 місяців тому

    Hi Maheer .another wonderful video..
    I have a scenario.. to read from on prem db and write to azure sql db (upsert operation) .i am able to do this using copy activity. I have 2 new columns only in sink( create date and update date) .. create date shud be set with current date when inserting new record and update date column shud be updated only in case of updating the record(not inserting) .. how do achieve this using upsert copy activity.. i dont find any way to distinguish between new record or existinf record in copy

  • @suktvm
    @suktvm 9 місяців тому

    Will it fail if the 2 tables contains similar column name ?

  • @RahulKumar-jg5ly
    @RahulKumar-jg5ly 3 роки тому +1

    Great explanation and presentation 👍👍👍....it is really helpful.

  • @sujitunim
    @sujitunim Рік тому

    Hi Maheer I have one use case in which I am getting data from the rest API but the order of the column is not the same every time. And I need to write a response in the CSV file. But the column should be a specific order. Is there any way to handle this in ADF copy activity.

  • @mayurkrish75
    @mayurkrish75 Рік тому

    Kudos to this tutorial! Very well explained.

  • @vijaymulimath6519
    @vijaymulimath6519 Рік тому +1

    i have one doubt.......that u have already hardcoded mapping throgh that u got json file so my question is is it dynamic mapping?

  • @shanavajshanu7196
    @shanavajshanu7196 2 роки тому

    Hi Sir, Is it possible to add an if condition on the source column in json mapping

  • @roshankumargupta46
    @roshankumargupta46 3 роки тому +1

    Great Maheer!
    Would be great if you can make one video on how do you perform unit testing on copy activity to check whether rows and columns copied correctly or not

  • @venkatrajak6277
    @venkatrajak6277 3 роки тому +1

    Hi Sir, thank you soo much for this Video.
    I have one doubt, we are already mapping manually, so it will be fixed for multiple runs rt? then what is the use of passing this dynamically? because we have to map and take the json any how for the first time. please help me in this?

    • @vickym3193
      @vickym3193 3 роки тому +1

      One reason I can think of is - if in future, you get more columns, you can directly add more items to the JSON, instead of modifying your ADF pipeline.

  • @geoffchaddock
    @geoffchaddock Рік тому

    Great tutorial, proved very helpful in my real time implementation

  • @Sandeep.Gupta27
    @Sandeep.Gupta27 3 роки тому +2

    Thanks for the video. Its very informative.
    I have one doubt. Suppose we execute the pipeline once and for the 2nd execution what if the source file have some additional or missing columns!? Will this give error?
    Thanks

    • @ajaykhedkar5940
      @ajaykhedkar5940 3 роки тому +1

      Yes , I have same doubt.
      If my files are coming daily, and one of file have some additional columns then how can we cope up with that?
      2nd thing if I wanted to vomit those extra columns then can I do that? If yes, then how? @maheer

    • @SairamV17
      @SairamV17 2 роки тому

      Did you get any answer ? For this

    • @SairamV17
      @SairamV17 2 роки тому

      #Wafastudios please answer this ?

  • @hindi-english1664
    @hindi-english1664 2 роки тому

    Is it working for delta loading?

  • @gurumoorthy5321
    @gurumoorthy5321 5 місяців тому +1

    Hi Maheer,
    Thank You... The solution is very interesting.
    I have 2 questions --
    1. Is it possible to run the for each parallelly (can we Uncheck "Sequential") -- ?
    2. This json is the only possible solution to dynamically mapping the columns -- Do we have any other alternate logic -- ?

  • @santhidhanuskodi8668
    @santhidhanuskodi8668 3 роки тому

    do u have the json code for the overall pipeline and DB changes? we can download and refer

  • @rodrigoalejandronunezcabre3150
    @rodrigoalejandronunezcabre3150 3 роки тому +3

    hello master!! , thanks for the video.
    ¿It is possible to perform a dynamic mapping from a web source and destination in to data lake?
    Thank you
    !

    • @parthgovekar8693
      @parthgovekar8693 10 місяців тому

      hi, did you found any solution for dynamic mapping from a web source ?

  • @Benwooduk85
    @Benwooduk85 2 роки тому

    Amazing tutorial, very well explained and works a treat for me using CRM as a source. My only issue is I have to match source and sink field name, including case. Very odd as you clearly don't have that issue in your demonstration.

  • @vasavig4612
    @vasavig4612 3 роки тому +1

    Hi thanks for this video. Can we read Excel sheets dynamically in Azure data factory

    • @WafaStudies
      @WafaStudies  3 роки тому

      You need to maintain file and sheet names some where as configuration and then lookup that data to dynamically point to sheet names

    • @deepjyotimitra1340
      @deepjyotimitra1340 3 роки тому

      Hi,
      I have a requirement of loading all excel files, it may be csv or xls into sql db. The file names are not known because it will be dynamic everytime only file extension is fixed. Can we read file from blob using *.xls or *.csv?

    • @deepjyotimitra1340
      @deepjyotimitra1340 3 роки тому

      For my requirement, i think the ans will be metadata activity.

  • @0shaan0
    @0shaan0 3 роки тому

    Hello Maheer,
    All your videos are simply superb and anyone can learn about Azure very easily ...would like to request you to Please create some videos on Azure Architecture....

  • @sureshthippani5163
    @sureshthippani5163 3 роки тому +2

    Hi This tutorial is very good. can you please prepare a video to create dynamic dataset for different environments and please create some videos for COSMOS DB (No SQL) .once again thanks sir.

  • @annukumari9629
    @annukumari9629 3 роки тому +1

    Wow..! Very nicely presented. Thankyou so much for your constant effort. May you recieve many more accomplishments and blessings. You deserve it all. Keep going. :)

  • @vijaysagar5984
    @vijaysagar5984 2 роки тому

    Hi Bro,
    Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.

    • @manasam7777
      @manasam7777 Рік тому

      Hi @vijaysagar5984 any solution of the asked question?

  • @priyankapatimidi2392
    @priyankapatimidi2392 3 роки тому +1

    Great content, very much helpful in real time implementation

  • @nivedabaskar316
    @nivedabaskar316 2 роки тому +1

    Thank you very much sir , very useful content . 👍

  • @nitagawade3330
    @nitagawade3330 2 роки тому

    Outstanding Maheer!!!

  • @udaychodagiri252
    @udaychodagiri252 10 місяців тому

    Nice Video. Please keep on doing this kind of stuff.

  • @prasangisreenwas9088
    @prasangisreenwas9088 Рік тому

    Like same task dynamically set column names by using maping data flows

  • @battulasuresh9306
    @battulasuresh9306 3 роки тому +1

    Master piece
    Add more videos in this play list

    • @WafaStudies
      @WafaStudies  3 роки тому

      Thank you 🙂

    • @battulasuresh9306
      @battulasuresh9306 3 роки тому

      @@WafaStudies anna, ADB Nerchukovalantey pre Requsities konchem chepparaa?
      currently working in deloitte on ADF

    • @HanumanSagar
      @HanumanSagar 3 роки тому

      @@battulasuresh9306 Hi Bro which project u are working in Deloitte?..

  • @sudheermattapally2791
    @sudheermattapally2791 3 роки тому +1

    Awesome....very useful article.

  • @ragharaj3367
    @ragharaj3367 2 роки тому +1

    This is a great video! Thanks for that :)
    I have scenario where I have to map one column from the source to multiple sink columns. Please could you let me know if there is a way to do this.
    I would really appreciate your help on this.

    • @nagasandeepkumarkapa1561
      @nagasandeepkumarkapa1561 2 роки тому

      Hi ,Can I know if you have found any resolution for this one

    • @ragharaj3367
      @ragharaj3367 2 роки тому +1

      @@nagasandeepkumarkapa1561 Microsoft say there is no way we can do this, but we had to make tweaks to the JSON to get it working

    • @nagasandeepkumarkapa1561
      @nagasandeepkumarkapa1561 2 роки тому

      Thanks for that , any chance you have done that ? Please let me know just the idea if it is done.

  • @MaheshReddyPeddaggari
    @MaheshReddyPeddaggari 3 роки тому +1

    Great content
    Thanks Maheer

  • @anjireddy5931
    @anjireddy5931 3 роки тому +1

    Maheer if you don't mind, please tell me how to create azure free account, I have created account but it has showed to me as you're not eligible for an azure free account

    • @WafaStudies
      @WafaStudies  3 роки тому

      ua-cam.com/video/qzHrz7Q5474/v-deo.html

  • @nuwanmenuka
    @nuwanmenuka Рік тому

    Amazing explanation

  • @shekareddy5146
    @shekareddy5146 3 роки тому +1

    I was looking for the same 👍

  • @AnandKumar-dc2bf
    @AnandKumar-dc2bf 3 роки тому +1

    Excellent video...

  • @rk-ej9ep
    @rk-ej9ep 2 роки тому +1

    Very nice..👍

  • @dibassimohamed5514
    @dibassimohamed5514 3 роки тому +1

    It magical. Thank you a lot

  • @shubhampawade2933
    @shubhampawade2933 2 роки тому +1

    Instead of SQL table can we store mappings in a file? If yes then how do we query the data from that file. Please any response is appreciated! Btw Thanks for the amazing video series!

    • @vasdecabeza2
      @vasdecabeza2 2 роки тому

      Yes, it can be in file stored in Blob Storage and use Metadata Activity to retrieve that.

    • @shubhampawade2933
      @shubhampawade2933 2 роки тому

      @@vasdecabeza2 But get metadata activity gives us the information about the file right? And not the content of the file?

  • @avirozenboim6446
    @avirozenboim6446 3 роки тому +4

    Thanks for this - was so helpful .... If I may , instead of manually creating the json I used string_agg function to generate the json automatically from a simple mapping table where you map source column to target column for each entity
    SELECT EntityId,
    ‘”type”: “TabularTranslator”, “mappings”: [ ‘
    + string_agg(
    ‘{“source”:{ “name”:”‘ + SourceColumn + ‘”},”sink”:{“name”:”‘ + TargetColumn + ‘”}}’,
    ‘,’
    ) + ‘ ] } ‘ AS ColumnMapping
    FROM dwh_control.entity_column_mapping em
    GROUP BY em.EntityId

    • @WafaStudies
      @WafaStudies  3 роки тому +2

      Great nice way

    • @SQL4ALL
      @SQL4ALL 2 роки тому

      @@WafaStudies Great video. Thank you

    • @SQL4ALL
      @SQL4ALL 2 роки тому

      Thanks Avi, this is really helpful. You might need to add leading { in the query ....tested works for me

    • @anshuldubey9395
      @anshuldubey9395 Рік тому

      Great finding Avi

  • @skselva403
    @skselva403 3 роки тому +1

    Super brother 👌👌👌

  • @potrunaresh2241
    @potrunaresh2241 3 роки тому

    Requirement: Create poc for daynamic pipelines with config table and store proc.attached datasets are to load into tables in SQL and create add piplnes to load the data dynamically with config table
    . please tell me that bettr approach
    Thanks
    I am having 8 excel files

  • @anuragkhare3821
    @anuragkhare3821 3 роки тому

    Its very helpful video for everyone. Thanks for sharing. I have similar kind of requirements but my data is in JSON format.
    I have data in JSON format in ADLS Gen2 and need to load the data into Azure SQL Table. I have collection reference under that I need take all data point.
    Challenge: Data point (Number of column) may vary over the time. How can we do it dynamically.?
    Example: Under the Collection Reference, we have 90 columns. After a year it may be 95 or 100.

    • @sanketchakane7745
      @sanketchakane7745 Рік тому

      in data flow there is a option swift drift, inferschema

  • @GrowthMindsetGlobal
    @GrowthMindsetGlobal 3 роки тому +1

    Really useful

  • @jolyjuju
    @jolyjuju Рік тому

    Its really very useful video for many no doubt, just one more thing, for example if source file has DOB or Birthdate which need to sink to the BirthDate column, in this case if i give multiple source tag, its not working, can any one have an idea to achive this

  • @parthpatwardhan3450
    @parthpatwardhan3450 Рік тому

    Good work but most of the things you have created already how should we get it yaar

  • @sonamkori8169
    @sonamkori8169 2 роки тому +1

    Thank You Sir

  • @KostikV
    @KostikV Рік тому

    Many thnx!

  • @ayushibansal7947
    @ayushibansal7947 4 місяці тому

    Awesome .

  • @snvp786
    @snvp786 2 роки тому +1

    Nice

  • @nightfury5967
    @nightfury5967 2 роки тому +1

    thanks. :)

  • @dbasalo
    @dbasalo Рік тому

    Excellent video, you got a like from me, and I would add a subscription if I wasn't subscribed already!

  • @cloudfitness
    @cloudfitness 3 роки тому

    👍🏻

  • @priyankadp5777
    @priyankadp5777 9 місяців тому

    Hi Maheer .another wonderful video..
    I have a scenario.. to read from on prem db and write to azure sql db (upsert operation) .i am able to do this using copy activity. I have 2 new columns only in sink( create date and update date) .. create date shud be set with current date when inserting new record and update date column shud be updated only in case of updating the record(not inserting) .. how do achieve this using upsert copy activity.. i dont find any way to distinguish between new record or existinf record in copy

  • @sandakelumhalkewela7854
    @sandakelumhalkewela7854 3 місяці тому

    Thank you very much sir. very clearly explained❤

  • @srinubathina7191
    @srinubathina7191 Рік тому

    Thank You Sir