Azure Data Factory - Iterate over a data collection using Lookup and ForEach Activities

Поділитися
Вставка
  • Опубліковано 31 січ 2025

КОМЕНТАРІ • 59

  • @isuruwickramasinghe9119
    @isuruwickramasinghe9119 3 роки тому

    One of the best lectures in the industry!

  • @pbidax2453
    @pbidax2453 4 роки тому +1

    Great Explanation and with Great Command on Subject. Never seen any trainer explaining like this with this much Clarity and command... :)...All the Best.

  • @sarthaks
    @sarthaks 6 років тому +1

    Thanks! a lot for this video. It really helped me in understanding some of key concepts around ADF V2. Eagerly waiting for more advanced videos around ADF V2

  • @mukundakkoor4799
    @mukundakkoor4799 4 роки тому

    Good explanation with real life examples. Thank you.

  • @heyitsjillio
    @heyitsjillio 6 років тому +3

    This is great, thank you for creating this content and sharing!

  • @vanidendukuri303
    @vanidendukuri303 6 років тому

    Excellent demo with good examples for better understanding.

  • @gvsivaprasad851
    @gvsivaprasad851 5 років тому

    Hi Dinesh, I am new to ADF. I helped me a lot for knowing what all. it will be too helpful if you done the videos about all the activities. Thank you GV

  • @vedanthasm2659
    @vedanthasm2659 5 років тому

    Excellent Video Dinesh.

  • @gayathrirokkam4652
    @gayathrirokkam4652 6 років тому +1

    I am big follower to you, it’s excellent job you are doing…waiting for the next video..

  • @LouRainaldi
    @LouRainaldi 5 років тому

    Great videos. Very informative and presented great! Wish I could have found it a month ago, but know I might refactor based on what I am learning from what you have presented. :) Thanks!

  • @ravichandrajl5366
    @ravichandrajl5366 6 років тому +4

    Hi Dinesh, Your videos are very Good. One small request if possible could you please upload videos with respect to custom activity,Web activity,Azure Function and HD insights so that it will be helpful .

  • @PRABHATKUMAR-sc4qo
    @PRABHATKUMAR-sc4qo 5 років тому

    Thank you sir ... Please continue making this kind of demonstrative video. God bless you :)

  • @raviv5109
    @raviv5109 4 роки тому

    Detailed, Good Work !! Keep coming ..!

  • @chamilam
    @chamilam 4 роки тому

    Thanks Dinesh, this is very insightful

  • @rohitchawla6645
    @rohitchawla6645 5 років тому

    This tutorial is really great...Thanks
    bro

  • @kanchivishnuvardhan1231
    @kanchivishnuvardhan1231 5 років тому

    Great Commitment, really appreciable , more videos on ADF V2 please

  • @NeerajKumar-yh5lf
    @NeerajKumar-yh5lf 5 років тому

    Thanks for making this great video.. really helpful.

  • @anjanikumar4302
    @anjanikumar4302 5 років тому +1

    This is great,Thank you very much it is very helpful.

  • @sudhakarwadekar9251
    @sudhakarwadekar9251 5 років тому +1

    Very useful. Thanks a lot.

  • @guptaashok121
    @guptaashok121 5 років тому

    Dinesh, i am doing similar foreach loop to load multiple sql table data from source to destination. we are also logging the table load start and complete just before and after the copy activity. we want to use the rowsCopied and rowsRead property of copy to populate in our log table. we also want to load tables in parallel as they do not have dependency. i suspect that due to parallel processing rowsCopied/rowsRead data may conflict among different tables as we refer to "@activity('ActivityName').output.rowsRead", how would it distinguish this value across different loads?

  • @manniedigicast
    @manniedigicast 2 роки тому

    Hi Dinesh. How can I copy the output of a lookup activity to blob storage or ADLS?

  • @ahish1111
    @ahish1111 4 роки тому

    @Dinesh Priyankara its a great and informative video. I have a scenario where i have to pass dynamic schema names into stored procedure and run that procedure in loop and combine result in a sql table.

  • @swetadani4060
    @swetadani4060 5 років тому

    Hello DInesh , Thanks! a lot for this video. I have a scenario where in I have an ADB notebook in the foreach loop whose output i want to copy inside ADB folder, can you please suggest , using copy activity i am not able to ..

  • @mikem8915
    @mikem8915 5 років тому

    Outstanding.

  • @PATRICKCHUAD
    @PATRICKCHUAD 5 років тому

    Great video! Thank you for explaining thr ADF. Can you please create a video to update Dynamics entity from on-prem sql based on one field say model number ? I dont know how to update the GUID if the record already exist. If new records i want to insert but update only if existing.

  • @sanishthomas2858
    @sanishthomas2858 5 років тому

    Nice.. Share more such videos on ADF. Also what is the best source or guide to install and work on Data factory in our system

  • @amitjaju9060
    @amitjaju9060 4 роки тому

    Hello Sir,
    I have a scenario where in particular folder we have 2 file with different format and 2 external table created to read those file. Now I got multiple files with same format and file name except timestamp is different for all so how can I read all those same file format file with same naming convention except timestamp is different for all using external table.
    I don't want to create new external table and with subfolder option .
    Can we read all same file format files with wild card.
    Please help me here

  • @sudarshant2340
    @sudarshant2340 2 роки тому

    Data In Excel file 2 columns:
    Country-name, Flag
    India, 1
    Netherlands, 1
    Romania, 0
    I have a excel file having data mentioned above storing in Azure data Lake Gen2.
    Requirement: I want only flag 1 data from excel using Azure data factory without using SQL server and data flow.
    I’m trying implement the requirements using the look up, set variable and for each activities in adf but I’m unable to find solution.
    Can you please give me your suggestions or ideas how to implement the pipeline in ADF.

  • @amadoryranon7741
    @amadoryranon7741 5 років тому

    Can we use stored procedure instead of the lookup. I thought lookup is the same functionality as the lookup of ssis
    .
    Can you use table storage as destination or source, or blob Excel to Table Storage

  • @rorymcmanus3017
    @rorymcmanus3017 6 років тому

    Hi Dinesh, Great video! :) Can you take you get a column from the Source on the copy activity and use it in the stored proc?

  • @naveenshindhe2893
    @naveenshindhe2893 3 роки тому

    Hi, Can we execute insert SQL like below in Lookup activity.. please let me know.
    insert into Table1 (Column1) Select ('test1');
    select x;

  • @rahulgoud3092
    @rahulgoud3092 5 років тому

    Hi Thank you posting such nice videos!!!
    I am trying to implement Azure Data Factory and suppose if i hv 3 different environments then i am using azure import/export ARM templates but then i see we need to give parameter file different for 3 environments manually...I'd like to know from you if there is a way to Automate this process of passing parameter.json file ???
    Thank you in Advance and I will be looking forward for your reply.

  • @gowthamtamilvendan4522
    @gowthamtamilvendan4522 6 років тому +1

    Awesome Dinesh , Thank you so much !! What will be next one ?

    • @DineshPriyankara
      @DineshPriyankara  6 років тому

      Thanks Gowtham, few have been planned, most probably on one of transformation activities.

  • @Randyminder
    @Randyminder 6 років тому

    Thank you, very helpful and instructive. One minor suggestion. The word "iterate" is not pronounced "eye terate". It's pronounced "it erate".

    • @DineshPriyankara
      @DineshPriyankara  6 років тому

      Thanks for the suggestion Randy, I will keep it in mind.

  • @chrishowes755
    @chrishowes755 5 років тому +1

    Fantastic - thank you PS. You have very strange compression settings on your voice - attack is far too slow - sounds like its at about 500ms and should be more like 50ms

    • @DineshPriyankara
      @DineshPriyankara  5 років тому

      Thanks for the comment Chris, I am trying to figure out a way of increasing the sound quality.

  • @shashank_1180
    @shashank_1180 4 роки тому

    I'm looking for incremental load using ADF
    TIA

  • @sunderrajan6172
    @sunderrajan6172 5 років тому +1

    Great work!
    What language is used in ADF and where can I get all the functions/statements that are allowed? For example @{items()} etc.

    • @DineshPriyankara
      @DineshPriyankara  5 років тому

      Hi Sunder,
      Thanks for the comment, I have not come across one single page that has all functions but you can see some in this page;
      docs.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions#functions
      will surely add a comment on this if I find a complete list.

  • @smaranda235
    @smaranda235 6 років тому

    very useful! thank you!

  • @philipcoppage3592
    @philipcoppage3592 5 років тому

    Super helpful

  • @maheshwarkonda1836
    @maheshwarkonda1836 5 років тому

    Good video... We need assistance on below requirement on our migration project.
    Requirement :
    We are working on delta data to be copied from online SQL server to CRM . We have created the pipelines using copy activity able to push the data from source(Online SQL server) to destination (Dynamics CRM).
    1. After coping is done successful we are executing stored procedure to update successful records . How can we pass only success records to stored procedure as input .
    My store procedure only has update statement to set flag on DB side.
    2. After coping is done- if some of the records gets failed how can we skip those records and pass failure records to web activity in order to get those details in list via email .

  • @swassab217
    @swassab217 4 роки тому

    hi,how to handle look up activity with more than 5000 rows

  • @ankurgarg5474
    @ankurgarg5474 6 років тому

    Looking for a quick video on incremental copy of folders from file system azure data lake using self hosted integration runtime.. can we do this without code. Incrementally copying the folders every 1 hour from the file system.. please tell the logic if not video

  • @ravikumar-hn5wt
    @ravikumar-hn5wt 6 років тому

    Hi , This is a regarding a technical issue in Adf V2 . What i am trying to do via ADF v2 is load/ update dimensions via one pipeline . The way i am doing it is by taking all records(dim name , SP which will load the dim) from a table using look up task and iterating it over table using foreach activity . Inisde the foreach activity i am calling the SP activity and in a SP name i am passing the value @{item().updateSp}.updateSp is a column name in table which has SP name . while running this pipeline , its failing in SP activity . So my question is in v2 is it possible to run the SP activity in a loop , if yes , can you please suggest how to do it .

    • @DineshPriyankara
      @DineshPriyankara  6 років тому

      Hi Ravi,
      Sorry for not replying immediately. Of course you can call SPs in ForEach Activity. There is no special action to be taken, just like the way you execute SPs in main canvas, you call SPs within ForEach. And if you need to pass a value to SP that is based on values related to iteration, they can be referred as @item().ColumnName.
      What is the error you get? if you have not fixed it yet, let us know, will try to find the reason for your error.

    • @ravikumar-hn5wt
      @ravikumar-hn5wt 6 років тому

      Hi Dinesh , Its okay and thanks for posting the videos of ADF V2 . unfortunately , I have not been able to solve the error by myself . When i run the pipeline it is inner activity is getting failed . The expression that i have written for SP name in SP activity is same as you have explained in the video i.e. using @item()columnName .what i have done is i have created multiple SP activity in pipeline and each meant for updating each dimension . There is of course flaw in this design i.e its not scalable .Each time a new dim is added i need to update the pipeline too . Its not good but i couldnt find how to loop the sp activity in adf v2 yet .

    • @joostbazelmans6093
      @joostbazelmans6093 6 років тому

      I have the same issue. when I assign a dynamic filename to the output file in the inner pipeline, it says: "item() is not a recognized function". my expression for the filename = @concat('DancerData',item().Date,'.txt')

    • @ravikumar-hn5wt
      @ravikumar-hn5wt 6 років тому

      try using @concat('DancerData', @item().Date,'.txt') in your pipeline .

    • @DineshPriyankara
      @DineshPriyankara  6 років тому

      Hi Joost,
      Did you try what Ravi said? Sorry, was busy, could not reply quickly.

  • @srinivasulureddy7265
    @srinivasulureddy7265 4 роки тому

    How to handle which file success and which are not success?

  • @gowthamtamilvendan4522
    @gowthamtamilvendan4522 6 років тому

    Dinesh , I have one scenario, I am getting 4 values from my Lookup and i have pass into copy activity.
    Example :
    Lookup step:
    Select personid as ID from sourcetable where personid (1,2,3,4)
    Copy activity:
    Select * from sourcetable where personid in (@{item().ID})
    I have to capture these four values in my where clause . In this case how to write my parameter . I have to use IN condition or else we have any different conditions for this ?
    Please correct me on the code written above .
    Thanks in advance.

  • @gowthamtamilvendan4522
    @gowthamtamilvendan4522 6 років тому

    Could you please post Parallel execution method also ?

    • @DineshPriyankara
      @DineshPriyankara  6 років тому

      will be surely discussing it in one of next videos Gowtham.

  • @farhanahmedsyed
    @farhanahmedsyed 6 років тому

    awesome thanks