Azure Data Factory Parametrization Tutorial

Поділитися
Вставка
  • Опубліковано 10 чер 2024
  • Parametrization in Azure Data Factory is essential to achieve good design and reusability as well as low cost of solution maintenance. Using parameters also speeds up implementation of new features in your pipelines.
    In this video I will cover basics of Data Factory Parametrization using common Blob to SQL loading scenario.
    Source: github.com/MarczakIO/azure4ev...
    Want to connect?
    - Blog marczak.io/
    - Twitter / marczakio
    - Facebook / marczakio
    - LinkedIn / adam-marczak
    - Site azure4everyone.com
    more to come..
    Next steps for you after watching the video
    1. Check data factory docs docs.microsoft.com/en-us/azur...
    2. Linked service parametrization example
    docs.microsoft.com/en-us/azur...
    3. System variables
    docs.microsoft.com/en-us/azur...
    See you next time!
  • Наука та технологія

КОМЕНТАРІ • 370

  • @raviv5109
    @raviv5109 4 роки тому +14

    Are you real? You really explain and show demo in such a flawless manner. Thanks you so much, your Azure videos are happy at every step.

  • @rajdeepmandal1470
    @rajdeepmandal1470 4 роки тому +3

    The quarantine days wouldn't be this useful without your teaching :) thank you Adam. You are the best!!!

  • @nikhilpatil2021
    @nikhilpatil2021 4 роки тому +6

    Thank you Adam, this is the best tutorial on Azure Data Factory I have seen! I am definitely going to try these demos. Thank you!

  • @tatianebrandao5487
    @tatianebrandao5487 6 місяців тому

    I'm just starting with Synapse/DF and I found your video extremely helpful for beginners. I managed to go through your video, complete the exercise on my own and everything made so much sense! Your way of explaining is very clear! Thank you very much for creating the video :)

  • @nicholasgregory4774
    @nicholasgregory4774 3 роки тому +10

    Another great one Adam, consider me a new student! Short enough in length while fitting all the relevant information in + easy to digest, without overly complicating things. Your PowerPoints with your explanations are second to none.

    • @AdamMarczakYT
      @AdamMarczakYT  3 роки тому +1

      Awesome, thank you!

    • @eerosiljander4622
      @eerosiljander4622 Рік тому

      Thanks Adam for these hands on videos. Best wishes,ES, Data engineer & Scientist.

  • @nupoorsingh3323
    @nupoorsingh3323 2 роки тому

    It's great support for people like me who do not have a mentor. Thanks for uploading these videos and helping me to understand the workflow and survive.

  • @carl33p
    @carl33p 4 роки тому +1

    Demos are the best way to learn, and your demo is as good as it gets. Build, run, build, run- the incremental style allows viewer to understand slowly.

  • @johnd3601
    @johnd3601 2 роки тому +1

    Thanks this is a big help, its amazing how much you don't know you know. This has opened up a lot of possible automation and saved me loads of time once I manage to apply this to my situation.

  • @ghay3
    @ghay3 3 роки тому

    My goto UA-cam channel for anything Azure - you are doing the greatest service by sharing your knowledge and responding to all the queries! many thanks, Adam. Hope you can also enlighten the community on AWS :)

  • @lehai9172
    @lehai9172 3 роки тому +1

    Fantastic tutorials, I've learned a lot from your videos. Thank you for sharing your experiences, Adam.

  • @santanughosal9785
    @santanughosal9785 Рік тому

    Awesome, simple and hands on video to understand the concept. Thank you

  • @barryjohnson8915
    @barryjohnson8915 4 роки тому +1

    Really excellent tutorials. Very simply explained, easy to follow, fantastic stuff.

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому

      Thanks Barry :) Glad to have you here.

  • @mashagalitskaia8642
    @mashagalitskaia8642 2 місяці тому

    Thanks a ton for this video! It helped me a lot. Very precise and detailed, I had no questions left:)

  • @joshiabhinav
    @joshiabhinav 4 роки тому

    Man you are awesome. you present so well and explain so well. Love the way you teach !

  • @sankaji20
    @sankaji20 3 роки тому

    You are doing great job ADAM, 20-30 minutes videos of your's is equal to 60minuts of others. And quality of your vdo is un match-able. Now I am GCP guy but also very much interested in learning azure from your videos only.. keep uploading azure videos. Thanks so much.. doing wonderful job for learners.. you will get a lot of blessings.. Thumbs-up for you..

  • @mayur6816
    @mayur6816 3 роки тому +1

    Thanks, Adam. This is the best tutorial on ADF.

  • @IntelliCloudTech
    @IntelliCloudTech 2 роки тому

    Clearly explained at every point. thanks for the video.

  • @iChrisBirch
    @iChrisBirch 6 місяців тому

    Great tutorial, answered my questions as I had them, great logical flow.

  • @TeymurRzayev
    @TeymurRzayev 2 роки тому

    Actually liked it before watching the video ;) Brilliant content, perfect explanation!

  • @SuperSanjays
    @SuperSanjays 2 роки тому

    Great information you have shared Adam. It helps me a lot.

  • @igiq
    @igiq 4 роки тому +1

    Keep it up Adam, your videos are very helpful !

  • @ashishjuneja6574
    @ashishjuneja6574 2 роки тому

    Super awesome !! Great going Adam.

  • @eatingnetwork6474
    @eatingnetwork6474 3 роки тому +1

    great video, once again very clear and concise, thanks Adam

  • @checomax1980
    @checomax1980 3 роки тому

    Thank you Adam,
    Excellent tutorial.

  • @srinivasreddysankasani5140
    @srinivasreddysankasani5140 3 роки тому +1

    Thank you Adam for your videos. your explanation is really awesome. looking for more videos

  • @jagadishparale8632
    @jagadishparale8632 4 роки тому +5

    Hey Adam,It was excellent tutorial.I would like you to do more videos on deployment of Azure Data Factory pipelines when we use parameters

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому +2

      Thanks! You are in luck, new video tomorrow is also on ADF, stay tuned!

  • @sraoarjun
    @sraoarjun 3 роки тому +1

    One word - FANTASTIC !! ... Thank you Adam , keep up the good work in educating several people .Also request you to put some videos on Synapse analytics ...

  • @sazidaanjumsheikh6817
    @sazidaanjumsheikh6817 Рік тому

    Thank you Adam, in a very short video you made me understand the ADF with a demo instead of Theory.. Nice content... :-)

  • @arnabmondal5596
    @arnabmondal5596 4 роки тому

    Awesome clip. Currently, better content than many available now.

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому

      Thank you so much :) glad that you enjoyed it.

  • @dabay200
    @dabay200 4 роки тому +1

    This is a really amazing video, so much better than Microsoft's tutorials.

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому

      Thanks again Dinal. You are too kind :).

  • @angelamessina7187
    @angelamessina7187 4 роки тому

    Thanks a lot Adam this will be realy helpful for my today's task. Be blessed

  • @jayong2370
    @jayong2370 Рік тому

    Awesome tutorial! Thank you.

  • @MarkDiamond
    @MarkDiamond 2 роки тому

    Excellent content, thank you so much for the effort!

  • @docondelta
    @docondelta 4 роки тому +2

    This is really amazing tutorial .. Thank you

  • @baladenmark
    @baladenmark 3 роки тому +1

    Wow. Thank you Adam. I really can't thank you enough.

  • @snmailist1470
    @snmailist1470 3 роки тому +1

    Thank you Adam. I'm very appreciate your effort 👍👍👍👍
    You give me a description what's going on the cloud.

  • @asrarhaq3234
    @asrarhaq3234 4 роки тому

    Wonderful - practical and useful- Thank you

  • @kalangarahouse
    @kalangarahouse 2 роки тому

    nice tutorial...please continue making such videos

  • @venkataramakrishnamagapu7645
    @venkataramakrishnamagapu7645 3 роки тому +1

    Excellent content. Simple and informative.

  • @fsfehico
    @fsfehico 4 роки тому

    Very well structured content. Thank you

  • @joseantonioestebanrodrigue448
    @joseantonioestebanrodrigue448 2 роки тому

    Perfect explanation!

  • @pavankumar-nm9yu
    @pavankumar-nm9yu 3 роки тому +1

    Thank You Adam, Amazing Explanation...!!!

  • @ricardorocha7882
    @ricardorocha7882 Рік тому

    Simplesmente incrível esse vídeo. Vai me ajudar bastante.

  • @soumenchandra7633
    @soumenchandra7633 4 роки тому

    What a fascinating presentation !!!

  • @Bubu020174
    @Bubu020174 2 роки тому

    Thank u so much to you that you have share this valuable information !

  • @sagarnegi
    @sagarnegi 3 роки тому

    Great explanation. I understood everything.

  • @BojackCartman
    @BojackCartman 3 роки тому

    nice tutorial, very clear and concise

  • @malayku8781
    @malayku8781 4 роки тому

    Very nicely explained Adam!

  • @GhernieM
    @GhernieM 4 роки тому

    Thanks Adam! Very helpful

  • @vishwanathr3507
    @vishwanathr3507 4 роки тому

    Hi Adam
    Thank you for excellent Tutorial and your time. Good luck
    Cheers!
    Vishwanath

  • @jayantKadam100
    @jayantKadam100 4 роки тому

    This is really wonderful tutorial and very useful thanks

  • @MrVoltelen
    @MrVoltelen 3 роки тому +1

    Thanks for the great explanation

  • @oscararmandocisnerosruvalc8503

    Your videos are amazing bro !!!!!

  • @ambrishl9273
    @ambrishl9273 4 роки тому

    Thanks a lot, This was very helpful!!!

  • @RohkeaNox
    @RohkeaNox Рік тому

    Very helpful, thank you :) !

  • @oscarsalas6271
    @oscarsalas6271 2 роки тому

    Excelente video, me ayudo mucho

  • @_indrid_cold_
    @_indrid_cold_ 4 роки тому

    Best content! Thank you so much!

  • @Unbox747
    @Unbox747 2 роки тому

    I love your videos!

  • @lorenzodelsoldato6899
    @lorenzodelsoldato6899 4 місяці тому

    Thank you, really good content

  • @shaileshsondawale2811
    @shaileshsondawale2811 Рік тому

    Great work...!!!!

  • @luisabaiano8689
    @luisabaiano8689 4 роки тому

    Thank you! From Italy

  • @TheSQLPro
    @TheSQLPro 3 роки тому +1

    Great tutorial!

  • @danceandmanymorewithprisha5599
    @danceandmanymorewithprisha5599 4 роки тому +1

    great explanation !

  • @devopsexpert9906
    @devopsexpert9906 3 роки тому +1

    Excellent Video, Best tutoring.......

  • @leonkriner3744
    @leonkriner3744 4 роки тому

    very helpful. Thank you!

  • @ivanovdenys
    @ivanovdenys 4 роки тому

    Thanks a lot !!! Really really good !!

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому

      Thanks! And thank you for watching Denys :)

  • @priyankapatel9461
    @priyankapatel9461 3 роки тому +1

    Really very helpful!

  • @kannan114
    @kannan114 2 роки тому

    Thanks Adam for the great video. I have a question: how do we put this entire pipeline in loop? Let's say I create a list of source files and destination tables and run this pipeline for each set of values in that list. That way I don't want to enter the value manually during every run/debug. Is this something that can be done at trigger? Any suggestions on that?

  • @bookko2377
    @bookko2377 4 роки тому

    Thank you From Thailand

  • @tarvinder91
    @tarvinder91 4 роки тому

    very informative and helpful video

  • @amitsingh8412
    @amitsingh8412 3 роки тому

    Thanks Adam.

  • @Peter-cd9rp
    @Peter-cd9rp 4 роки тому

    cool dude. Especially pipeline parameters! :)I can easily start pipelines with logic apps! :D Wohoo

  • @rishikabapat6737
    @rishikabapat6737 4 роки тому +1

    Thank you Adam for such great videos! I'm not sure if you have covered this in any video, but my question is around transformation of the csv before its loaded into the SQL DB. For example I have a csv file with years as columns and I need to translate them into data fields. Is there any way to do that using Data Factory, or is the only way to re-format in excel before I load it?

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому +1

      Yes, absolutely! Check out my video on Data Factory Mapping Data Flows it shows exactly what you need ;) thanks for watching!

  • @geethamca88
    @geethamca88 Рік тому

    Excellent

  • @sumashree6351
    @sumashree6351 2 роки тому

    Thank you very much good explanation.i have one dout which parameter used to improve the performance of the company for the past 6 years

  • @julianramirezvasquez3164
    @julianramirezvasquez3164 2 роки тому

    Hi Adam, great video as always! I have a question, is there a way to configure Data Factory in order to send files automatically from a Blob storage and then transform the files automatically as well? if that's possible, how can I do that? Thank you so much!

  • @chen5576
    @chen5576 4 роки тому

    Hi Adam! This was a great tutorial! Do we always need to pre-define a table in SQL before moving data from blob? Could we create a new table if the target sink name does not exist inside the pipeline?

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому +1

      Thanks! For the table, you can select checkbox on the copy activity to create table if it doesn't exist. Just make sure that the schema mapping is there, otherwise your table schema will be pretty bad. Although this option is not recommended as you should change your DB schema in more controlled manner.

  • @mumair1979
    @mumair1979 2 роки тому

    Hi Adam, This is fantastic. How can we take parameters from a SQL table, by using a stored procedure, instead of putting them in each time into the Datasources ?? can you please advise.. Thanks

  • @sonamkori8169
    @sonamkori8169 2 роки тому

    Thank You Sir

  • @Viqram1
    @Viqram1 Рік тому

    Hi Adam, Thank you for making such informative videos. I was trying to follow the instructions that you've mentioned in the video. However I wasn't able to load both the files using parameterisation.
    When I try to load cars.csv, it loads properly, and then when I try to load planes.csv, it throws an error saying column's not found. In order to reolve it, I just clicked on "Import Schemas" under sink tab and I was able to load planes.csv.
    Why do I have to import schemas while changing source files?

  • @sameerdongare1113
    @sameerdongare1113 4 роки тому

    Awesome video

  • @tanmoybhattacharjee4966
    @tanmoybhattacharjee4966 3 роки тому +1

    thank you, amazing tutorials , really helpful

  • @samama1975
    @samama1975 4 роки тому

    Hi Adam, thank you for throwing light into the world of Azure. I enjoy following your tutorials. Question though, Could you please point me to where I can download the PowerPoint doc you used for this tutorial? It's simply the road map that is needed to guide you as you navigate your project. Thank you

    • @AdamMarczakYT
      @AdamMarczakYT  4 роки тому +2

      Thank you for watching and commenting. Unfortunately while the source code and samples are open source on github, entire content are free, I decided that at this point in time I'm not sharing PowerPoint presentations as I want to maintain copyright over my materials. Thank you for understanding.

  • @maxgobel
    @maxgobel 3 роки тому +1

    HI Adam, thanks for the great tutorial, was very helpful. Would I be able to follow this procedure with a dataflow instead of a copy example? I am struggling with implementing this between a rawinputBlob and a cleanedOutputBlob

    • @AdamMarczakYT
      @AdamMarczakYT  3 роки тому +1

      It's a little different but mapping data flows support parametrization too: docs.microsoft.com/en-us/azure/data-factory/parameters-data-flow?WT.mc_id=AZ-MVP-5003556 thanks for stopping by!

  • @alfredsfutterkiste7534
    @alfredsfutterkiste7534 2 роки тому

    Love it

  • @srinivasareddy3336
    @srinivasareddy3336 2 роки тому

    Hi Adam, thanks for the great video tutorial. May I know how to pass variable values from one pipeline to another pipeline in Azure Data Factory.

  • @mahendramanchekar
    @mahendramanchekar Рік тому

    Thank you

  • @kavishv.8101
    @kavishv.8101 2 роки тому

    Do we have to always create individual table in SQL database for every csv file we copy from blob to database? Also, you've an amazing content, Thank you!

  • @mrboloban25
    @mrboloban25 2 роки тому

    Great video Adam, thanks. If I am able to get the schema and table list from a db with a Lookup, how could I export this to a csv file? Thanks.

  • @jonahnio5642
    @jonahnio5642 3 роки тому +1

    Thanks alot Adam for the video. WOuld you be able to show me how I can pass variables across different pipelines? Thanks very much

    • @AdamMarczakYT
      @AdamMarczakYT  3 роки тому

      Thanks. Check out MS blog entry on that cloudblogs.microsoft.com/industry-blog/en-gb/technetuk/2020/03/19/enterprise-wide-orchestration-using-multiple-data-factories/

  • @atulsalunke6675
    @atulsalunke6675 3 роки тому

    Awesome content and delivery Adam !! Thanks!!
    Please if possible add real Project based content using ADF

    • @AdamMarczakYT
      @AdamMarczakYT  3 роки тому

      Great suggestion! I do plan to have few videos on actual implementations :)

  • @rajexpert8544
    @rajexpert8544 2 роки тому

    Hi Adam,
    Nice tutorial. How can I parameterize the pipeline if I want to loop through all the containers in blob storage and with the corresponding sql table name?
    I don't want to enter parameter values manually at run time

  • @H2B-World
    @H2B-World 2 роки тому

    Hi Adam, it looks flawless but I have a doubt in the present scenario. Without passing names manually in parameters while can't it read the file one by one and process accordingly?

  • @sarveshpandey1125
    @sarveshpandey1125 3 роки тому

    You saved a lot of my time . Thank you! can you please clarify one thing, what's the use case for using pipeline variables?

    • @AdamMarczakYT
      @AdamMarczakYT  3 роки тому

      Thanks! Parameters are for entry input, variables are for intermediate results and temporary or calculated values that change during pipeline run. They have many use cases.

  • @krishnakoirala2088
    @krishnakoirala2088 2 роки тому

    Adam Thanks for all those great videos. A question: I have a parameterized stored procedure as an activity and I am passing parameter values with in ADF. I want to capture value of one parameter dynamically that is passed on the stored procedure, how do I do that? I need to capture this value within databricks notebook via widget. I have defined a Base parameters in databricks notebook activity something like this (@activity('DeleteDaysBack').output), here DeleteDaysBack is the stored procedure and parameter of which I am trying to capture in databrick activity. Only the Input has that parameter, but ADF does not support Input, throws an error. Any insight would be helpful, thanks in advance.

  • @yyy168-wt1zq
    @yyy168-wt1zq 3 місяці тому

    Hi Adam, thanks for the video, that's a clear explanation, can I ask what if I want to put Cars and Planes into a variable array, then Pipeline can have a for loop execution of doing the copy of Cars and then Planes. Is it possible?

  • @blse2000
    @blse2000 4 роки тому

    awesome bro

  • @juliestudy1475
    @juliestudy1475 7 місяців тому

    This is great! Instead of typing in the source and destination one at a time, is there a way to simply loop through the files folder, pick up the filenames and pass them as input (filename) parameters and use them again as output (tablename) parameter?

  • @Go_Bremerhaven
    @Go_Bremerhaven 3 роки тому +1

    What a great video. One question though. Does the Copy Data Activity automatically map the columns of the csv to the columns of the sql table automatically based on the column names? What is if the column names are different between the csv and sql tables?

    • @AdamMarczakYT
      @AdamMarczakYT  3 роки тому +1

      Thanks! If they are different it will throw error, but as a workaround you can create SQL view with the right column names. SQL server allows to perform inserts on the view :)

  • @sougatasil6064
    @sougatasil6064 Рік тому

    Thanks for the demo. But what to do in case of scheduled execution , because it is not possible to change the parameters manually during scheduled execution. Is there a way to pass the parameter dynamically so that , once it is scheduled for car and next time scheduled for plane without manual intervention