Learn N Relearn
Learn N Relearn
  • 16
  • 147 878
Azure Data factory || Control Flow || ForEach Loop Activity
Content : ForEach loop Activity concept and implementation in Azure data factory control flow
Audience : Beginner
Next Video :
1. Data flow transformations
Переглядів: 3 311

Відео

Azure Data factory || Control Flow || Wait Activity
Переглядів 9503 роки тому
Content : Wait Activity concept and implementation in Azure data factory control flow Audience : Beginner Next Video : 1. ForEach Loop Activity
Azure Data factory || Control Flow || Append Activity
Переглядів 1,1 тис.3 роки тому
Content : Append Activity concept and implementation in Azure data factory control flow Audience : Beginner Next Video : 1. Wait Activity 2. ForEach Loop Activity
Azure Data factory || Control Flow || If Activity
Переглядів 1,5 тис.3 роки тому
Content : If Activity concept and implementation in Azure data factory control flow Audience : Beginner Next Video : 1. Append Activity 2. Wait Activity 3. ForEach Loop Activity
Azure Data factory || Control Flow Activity || Filter Activity
Переглядів 1,6 тис.3 роки тому
Content : Filter Activity concept and implementation Audience : Beginner Next Video : If Activity
Azure Data Factory || Control Flow Activity || Get Metadata Activity
Переглядів 3,1 тис.3 роки тому
Content : Get Metadata Activity concept and Implementation Next Video : 1. Filter Activity 2. Append Activity 3. If Activity 4. Lookup Activity 5. Wait and Until
Azure Data Factory || Slowly Changing Dimension Type - 2 || SCD Type 2 || Part 2
Переглядів 6 тис.3 роки тому
Content : Azure Data factory Live Scenario. Slowly Changing Dimension Type 2 concept explanation and hands on implementation. Activity & Transformations Used : 1. Mapping Data Flow 2. Join 3. Lookup 4. Conditional Split 5. New branch 6. Derived Column 7. Filter 8. Alter Row 9. Sink Update/Insert
Azure Data Factory || Slowly Changing Dimension Type - 2 || SCD Type 2 Part 1
Переглядів 8 тис.3 роки тому
Content : Azure Data factory Live Scenario. Slowly Changing Dimension Type 2 concept explanation and hands on implementation. Activity & Transformations Used : 1. Mapping Data Flow 2. Join 3. Lookup 4. Conditional Split 5. New branch 6. Derived Column 7. Filter 8. Alter Row 9. Sink Update/Insert
Azure Data Factory || Slowly Changing Dimension || scd type 1 Concept and implementation
Переглядів 8 тис.3 роки тому
Content : Azure live scenario example. Understanding the concept of SCD Type 1 and implementing through azure data factory. 1. data flow activity 2. source, ref table, sink 3. exist transformation 4. Alter Transformation Next Video : Slowly changing dimension type-2 concept and implementation Please share your feedback on this and subscribe to our channel.
Azure Tutorial || Azure Data warehouse solution || Part -2
Переглядів 5 тис.3 роки тому
Content : How to use azure services which can be used in Data warehouse solution with basic hands on examples Azure Storage Azure SQL Database Azure Datafactory Azure DataBrick Azure SynapseDB Azure Synapse Analytic workspace Audience : Azure begginners Part 1 : ua-cam.com/video/7NRfzwf-SAo/v-deo.html Next Video : Azure Data factory Live scenario with concept and hands on practice
Azure Tutorial || Azure Data warehouse solution || Part -1
Переглядів 29 тис.3 роки тому
Content : How to use azure services which can be used in Data warehouse solution with basic hands on examples Azure Storage Azure SQL Database Azure Datafactory Azure DataBrick Azure SynapseDB Azure Synapse Analytic workspace Audience : Azure begginners Part 2 : ua-cam.com/video/yZlpGCWMsfI/v-deo.html Next Video : Azure Data factory Live scenario with concept and hands on practice
Azure data factory || Incremental Load or Delta load from SQL to File Storage
Переглядів 54 тис.4 роки тому
Links : About : In this video you will understand how we can perform incremental of delta load from Azure SQL to File storage using watermark table. Azure portal - portal.azure.com Learn n Relearn Channel - ua-cam.com/channels/9Q0FgZv2MM_B_Jq3HieGUQ.html Create and connect to Azure SQL : ua-cam.com/video/1pvOT9IqkqQ/v-deo.html ADF Basics : ua-cam.com/video/0vkzn5mwtZU/v-deo.html If you are like...
Azure Data Factory || Delete activity and Stored Procedure Activity Example
Переглядів 2,3 тис.4 роки тому
Links : Azure portal - portal.azure.com Learn n Relearn Channel - ua-cam.com/channels/9Q0FgZv2MM_B_Jq3HieGUQ.html ADF Basic - ua-cam.com/video/0vkzn5mwtZU/v-deo.html Create and Connect to Azure SQL - ua-cam.com/video/1pvOT9IqkqQ/v-deo.html About : Lets understand few more ADF activities 'Delete' and 'Stored Procedure ' with example. If you are like me and love to learn, then this channel is for...
Azure Data factory Tutorial | Why and what is Data factory, components, Create ADF pipeline
Переглядів 12 тис.4 роки тому
Links : Azure portal - portal.azure.com Learn n Relearn Channel - ua-cam.com/channels/9Q0FgZv2MM_B_Jq3HieGUQ.html Create and connect to Azure SQL : ua-cam.com/video/MtNepTS24io/v-deo.html About : Lets understand Why and what is data factory, Few components of Data factory. Create ADF pipeline. Load data from CSV file to Azure SQL. If you are like me and love to learn, then this channel is for y...
Azure Tutorial | Create Azure SQL Database, Firewall set up, Connect SQL Database from SSMS
Переглядів 9 тис.4 роки тому
Links : Azure portal - portal.azure.com Learn n Relearn Channel - ua-cam.com/channels/9Q0FgZv2MM_B_Jq3HieGUQ.html Create SQL Server : ua-cam.com/video/MtNepTS24io/v-deo.html About : In this video we will understand How to create SQL Database, firewall setup and SQL database from SSMS. If you are like me and love to learn, then this channel is for you. We can share a lots of learnings and ideas ...
Azure Tutorial || What is a Resource Group and a Resource in Azure with Example
Переглядів 2,4 тис.4 роки тому
Azure Tutorial || What is a Resource Group and a Resource in Azure with Example

КОМЕНТАРІ

  • @gulchehraamirjonova2339
    @gulchehraamirjonova2339 Місяць тому

    Thank you so much You can’t imagine how it was helpful

  • @Sistematizei
    @Sistematizei 3 місяці тому

    A clean and clear tutorial, congratulations on the video and the mastery of the explanation, with a wealth of details

  • @aadityasharma63353
    @aadityasharma63353 5 місяців тому

    mam can you please provide the source code which is in the starting of the video.

  • @williamtenhoven8405
    @williamtenhoven8405 7 місяців тому

    Hi, it looks like this doesn't work with parquet. if I do the same in the pipeline expression builder in the copy data Sink and I press Validate ALL it says Syntax error : Opening brace has no closing brace. Thing is..... there is a closing brace...

  • @averychen4633
    @averychen4633 7 місяців тому

    Thank you.This is the clearest explanation about azure cloud but you did not show us how to install the SSMS server. Is it an on premise server? Where does that SSMS come from? I am lost. I do not quite understand that part.

    • @SushilChauhan
      @SushilChauhan 4 місяці тому

      you can configure the SSMS with Azure SQL DB. all you need is Azure SQL Server address and database login password.

  • @Dhrumilthakkar210
    @Dhrumilthakkar210 Рік тому

    Very nice explanation

  • @anonymous-254
    @anonymous-254 Рік тому

    you just proceeding and proceeding ahead... plz tell what is Incremental Load ?in which scenarios we can use in real life,,, you have explained any backgroung info

  • @zramzscinece_tech5310
    @zramzscinece_tech5310 Рік тому

    Do we need to create one data flow for each dimension? any alternative?

  • @souranwaris142
    @souranwaris142 Рік тому

    Hello, I have a problem with the incremental load I want to create an incremental pipeline from the Oracle on-premise server to Azure data lake(blob storage) I don't have Azure SQL. I just want to push in blob storage as a CSV file. in my case, I have confusion about where I should create the watermark table. someone told me in your case you have to use parquet data. please help me with this I am stuck for many days.

  • @dataguy6700
    @dataguy6700 Рік тому

    Thanks a lot, good explained

  • @rohitsethi5696
    @rohitsethi5696 Рік тому

    what is water map table . i have of staging table and worked in staging table

  • @Bgmifortimepass
    @Bgmifortimepass Рік тому

    in realtime senario how we maintain tables in sql db , can we need to create table or we can existing one.

  • @kandulagaming8186
    @kandulagaming8186 Рік тому

    Simple -- English Language ::: Even anybody (NON-IT Persons ) is able to understand ..

  • @kandulagaming8186
    @kandulagaming8186 Рік тому

    Excellent .. Explanation _____ GOOD ..GOOD .. This is Wonderful 100 Percentage :: I have seen many persons Videos

  • @WelcomeToDataverse
    @WelcomeToDataverse Рік тому

    Dimension table should also have a end date column

  • @arihantsurana3671
    @arihantsurana3671 Рік тому

    I followed all the steps.. but while debugging.. I am facing the problem to run pipeline.. my lookups are running but copydata one is failing... plz help..

  • @MR-gp4zp
    @MR-gp4zp Рік тому

    Where is part 3 ??

  • @rahulchavan7822
    @rahulchavan7822 2 роки тому

    U r voice very very very low

  • @arupnaskar3818
    @arupnaskar3818 2 роки тому

    Great maam excellent.. 🧚💐💐

  • @abhijeethivarkar1329
    @abhijeethivarkar1329 2 роки тому

    There is one issue I feel in this implementation. If we try to load same record file without any changes multiple times it will go and add new active entries and will mark old entries as 0. So the total record count of the sink table will increase with complete set of existing records less than Maxkey value. Secondly you can execute the sink activity in sequence which will eliminate the requirement of Maxkey value logic to track the updatable records.

  • @lucaslira5
    @lucaslira5 2 роки тому

    Thanks for the video, is it possible to just 1 file and adding the new records to it? So you don't have to create multiple files in the blob

  • @madanl2208
    @madanl2208 2 роки тому

    Uploaded more videos Madam..your teaching is very well..plz upload

  • @sharavananp5570
    @sharavananp5570 2 роки тому

    Hi, Awesome explanation. I did the same for pipline and found a error due to T and Z coming in output of lookup values, where as in sql the datetime format has no T and Z. I tried using formatDateTime ('2000-07-07T23:34:32Z',yyyy-MM-ddTHH:mm:ssZ') and still facing issues. COuld you kindly suggest whats the best approach to solve this

    • @sidsan000
      @sidsan000 Рік тому

      @@sharavananp5570 Hi Can you please tell me how u resolved this issue without timstamp and using datetime only

  • @hemantvitkar6646
    @hemantvitkar6646 2 роки тому

    Very useful & informative part 1 & Part 2 ....very important for beginners they can visualise the process.... I saw too much content on UA-cam for DP 900 but this both parts are awesome.. 👍👍 I request you mam please keeping up ....

  • @madanl2208
    @madanl2208 2 роки тому

    Thanks madam keep post more

  • @mummz3371
    @mummz3371 2 роки тому

    Your tutorials are so good. Why did you stop making videos :(

  • @dileeprajnarayanthumula4652
    @dileeprajnarayanthumula4652 2 роки тому

    Hi Maam, the explanation is very good I would like to know if there is any possibe chance providing personal training on ADF?

  • @mummz3371
    @mummz3371 2 роки тому

    Hey, just watched this tutorial and it's really good. I see that you don't upload anymore. Please start to upload again, your knowledge and videos seem really good and you are an amazing teacher❤️ Hope you are well and we see you soon!

  • @pallarajesh
    @pallarajesh 2 роки тому

    Hii can you take mock interviews for azure databricks and data factory aspires

  • @sureshmekala2828
    @sureshmekala2828 2 роки тому

    Well explained medam Tq

  • @SatishBurnwal
    @SatishBurnwal 2 роки тому

    The video is really informative. Just that your voice is very feeble to hear.

  • @musketeers3344
    @musketeers3344 2 роки тому

    Can we use same Watermark table for different table , so that multiple tables water mark value get update in same table ?

  • @garlaamar8973
    @garlaamar8973 2 роки тому

    Good explanation. Tq for sharing this 😊

  • @kartikeshsaurkar4353
    @kartikeshsaurkar4353 2 роки тому

    Why cross join is used & how its working here ?

  • @kartikeshsaurkar4353
    @kartikeshsaurkar4353 2 роки тому

    You should have enabled data flow debug so that it would be easier for us to understand

  • @ARUNSRAJ44
    @ARUNSRAJ44 2 роки тому

    If the IP address is keep changing, how to resolve the connectivity issue as i cannot add all IP address to my Firewall

  • @cliffordraygentiles
    @cliffordraygentiles 2 роки тому

    Hello, thanks for this it was a good explanation. I have a question though, where is the part where the Employee_SID is created? thanks more power to you

  • @cliffordraygentiles
    @cliffordraygentiles 2 роки тому

    Hello, thanks for this it was a good explanation. I have a question though, where is the part where the Employee_SID is created? thanks more power to you

  • @terrificop5307
    @terrificop5307 2 роки тому

    nice session

  • @tahamansoor7396
    @tahamansoor7396 2 роки тому

    Best channel for learning ADF

  • @bandarurohithkumar439
    @bandarurohithkumar439 2 роки тому

    Can u upload the doc file also. It really helpful us.

  • @ankitsoni5286
    @ankitsoni5286 2 роки тому

    I have 100 pipelines to run, can't run all at once, I can do 5. If I wait few mins then expenses will go high, any better options?

  • @aravindanbu2433
    @aravindanbu2433 2 роки тому

    Well explained thankyou ,and if possible could you pls provide about ADX also...

  • @lucamel8766
    @lucamel8766 2 роки тому

    thanks for explaining a complex topic, but pls check the audio :(

  • @vijay9842
    @vijay9842 2 роки тому

    How do we write copy activity from Kusto (Azure Data Explorer) ? order_payload | where Last_modified_date > "@{activity('Lookup_max_watermark').output.FirstRow.maxdate}"

  • @shubhamkhatri6908
    @shubhamkhatri6908 2 роки тому

    Awesome.. Pls keep continuing the good work ❤️

  • @kollikr
    @kollikr 2 роки тому

    Do you have any recommended approach to deal with the deleted rows? In your example inserts and updates are handled well but I am also looking for a solution to find any hard deletes in the source table

    • @learnnrelearn7553
      @learnnrelearn7553 2 роки тому

      To find the deleted delta records, you need to do a lookup to the source to find if any record is missing..... else from source we can have a delete flag to identify which record is deleted.(soft delete)

  • @desparadoking8209
    @desparadoking8209 2 роки тому

    Very good explanation! 👍🙂

  • @mr.prasadyadav
    @mr.prasadyadav 2 роки тому

    Nice Lecture , Thank you