How to Load Multiple CSV Files to a Table in Azure Data Factory - Azure Data Factory Tutorial 2021

Поділитися
Вставка
  • Опубліковано 25 сер 2024

КОМЕНТАРІ • 27

  • @niteshsoni2282
    @niteshsoni2282 Рік тому +2

    BEST CHANNEL FOR LEARNING................

  • @zohebsyed3333
    @zohebsyed3333 Рік тому +2

    Thanks for the effort. These videos' are amazing and very detailed.

  • @zakariiabadis7203
    @zakariiabadis7203 3 роки тому +2

    Hello man !
    Please we want this training to be like that of Ssis
    I loooooooooove you man !

  • @mr.prasadyadav
    @mr.prasadyadav 2 роки тому +1

    Nice effort , Thank you Sir ❤️

  • @kumareskumares-ug2ys
    @kumareskumares-ug2ys 3 роки тому +1

    Really helpful ... Appreciated your efforts..!!

  • @narendrakishore8526
    @narendrakishore8526 2 роки тому +1

    Great explanation ‼️

  • @rohitsingh8334
    @rohitsingh8334 3 роки тому +1

    i like your style of teaching.. 👍

  • @okkerb1
    @okkerb1 2 роки тому +2

    I agree that this is a simple but certainly not safe solution. The problem that I have with this approach Is that the processing, copy and delete processes of the file is completely disconnected and that there is, therefore, no guarantee that the files that were written to the DB are in fact the ones that are moved to the archive folder. Let's say there is an error that is swallowed during the processing stage the files that get moved would then be the full set or if there's an error in the moving step we will end up with duplicates in the SQL table. Or if more files land in the incoming folder after processing to SQL but before the copy then we will move files to Archive that was not processed. So sorry to say but this approach for me is not a good Architecture for a fully production-ready solution.
    But saying that thank you for your contributions and know that I've used your suggestions from previous videos usefully.

    • @TechBrothersIT
      @TechBrothersIT  2 роки тому +3

      Hi Okker, I totally Agree with you. The intention of this video was to provide the ways to practice different options. For fully functional solution, I will not suggest this. If would put a lot of more checks if I have to fully design the ETL. If I get time, I will add some video on that. I did looked into issues like half way failing for load for some files and that can end up bad data or duplicate data or even failure when retry. I really appreciate your took time to analysis and educate me and others on this. Have a great day.

    • @okkerb1
      @okkerb1 2 роки тому +1

      @@TechBrothersIT Thanks for your response. The purpose of my message was certainly not to educate you. :) But rather to highlight the danger with this approach and as you have mentioned thay there would need to be lots of checks and balances to make this work.

  • @clouddataodyssey
    @clouddataodyssey 2 роки тому +1

    Thanks for the video. Can you please record a video with "List of files" option (which we configure in source - blob storage account when reading files) from source system.

  • @srinubathina7191
    @srinubathina7191 Рік тому

    Thank you sir

  • @prashantsonawane7730
    @prashantsonawane7730 Рік тому

    Hello Sir Please make video on how to load multiple different extension files such as CSV, parquet, and txt into the database.

  • @kalaivanan.s5042
    @kalaivanan.s5042 9 місяців тому

    How it is read all the file without foreach

  • @rohitsethi5696
    @rohitsethi5696 Рік тому

    hi im rohit Can we use all this activity in one copy command .like all in the variable and use the for each loop to do this process

  • @Mgiusto
    @Mgiusto 3 роки тому +1

    I need something like this but I need to load each file one at a time to a SQL DB, not everything that is found all at once which the pipeline in this video will do.
    How can I cycle through files one at a time in the order they arrived in my storage folder? Any help would be greatly appreciated, thank you!

    • @TechBrothersIT
      @TechBrothersIT  3 роки тому

      Can you please explain your requirement in little more details step by step. Sorry could not understand what you are trying to do. Maybe able to help you. Thanks for the feedback.

    • @brandonelliott2263
      @brandonelliott2263 2 роки тому

      Based on what you are describing...you want to be able to ingest files into the target SQL DB based on the latest arrival of data from your storage folder. Sounds to me like you need to set up a Message Queue and a streaming service that feeds into the pipeline that transmits your latest data to your target SQL DB.

    • @Mgiusto
      @Mgiusto 2 роки тому

      @@brandonelliott2263 I was able to solve my issue by using a Get Metadata task and using that to create a file list of the source blob storage files. Then I run a For-Next loop off of this listing to process each file one at a time in a Copy Data task.

  • @heliobarbosa2
    @heliobarbosa2 Рік тому

    how to get the file name to insert into table?

  • @Boss56641
    @Boss56641 2 роки тому

    What if it is failed after copying half of the files ,we need to continue with the remaining files? How to do that?

    • @TechBrothersIT
      @TechBrothersIT  2 роки тому

      I will say that save the completion status in some table and process only which have not completed

  • @RameshPatel-fg9qk
    @RameshPatel-fg9qk 2 роки тому

    Nice Video,
    I have a Source is csv file folder in SFTP Linked Service.
    My Destination is SQL DB.
    - Daily Regular New File uploaded in SFTP Source Folder ,
    - I want Latest Modified file Transfer From SFTP Folder To SQLDB on daily Bases..
    It is Automatic Process.
    Please Share This Type Video.
    I have Problem in Mapping.
    Total 172 Column in SOURCE and Destination.
    Please Share Video