2. Get File Names from Source Folder Dynamically in Azure Data Factory

Поділитися
Вставка
  • Опубліковано 14 жов 2024
  • In this video, I discussed about Getting File Names Dynamically from Source folder in Azure Data Factory
    Link for Azure Functions Play list:
    • 1. Introduction to Azu...
    Link for Azure Basics Play list:
    • 1. What is Azure and C...
    Link for Azure Data factory Play list:
    • 1. Introduction to Azu...
    #Azure #ADF #AzureDataFactory

КОМЕНТАРІ • 83

  • @debasisroy9625
    @debasisroy9625 4 роки тому +4

    Excellent content !! Thanks Mate for taking out your time. Could you pls do a series on Data Factory DevOps integration. Building a CI / CD pipeline using library variables

  • @GunNut37086
    @GunNut37086 2 роки тому +1

    Expertly done!! You explained this perfectly for me. Thank you for sharing your expertise.

  • @siddheshamrutkar8684
    @siddheshamrutkar8684 2 роки тому +1

    What a content boss.. Really very impressive.. May I know which videos should I refer to get started with Azure Cloud as I am relatively new to this.. I know MSBI and wants to get upgrade myself to Azure Cloud.. Kindly suggest and Your Contents are awesome.. Hats off to You.. 🤟👏

    • @WafaStudies
      @WafaStudies  2 роки тому

      U can start with Azure data factory and Azure Synapse Analytics

  • @rbor-xb5eg
    @rbor-xb5eg 11 місяців тому +1

    thank you sir, youre legend, respect from brazil

  • @venukumar1094
    @venukumar1094 3 роки тому +2

    Hi Maheer, Thanks for the detailed explanation. For this topic the scenario should be "Read Files from Source Folder Dynamically in Azure Data Factory" instead Get File Names.. We are not reading/getting "filenames" right, the files were being just copied from source to Sink.?

    • @tallaravikumar4560
      @tallaravikumar4560 Рік тому

      First he is copying file names n then using those file names he is copying the files.

  • @damayantibhuyan7147
    @damayantibhuyan7147 4 роки тому +2

    Nice videos, Clear Steps.. Please keep uploading. Thank You !

  • @deepaksahirwar
    @deepaksahirwar 2 роки тому +1

    Thank you for great explanation. Please could we expect learning videos on Azure Synapse Analytics?

    • @WafaStudies
      @WafaStudies  2 роки тому +1

      I created a playlist on Synapse analytics. Kindly check it. Link below.
      ua-cam.com/play/PLMWaZteqtEaIZxPCw_0AO1GsqESq3hZc6.html

    • @deepaksahirwar
      @deepaksahirwar 2 роки тому +1

      @@WafaStudies Tons of thank you,dear Sir . Much much helpful

    • @WafaStudies
      @WafaStudies  2 роки тому +1

      @@deepaksahirwar welcome 😁

  • @ArabaEfsanesi
    @ArabaEfsanesi 3 роки тому

    Thanks for video first. My question is; what should i do if i want to copy only a specific file types from my input folder (ex. just csv files) in this foreach loop example?

  • @pradeepert
    @pradeepert 2 роки тому

    Can we pass file path dynamically? I have sql table from there I can take the file path. This file path needs to be passed to getmetadata and list the files.
    Looking for your help. Thank you so much!

  • @subhashkomy
    @subhashkomy 3 роки тому +2

    Can you please create a video, how to upload multiple Excel data in Sql Server using Data Flows and please also used data conversion. It doesn't seem to be as easy as we do in SSIS.

  • @rajeshgowd
    @rajeshgowd Рік тому

    Hello Maheer,
    could you please make a video about copying excel files. if we implement as like above video, in excel files its asking sheet name.

  • @rakeshupadhyay1
    @rakeshupadhyay1 2 роки тому +1

    Very good content, practical scenarios are helpful

  • @mathangiananth6599
    @mathangiananth6599 3 роки тому +1

    Hi , thanks for uploading amazing content . i have one question here , what if we hav different file types in the source blob container like .TXT,.CSV,parquet files , orc files and I want to copy all of these to a different path . This was asked in one of my interviews. Can you suggest what can be done here .TIA

    • @sumitbarde3677
      @sumitbarde3677 2 роки тому

      in this scenario you can use a lookup activity. Create a config file store all the dynamic parameters you want to pass to a pipeline, after lookup use foreach and copy activity. Sorted

  • @esrasultan8963
    @esrasultan8963 Рік тому

    Hi Maheer, do you video, where we copy csv file from dynamic folder in adls to new folder in adls and store it as parquet.

  • @aruntejvadthya1309
    @aruntejvadthya1309 4 роки тому +1

    Very good explanation👌 . Its very helpful

  • @viveknimmagadda2397
    @viveknimmagadda2397 Рік тому

    At minutes 4:27 in the video, we can see the recursive property, however, that's not the case for me. The software has been updated and the video might be outdated. Can you please help me with this as I cannot find the recursive property?

  • @empowerpeoplebytech2347
    @empowerpeoplebytech2347 3 роки тому

    Good one, very helpful and practical scenario. You made it exactly as it is needed!

  • @anujgupta-lc1md
    @anujgupta-lc1md 4 роки тому +1

    amazing plz add some incremental load handling data & how to check whether files are present in blob or not from validation or getmetda actitiy.

    • @WafaStudies
      @WafaStudies  4 роки тому

      Sure will add videos on those scenarios. Thank you

  • @gubscdatascience1805
    @gubscdatascience1805 2 роки тому +1

    very helpful sir

  • @sandeep5996
    @sandeep5996 4 роки тому +1

    Nice work!
    Could you please make a video on. How to check 0Kb csv files / zero row record from source. If zero Kb file/Zero records in source trigger an email, in azure Data Factory.
    Thanks in advance.

  • @sirisiri3797
    @sirisiri3797 2 роки тому +1

    Very usefull videos pls make data bricks videos also

    • @WafaStudies
      @WafaStudies  2 роки тому

      Hi I am already doing databricks playlist. Kindly check it. Currently that playlist is inprogress

    • @sirisiri3797
      @sirisiri3797 2 роки тому

      If possible can you make in telugu also

  • @YanadireddyBonamukkala
    @YanadireddyBonamukkala 4 роки тому

    If possible try a video on crating Global parameters and pass the values dynamically with different DataBases

  • @mobinbagwan5747
    @mobinbagwan5747 3 роки тому +1

    Can i use the items().name inside activity which will be inside a if else loop activity which is inside for each activity ? I hope my question makes sense

    • @WafaStudies
      @WafaStudies  3 роки тому +1

      You can use if that expression is within foreach

  • @tipums123
    @tipums123 4 роки тому +1

    Very good explanation. Thank you for this video!

  • @manognadamacharla3346
    @manognadamacharla3346 3 роки тому

    Hello thanks for your wonderful videos, can you please give an idea on pushing all folders along with files in data lake with one go.

  • @yusufuthman8571
    @yusufuthman8571 4 роки тому +1

    Hi, This Tutorial is really helpful but I find it difficult to point the files to a database instead of the same directory as the source file as is the case in most real life scenarios, please help on how to get these files into an SQL server database

  • @itech7313
    @itech7313 Рік тому

    Hi please tell us how do we import multiple files from different sources in ADF (this is interview quesion)

  • @princyshinas8740
    @princyshinas8740 2 роки тому

    Hi... I could nt see recursive option in synapse analytics. can u suggest some ways to get subfolder?

  • @sid0000009
    @sid0000009 4 роки тому

    the data set which you use to get the file names is not the same what you select inside the copy activity. Ideally both should be same ?

  • @kunalratnaparkhi828
    @kunalratnaparkhi828 4 роки тому

    Could it be possible to create a destination folder structure same as the source folder structure automatically while uploading files in a data leak using a data factory? The destination folder structure should be created automatically.
    Example -
    I have 3 files in the "C:\Test\Upload" folder. This is on-premises.
    Now I want to upload those files in data leak using the data factory and destination folder structure should be C:\Test\Upload", which should create automatically.
    Please advice.

  • @mohangonnabathula2261
    @mohangonnabathula2261 2 роки тому

    I can't find the "recursively " option anymore in Get Metadata activity. Can you please let me know how to get all the files recursively in GetMetadata acty ? Thanks.

  • @lib133
    @lib133 2 місяці тому

    what if one wants to to a similar things but with .txt or .sql files ( stored in a ADLS Gen 2 container ) ?

  • @MarceloSilva-us1gh
    @MarceloSilva-us1gh 3 роки тому +1

    Amazing! Thank you my friend!

  • @rakeshmishra4650
    @rakeshmishra4650 3 роки тому

    Nice video but if I'd like to store/capture the filename into a sql table column then how can we do that? Could you please update us here. That means load the three files into sql table with respective filename into sql table column. Thanks

  • @abnavalagatti
    @abnavalagatti 3 роки тому

    Very good explanation

  • @sabinsunny655
    @sabinsunny655 3 роки тому

    Good content , really helpful, clear steps , thanks for the video

  • @SatyaPParida
    @SatyaPParida 4 роки тому +1

    Wonderful content ✌️. Really helpful

  • @sureshch8328
    @sureshch8328 3 роки тому

    Hi
    I am using pkg parameter Startdate is false but it was not picking parameter due to UTC time zone how can we change this dynamically.

  • @vijaysagar5984
    @vijaysagar5984 2 роки тому

    Hi Bro,
    Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.

  • @bhavnakamble975
    @bhavnakamble975 3 роки тому +1

    Amazing and really helpful

  • @vinayraghuwanshi9419
    @vinayraghuwanshi9419 4 роки тому

    In your metadata, there is an option of recursively but in my metadata, I did not see this option

  • @anupgupta5781
    @anupgupta5781 3 роки тому

    If want to copy different files into different output sub folder how could I do it

  • @fannymaryjane
    @fannymaryjane 3 роки тому

    Can I use with the configured wildcard?

  • @krishnakuraku6853
    @krishnakuraku6853 4 роки тому +1

    How can we execute dataflow tasks parallel in ADF..? Any idea please..

    • @WafaStudies
      @WafaStudies  4 роки тому +1

      To execute data flows in adf u need to use data flow activity. You may have your data flow in foreach and execute it for multiple iterations in parallel

    • @krishnakuraku6853
      @krishnakuraku6853 4 роки тому

      @@WafaStudies Thanks bro....

    • @krishnakuraku6853
      @krishnakuraku6853 4 роки тому

      if the schema varies how foreach detect metadata of different tables or files and load data into different tables or files.

    • @WafaStudies
      @WafaStudies  4 роки тому

      @@krishnakuraku6853 u need handle that. Allow schema drift option allow you to handle that. Pls check my vidoe on schema drift concept in data flows.

  • @YanadireddyBonamukkala
    @YanadireddyBonamukkala 4 роки тому +1

    Is there any possible to change the linked service dynamically at triggering level, if yes please share your video here

    • @WafaStudies
      @WafaStudies  4 роки тому +1

      ua-cam.com/video/M22Mj0rcBcs/v-deo.html

  • @itech7313
    @itech7313 Рік тому

    He please us how do we import data from different sources for multiple files this is interview quetion

  • @ravisamal3533
    @ravisamal3533 4 роки тому

    if i want to copy multiple file from blob to sql db , how can i do so

  • @Kannan-lt1ud
    @Kannan-lt1ud 3 роки тому

    thank you, this helped me...

  • @bandarurohithkumar439
    @bandarurohithkumar439 2 роки тому

    How to contact you?

  • @stangoodvibes
    @stangoodvibes Рік тому

    ANOTHER video that shows the same thing - getting files from a singlr folder level. Too easy. How about getting all the files from a nested folder structure where the actual files may be n levels down (n is unknown)???

  • @srinivasu5984
    @srinivasu5984 3 роки тому +1

    nice

  • @ayubshaik2415
    @ayubshaik2415 3 роки тому

    Hi.......