How to Load Multiple CSV Files to Multiple Tables According to File Name in Azure Data Factory

Поділитися
Вставка
  • Опубліковано 7 січ 2025

КОМЕНТАРІ • 40

  • @mairaacevedo8960
    @mairaacevedo8960 2 роки тому +4

    Fantastic content. I combined some of your videos to do a bit of a complex task and I'm so happy it worked!. Thanks heaps!

  • @jayong2370
    @jayong2370 2 роки тому +1

    Thank you! This is exactly what I was trying to configure.

  • @bitips
    @bitips 2 роки тому +1

    Thanks for share this knowledge. It is fantastic !

  • @piraviperumal2544
    @piraviperumal2544 2 роки тому +4

    Hi Brother,
    Not sure whether Azure team fixed it or not but @replace(item().name,'.txt','') is working fine. I guess you have missed @ sign before replace function in your attempt.

  • @marcuslawson-composer2892
    @marcuslawson-composer2892 2 роки тому +1

    Very helpful video. Thank you!

  • @neftalimich
    @neftalimich 3 роки тому +1

    Thank you very much, was really helpful.

  • @maartencastsbroad
    @maartencastsbroad 2 роки тому

    Great video, exactly what I needed!

  • @williamtenhoven8405
    @williamtenhoven8405 Рік тому

    Hi, thanks for this ! 1 question. suppose I wanted to convert the csv files to parquet files the how would I proceed ? I used the concat replace, but looking at the target parquet files they seem to be corrupted : The file 'Emp1' may not render correctly as it contains an unrecognized extension. @concat(replace(item().name,'csv','parquet')) does not work either..... Any suggestions ? Thanks

  • @gebakjes1099
    @gebakjes1099 3 роки тому +1

    Thanks! Really helpful!

  • @TheMadPiggy
    @TheMadPiggy 3 роки тому +3

    works like a charm, however the auto created tables all are nvarchar(MAX). Not the best for database size, not for useability. Any way around this.

    • @TechBrothersIT
      @TechBrothersIT  3 роки тому

      I noticed that too, the data type is nvarchar max. You might want to create final tables once data is loaded and have final tables with correct data types, create stored procedure to load data from these staging tables to your destination tables. if you have already created the tables with correct data type, then you will be fine too.

  • @purushothamnaidu5544
    @purushothamnaidu5544 3 роки тому +4

    Sir...Can you show once how to load the files available in blob container and load into multiple existing tables in azure sql database, that would be really helpful to me

    • @ambatiprasanth4292
      @ambatiprasanth4292 Рік тому

      Brother i was looking for the same... Now did you know how to do it.?

  • @vishal-xf6ev
    @vishal-xf6ev 3 роки тому +1

    Hi Brother ,
    Great Video & thanks for sharing :-)

  • @harshanakularatna
    @harshanakularatna 3 роки тому +1

    you are awsome. keep it up!

  • @tomasoon
    @tomasoon Рік тому

    very great tutorial, i have a question, if I run a pipeline, and there's a new csv file in the bucket with the same schema as other, this method will apend the data to the table with same schema or will create another one?

  • @devops840
    @devops840 2 роки тому +1

    Hi Sir,
    I am able to insert the data using dynamic CSV files, Could you please help me in upserting the data ?

  • @kiranreddy9103
    @kiranreddy9103 2 роки тому

    HI, if file names are like emp1 ,emp2, emp3 etc. in this case how we can write a expression to remove numb
    ers in REPLACE. could you help us.

  • @boatengappiah2116
    @boatengappiah2116 3 роки тому +1

    Great videos. I however don't see any video on SharePoint with ADF. Do you have a video or can you make one? Thank you

    • @TechBrothersIT
      @TechBrothersIT  3 роки тому

      Hoping to have one soon. working in many videos and scenarios. thanks for feedback

  • @insane2093
    @insane2093 9 місяців тому

    Small query sir , once created the table, again if new data or new files come with suffix changes like date change then again it create new table or insert the data into the already created table coz you are using auto created option . Thank you in advance

  • @Eraldi2323
    @Eraldi2323 2 роки тому +1

    Hi TechBrothers, thanks for this very useful video.
    I had a question, I am trying to truncate the tables with the following
    @{concat('truncate table',item().name)} but is not working for me, giving an error
    Please advise.
    Thank You

    • @niranjanchinnu8295
      @niranjanchinnu8295 2 роки тому

      i tried this today as well. My implementation idea is to truncate and insert into tables. For that I truncated the table with TRUNCATE TABLE [SCHEMA_NAME].@{item.name} . After this step if the table exists already then it would truncate. Orelse try pointing a fail output line to the same block that you are pointing the sucess block. So by doing this if table doesnt exists then it will go in the fail block and execute it and if it is present then it will truncate and give you the appropriate results

  • @Deezubwun
    @Deezubwun 2 роки тому

    Hi. This was a great help to me. One issue I am having is the data is failing to load due to multiple data type errors (such as String to DATETIME). As the data in the CSV is exported as string, do you have a way of mapping the formatting of each field which is a problem, bearing in mind the columns may be named something different?

  • @ayushikankane530
    @ayushikankane530 6 місяців тому

    If csv file is hqving some columna as json structure than how to proceed?

  • @SRINIVASP-fx5kz
    @SRINIVASP-fx5kz 2 роки тому

    excellent video super

  • @viswaanand4578
    @viswaanand4578 2 роки тому

    Hi
    I can see my csv files in SSMS but cannot see in table format in SSMS also it is in CSV format did i miss anything?

  • @uditbhargava8762
    @uditbhargava8762 2 роки тому

    Sir can we use split() function to remove .txt ?

  • @thyagaraj1124
    @thyagaraj1124 2 роки тому

    Is it possible to load the different source files into existing tables in the SQL server? Means the source file names do not match with the existing table names?

    • @TechBrothersIT
      @TechBrothersIT  2 роки тому +1

      Hi, yes that is possible, but you have to provide some type of source and destination, if file names are different , you can group them in source and then destination table can stay same.

  • @rohitsethi5696
    @rohitsethi5696 Рік тому

    hi im rohit can we use copy data activity from CSV files if not why ?

  • @vijaysagar5984
    @vijaysagar5984 2 роки тому

    Hi Bro,
    Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.

    • @TechBrothersIT
      @TechBrothersIT  2 роки тому

      One of the way could be load the data without header information into staging table and then remove the bad header data and only use clean data.

  • @niteshsoni2282
    @niteshsoni2282 2 роки тому +1

    THANKS SIR

  • @kirubababu7127
    @kirubababu7127 3 роки тому

    How to do this in HTTP server?