Azure Data Factory - Partition a large table and create files in ADLS using copy activity

Поділитися
Вставка
  • Опубліковано 5 жов 2024

КОМЕНТАРІ • 9

  • @payalkalantri7525
    @payalkalantri7525 Рік тому

    Just a small help , for all kind of I/O errors /connection timeout errors you can use CONNECTIONTIMEOUT in your DB string, this works for ADF.

  • @Thegameplay2
    @Thegameplay2 Рік тому +1

    when it is talking about MPP it is not recommended to have more than 60 partitions

  • @scarabic5937
    @scarabic5937 Рік тому

    Thank you for your awesome tutorials, could you provide the sink parameters please as it is not shown in the video please?

  • @Kishyist
    @Kishyist Рік тому +1

    How do we get the filename prefix maintained in sick? Like filename1_ProductID_NA_760_0.txt. Also, in your example, how did you achieve sql_SalesLT_Product_ProductID_NA_760_0.txt? Can you help telling me about sink details you provided?

  • @mohitarora792
    @mohitarora792 Рік тому +1

    What settings did you used in sink to do multiple writes?

  • @payalkalantri7525
    @payalkalantri7525 Рік тому

    Very nice explanation however partitions are not working for I/O exception errors in ADF.

    • @AllAboutBI
      @AllAboutBI  Рік тому

      Sorry payal, i doubt I understand what you say 🫣 could u pls provide some details abt what io error we talk abt

  • @nr3807
    @nr3807 2 роки тому

    Can you please make a video for a source system like db2 where these options won't show up in the copy activity. But if we have a very big table, what will be the better approach for such scenario?

  • @விரேவதி
    @விரேவதி 2 роки тому

    Nice mam