12. How to perform Upsert for Incremental records using Azure Synapse Pipelines

Поділитися
Вставка
  • Опубліковано 29 гру 2024

КОМЕНТАРІ • 37

  • @paul-xr4kj
    @paul-xr4kj Рік тому +2

    Thankyou very much for taking the time to share your knowledge and create these videos. I have learned a huge amount from them and they have been essential in getting a tricky synapse solution over the line for a customer.
    You should consider advertising and selling your Professional Services on the channel as well - you clearly know all of this inside out

  • @DrumCodeLive
    @DrumCodeLive Рік тому +1

    Excellent Annu, very well presented, thanks for investing your time and effort into this.

  • @Andydady-fp2zm
    @Andydady-fp2zm 2 роки тому +1

    Superb!!!, i was looking same scenarios. This video help me a lot. Thanks. Keep posting..👍

  • @chetak8377
    @chetak8377 Рік тому +1

    Gone through all your videos, these are very well explained and detailed. Helped to answer scenario based interview questions. Thanks Mam for this content. Looking for more videos on this. 👏👏

    • @azurecontentannu6399
      @azurecontentannu6399  Рік тому

      Thankyou so much for watching the videos and sharing feedback 😊 Glad to know it was helpful

  • @AshishKumar-qk9rc
    @AshishKumar-qk9rc 2 роки тому +1

    Excellent work🙌

  • @jehok1007
    @jehok1007 2 роки тому

    This topic is what I just needed. Thank you so much !!!

  • @gotimukulshriya1864
    @gotimukulshriya1864 3 місяці тому +2

    Hi, I have a scenario where a single table has multiple unique keys or composite keys. How do we match all the keys with the target table using key Column?

    • @azurecontentannu6399
      @azurecontentannu6399  3 місяці тому

      @@gotimukulshriya1864 yes store the keys in the form of col1, col2, col3 in a SQL table and dynamically point to this col having the key columns list and use createarray function @createarray(keycol)

  • @gotimukulshriya1864
    @gotimukulshriya1864 3 місяці тому

    Hi, I also have a scenario where I have few values of type decimal(38,18) on the source which I am copying into adls as a parquet file. In this case the column is converted to string. While copying this data to dsqlpool I am facing an issue saying “cannot load string to decimal(38,18). Do you have a work around for this?

  • @Xavwar
    @Xavwar Рік тому

    great video, thanks, do you happen to know if the upsert supports a uniqueidentifier column as a key column?

  • @moulshreesuhas504
    @moulshreesuhas504 7 місяців тому

    Can we do deletes as well?
    Also can we use source as on-premises SQL database and destination as Azure Synapse?

  • @eq2home
    @eq2home 2 роки тому +1

    Well done. I have a question that you may have thought through. I have 800+ tables in one of my databases of a system disbursed geographically at various locations. The tables are organized around accounts, parties, events, etc. Of the 800+ tables, only several hundred have system-initiated create or update fields (timestampcreate and timestampchange.) My Azure repository requires all tables to be present for reporting purposes. To develop virtual timestampcreate and timestampchange (rather than copying several million records each night), I've linked the tables in views to the nearest tip of the spear (for example, accounts) that does contain a timestampchange field. Do you think this is a strategy that you would suggest? Long and short, when the tip changes, the rest of the tree also changes.

  • @ashredsa
    @ashredsa 8 місяців тому

    How to do upset in dataflow with 2 column as key ?

  • @deepjyotimitra1340
    @deepjyotimitra1340 2 роки тому +1

    very good content. thank you..
    i have a scenario, where my source data wont have any updates, only new records will be there along with old data(without any update in records) for every run. in this case, if i do upsert based on key column then all existing records will be updated(although nothing to update) and new records will be inserted(which is fine).
    how to handle this scenario? i just want to avoid update only new data should get insert and rest should be ignored..

    • @monke3832
      @monke3832 2 роки тому +1

      Go for incremental load

  • @avinashthakur4744
    @avinashthakur4744 2 роки тому

    Great ✌️

  • @jhellier
    @jhellier 2 роки тому

    I noticed Upsert still downloads all the rows so there is no time saving. Can you confirm?

  • @lagendijkict
    @lagendijkict 7 місяців тому

    I had the exact same List1 error, {
    "errorCode": "2200",
    "message": "ErrorCode=UserErrorUnexpectedObjectTypeInPayload,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of property 'keys' is in unexpected type 'List`1'.,Source=Microsoft.DataTransfer.DataContracts,'",
    "failureType": "UserError",
    "target": "Src to PreStg",
    "details": []
    }
    but when using @array the error stays the same. What could be the reason?
    I also have the key columns in a table that a lookup looks through.

  • @baronharkonnen7894
    @baronharkonnen7894 Рік тому +1

    Hi Ma'am. Is there a way we can integrate incremental models in dbt to azure synapse?

  • @rathikavenkatesh2510
    @rathikavenkatesh2510 2 роки тому +1

    Hi could you please share the previous incremental. I am not able to follow the intital steps

    • @azurecontentannu6399
      @azurecontentannu6399  2 роки тому

      Please watch 11th video of the playlist . Thankyou for watching

    • @rathikavenkatesh2510
      @rathikavenkatesh2510 2 роки тому +1

      Thanks. I have few doubts. Do you have any email to contact. To perform upset do we require unique key column or primary key column.

    • @azurecontentannu6399
      @azurecontentannu6399  Рік тому

      @@rathikavenkatesh2510 Key column - it can be a single column or combination of columns to make it unique

  • @mohammedumar3684
    @mohammedumar3684 2 роки тому

    I am getting the same error that states, azure data factorySql upsert key column '[ column_name]' does not exist in the table '[dbo].[InterimTable_c...]'
    In my framework I am using the Temp DB feature, any assistance will be appreciated, thanks.

  • @sajinmp5955
    @sajinmp5955 2 роки тому

    Hai, is there any way to update azure sql table rows to pg sql table using this concept ?

  • @pamilad5473
    @pamilad5473 2 роки тому +1

    Assume source is csv file and sink is azure sql...And source data doesn't have any primary key column...in this case how to build upsert logic?

    • @azurecontentannu6399
      @azurecontentannu6399  2 роки тому

      Then you need to use hashing concept to assign hash values to each records and match with target dataset. If it matches, update else insert.

  • @rathikavenkatesh2510
    @rathikavenkatesh2510 2 роки тому

    I am trying to do insert and update incremental. Please share the complete steps . Thanks in advance