I have one Lakehouse and one warehouse in one workspace. And I read from Lakehouse and load those tables to warehouse by pipeline with option auto create table in destination. But when I again run that pipeline it add double data to the same table. Is there any way to solve this issue?
Thanks for your comment. Yes, it is a common error people encounter with the Fabric DWH. Your data gets appended instead of updated, right? When setting the options in the "Connect to data destination" window, choose Overwrite instead of Append. It should be below the Destination table name field (may have been changed with latest Fabric updates, but the option should be somewhere in that vicinity.
excellent
Thanks for the tutorial. You have provided the SQL scripts for creating the tables etc. How about the SQL Server set up for use in the data pipeline?
I have one Lakehouse and one warehouse in one workspace. And I read from Lakehouse and load those tables to warehouse by pipeline with option auto create table in destination. But when I again run that pipeline it add double data to the same table. Is there any way to solve this issue?
Thanks for your comment. Yes, it is a common error people encounter with the Fabric DWH. Your data gets appended instead of updated, right?
When setting the options in the "Connect to data destination" window, choose Overwrite instead of Append. It should be below the Destination table name field (may have been changed with latest Fabric updates, but the option should be somewhere in that vicinity.
Thank you🙏
Welcome! Glad you liked the video.