@mastering_snowflake I have a specific use case where every time there is a change in raw table, I run the merge statement that merges based on multiple columns. If the record from select statement in raw table is found with the matching values for all the column in the target table (each combination of column values in merge condition is unique) then I update the record, else I insert a new record. How is this possible in Dynamic table? In other words, how can I mention multiple columns as merge condition in Dynamic table?
will it be a good use case to use snowpipe streaming in case my source is oracle on premise database, but I want low latency data ingestion in snowflake tables. Or do you recommend going the traditional way of using snowpipe and queues in azure?
Would It be posible to share tables between different databases with a dynamic table? I usually create a copy of a table to a different database using a view but its quite painfull when the schema changes
Thanks for sharing. when you create a dynamic table on top of query, is it going execute for all the records in source tables everytime or try to ingest only new records into the dynamic table when it is refreshed?
Thanks, very useful.Does Snowpipe streaming work with Kafka using kafka connector only or can it similarly integrate with Google PubSub as well using a similar connector?
Are we going to get Latteral flatten json ability during snowpipe creation?
@mastering_snowflake I have a specific use case where every time there is a change in raw table, I run the merge statement that merges based on multiple columns. If the record from select statement in raw table is found with the matching values for all the column in the target table (each combination of column values in merge condition is unique) then I update the record, else I insert a new record. How is this possible in Dynamic table? In other words, how can I mention multiple columns as merge condition in Dynamic table?
Why would you want to update the record if all column values match?
will it be a good use case to use snowpipe streaming in case my source is oracle on premise database, but I want low latency data ingestion in snowflake tables. Or do you recommend going the traditional way of using snowpipe and queues in azure?
Traditional way for me
Would It be posible to share tables between different databases with a dynamic table? I usually create a copy of a table to a different database using a view but its quite painfull when the schema changes
Cloning?
@@roopad8742 cloning creates a copy of a table in that instant, if the original table changes the clone doesnt update
@@cristinaperez5757 Yes but adding a step to clone rather than copy using a view is much more Efficient isn't it?
@@roopad8742 ok, thanks for replying!
Thanks for sharing. when you create a dynamic table on top of query, is it going execute for all the records in source tables everytime or try to ingest only new records into the dynamic table when it is
refreshed?
This article should help clarify things medium.com/snowflake/slowly-changing-dimensions-with-dynamic-tables-d0d76582ff31
Thank you for sharing, it is very useful
Thanks, very useful.Does Snowpipe streaming work with Kafka using kafka connector only or can it similarly integrate with Google PubSub as well using a similar connector?
I believe it’s only Kafka at this stage
Hhhhhmmm can the Dynamic table query or pull only data that hasn't been pulled since last time?
Yes it uses change tracking for this