Great video!! Is it possible to have my iceberg table created through databricks and have snowflake iceberg table to read the parquet seamlessly? Will Snowflake catalog gets updated everttime there are inserts happening on azure iceberg table
Using Federated Query is a different pattern and I have not explored it. That said, it should work to query Snowflake tables but it would not go through the Snowflake SDK. The SDK is a great option because it doesn't require Snowflake compute and my understanding is that the Federated query model would.
Unfortunately, no. There are two main patterns: -Snowflake write to Managed Iceberg tables -Spark write to Snowflake externally managed tables as of today there is no way to to write externally to a managed iceberg table. Stay tuned tho!
Amazing demo Paul!!
Hi Paul, amazing demo many thanks! can we use Iceberg tables with Azure Data Lake Storage Gen2 instead of S3? only on azure platform
Great video!!
Is it possible to have my iceberg table created through databricks and have snowflake iceberg table to read the parquet seamlessly? Will Snowflake catalog gets updated everttime there are inserts happening on azure iceberg table
what is the cost (how is it billed) to read data from iceberg managed tables using an external spark job?
nice demo. Can we access to iceberg tables using query federation in databricks instead of libraries to make connection to snowflake?
Using Federated Query is a different pattern and I have not explored it. That said, it should work to query Snowflake tables but it would not go through the Snowflake SDK. The SDK is a great option because it doesn't require Snowflake compute and my understanding is that the Federated query model would.
@@PaulNeedleman I replicated all the steps from the demo but I was not able to list the table. It was empty result from databricks end. Any clue?
@@PaulNeedleman I have workspace unity catalog enabled. Is that the reason?
@nitinkansal it's very possible. I tested without unity catalog.
Is it possible to write data from Spark to a Snowflake managed iceberg table?
Unfortunately, no. There are two main patterns:
-Snowflake write to Managed Iceberg tables
-Spark write to Snowflake externally managed tables
as of today there is no way to to write externally to a managed iceberg table. Stay tuned tho!