Snowflake BUILD | How To Use Apache Iceberg With Snowflake And AWS
Вставка
- Опубліковано 9 лют 2025
- This demo from Snowflake BUILD provides a demonstration of how to integrate Snowflake with a data lakehouse built with Apache Iceberg on AWS. You will learn how to create Iceberg Tables, register them with AWS Glue Data Catalog, and query the same table from Amazon Athena and Snowflake. In addition, you will see how to easily use various Snowflake features like data sharing, time travel, and converting catalogs without any data rewrite or upfront ingest.
Subscribe for more! www.snowflake.c...
Explore sample code, download tools, and connect with peers: developers.sno...
Everything sounds great in demo with full admin access. Very easy in my own Snowflake and AWS accounts.
Challenge is to accomplish this at a real workplace when you have to talk to 3 admins and understand what to ask to whom.
Is it possible to create partitioned data in S3 bucket using snowflake catalog from snowflake query ? So lets say i want to create partition based on date on my timestamp column in a table and it should store in S3 location in respective date partition.
The iceberg table created through Glue , files were in Avro format.
But Snowflake support Iceberg tables with files in Parquet only.
So how did this work ?
She did not browse the data folder which would have contained parquet files only (default format). Instead, she browsed the metadata folder which would contain json/avro files. But, when she demoed, she didn't create the table, nor did she insert the records. She merely showed the table so I guess you got confused like me. I went back and checked, that they may have created the table and inserted the records before the demo itself.