Це відео не доступне.
Перепрошуємо.
Lakehouse with Delta Lake Deep Dive Training
Вставка
- Опубліковано 21 лип 2021
- In this course, we will provide a brief overview of data architecture concepts, an introduction to the Lakehouse paradigm, and an in-depth look at Delta Lake features and functionality. You will learn about applying software engineering principles with Databricks as we demonstrate how to build end-to-end OLAP data pipelines using Delta Lake for batch and streaming data. The course also discusses serving data to end users through aggregate tables and Databricks SQL Analytics. Throughout the course, emphasis will be placed on using data engineering best practices with Databricks.
By the end of the course, students will understand how to apply data engineering best practices within Databricks.
Prerequisites:
Familiarity with data engineering concepts
Basic knowledge of Delta Lake core features and use cases
Get better faster with Databricks Academy: academy.databr...
Get the Delta Lake: Up & Running by O’Reilly ebook preview to learn the basics of Delta Lake, the open storage format at the heart of the lakehouse architecture. Download the ebook: dbricks.co/3kE...
Connect with us:
Website: databricks.com
Facebook: / databricksinc
Twitter: / databricks
LinkedIn: / databricks
Instagram: / databricksinc Databricks is proud to announce that Gartner has named us a Leader in both the 2021 Magic Quadrant for Cloud Database Management Systems and the 2021 Magic Quadrant for Data Science and Machine Learning Platforms. Download the reports here. databricks.com...
Material is well articulated but it looks like the trainer was asked to work on weekend :P
True, great material but the trainer is soo low energy that the video is hard to listen to.
Where can i get the notebook?
What is difference between MANAGED delta table and EXTERNAL Delta table in azure databricks? Can we do insert , delete , update in both the types.
Yes, you can perform insert, delete, and update operations in both types of tables. Delta Lake's support for these operations allows you to easily maintain the data in your tables by providing functionality similar to what you would find in a traditional relational database management system. Note that the ability to perform these operations is one of the features that separates Delta Lake from simpler file-based data storage formats.
However, while you can perform these operations on both Managed and External tables, it's crucial to remember that in the case of External tables, these operations will only affect the table view Delta Lake is managing, not the underlying data in the external storage.
Although we are creating the internal and external tables using files available in dbfs ,under the hood tables are stored as parquet files only. And the all the history of delta table are stored as transaction log files which are in json format. For internal tables when we delete the tables the data and metadata will be be deleted. But in case of external table if we delete the table only the metadata will be deleted. The underlying files will not be impacted by this as it is stored in external storage like adls and s3.
Where do you get the materials promoted in this video?
I'm wondering the same. I replayed a few times to be sure I didn't miss it.
it is in databricks academy. course name: Lakehouse with Delta Lake Deep Dive
i like the delta lake design pattern
I know this video is been up for a while, but I'm curious what catalog/schema in the meta catalog the table is being created in lesson 3 step 2 (1:25:00 ) in video? Can Create table SQL specifiy bronze.health.health_tracker_processed for example? Will these tables be visually depicted in the "Data" tab in the DB IDE?
Link to the databricks notebooks?
Well articulative.
hi I'm Ariful Islam leeton im software
Super articulative
her speech is so unclear and slurred, the other speaker was so much better