Must watch for delta live tables. What a wonderful walkthrough of the capabilities of delta lake & delta lake tables. Thanks Emil, for such a wonderful session and making it interesting with lakes!
I have not seen any good content on delta lake architecture on youtube in recent times with this explanation .. i will go through all the frames without a miss
Once again.. thanks for such great video! It can give any person a solid knowledge of all the elements of the deltalake.. fantastic flow and orchestration, all over the video! #dataengineer #datascientist #solutionarchitect #citizendeveloper #databricks #deltalake #lakehouse
Again a nice content well planned and detailed explanation. I have a silly doubt so delta tables are actually managing & optimizing the parquet files storage along with which files to read and to skip, optimizing IO operations etc. If possible could you please make a video on parquet file storage(How it's stores the data and what kind of metadata it carries) or if u have any could u please share. Btw I really like ur contents and it is helping me a lot to enrich my knowledge base. Thank you so much..
Hi. Thanks for feedback. Main thing is delta log, which i tried to explain in details. This plus optimization techniques from second part o video, are the main magic ingrediens, and allows effecrive data skiping.
Must watch for delta live tables. What a wonderful walkthrough of the capabilities of delta lake & delta lake tables.
Thanks Emil, for such a wonderful session and making it interesting with lakes!
I have not seen any good content on delta lake architecture on youtube in recent times with this explanation .. i will go through all the frames without a miss
Awesome crash course. You really dive into the mechanism of Delta in furious clarity!
ERA OF FAKE NEWS..... I LOVE IT..... REALLY GOOD LESSON....CANT THANK YOU ENOUGH FOR YOUR GENUINE EFFORT AND DEDICATTION...BEYOND WORDS
Wonderful video, probably best material on the given topic in the whole internet!
You’re truly brilliant for putting in such a great effort to help us understand. Thank you so much, mate-it’s incredibly helpful! ❤❤
Once again.. thanks for such great video! It can give any person a solid knowledge of all the elements of the deltalake.. fantastic flow and orchestration, all over the video!
#dataengineer
#datascientist
#solutionarchitect
#citizendeveloper
#databricks
#deltalake
#lakehouse
Thanks Pal
In the last Spark architecture video I asked for this
Keep doing the great work
Hope you will like it. Let me know if you have ideas for other videos.
this is just superb!!! Waiting for many more in depth and quality contents like this in coming days .. thank you so much 👍
Excellent !! keep smiling 😀and keep posting videos like this😀
Thank you! This is interesting and content-rich!
Really great video on Delta Lake,. Can you pls make a video on Spark Optimizations.
Let me think about it. Many ways to can approach it.
@@DatabricksProThanks, looking forward to it
Thank you! That is an awesome tutorial
Extremely great content 👌
Can you please share the link to the notebooks so that we can practice.
Hi there, where can I find a code resource for this video tutorials. Thanks in advanced.
Again a nice content well planned and detailed explanation.
I have a silly doubt so delta tables are actually managing & optimizing the parquet files storage along with which files to read and to skip, optimizing IO operations etc.
If possible could you please make a video on parquet file storage(How it's stores the data and what kind of metadata it carries) or if u have any could u please share.
Btw I really like ur contents and it is helping me a lot to enrich my knowledge base. Thank you so much..
Hi. Thanks for feedback. Main thing is delta log, which i tried to explain in details. This plus optimization techniques from second part o video, are the main magic ingrediens, and allows effecrive data skiping.
Which book you referring for this tutorial
Nice tutorial, do you have explained notebook handy
Schema Evolution ?
Btw, could you add timecodes to this vid. I guess, it would affect not only the watching experience, but also may attract YT algorithms.
I am super happy you found it useful, and thank you for your suggestion. I will definitely try it!