I have one doubt, if we have a files at adl gen2 we have to create mount point and work on the files. if we have delta lake is it required create to mount point.
Hello Bhawana, Great explanation👌👍. Just looking for some inputs b/n Delta table vs Live table in data bricks. See that both can process streaming and batch. What is the difference and whihc one is preferred?. Please share if you have any information on this .thanks!
Clear explanation in the very simple language. Thanks
Thank you so much, you are one of the best explainer
The way of explanation is quite good.
The best content on spark in youtube.👋👋
Data Queen. Thank god you exist.
Appreciate your efforts . Very good explanations 👏👏.
Great explanation 👌, got the idea💡
dont hesitate to create more content. Thanks
Wonderful effort👏👏👏
You are an amazing teacher..can you please create some videos on powerbi and DAX.Honestly this is one of the best content
Great video. Thanks for all your efforts !
Please add the links to the other videos discussed in the tutorial.
It feels good when women are also creating youtube technical videos. Otherwise 95-99% are only men.
Wow really it’s very helpful Sister
Please do more videos on Delta lake, how can we read and apply busses logics
Hello - Can you please share how to deploy delta tables to other environments
your videos are informative. Just a small request can you pls make use of micophone as the audio is too harsh for all of your videos.
great video.. Maam
Nice explanation 👌 👍
Can we bulk insert delta table by ssms from blob storage?
At 10:32 are the .crc files checkpoint file or after every 10 transactions we have get a checkpoint file created?
I am bit confused here.
Nope they are .crc files, check point files are created in parquet format.
What is the difference between .delta (databricks default dataset) and .parquet files (when we write using spark.write.format(delta)) delta tables ?
Super helpful - Thank you so much !! #StayBlessednHappy
Very informative, best content on spark on youtube, thank you
I have one doubt, if we have a files at adl gen2 we have to create mount point and work on the files. if we have delta lake is it required create to mount point.
Yes you need to mount
Awesome Tutorial!! 👋
How to read a specific delta partition? Or partitions for ex. Between dates?
Great content!! Just one request could you please share the notebook commands used.
Great explanation. I got many concepts related to Delta tables. thank you
Informative ❤️
Any chance of doing similar comparison with Apache Iceberg?
Very useful 👍
Hello Bhawana, Great explanation👌👍.
Just looking for some inputs b/n Delta table vs Live table in data bricks. See that both can process streaming and batch.
What is the difference and whihc one is preferred?. Please share if you have any information on this .thanks!
Helped a lot..
Nice explanation..Will you please share me the code used in this video ?
❤❤
ara maam your voice is too sharp keep the mic in some more distance
I wonder how she got 433 likes
Really poor explanation. You are all over the place.
mind turning off ur video? im getting distracted
Very useful 👍