Great video. Pin the cluster is an important concept, I had a question on it in my exam. You have covered everything needed for it. One question, which SQL variant does Databricks use internally say when we use %sql or spark.sql command? Is it MSsql, postgres, something else?
Thank you for the course, Tybul! Can you please cover Databricks integration with other Azure services and a real-time project? Also, if possible, release the next course at least once a week because I do not waste on other videos. You are Perfect. Thanks!
Databricks integration with ADF will be covered soon. Real-time project will be covered at the end of the whole course. I'm trying to release videos on a weekly basis but I can't guarantee that.
Suppose after using the databricks I remove that from resource group and then again when I try to create a new databricks Is it allow to use 14 day trial within the 14 days period for a newly created databrick ?
Hello Piotr! I have been following your DP 203 episodes from a month. The use cases taught are really informative and helps us learn from your vast experience in Data Engineering Field. I have a request for you - Since 2024 Paris Olympics are right around the corner, i am thinking of a Data Streaming Project ,in which we can ingest Olympics data from APIs and transform in Azure cloud and save back to Data Lake. This project would help us in implementation of topics learnt in your episodes. Please guide me by providing the high level architecture of how this project can be implemented. Thanks in Advance :)
Peter, it's always good to listen to you. It's a pity there weren't such tutorials when I started with this technology.
Thanks Tybul for this great content on how to load data in Databricks and overview of how to use !!
This was another great content, don't have any words left to thank you. The best part is i never feel bored.
Thanks!
Thank you. Fantastic content as always.
Great video.
Pin the cluster is an important concept, I had a question on it in my exam. You have covered everything needed for it.
One question, which SQL variant does Databricks use internally say when we use %sql or spark.sql command? Is it MSsql, postgres, something else?
It is Spark SQL: spark.apache.org/docs/latest/sql-ref.html
Thank you for the course, Tybul! Can you please cover Databricks integration with other Azure services and a real-time project? Also, if possible, release the next course at least once a week because I do not waste on other videos. You are Perfect. Thanks!
Databricks integration with ADF will be covered soon.
Real-time project will be covered at the end of the whole course.
I'm trying to release videos on a weekly basis but I can't guarantee that.
Suppose after using the databricks I remove that from resource group and then again when I try to create a new databricks Is it allow to use 14 day trial within the 14 days period for a newly created databrick ?
I think so.
Hello Piotr!
I have been following your DP 203 episodes from a month. The use cases taught are really informative and helps us learn from your vast experience in Data Engineering Field.
I have a request for you - Since 2024 Paris Olympics are right around the corner, i am thinking of a Data Streaming Project ,in which we can ingest Olympics data from APIs and transform in Azure cloud and save back to Data Lake. This project would help us in implementation of topics learnt in your episodes. Please guide me by providing the high level architecture of how this project can be implemented.
Thanks in Advance :)
Hi, milestone 4 will have a challenge related to streaming.
This video is super helpful. Also, could you please suggest some resources to learn sql or may be you could start sql series :) Thanks!
For learning SQL I can recommend "T-SQL Fundamentals" book by Itzik Ben-Gan: itziktsql.com/t-sql-fund-4th-edition-1
Thanks
Was deleting your cluster (due to not pinning it) a big problem in your project? Did it much waste money or time?
Nope, it was pretty easy to recreate it as it wasn't used yet so it didn't have much configuration done in the first place.