Great summary!!
Invaluable. Thank you 🙏
You can use Terraform to provision your Workflows, Tasks, Clusters, Notebooks, etc. programmatically. Then Terraform scripts (*.tf, *.hcl) can be uploaded to Git and used in CI/CD as well.
Thanks for your comment. Terraform is not open source anymore which causes me to pause on its future. OpenTofu is the new open source Terraform. You can also use Python with the Databricks Python SDK, or just Python with the Databricks REST API or the new Databricks Asset Bundles.
I wish they would add the possibility of adding workflow dependencies to other workflows. As a data engineer, you need this 100% of the time.
@@BryanCafferky I meant would be immensely helpful if Databricks workflows offered the feature to set a trigger mode based on the completion or state of other workflows, given we have the limit of 100 tasks per workflow.
I have workflow with task A and task B, and 10 mores. I would like to have widgets or parameters like A : True, B : False... and it would decide if task should be skipped or now. Is it possible? How?
Can you help how we can create the drop down for task parameters in worflow
You use widgets. Doc here learn.microsoft.com/en-us/azure/databricks/notebooks/widgets
Thank you for sharing your knowledge! One question: is there a way to create this workflow using some type of ci/cd? for example, creating a development branch and pull request to merge in a master branch?
The main idea is to create the workflow into a development environment and send it to the production environment.
Yes. There are several ways. I am using the Databricks Python SDK from an Azure DevOps pipeline to do this. However, workflows are not stored in the repos so you'll need to use the UI, get the JSON and paste it into a file in your repo. learn.microsoft.com/en-us/azure/databricks/dev-tools/sdk-python You can also use the new Databricks Asset Bundles learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/
Agree on the limitations. For some reason a Databricks Workflow cannot contain more than 100 steps.
Luckily there is now a new feature where a workow can contain a new kind of step which triggers another job.
So now you can atleast subdivide you job into multiple smaller ones and then have a mster job that triggesr all the sub-jobs.
But still, it would be way easier to just not have that limitation. It feels kinda artificial :/
how to use if else branch logic ?
Hi @bryan, Why are these videos still not in the playlist on your website, it's been 2 weeks since you posted them here. I'm looking under the DataBricks Section and can't find them. I think your website should be first class citizen for locating your videos as well. Cheers and thanks for the helpful videos.
Hi @Baravindk, They are in the YT playlist and the GitBook points you to the playlist rather than listing all the videos therein. To make new videos more easily found, I added a new videos menu to the GitBook and added these. These videos are in the UA-cam Master Data Lakehouse playlist. Thanks
Hi Bryan, wondering if you could a video of databricks and DBT? Would be interested in your thoughts :)
I have not used dbt but from what I have seen it is very powerful. Thanks
When creating a workflow, does it allow you to drag and drop tasks?
No. The UI is more select and set the properties. The UI will update to the properties like dependencies.
love this video. The dashboard refresh is supercool