OMG! Now I connect Azure databricks with ADLS in various ways. You are a real teacher. If you open the membership, I think I will be number one to join. Thanks
Thank you very much. It is really good to know all these methods. Just 2 months ago, I was in need of these access methods to connect ADLS Gen2 and ADB. The way you explain is very simple and straight to the point.
If someone outside of our organization(using a different Databricks instance), creates the secret scope that connects to our Key Vault, and somehow has our service principle block of code could they access our Data Lake!?
Fortunately you can't just create a secret scope to any Key Vault out there - you have to be granted Key Vault Contributor, Contributor or Owner role on the Azure key vault instance that you want to connect to. So someone outside our organization might try to create a secret scope but will fail.
Thank you so much for the video. I tried connecting to storage account using service principal, i did everything created the service principal, assigned storage blob contributor role and created secret scope in databricks. And when I tried to list the files in the storage account, I am getting an error saying - this request is not authorised to perform this operation 403. Any idea why is this happening ? Thanks in advance
I would start troubleshooting in the following way: 1. Can you connect if you hardcode service principal secret? If yes, then the issue is related to the secret scope. 2. If not, then check it using hardcoded storage account key. If it works, then the issue is related to permissions granted to the service principal. If it doesn't then the issue might be on the networking side. Or simply the permissions were not propagated yet and you might need to wait few minutes. Please try those things and let me know the outcome.
Best session on connection with ADLS. Thank you
My pleasure
OMG! Now I connect Azure databricks with ADLS in various ways. You are a real teacher. If you open the membership, I think I will be number one to join.
Thanks
Wow, thank you!
Thank you very much. It is really good to know all these methods. Just 2 months ago, I was in need of these access methods to connect ADLS Gen2 and ADB.
The way you explain is very simple and straight to the point.
Glad it was helpful!
Tybul, it doesn't make sense how well you explain things. Thank you very much for your videos!!
Thanks! I just remember what confused me when I was learning about it and I'm trying to make it easier for you.
Excellent video. Just what I was looking for!! I have referred a lot of videos for this but none had this much clarity and ease of understanding!!
Glad it was helpful!
Thanks Piotr for this video on connecting ADLS from Databricks!! Great content to learn !!
Thank you Piotr with your help I passed my exam last week! Couldn't have done it without you
Awesome Michael! Congratulations!
Great content as always! (Love the meow at the end😂)
Yup, according to my cat it was the high time to end the episode and play with it.
Your assistent at the end 😂😼
If someone outside of our organization(using a different Databricks instance), creates the secret scope that connects to our Key Vault, and somehow has our service principle block of code could they access our Data Lake!?
Fortunately you can't just create a secret scope to any Key Vault out there - you have to be granted Key Vault Contributor, Contributor or Owner role on the Azure key vault instance that you want to connect to. So someone outside our organization might try to create a secret scope but will fail.
Tybul thanks a lot for the videos! One quick question.. Basically, for creating the scope you must be a AKV owner user (or similar). Isn't it?
That's correct.
Can you cover how it works with Unity Catalog?
Yes, after core DP-203 part.
@@TybulOnAzure Great, this will be an interesting topic!
Thank you so much for the video.
I tried connecting to storage account using service principal, i did everything created the service principal, assigned storage blob contributor role and created secret scope in databricks. And when I tried to list the files in the storage account, I am getting an error saying - this request is not authorised to perform this operation 403. Any idea why is this happening ?
Thanks in advance
I would start troubleshooting in the following way:
1. Can you connect if you hardcode service principal secret? If yes, then the issue is related to the secret scope.
2. If not, then check it using hardcoded storage account key. If it works, then the issue is related to permissions granted to the service principal. If it doesn't then the issue might be on the networking side.
Or simply the permissions were not propagated yet and you might need to wait few minutes.
Please try those things and let me know the outcome.
I’m getting the same issue while hardcoding the account key as well.
Can you paste the command you use to list the contents of your container?
@@TybulOnAzure dbutils.fs.ls("abfss://init@apsaaclprod.dfs.core.windows.net")
@@TybulOnAzure also why is it taking so much time to start the cluster ? is there any way to speed up that ?
Hi Piotr :)
Is git or github knowledge needed for the exam?
I don't think so. But definitely you'll need it for your daily work.
@@TybulOnAzure Thanks. Any tips on how I should get familiar with it? I want to know at least the basics and I wonder if UA-cam is going to be enough.
@chillvibe4745 try this page for learning git in a visual way: learngitbranching.js.org. It is not specific to Azure, just good old pure knowledge.