For combobox and textbox we can set dynamically any value other than given choices but in case of dropdown and multiselect we can't set dynamically value other than given choices then how we can set parameters for them from another notebook which are not in their choices Eg Sir you are passing parameters as 'dropdown' and 'multiselect' but these values are not in their choices so how is it working? Please correct me
How does it take value for dropdown by using parameter, meaning you said in last video dropdown values have to be pre-defined and it will not accept any other values, so in your example, you havent cleared the previous dropdown list so how is it not throwing error, when you send the dropdown parameter value.
is there any way to pass the arguments dinamically? building a f''' something like {"argument": "data", "argument2": "data2", ...} and then pass it like a variable?
nice video.. in real time scenario..if I want to pass the date every day in a notebook.. dbutils.widgets.text("p_file_date"," ") v_file_date = dbutils.widgets.get("p_file_date") in the empty space I will supply parameter... pls correct me if am wrong ..
Hi, I have one question How can use one cluster table for another cluster ( Share one cluster table with other clusters), if yes, please let me know how it works.
Yes. "To enable notebook workflows, please upgrade your Databricks subscription". @Wafastudies: please let us know if we can still free subscription to practice.
Hi Wafa, Thanks for making these videos they are very crisp and clear. I'm getting error while executing run utility, will I need to pay to upgrade it? IllegalArgumentException: requirement failed: To enable notebook workflows, please upgrade your Databricks subscription. FYI I signed in using community edition.
Even I got the same error message as I think passing values from one notebook to other isn't supported in community edition. I see @maheer is using Azure subscription for databricks. Btw, great presentation very crisp and to the point. This surely needs a great dedication and consistency throughout as I have been following all his playlists and he is just awesome Chap!!!!
Thanks for all these explanations, it's really great content, just wanted to check can we get the notebooks used in all the practicals if it is available in some github repository.. Thanks again!
Hi, I am a big fan of yours. Thanks a lot for sharing very good videos. I have a question , if you have created any video on how to pass a value from Databricks to Azure pipeline and use this value in pipeline for further processing . Using command dbutils.notebook.exit('some value'), i can pass value from databricks to pipeline
Thank you for your lucid explainations🙏
Welcome ☺️
For combobox and textbox we can set dynamically any value other than given choices but in case of dropdown and multiselect we can't set dynamically value other than given choices then how we can set parameters for them from another notebook which are not in their choices
Eg Sir you are passing parameters as 'dropdown' and 'multiselect' but these values are not in their choices so how is it working? Please correct me
Is there a way to pass a dataframe as a value using run() to another notebook??
How does it take value for dropdown by using parameter, meaning you said in last video dropdown values have to be pre-defined and it will not accept any other values, so in your example, you havent cleared the previous dropdown list so how is it not throwing error, when you send the dropdown parameter value.
is there any way to pass the arguments dinamically?
building a f''' something like {"argument": "data", "argument2": "data2", ...} and then pass it like a variable?
dbutils. notebook. exit("returnValue") and corresponding "returnValue" will be returned to the service.
Can we able to communicate with another notebooks from one notebook without subscription in Community edition?
I am also facing the following issue. "requirement failed: To enable notebook workflows, please upgrade your Databricks subscription" .
nice video.. in real time scenario..if I want to pass the date every day in a notebook..
dbutils.widgets.text("p_file_date"," ")
v_file_date = dbutils.widgets.get("p_file_date")
in the empty space I will supply parameter...
pls correct me if am wrong ..
U already created parameter here with name p_file_date. Now you need to pass value to it when ever u run this notebook.
Hi,
I have one question
How can use one cluster table for another cluster ( Share one cluster table with other clusters), if yes, please let me know how it works.
But it is not accepting in community edition. It is asking for the databricks subscription.
Yes. "To enable notebook workflows, please upgrade your Databricks subscription". @Wafastudies: please let us know if we can still free subscription to practice.
I'm getting error while executing run utility, will I need to pay to upgrade it?
IllegalArgumentException: requirement failed:
Great video. Pls also make the code snippets that u use available for instant use.
Hi Wafa,
Thanks for making these videos they are very crisp and clear.
I'm getting error while executing run utility, will I need to pay to upgrade it?
IllegalArgumentException: requirement failed: To enable notebook workflows, please upgrade your Databricks subscription.
FYI I signed in using community edition.
Even I got the same error message as I think passing values from one notebook to other isn't supported in community edition. I see @maheer is using Azure subscription for databricks.
Btw, great presentation very crisp and to the point. This surely needs a great dedication and consistency throughout as I have been following all his playlists and he is just awesome Chap!!!!
Thanks for all these explanations, it's really great content, just wanted to check can we get the notebooks used in all the practicals if it is available in some github repository.. Thanks again!
Thank you Sir...
Welcome 😊
Thank you!
Hi, I am a big fan of yours. Thanks a lot for sharing very good videos. I have a question , if you have created any video on how to pass a value from Databricks to Azure pipeline and use this value in pipeline for further processing . Using command dbutils.notebook.exit('some value'), i can pass value from databricks to pipeline