16. Pass values to notebook parameters from another notebook using run command in Azure Databricks

Поділитися
Вставка
  • Опубліковано 11 гру 2024

КОМЕНТАРІ • 25

  • @Koustav7777
    @Koustav7777 2 роки тому +3

    Thank you for your lucid explainations🙏

  • @rohitbadgujar4514
    @rohitbadgujar4514 11 місяців тому +2

    For combobox and textbox we can set dynamically any value other than given choices but in case of dropdown and multiselect we can't set dynamically value other than given choices then how we can set parameters for them from another notebook which are not in their choices
    Eg Sir you are passing parameters as 'dropdown' and 'multiselect' but these values are not in their choices so how is it working? Please correct me

  • @VIGNESHvignesh-jr4lo
    @VIGNESHvignesh-jr4lo 2 роки тому +3

    Is there a way to pass a dataframe as a value using run() to another notebook??

  • @libithagohul8468
    @libithagohul8468 2 роки тому +1

    How does it take value for dropdown by using parameter, meaning you said in last video dropdown values have to be pre-defined and it will not accept any other values, so in your example, you havent cleared the previous dropdown list so how is it not throwing error, when you send the dropdown parameter value.

  • @gastondemundo9822
    @gastondemundo9822 Рік тому +1

    is there any way to pass the arguments dinamically?
    building a f''' something like {"argument": "data", "argument2": "data2", ...} and then pass it like a variable?

    • @anirbandatta1498
      @anirbandatta1498 Рік тому

      dbutils. notebook. exit("returnValue") and corresponding "returnValue" will be returned to the service.

  • @vamsikrishnabapatla1061
    @vamsikrishnabapatla1061 2 роки тому +1

    Can we able to communicate with another notebooks from one notebook without subscription in Community edition?

    • @srinidhim7978
      @srinidhim7978 Рік тому

      I am also facing the following issue. "requirement failed: To enable notebook workflows, please upgrade your Databricks subscription" .

  • @saipraneeth1660
    @saipraneeth1660 2 роки тому +3

    nice video.. in real time scenario..if I want to pass the date every day in a notebook..
    dbutils.widgets.text("p_file_date"," ")
    v_file_date = dbutils.widgets.get("p_file_date")
    in the empty space I will supply parameter...
    pls correct me if am wrong ..

    • @WafaStudies
      @WafaStudies  2 роки тому +1

      U already created parameter here with name p_file_date. Now you need to pass value to it when ever u run this notebook.

  • @kcsvenkat
    @kcsvenkat Рік тому

    Hi,
    I have one question
    How can use one cluster table for another cluster ( Share one cluster table with other clusters), if yes, please let me know how it works.

  • @ramyamarella1197
    @ramyamarella1197 2 роки тому +1

    But it is not accepting in community edition. It is asking for the databricks subscription.

    • @shaikmussarath8301
      @shaikmussarath8301 2 роки тому

      Yes. "To enable notebook workflows, please upgrade your Databricks subscription". @Wafastudies: please let us know if we can still free subscription to practice.

  • @sandeepsandy1237
    @sandeepsandy1237 2 роки тому

    I'm getting error while executing run utility, will I need to pay to upgrade it?
    IllegalArgumentException: requirement failed:

  • @saurabhshrivastava224
    @saurabhshrivastava224 2 роки тому +1

    Great video. Pls also make the code snippets that u use available for instant use.

  • @kripaligandhi6478
    @kripaligandhi6478 2 роки тому +1

    Hi Wafa,
    Thanks for making these videos they are very crisp and clear.
    I'm getting error while executing run utility, will I need to pay to upgrade it?
    IllegalArgumentException: requirement failed: To enable notebook workflows, please upgrade your Databricks subscription.
    FYI I signed in using community edition.

    • @vnellutla125
      @vnellutla125 Рік тому

      Even I got the same error message as I think passing values from one notebook to other isn't supported in community edition. I see @maheer is using Azure subscription for databricks.
      Btw, great presentation very crisp and to the point. This surely needs a great dedication and consistency throughout as I have been following all his playlists and he is just awesome Chap!!!!

  • @abhishekstatus_7
    @abhishekstatus_7 Рік тому

    Thanks for all these explanations, it's really great content, just wanted to check can we get the notebooks used in all the practicals if it is available in some github repository.. Thanks again!

  • @sonamkori8169
    @sonamkori8169 2 роки тому +2

    Thank you Sir...

  • @e2ndcomingsoon655
    @e2ndcomingsoon655 10 днів тому

    Thank you!

  • @bookworld6405
    @bookworld6405 Рік тому

    Hi, I am a big fan of yours. Thanks a lot for sharing very good videos. I have a question , if you have created any video on how to pass a value from Databricks to Azure pipeline and use this value in pipeline for further processing . Using command dbutils.notebook.exit('some value'), i can pass value from databricks to pipeline