The Spark SDK I use in this demo does not support this but there are a few options available to you. This SDK pypi.org/project/azure-cosmos/ provides a mechanism to scale a collection directly in the notebook using python. Another option is a new built in Cosmos feature called Auto-Pilot (tinyurl.com/wkb5pe3) , this will scale the cosmos instance automatically based on demand. Another option which is what we generally use is a scheduled Azure Automation PowerShell job.
Thank you, are you able to share the notebooks code
How can we scale cosmosDb using databricks only for time notebook is executing.
The Spark SDK I use in this demo does not support this but there are a few options available to you. This SDK pypi.org/project/azure-cosmos/ provides a mechanism to scale a collection directly in the notebook using python. Another option is a new built in Cosmos feature called Auto-Pilot (tinyurl.com/wkb5pe3) , this will scale the cosmos instance automatically based on demand. Another option which is what we generally use is a scheduled Azure Automation PowerShell job.
interesting