Sir can you please show synapse workspace deployment from dev to prod or uat using yaml approach. I am not getting enough resources in internet.
Hello Sir. Thank you for your help and support.
Do you have any plan for azure stream analytics tutorial?
I am facing one challenge in ADF copy activity UPSERT. I have generic pipeline and copy happening based on config and column mapping also dynamic(using json). I have insert_date column in all tables and i want to insert value only when i am inserting new record in table. but there is now way to identify which records are update\insert.
In Azure DB it was possible, as I set default value of Insert date column using getdate() function and i skipped this column in mapping. But as my target is synapse, it does not support dynamic expression in default value.
Hello bro I like your videos, can you provide a video link to which you explained 'How to extract data from blob storage to DataBricks' please
ua-cam.com/video/ul4Gqehas0w/v-deo.html here is link and there is no need to extract data from blob to databricks. Just mount blob storage and you can do may file operation or translation
By mounting from databricks to blobstorage, can't we access data stored in blob storage???
@@vasanthasworld2948 by mounting blob storage you can access data in databricks and if want extract those into databricks DFS use dbutils file cp operation to copy data from blob to DFS databricks
Well done! Nice solution for the non existing trigger: "scan the source directory if new files are added" :D