Hi, do not intend to go on this topic. This is a way to bring data to the lake form transactional systems. then from there I would ingest in delta tables.
A good one. A company uses Databricks as its analytics platform and need operational data for all their analytics workload like data science, machine Learning, business reporting and etc. SQL server is a good OLTP database but not good to do analytics at scale.
Thankkkks , you helped me a lot! In my first job as a data engineer! Greetings from Mendoza, Arg
No worries 😉
Saudações brasileiras!! Very good video my friend. This is the type of content that I'm looking lately. Keep going!!
Valeu Thiago 👍🏻
Excellent tutorial! Made my job much easier.
Glad it helped!
Thank you very match! It was very helpful!
Thanks 🙏🏻
This is very helpful. Thank You Very much
Have you extended this exercise to create partition-based parquet file inside table folder.?
Hi, do not intend to go on this topic. This is a way to bring data to the lake form transactional systems. then from there I would ingest in delta tables.
Thanks a lot for sharing
Welcome 🙏🏻
How do I continue to sync the data?
This example just copy the whole table
Great tutorial!
Thank you!
Hi,
Which resources did you create under your resource group?
Thanks
I don’t remember here but a resource group is just a logical container to organize resources in Azure. Cheers.
Can you give me a good reason why you would need to copy data from a sql db to Azure data lake?
Cool video
A good one. A company uses Databricks as its analytics platform and need operational data for all their analytics workload like data science, machine Learning, business reporting and etc. SQL server is a good OLTP database but not good to do analytics at scale.
@@pytalista alright, thank you.
Thank you so much brother
Welcome 🙏🏻