I hope you find this video useful. If you have learned something or found it helpful, please consider giving it a like. What topic should I cover in the next end-to-end project video? Please let me know your thoughts in the comments. Follow my LinkedIn profile to get regular updates: www.linkedin.com/in/mrk-talkstech/ Thank you!😊😊😊
You could do a video migrating on premises to synapse, for example , loading facts and dims using serveless and delta tables, using databricks …congrats by your videos, your video is better than Microsoft official courses
Excellent and well described end to end data pipeline. Waiting for more videos on real time and incremental batch processing using ADF and databricks. Thanks
This is the best end-to-end project I have ever seen. I was able to set up a whole data engineering solution and personally implement it in my company based on this video. Please add more end-to-end projects when you get time. Amazing tutor and amazing channel.
I have watched the entire video like a movie without any distraction and never slept in between. I have seen many videos to learn Azure but not satisfied. But you have explained excellently with in depth detail and highlighted the points wherever is required without missing any single part. Your dedication on creating such a wonderful tutorial even during late night Sundays is much appreciated. This end to end project gives full confidence to work as a Data Engineer in the Azure portal. Hats off to you Kishore. Semma mass! Subscribed, liked and shared. Thanks a lot brother!
It is a great effort that you have made in supporting those who cannot afford to spend money on courses. This channel may become successful in the coming days. The content you provide here is not taught in courses themselves. Please be motivated to continue teaching in this way. We are here to support anyway whatever possibility we have in our hands.
Watched the tutorial in one go. When i decided to become azure data engineer and tried to watch your video it looked me difficult bcz azure have so many services with all those authentication. Then i practiced the adf, data lake, databricks, powerbi for months taking DP-203, PL-300 exams and making some small projects then i understood this project. For a beginner this might be difficult.
I wouldn't be surprised if this channel hits a million subscribers soon !!! Excellent explanations and extra ordinary presentations 👏👏. WIll continue to watch every videos of this channel...love the structured way of going with the data related topics 👌
Mr K, This is a top notch content! but, I have been avoiding azure for 5 years and I think I should avoid it more. why they make the process too repetitive? I am a Fan of AWS and I can accomplish the same pipeline in a few line of code with some python knowledge. But your tutoring is exceptional. I 've subscribed!
Athough it's a good playlist, you should have started the video from creating the resources from scratch. As a beginner I faced challenges to built the resources with specific config.
do it in incremental coming from the transient layer going to bronze then do append and you will only want updated data to take to silver and without duplicates
What Tools you have used to make this tutorials, everything is excellent! kindly share your tools and technologies you used for tutorial! Tutorial is Par Excellent and easy to understand!
I am following this tutorial with a M1 MacBook Pro. Please provide any advice as I can't seem to install adf integration runtime. I am currently using Azure Data Studio with Docker. I tried to install SSIM on UTM and Parallel, but they all failed.
This is the video I have been searching for a long time to kick off my databricks journey, now Mr. K you offered it for public free, thank you very much. :) Please keep it up.
I just wanna say THANK YOU VERY MUCH , MR. K. I was able to complete the project End to End and learned so much with your video. I would really like to you make more projects with this clarity. Thank you General! 🍻🍻
hi, great tutorial and indeed good learning for starters as me. Can you also please make end to end azure data engineering real time project with continuous data stream & readily available big data (so that we can readily download from your link). It would be of great help for us.
Great video! Your explanation of End-to-end Azure Data Engineering was clear, simple and easy to follow. I appreciate your expertise in breaking it down for us. God bless!!
The BEST tutorial about data engineering that I have Found. I'm a data analyst, but in my company I'm having to do a LOT of data engineering job and I was totally lost till I found this tutorial. THANK you for helping me.
Bro same condition with me as well But does the company give us data engineer experience and does the data analyst experience worth it for data engineer
What a powerful 1st 10 minutes... Looking forward to following along! I am just a SQL DB to Power Bi, but want to push the company forward and this is just the knowledge I need to begin :)
The BEST tutorial about data engineering that I have Found,you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project
Amazing Video @Kishore, you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project, hoping for more videos to come, thanks again man, keep shining
Mr. k Sirji I went through the first part of your video - Environment setup. You mentioned that after going through end to end one can have the architecture in one's resume. Surprised that basics like Linked Service, Datasets were just created on the fly. These are extremely important components of ADF and no concept was provided. I am lucky that I have previously gone through certain ADF data movement tutorials All essential components were already created and creation is services forms a basis of any tutorial. You create a SHR IR service for one premise SQL Server but the most important thing is the configuration of Security Lists / Groups (Virtual Cloud Nw Firewall). Anyway the presentation is good , voice volume is optimum and speed is also okay. Namaste 🙏
HI, thank you so much for amazing project, I was wondering would the process be same if it is JSON files instead of sql server ? I need a connection between JSON files API to azure cloud and finally to power BI. Please let me know. Thank you so mcuh. Regards
Great session 👍, just want to know can we do the same activity with just AZ Synapse as u told it can do both Ingest & transform as well. Plz make Part-2 of this with AZ Synapse only, if possible.
For anyone on M1+ Macs: the database setup for AdventureWorks is not possible due to SHIR limitations (self-hosted integrated runtime -> only possible on Windows). You cannot install SSMS with SQL Server. As an alternative to SSMS, you can use a dockerized version of Azure SQL Edge and install Azure Data Studio that connects to your localhost (the dockerized edge container). This way you can query/see the table on Mac M1. HOWEVER, you cannot connect to Azure Data Factory with SHIR in the step to get the data from your 'on-premise' (laptop) database to ADF, since SHIR is not installable on ARM Macs. I have simply installed a SQL server + Database in Azure itself and used that as a reference to continue the tutorial. Good luck!
I am also using M1 Mac. If we go with this we can directly use the default run time instead of the self-hosted run time correct? As both the resources are in the Azure cloud
Hello Mr. K . As I follow this one, I have some problems connecting with my on premise. on your tut, you've successfuly konekted to on premise. but me. i'm having a hard time, on my installed IR on my local machine,under diagnostic, i tried my sql server settings. and sucess but in the azure portal > adf linked service, still can't connect
Thank you so much for this video, im following it, but cannot find the resource files that you previously created. Can you please tell me where i can find them?
Awesome teaching !! Magnificent !! have one doubt if the source is CSV or oracle db(instead of sql server source db) can we do all this activity by using ADF?
at 32:28 i am not getting the same page, its saying Access policies not available. The access configuration for this key vault is set to role-based access control. To add or manage your access policies, go to the Access control (IAM) page. what should I do help
Wonderful explanation. Best on the UA-cam. Just a small doubt, u can do the entire ETL in Azure Synapse, why did u create seperate ADF,ADB resources while Azure Synapse alone contains all of it?
Thank you :) Yes, using synapse we can do the complete ETL process, but I wanted to include as much resources as possible in this Architecture for help people understand how each resources are used in Azure and its integrations :)
A query regarding SHIR. AFter installation of integration runtime on local machine how does it connect to the SQL server ? It detects automatically or is there is any connection setting we need to do?
Hats off to your patience and the detailed explanation of end to end pipeline , even the paid content courses will not give you this level of explanation ..grateful and thankful for this amazing end to end project video
can i add this in my resume and tell this as my project in company as i am moving my career from python developer to azure data engineer. if i can , then how many year experience candidate can add this
If any one getting these problem access policies is not available then go to in the left pane of azure key vault and search for access configuration in there You get permission model In there tick the vault access policy and apply it So after refresh the access policies page your problem will get solve
when you store data from bronze to silver,, is their any way to do naming convention as... i knew for one file.. but you saved using loop.. ho w to achieve this
Thank you MrK, I’m a project manager with not so tech background and I look after these kinds of projects. Your explanation gave me perfect idea of understanding the flow functionally and made me aware of Data engineering projects end to end, thank you!
Awesome project and really well explained. I learned a lot. I obviously built the same as the video progressed, although I used a more comprehensive database with multiple schemas. That forced me to learn how to create the pipelines slightly different. Really enjoyed every minute of the video. I like the end-to-end project approaches, you learn so much more and its closer to a real world data flow.
"Great video! Your explanation of End-to-end Azure Data Engineering was clear and easy to follow. I appreciate your expertise in breaking it down for us.
Amazing contents. Thank you so much for your efforts! I have a question about the tech stacks that you chose. I see a lot of different combinations regarding Azure native tools from companies. Like some companies only use ADF & Databricks (or even only Databricks); others like this demo - ADF, Data Lake Gen2, Synapse Analytics, and Databricks. Could you please explain to me why you chose those tools for the demo? Because for me, only with ADF for ingestion and Databricks for ETL indeed also works for this end to end, right? Or did you choose those just for the training purpose? It will be very much appreciated! Thanks you.
Hi Bro, I have never seen a person explained like this. Though there is a speed in your explanation, I was able to grab the valuable information. A small request from my side, by any chance can you please explain, how to do the ETL validations here like.. count & duplicates, column mapping, business logics of records in source and target.
this video is very helpful, good hands on experience on azure services , the way you explained step by step process is very interesting, great content .
@@mr.ktalkstech Thank you so much for your reply. Your videos are amazing. Well done Sir! I am looking forward to watching DLT pipeline end to end and the rest of your videos
Hello Mr.K, thanks for this informative video. However, I am facing some connectivity issues at the beginning when I am trying to create a new linked service to connect database. I have tried several options and couldn't succeed. Can you please help?
I hope you find this video useful. If you have learned something or found it helpful, please consider giving it a like.
What topic should I cover in the next end-to-end project video? Please let me know your thoughts in the comments.
Follow my LinkedIn profile to get regular updates:
www.linkedin.com/in/mrk-talkstech/
Thank you!😊😊😊
Please do project on autoloader incremental pipeline with dimensions, facts, quality checks, phase.
You could do a video migrating on premises to synapse, for example , loading facts and dims using serveless and delta tables, using databricks …congrats by your videos, your video is better than Microsoft official courses
Please do project for data warehouse .
instead of all tables how to choose custom tables, and push only those tables on prem sql-server to ADLS.
@hello sir what will be cluster configuration for real time projects
A senior colleague recommended this video to me and I must say I am beyond grateful. Thanks for compiling such an amazing content. A Game Changer!!!
Thank you so much :)
Your presentation and explanation were excellent, and Microsoft owes you appreciation for it!!
Thank you so much :)
Excellent and well described end to end data pipeline. Waiting for more videos on real time and incremental batch processing using ADF and databricks. Thanks
Crystal clear explanation & Amazing content Mr.K, Looking forward for more Hands-on projects.
Thank you for this contribution.
Thank you so much :)
One more end to end project, with different scenarios would be great
Clear explanation with ADF BUT one needs to build more real time projects as we as with scenarios bro
All ways welcome brother...
I came here in search for Copper But I found Gold You are an amazing teacher Thank you ♦🙌
This is the best end-to-end project I have ever seen. I was able to set up a whole data engineering solution and personally implement it in my company based on this video. Please add more end-to-end projects when you get time. Amazing tutor and amazing channel.
Thank you soo much :)
Hey, can you assist me with something? I am stuck badly on something
I have watched the entire video like a movie without any distraction and never slept in between. I have seen many videos to learn Azure but not satisfied. But you have explained excellently with in depth detail and highlighted the points wherever is required without missing any single part. Your dedication on creating such a wonderful tutorial even during late night Sundays is much appreciated. This end to end project gives full confidence to work as a Data Engineer in the Azure portal. Hats off to you Kishore. Semma mass!
Subscribed, liked and shared. Thanks a lot brother!
Thank you so much for the huge compliment :)
your explanation skills are amazing man. that's so hard to find. you dumb it down make things very accessible
Thank you so much :)
It is a great effort that you have made in supporting those who cannot afford to spend money on courses. This channel may become successful in the coming days. The content you provide here is not taught in courses themselves. Please be motivated to continue teaching in this way. We are here to support anyway whatever possibility we have in our hands.
Thank you so much for the support :) Sure, will continue making the videos :)
wow, what an amazing demonstration of all the cloud products with hands on lab!!
Thanks
Thank you so much :)
This is great content! Thank you so much for sharing. Looking forward to watch the next end to end project videos
Thank you so much :)
Watched the tutorial in one go. When i decided to become azure data engineer and tried to watch your video it looked me difficult bcz azure have so many services with all those authentication. Then i practiced the adf, data lake, databricks, powerbi for months taking DP-203, PL-300 exams and making some small projects then i understood this project. For a beginner this might be difficult.
I wouldn't be surprised if this channel hits a million subscribers soon !!! Excellent explanations and extra ordinary presentations 👏👏. WIll continue to watch every videos of this channel...love the structured way of going with the data related topics 👌
Thank you so much for the support and the biggest compliment :)
Excellent video with great explanations - thanks so much!
Thank you so much :)
Mr K, This is a top notch content! but, I have been avoiding azure for 5 years and I think I should avoid it more. why they make the process too repetitive? I am a Fan of AWS and I can accomplish the same pipeline in a few line of code with some python knowledge. But your tutoring is exceptional. I 've subscribed!
Thank You for the wonderful Explanation. Please make the video for MLOps Pipeline for Azure DevOps
Thank you so much :) Sure, ll do :)
Athough it's a good playlist, you should have started the video from creating the resources from scratch. As a beginner I faced challenges to built the resources with specific config.
Greatness! Thanks very much I have learnt alot
Very nice, very useful, thank you.
do it in incremental coming from the transient layer going to bronze then do append and you will only want updated data to take to silver and without duplicates
Hi , This is very useful project. However, i would like to request how do you it for real-time streaming pipelines when th data updation occurs.
Thank you Mr. K.. 😀
Thaks, you're amazing !
Thank you so much :)
What Tools you have used to make this tutorials, everything is excellent! kindly share your tools and technologies you used for tutorial! Tutorial is Par Excellent and easy to understand!
Thank you so much :)
thank you sir , we like to see the Python pipeline too
Sure :)
I am following this tutorial with a M1 MacBook Pro. Please provide any advice as I can't seem to install adf integration runtime. I am currently using Azure Data Studio with Docker. I tried to install SSIM on UTM and Parallel, but they all failed.
Thanks!
Thank you so much :)
This channel is definitely my go to place for End to End Projects . Absolutely love your content sir . Keep it coming . Thank you 😊
Thank you so much :)
This is the video I have been searching for a long time to kick off my databricks journey, now Mr. K you offered it for public free, thank you very much. :) Please keep it up.
Folks - make sure your SQL Server allows standard authentication and not only Windows authentication
Yeah well said.
I was unable to connect then used the azure sql database
How did you do that@@Armwrestling_Invasion
Please bring some more End-to-End projects on Azure
I just wanna say THANK YOU VERY MUCH , MR. K. I was able to complete the project End to End and learned so much with your video. I would really like to you make more projects with this clarity. Thank you General! 🍻🍻
Crispy clear, no-nonsense, dense, structured. Definitely subscribed.
Thank you so much :)
hi, great tutorial and indeed good learning for starters as me. Can you also please make end to end azure data engineering real time project with continuous data stream & readily available big data (so that we can readily download from your link). It would be of great help for us.
Great video! Your explanation of End-to-end Azure Data Engineering was clear, simple and easy to follow. I appreciate your expertise in breaking it down for us. God bless!!
Thank you so much :)
First time, I've encountered a comprehensive and practical explanation from start to finish. Thank you very much for all your efforts.
Thank you so much :)
The BEST tutorial about data engineering that I have Found. I'm a data analyst, but in my company I'm having to do a LOT of data engineering job and I was totally lost till I found this tutorial. THANK you for helping me.
Bro same condition with me as well
But does the company give us data engineer experience and does the data analyst experience worth it for data engineer
What a powerful 1st 10 minutes... Looking forward to following along! I am just a SQL DB to Power Bi, but want to push the company forward and this is just the knowledge I need to begin :)
Thank you soo much :)
The BEST tutorial about data engineering that I have Found,you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project
Bro should have shown how to create those resources also. Can u create a separate video showing how to create these different resources
I already have videos for Databricks and ADF, please do watch it using my ADF and Databricks playlist :)
Mr. K, you are the gem man. Your way of explaining the things superb. Thanks a lot for this free content.
Thank you so much :)
Can I become Data engineer in 3 hours? Is it bullshit or serious?
Love your vids, man!
Is there any difference of this vid from your udemy vid?
Im about to purchase from udemy but dont know if they are the samething.
I hope he replies you. I have the exact same question.
Its the same one with few added information and resources, I have the course in the Udemy for wider reach :)
Its the same one with few added information and resources, I have the course in the Udemy for wider reach :)
How awesome is this video! I spent probably like 20h watching, rewatching and redoing certain parts on my own :D I learned so much, thank you!
Amazing Video @Kishore, you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project, hoping for more videos to come, thanks again man, keep shining
Mr. k Sirji I went through the first part of your video - Environment setup. You mentioned that after going through end to end one can have the architecture in one's resume.
Surprised that basics like Linked Service, Datasets were just created on the fly. These are extremely important components of ADF and no concept was provided. I am lucky that I have previously gone through certain ADF data movement tutorials
All essential components were already created and creation is services forms a basis of any tutorial.
You create a SHR IR service for one premise SQL Server but the most important thing is the configuration of Security Lists / Groups (Virtual Cloud Nw Firewall).
Anyway the presentation is good , voice volume is optimum and speed is also okay.
Namaste 🙏
Its a very good skill to have as a Data Engineer :)
Brilliant. Clearly explained and well structured.
Thank you so much :)
Best end-to-end project I have ever seen with real time explanation .Nice expanation and great content really appreciate your efforts
HI, thank you so much for amazing project, I was wondering would the process be same if it is JSON files instead of sql server ? I need a connection between JSON files API to azure cloud and finally to power BI. Please let me know. Thank you so mcuh. Regards
Great session 👍,
just want to know can we do the same activity with just AZ Synapse as u told it can do both Ingest & transform as well.
Plz make Part-2 of this with AZ Synapse only, if possible.
For anyone on M1+ Macs: the database setup for AdventureWorks is not possible due to SHIR limitations (self-hosted integrated runtime -> only possible on Windows). You cannot install SSMS with SQL Server. As an alternative to SSMS, you can use a dockerized version of Azure SQL Edge and install Azure Data Studio that connects to your localhost (the dockerized edge container). This way you can query/see the table on Mac M1. HOWEVER, you cannot connect to Azure Data Factory with SHIR in the step to get the data from your 'on-premise' (laptop) database to ADF, since SHIR is not installable on ARM Macs. I have simply installed a SQL server + Database in Azure itself and used that as a reference to continue the tutorial. Good luck!
Thanks for this, appreciate it :)
I am also using M1 Mac. If we go with this we can directly use the default run time instead of the self-hosted run time correct? As both the resources are in the Azure cloud
Hello Mr. K . As I follow this one,
I have some problems connecting with my on premise.
on your tut, you've successfuly konekted to on premise.
but me. i'm having a hard time,
on my installed IR on my local machine,under diagnostic, i tried my sql server settings. and sucess but in the azure portal > adf linked service, still can't connect
Thank you so much for this video, im following it, but cannot find the resource files that you previously created. Can you please tell me where i can find them?
Awesome teaching !! Magnificent !!
have one doubt if the source is CSV or oracle db(instead of sql server source db) can we do all this activity by using ADF?
Can this project be worked out in Fabric environment much simpler way with out much configuration setups ?
Please let me know if you a video for that?
at 32:28 i am not getting the same page, its saying
Access policies not available.
The access configuration for this key vault is set to role-based access control. To add or manage your access policies, go to the Access control (IAM) page.
what should I do help
Wonderful explanation. Best on the UA-cam.
Just a small doubt, u can do the entire ETL in Azure Synapse, why did u create seperate ADF,ADB resources while Azure Synapse alone contains all of it?
Thank you :) Yes, using synapse we can do the complete ETL process, but I wanted to include as much resources as possible in this Architecture for help people understand how each resources are used in Azure and its integrations :)
for ETL ADF, Synapse are code free platform you wont get everything in big data, good to work on data bricks platform is best option spark cluster.
Good job Mr k. I will like to ask, will it be possible to replicate this end to end project in Azure 3o days free trial subscription.Thanks
A query regarding SHIR. AFter installation of integration runtime on local machine how does it connect to the SQL server ? It detects automatically or is there is any connection setting we need to do?
my foreach activity not succeed (in progess) but inside copy activity succeed... Why this happened 😢?
Thanks for this end to end project
can you also create video for incremental data loading in all systems
What is the need of synapse, cant we connect power bi directly to gold container Datalake?
Can someone advise from where we can get input data sets? I mean all the required input tables in SSMS?
Can you provide code link/ github link for the same
Hats off to your patience and the detailed explanation of end to end pipeline , even the paid content courses will not give you this level of explanation ..grateful and thankful for this amazing end to end project video
Hello sir
From which repository you picked the data of sales , can u please mention the link of the data set
My sql server connection failed at the Source creation level/ timestamp - 31:44, if someone can help?
can i add this in my resume and tell this as my project in company
as i am moving my career from python developer to azure data engineer.
if i can , then how many year experience candidate can add this
If any one getting these problem
access policies is not available then go to in the left pane of azure key vault and search for access configuration in there
You get permission model
In there tick the vault access policy and apply it
So after refresh the access policies page your problem will get solve
In real world, how many days/months/ man hours does this project take?
Very well crafted, the best possible way to create an end to end project for someone who is a beginner to Azure.
A very good job, Keep going buddy🎉
Thank you so much :)
Amazing work. Appreciate the help. just one question , Can we use azure data brick instead Synapse to avoid vendor lock in ?
How to create this kind of videos. Please help with the tools to built this type video.
how much does it cost to do this in pay-as-you-go model i azure for practice?
when you store data from bronze to silver,, is their any way to do naming convention as... i knew for one file.. but you saved using loop.. ho w to achieve this
Can you add a video on how you created the resource groups?
Can someone tell me, Is this day-to-day routine task of a data engineer?
Please come up with more videos.Thanks a lot for your effort
May I know how you have created the resources group use din this video
Thank you MrK, I’m a project manager with not so tech background and I look after these kinds of projects. Your explanation gave me perfect idea of understanding the flow functionally and made me aware of Data engineering projects end to end, thank you!
Thank you so much :)
Awesome project and really well explained. I learned a lot. I obviously built the same as the video progressed, although I used a more comprehensive database with multiple schemas. That forced me to learn how to create the pipelines slightly different. Really enjoyed every minute of the video. I like the end-to-end project approaches, you learn so much more and its closer to a real world data flow.
Thank you so much :)
on 14:27 , from where did u get that script , please help me out
Once dataloaded to bronze layer what aboutincremtal load
"Great video! Your explanation of End-to-end Azure Data Engineering was clear and easy to follow. I appreciate your expertise in breaking it down for us.
Thank you so much :)
hi @Mr. K if possible please make a video on CICD.
Sure, will be uploaded soon :)
hi is it merged version of this project playlist?
Appreciate it Mr. K ! Thank you ❤
Hi @Mr. K Talks Tech could you please make a project on delta live tables.
Sure :)
Amazing contents. Thank you so much for your efforts!
I have a question about the tech stacks that you chose. I see a lot of different combinations regarding Azure native tools from companies. Like some companies only use ADF & Databricks (or even only Databricks); others like this demo - ADF, Data Lake Gen2, Synapse Analytics, and Databricks. Could you please explain to me why you chose those tools for the demo? Because for me, only with ADF for ingestion and Databricks for ETL indeed also works for this end to end, right? Or did you choose those just for the training purpose? It will be very much appreciated! Thanks you.
Please make more end to end real time project like this using azure. The video is so much informative and have learnt so many things.
Thank you so much, Sure :)
Hi Bro, I have never seen a person explained like this. Though there is a speed in your explanation, I was able to grab the valuable information. A small request from my side, by any chance can you please explain, how to do the ETL validations here like.. count & duplicates, column mapping, business logics of records in source and target.
Thank you so much :) Sure, ll do that in the future :)
Each time copy all data from SSMS and will it overwrite to ADLS g2
Yes that's right, haven't implemented an incremental load in this Project :)
please make video on the same
@@mr.ktalkstech
df.write.format("delta").mode('overwrite').save("path") you can implement an incremental load with this option mode in pyspark.
Can anyone create resume based on this project?
can you plz provide the note books
this video is very helpful, good hands on experience on azure services , the way you explained step by step process is very interesting, great content .
Thank you so much :)
Thank you for video MR K. I just wonder what made you to use ADF pipeline rather than DLT one?
This is one way to do it- yeah of course we can use the DLT as well- ll cover that in another End-to-End project :)
@@mr.ktalkstech Thank you so much for your reply. Your videos are amazing. Well done Sir! I am looking forward to watching DLT pipeline end to end and the rest of your videos
Hello Mr.K, thanks for this informative video. However, I am facing some connectivity issues at the beginning when I am trying to create a new linked service to connect database. I have tried several options and couldn't succeed. Can you please help?