Azure Synapse | Azure Synapse Analytics [Full Course] ☁️
Вставка
- Опубліковано 6 жов 2021
- Azure Synapse Analytics is changing the way we work with data services in Azure. The ASA workspace combines the core technologies required for data warehousing, Big Data Analytics and Data Science.
In this Learn with the Nerds event, Mitchell Pearson will teach you how you can use Synapse Analytics to solve the paradox of analytics with a unified analytical platform where your data analysts, data engineers and data scientists can all perform essential tasks in the same workspace. Mitchell will discuss Dedicated and Serverless SQL Pools, integration with Azure Data Lake, performing ETL with pipelines and data flows, Synapse notebooks and finally the built-in Power BI integration capabilities! So come and join Mitchell as you explore the many features which make Synapse Analytics the industry-leading analytics platform.
What You'll Learn:
✔️ SQL Pools
✔️ Integration with Azure Data Lake
✔️ Perform ETL with pipelines and data flows
✔️ Synapse notebooks
✔️ Built-in Power BI integration capabilities
After you finish this course:
📄 If you have completed this class and would like a certificate, fill out this form - share.hsforms.com/1Pb8VndxSTd...
👉 Download The Nerd's Guide to Azure Synapse for FREE!
tinyurl.com/ytpbkye6
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - - -
Next step on your journey:
👉 On-Demand Learning - Start With The FREE Community Plan: prag.works/lwtn-trial
🔗Pragmatic Works On-Demand Learning Packages: pragmaticworks.com/pricing/
🔗Pragmatic Works Boot Camps: pragmaticworks.com/boot-camps/
🔗Pragmatic Works Hackathons: pragmaticworks.com/private-tr...
🔗Pragmatic Works Virtual Mentoring: pragmaticworks.com/virtual-me...
🔗Pragmatic Works Enterprise Private Training: pragmaticworks.com/private-tr...
🔗Pragmatic Works Blog: blog.pragmaticworks.com/
Let's connect:
✔️Twitter: / pragmaticworks
✔️Facebook: / pragmaticworks
✔️Instagram: / pragmatic.works
✔️LinkedIn: / pragmaticworks
✔️UA-cam: / pragmaticworks
Pragmatic Works
7175 Hwy 17, Suite 2 Fleming Island, FL 32003
Phone: (904) 638-5743
Email: training@pragmaticworks.com
#PragmaticWorks #MitchellPearson #Training #Microsoft #Azure #AzureSynapse #Cloud #LearnWithTheNerds
**Any sale mentioned in the video are no longer valid. Offers are subject to change with/without notice and are for a limited time only.
~-~~-~~~-~~-~
Please watch: "(275) Free Pragmatic Works Power Apps Component Library "
• Free Power Apps Compon...
~-~~-~~~-~~-~
30 minutes into the course and I am already happy with this presentation. Thank you
Glad you enjoyed! Thanks for watching!
This is absolutely gem of a session and helps to bring a lot of clarity about ASA, huge thanks for sharing it! 🙌
Glad it was helpful!
2:03:33 I had the same issue, but instead of enabling All IP's in Synapse, I went to the SQL Database and under the "Set server firewall" option made sure "Allow Azure services and resources to access this server" was set to true. I'm sure you already knew this, but I also know in the midst of a live demo we are all quick to do what comes to mind first. ;)
Gifted , humble and generous giver. thanks !
Thanks for watching!
Amazing learning material for a data engineer / scientist. Highly recommended
What a great course - thank you, it was very instructive and covered a lot of ASA topics !
Glad you enjoyed it!
Huge thanks for this explaination it really helped me to understand ASA concept with ADF.
WoW! This video is heavily enriched with information. It took me 3 days to finish it up! :)
This was great, thank you so much Mitchell Pearson and Matt!
Glad you enjoyed it!
Mitch is an amazing instructor!
Another great video series, great work!
Brilliant workshop 👍 live debugging made it even more useful 👏
Glad it helped!
Brilliant tutorial, comprehensive and well explained thanks you!
Glad you enjoyed it!
Great stuff, thank you! One question, how can we do real time sync with external table with application operational data using event handlers like azure function or web hooks. I want to push real time data from my daily operational data stores to external tables
What a great video!! Thanks for making it
Wonderful session, thanks for putting this together and presenting it so nicely!
Thank you! Glad you enjoyed it and thanks for watching! -Mitchell Pearson
Thank you so much for this video. This was very helpful for me to get an understanding of how to use ASA.
Glad you enjoyed! Thanks!
Mitch, you are the best!!!. Another best video , thank you!!
this is onee of the best training videos that i have seen n youtube
The complete restructure of Data Lake using only one line of code was crazy!
Glad you enjoyed!
2:03:00 Oh no! you practically disabled the firewall? That hurt my feelings very bad. As an Infrastructure Administrator Associate & an Aspiring Solution Architect Expert, I am terribly broken-hearted now. I am crying now 😂
Love your camera smile at the Very beginning 😚
It's awesome! Thank you!
Awesome sir !! Keep up the great work 👍
Amazing training session. Thank you
This was great. Thank you for sharing this.
Glad it was helpful!
Learned so much, thanks, very good explained
Fantastic communication ! Thank you for that . I will start to follow your videos. All the best .
Amazing. Thanks for this.
Great session, Thank you!
Glad you enjoyed it!
Thanks Sir, you video gives a comprehensive overview on ASA
You are most welcome!
Very concise presentation. Quick question - when yoiiu use spark on synapse, what is the metastore that spark uses to create tables?
Great presentation....really loved it
Smashed it out the park again Mitchell
I don't comment usually but man, that was very helpful!
Great work !!!!
This was enjoyable to watch. Thanks!
Glad to hear it!
Best video on synapse
Mitchell is a great great trainer.🤗🤗
Thank you! Glad that you enjoyed! :)
such a valuble information in online.....at free of cost thank u
Glad you enjoyed it!
What a great video to watch. I feel like I need to pay.
Thanks for watching!
Great Video. Do companies also use Synapse serverless pool for datawarehousing? with external tables and stuff or could you rather use an sql database or dedicated pool. I work mostly with small companies that have max 1,5 TB data and use mostly not unstructured data. And how about incremental loads. I see in every example you did, it's a full load. Can you also do incremental loads to .parquet files for example? if so how? if you have the time i'd love to hear your story.
Excellent event! Thank you for sharing.
Glad you enjoyed it!
I'm new to azure. I find the video very helpful. I have a doubt regarding my project.
I want to do analysis on streaming data using py-spark, pandas (as for using sql we use stream analytics). HD insight is one of the options but it costs per cluster. Is it possible to achieve it by either using databricks or synapse analytics?
Thanks so much
Great video.
Good video thanks
at 1:57:00 you use a pipeline to copy data, might be my personal preference but I hate using GUIs when I could just write the copy command on Synapse and execute directly, so why is this approach preferred?.
Excellent Workshop
Glad you enjoyed it!
Great video ! where can I find the files used in the demo ?
Excellent presentation. Guy is born trainer.
Thank you!
Hey thank you for this course, could you tell me if the files are available for downlaod?
I'm new to azure. I find the video very helpful. I have a doubt regarding my project.
We have 2 sql Dedicated instance
1 for Actually Control ADF and 2 for Sql Object
Is there any way we can connect 2 instance of sql dedicated Instance like what we have in SQL : Linked Server, Replication, AlwatsON , etc ?
When should data factory be used since synapse can do ETL too?
This was so good🙌
Thanks for watching!
Does the Serverless Pool equate to the main feature of "auto-scaling" from Snowflake??
Excellent videos sir. Please share some windows servers 2016/2019 full videos
Super helpful - Thank you so much !! #StayBlessednHappy
Thank you! Glad you enjoyed!
Thanks
Is it possible to Live Sync Azure Synapse to SQL DB for SSRS reporting? Or any alternative for it. There are 200+ SSRS reports are running using SQL DB Data Export from Dynamics 365. Can any one help here for a solution?
Thanks for the video. A couple of humble suggestions. You kept saying “we are gonna talk about that…”, “ I am gonna talk about that…” half way through the videos. Set the agenda briefly at the beginning and talk about what you need to talk about. Another one is you assume all of your audience is skilled at azure products set. “You see this is similar to that…” was said many times.
Anyways, good video
its very nice! unfortunately dataset isnt shared and no way to actually practice. i checked on microsoft toand theres a similar dataset but the thing is broken and doesnt contain a single parquet
Did u end up finding the dataset? I just found this video, the explanation is pretty good but I too wanted to work on it while watching the video
I already have On-Demand Subscription. I do not see any course on Synapse there?
Great session, where can we get the deck.
Great
It almost seems easier ( as you explained) to use spark to transform/manipulate the dataframes rather than creating a pipeline that would consist of several steps to complete.
We are a Tableau shop. How does Power BI compares to Tableau? Can we use the combination of Synapse with Tableau?
Thanks
Hi @pragmatic Works
The demo given here is great. I am new to synapse can you please help me how to import schema from cdm manifest files combine it with csv to copy data. I am trying to import data from Data lake where data has been exported from d365fo. But it contains header less csv files and manifest.cdm files which contains schema. I want to create view or external table in serverless sql pool.
Very useful..Thanks.
Glad it was helpful!
Great video. Is it possible to share the sample files (Taxi data & Internet Sales) ?
Hello,Thanks for this amazing video. I’ll love to implement this from my own end. Can I get the files link or something used for this. I’d appreciate your response thanks
do you have a devops course to get a microsoft certification?
1:15:35 how much data have been read from the DL? all?
Thanks Mitchell for this brilliant session.
Will one be able to run 1000+ reports/queries at the same time against dedicated SQL pools, why I ask this is cause I believe there is a limitation on concurrency.
And is there such limitation on serverless pool?
Also you have mentioned that one doesn't have to provision a SQL pool if concurrency is an issue or requirements are not aligned to synapse. I am bit confused with MPP and concurrency, I would have thought massively parallel processing means one can run many queries at a time, could you please help in understanding this better.
Jjjj.. Ss jee
is there any way to get the slides?
Great stuff. And I have the following question: around 1:15:00 you create external table so that there is no need to use openrowset syntax. If I want to still benefit from partitioning, would I need to create external table per partition¿
I guess yes you still benefit from partitioning since the `Select Top 100 * From TaxiData` query is simply "syntax sugar" on top of the "real query " which uses openrowset behind the scenes with partitioning applied. Remember he mentioned that the TaxiData Exernal Table does not store any "real data" besides metadata. So I can write `Select Top 100 * From TaxiData` and the engine will translate this "syntax sugar" into the "real query that has openrowset in it" and only scan my month1 partition. I hope I got it right?😊
Subscribed...
So i wanna work with him while watching the video. Where can I get the Training data that you already had loaded in your synapse studio?
The actual Synapse demo/walkthrough starts at 24:00
Hi, the link to the bootcamp seems to be invalid
Is Azure Synapse just a way to say Azure with all its services or is it some third party way of interacting with Azure services?
that opening 5s! xD
Hi @pragmatic works .. Session is great . I have one doubt when you are partitioning the data into separate months , does this means you are also creating its copy . For Example if data in training was 100MB and you partitioned it into separate months into training_output 100MB data is getting divided also and now you have 200MB of data ?
thanks got my answer later in the video
Good
If you're using Azure Synapse, do you need Azure Data Factory?
where find the "holiday.snappy.parquet " file plz?
Do u offer live trainings
24:10 Azure synapse analytics demo
As a developer, I find the 'cost' is creeping in at all levels! It is such a shame that MS has now got us by our nuts! We are developers NOT accountants!
oh God, you dont share your training scrips nowhere? how student can really practice them?
without that i am not going to watch further
Where's the course on virtual networks mentioned at 31:02?
ua-cam.com/video/TkLT4HWd558/v-deo.html
👆👆👆👆If the world has more people like then it will definitely to be a better
Place..... thanks so much sir you're best 💯 (his name above) up
A [Full Course] without CI/CD ??? 😕