Hey Fabricators! Are you excited for this series?! Let me know what you think of the plan, and comment below if there is anything else you want me to cover! Also join our free Fabric learning community with 70+ members and daily conversations about Fabric: skool.com/microsoft-fabric Thanks for watching and for your ongoing support! 🙌😊
Hi Will, i have everything on my powerbi service, Semantic models, Data flows, report that i want to migrate to fabric. is there any video of yours talk about this ? thank you !
This is some great content you're producing Will. I am currently working to migrate multiple clients from Power BI / SQL Server to Fabric, and since Fabric is a huge platform, your hands on videos are really helpful. Thank you a lot!
Ah very nice, I'm sure a lot of the stuff I'll be walking though in this series will be very relevant to you then. Good luck with the migration - would love to hear how that goes - feel free to ping me a message in our Skool community www.skool.com/microsoft-fabric 🙌
Thank you Will, I am really looking forward to the series. I will like to highlight a 4th benefit (even though you are somehow implying it) to the Fabic-centric approach and that is data reusability (you have highlighted this in your "Fundamentals of Microsoft Fabric in 38 minutes" previous video). Even within a single department, It let's you reuse a common measure or aggregation that was previously redefined multiple times repeated in many PBI reports all now coming (maintained) from a single place.
Hey Ricky - thanks for your insightful comment - really good point about reusability of aggregations/ measures, I'll make sure to touch on this at the relevant point in the series. And thanks for your contributions to the community - your input helped shape this series - looking forward to posting the next vid soon 🙌
Hi Will, this is a wonderful start. So basically, Fabric has Data Engineer friendly, coding oriented features like notebook, SemPy libraries, great expectation library for DQ and also Data Analyst friendly features like Dataflows (unlike only coder friendly platforms like Databricks, dbt and no-code or low-code tools like Talend, Azure Data Factory, Informatica, etc). It will be great if you can cover when to choose dataflows and when to choose notebook, similarly for Data quality, when to choose coding oriented integration with great expectations, etc.
Thanks! Yes, you've got it right :) one of the key 'benefits' of Fabric, is that they are trying hard to make data analytics more accesible to people who can't code: Data Wrangler, Visual Query builders, Copilot integration throughout, Dataflows/ Data Pipelines(mostly no or low code), Data activator is fully no code etc etc etc
@@chinmaykajalwa that's what i'll be answering in this series! In the mean time this piece of documentation might help you: learn.microsoft.com/en-us/fabric/get-started/decision-guide-pipeline-dataflow-spark
Very good video and thanks for your content! But there is one point where I would disagree. At minute 5:20 you talk about the problems of a Power BI centered architecture (changing structure, poor data quality, etc.). For me, however, the actual problem would not be the Power BI-centered architecture itself, but the handling of the data source. If you pursue a Power BI-centered approach, you should still use a DWH (or similar) as the data source, where you can be sure that the structure or data types will not suddenly change. The data quality should also be guaranteed because the data is cleansed and transformed before it is stored in the DWH. Of course, if you were to retrieve the data directly from operational systems, the problems you mentioned could occur. But obtaining data directly from the operational systems would generally not be a good approach for a solid data analytics solution.
Hey thanks for the comment, I think we're in agreement! Always better to serve your Power BI reports from a structured DB (even better if validated!). It's not always the case though! Hence why there are hundreds of different Power Query connectors in Power BI desktop, lots of people use Power BI as their all-in-one tool for data ingestion, transformation and visualization, which can cause problems. The benefit of Fabric is that that structured DB can be a Fabric Data Warehouse - part of the same SaaS solution, easily accessible for all. Not in Azure SQL or an on-prem server.
I checking have you started going through the series where you first start with how your UI should be set up and then explained the difference between data lake etc. I see that you do have other videos out, do those relate to this?
Hi Allan, thanks for your comment! Yes you're right, I have done some videos in the past about lakehouse/ data warehouse etc, but what I want to do with this series is tailor the videos/messaging especially to people who are moving from Power BI-centric architectures to Fabric-centric architectures, documenting step-by-step all the main decision points you will face as you transition. This video is the first in the series, and I've released the second video also. The next video in the series (on Options for Getting Data Into Fabric) is coming soon 😊
Hey great series of videos. We currently host a lot of our data in IaaS SQL SSAS servers and are considering a move to Fabric. I'd would be very interested in your thoughts on handling tab models especially row level security, where should the data model be maintained and where should elements like measures and calc groups be maintained?
Hey, difficult to comment on your specific use case, but the logical thing to do would be to use the Fabric Data Warehouse for most of your models and then build Power BI semantic models from that. Where you put your RLS (in the Data Warehouse or in the individual Power BI semantic models) depends on the structure of your org/ who needs access to what data. Good luck!
@@LearnMicrosoftFabric Hi thanks for the reply. If I could trouble you with a dumb question. Can I take a tab model hosted in SQL SSAS and move this to a PBI Premium workspace as is?
Well I answered my own question. Here is a video that shows the migration of tab models from SSAS or AAS to PBI i case it of use to others. ua-cam.com/video/dAn9P8ecktw/v-deo.html
I would like to move from using a OneDrive folder source to using Fabric. It's also very difficult to get a straight answer on what the cost implications would be.
Hi Richard, cost estimates in Fabric depend on a few different variables and where you source data from (i.e. OneDrive) is not one of them. These things affect cost: how much data you are planning on storing, the intensity of your data processing workloads (i.e. are you ingesting TBs of new data every minute/ hour/ day/ year), are you using Spark engine a lot for data transformations or for machine learning training. These are resource-intensive activities that will use up more capacity units. So if you're doing a lot of that kind of stuff, you will likel need to buy a higher SKU of Fabric capacity (and pay more). Hope that helps. Easiest way it to get a Fabric trial capacity, run some sample workloads to understand how many capacity units you will likely need.
Hey! I did make this one right when Fabric was released, but I could do with updating it as some of the details have now changed ua-cam.com/video/w481BSXk0Bw/v-deo.html
Hi, thanks for watching! Technically you can build a Power BI report in Fabric, but it will be fairly basic as not all the functionality available in Power BI desktop existing in the web-based report editing app. Although they are adding more and more features to the web version fairly rapidly!
Just to be on the same page: Fabric was created to reduce DE workloads by allowing DAs to become some sort of DEs by basically having an interface that access all Azure products in one place. Is that right?
You're right in that fabric makes it easier for less-technical people to accomplish more and take on more ownership of things like data pipelines and data flows. But I don't think this negates the need for data engineers (in serious implementations), and I don't think that Fabric was built specifically for that purpose (more of a side-benefit).
For me, as advanced BI developer Fabric is one huge question mark. After 6 years still no idea what is this, 10000 options which has almost same dfinitions etc..choas, choas.. I wanted to check some option there is always huge payment behind with all this capacity options which are another huge question mark.
Hey Fabricators! Are you excited for this series?! Let me know what you think of the plan, and comment below if there is anything else you want me to cover!
Also join our free Fabric learning community with 70+ members and daily conversations about Fabric: skool.com/microsoft-fabric
Thanks for watching and for your ongoing support!
🙌😊
Excellent introduction. YES! EXCITED FOR THIS SERIES! The plan you've described sounds spot on. Thanks for all you do!
@@michaelfarhat3447 Thanks for following along!
Hi Will, i have everything on my powerbi service, Semantic models, Data flows, report that i want to migrate to fabric. is there any video of yours talk about this ? thank you !
I'm glad that I came across this video. Looking forward to going through each of the videos to learn how to use Fabric.
Thanks for watching!
This is some great content you're producing Will. I am currently working to migrate multiple clients from Power BI / SQL Server to Fabric, and since Fabric is a huge platform, your hands on videos are really helpful. Thank you a lot!
Ah very nice, I'm sure a lot of the stuff I'll be walking though in this series will be very relevant to you then. Good luck with the migration - would love to hear how that goes - feel free to ping me a message in our Skool community www.skool.com/microsoft-fabric 🙌
Amazing - the perfect series for what I’ve been looking for.
That's good to hear! Three videos so far, should be around 8 in total, next one coming soon :) great to have you here!
Looking forward to watching the full series!
Next video coming soon, thanks for watching!!
Thank you so much! You talk and explain so well it’s a delight watching. Really looking forward to the rest of the videos!
Thanks for your kind words! Mor evideos coming v soon 🙌
Thank you Will, I am really looking forward to the series. I will like to highlight a 4th benefit (even though you are somehow implying it) to the Fabic-centric approach and that is data reusability (you have highlighted this in your "Fundamentals of Microsoft Fabric in 38 minutes" previous video). Even within a single department, It let's you reuse a common measure or aggregation that was previously redefined multiple times repeated in many PBI reports all now coming (maintained) from a single place.
Hey Ricky - thanks for your insightful comment - really good point about reusability of aggregations/ measures, I'll make sure to touch on this at the relevant point in the series. And thanks for your contributions to the community - your input helped shape this series - looking forward to posting the next vid soon 🙌
Greates content for Fabric on UA-cam. Thanks a lot for sharing!
Appreciate your comment - thanks!
Really interested by that series.
Here you go, only one video to go (to be released this week) 🙌 ua-cam.com/play/PLug2zSFKZmV3eee0W2PJU8XNJbu1dn3-P.html
Hi Will, this is a wonderful start.
So basically, Fabric has Data Engineer friendly, coding oriented features like notebook, SemPy libraries, great expectation library for DQ and also Data Analyst friendly features like Dataflows (unlike only coder friendly platforms like Databricks, dbt and no-code or low-code tools like Talend, Azure Data Factory, Informatica, etc).
It will be great if you can cover when to choose dataflows and when to choose notebook, similarly for Data quality, when to choose coding oriented integration with great expectations, etc.
Thanks! Yes, you've got it right :) one of the key 'benefits' of Fabric, is that they are trying hard to make data analytics more accesible to people who can't code: Data Wrangler, Visual Query builders, Copilot integration throughout, Dataflows/ Data Pipelines(mostly no or low code), Data activator is fully no code etc etc etc
@@LearnMicrosoftFabric so when shall we use notebooks and when shall we use dataflows? Any thumb rules for that ?
@@chinmaykajalwa that's what i'll be answering in this series! In the mean time this piece of documentation might help you: learn.microsoft.com/en-us/fabric/get-started/decision-guide-pipeline-dataflow-spark
@@LearnMicrosoftFabric thank you very much. Looking forward for that session. Above link is also helpful.
Great video Will as usual! Really looking forward to this series! Keep up the great work! I can see another great Microsoft MVP in making..:)
Next video coming soon - thanks for watching and commenting- - really appreciate it!!
Thanks Will. Looking forward to the series.
Thanks for watching! Hopefully we can answer a lot of the big questions that people are asking about Fabric 💪 Next video coming v soon
Thanks for the great content 👍
Keep up the good work 👏
Thanks for watching!!
I really thank you and I'm excited for this series.
Thanks for watching!!
Very good video and thanks for your content! But there is one point where I would disagree. At minute 5:20 you talk about the problems of a Power BI centered architecture (changing structure, poor data quality, etc.). For me, however, the actual problem would not be the Power BI-centered architecture itself, but the handling of the data source. If you pursue a Power BI-centered approach, you should still use a DWH (or similar) as the data source, where you can be sure that the structure or data types will not suddenly change. The data quality should also be guaranteed because the data is cleansed and transformed before it is stored in the DWH.
Of course, if you were to retrieve the data directly from operational systems, the problems you mentioned could occur. But obtaining data directly from the operational systems would generally not be a good approach for a solid data analytics solution.
Hey thanks for the comment, I think we're in agreement! Always better to serve your Power BI reports from a structured DB (even better if validated!). It's not always the case though! Hence why there are hundreds of different Power Query connectors in Power BI desktop, lots of people use Power BI as their all-in-one tool for data ingestion, transformation and visualization, which can cause problems.
The benefit of Fabric is that that structured DB can be a Fabric Data Warehouse - part of the same SaaS solution, easily accessible for all. Not in Azure SQL or an on-prem server.
@@LearnMicrosoftFabricYes, we are 😊
Great job! 🙂
Thanks for watching!!
Amazing video. tnks you so much.
Thanks for watching!!
I checking have you started going through the series where you first start with how your UI should be set up and then explained the difference between data lake etc. I see that you do have other videos out, do those relate to this?
Hi Allan, thanks for your comment! Yes you're right, I have done some videos in the past about lakehouse/ data warehouse etc, but what I want to do with this series is tailor the videos/messaging especially to people who are moving from Power BI-centric architectures to Fabric-centric architectures, documenting step-by-step all the main decision points you will face as you transition. This video is the first in the series, and I've released the second video also. The next video in the series (on Options for Getting Data Into Fabric) is coming soon 😊
All I have to say is thanks thanks thanks thanks thanks thanks .....
Hahaha thanks to you for watching!
Nice. Thanks
Hey great series of videos. We currently host a lot of our data in IaaS SQL SSAS servers and are considering a move to Fabric. I'd would be very interested in your thoughts on handling tab models especially row level security, where should the data model be maintained and where should elements like measures and calc groups be maintained?
Hey, difficult to comment on your specific use case, but the logical thing to do would be to use the Fabric Data Warehouse for most of your models and then build Power BI semantic models from that. Where you put your RLS (in the Data Warehouse or in the individual Power BI semantic models) depends on the structure of your org/ who needs access to what data. Good luck!
@@LearnMicrosoftFabric Hi thanks for the reply. If I could trouble you with a dumb question. Can I take a tab model hosted in SQL SSAS and move this to a PBI Premium workspace as is?
Well I answered my own question. Here is a video that shows the migration of tab models from SSAS or AAS to PBI i case it of use to others. ua-cam.com/video/dAn9P8ecktw/v-deo.html
Thank you 🙏
Thanks for watching 🙌
I would like to move from using a OneDrive folder source to using Fabric. It's also very difficult to get a straight answer on what the cost implications would be.
Hi Richard, cost estimates in Fabric depend on a few different variables and where you source data from (i.e. OneDrive) is not one of them.
These things affect cost: how much data you are planning on storing, the intensity of your data processing workloads (i.e. are you ingesting TBs of new data every minute/ hour/ day/ year), are you using Spark engine a lot for data transformations or for machine learning training. These are resource-intensive activities that will use up more capacity units. So if you're doing a lot of that kind of stuff, you will likel need to buy a higher SKU of Fabric capacity (and pay more).
Hope that helps. Easiest way it to get a Fabric trial capacity, run some sample workloads to understand how many capacity units you will likely need.
You should make a video talk about fabric cost...
Hey! I did make this one right when Fabric was released, but I could do with updating it as some of the details have now changed ua-cam.com/video/w481BSXk0Bw/v-deo.html
@@LearnMicrosoftFabric thanks man. You have a new subscriber !!
@@sebasantelmoThanks! Good to have you onboard :)
Hi @Will, Can I Build a complete Power bi report inside fabric has i will do on Power Bi desktop?
Hi, thanks for watching! Technically you can build a Power BI report in Fabric, but it will be fairly basic as not all the functionality available in Power BI desktop existing in the web-based report editing app. Although they are adding more and more features to the web version fairly rapidly!
Just to be on the same page: Fabric was created to reduce DE workloads by allowing DAs to become some sort of DEs by basically having an interface that access all Azure products in one place. Is that right?
You're right in that fabric makes it easier for less-technical people to accomplish more and take on more ownership of things like data pipelines and data flows. But I don't think this negates the need for data engineers (in serious implementations), and I don't think that Fabric was built specifically for that purpose (more of a side-benefit).
@@LearnMicrosoftFabric Do you still have to for data factory, etc.?
For me, as advanced BI developer Fabric is one huge question mark. After 6 years still no idea what is this, 10000 options which has almost same dfinitions etc..choas, choas.. I wanted to check some option there is always huge payment behind with all this capacity options which are another huge question mark.