Suggestion: Please do more Fabric videos with focus on Notebooks and analytics like common enterprise needs around price prediction, demand forecasting etc. This will help us the power bi users learn some data science and learn how to use notebooks as well.
Well you could of course use Sql but Fabric is much bigger than that. With SQL you would need to do a lot of ETL work and set up permissions, maybe copy data do fifferent marts etc. Fabric helps you with that and you can create shortcuts to reuse your data easily. Not to mention yo can create and save DAX queries so they are reuasble by anyone connecting to that source
Patrick, can you please do some videos on how to take weekly raw data files and save them each week with a date stamp to keep historical records? That would be awesome. Basically same files each week, and append them to create a large system of weekly records
This is actually great with DWH. But, after I am seeing this, there is a lot of options here, lot of clicks, and you need to know what you are doing. I was considering to get Fabric in our company, but I think the good thing is to have someone who is doing ETL (SQL programming) with Management Studio and prepare all the stuff. This way from Power BI (Fabric) is okay, but why to complicate when there is no need to it.
This is great. What I would like to achieve with all these new MS tools is ETL that watches a folder inside OneLake (or SharePoint) and when I add a new flat file (csv) to the folder it parses it and appends parsed rows to OneLake db. Then this data is available in PBI. This would be an amazing tutorial if you could provide one.
Q: When you went through loading new data, I only saw 'Enable staging', however, I have 'Staging account connection' with a dropdown. What is needed for this?
Q:When you add the new schema it seems like the information_schema is removed. Is this something you do that we don't get to see, or is the informaiton schema automatically removed when you create new schemas or where can they be found?
Hi Good Morning!, I have to convert the existing SQL server stored procedure into fabric environment, In my stored procedures there are CURSOR commands but fabric doesn't support CURSOR commands, in this case how do I proceed, is there any alternative.
Great video, Patrick 🙃 If I look at Datamart, which you demoed last year, I can see common ideas, such as having everything together, but very different way of doing things and different audiences (Datamart for citizen dev and datamesh oriented, Data WareHouse for Data Office and IT). Since datamart are in preview since feb 2022 (2 years ago, waouh!) and have a proximity with Dataware House (Get data, create measure, do SQL queries etc...Do Microsoft intend to give up datamart in favor of Fabric Data Warehouse?
is the data warehouse only available in premium capacity or also as part of PPU or Pro. what would be the advatnage of using this over and above a data warehouse deployed on my local server?
Isn't fabric the next version of Synapse ? Also what are your thoughts on Databricks delta lake tables ? As a seasoned SQL developer, fabric is very familiar , it's T-SQL and GUI !
I assume Data Warehouse would be better than a lakehouse if you are heavy on TSQL. I am curious though if it would be worth migrating from something like serverless sql pool to a datawarehouse. Serverless can already read csv and parquet files from the azure storage account . What are the advantages?
I have the very same question! I hope someone can Reply to you/me on this. Especially if this has been covered on another video or post. Thanks Daniel!
Hi, thanks for your video, seems that Microsoft is preparing their platform that all DWHs should run in the future in the cloud. So could you show us how to implement a SC2 historization here? thanks
greate video, this will help me with my thesis, i have few questions to ask , i wish i had your email such as i can i tell witch data warehouse model are u using or chosen
Suggestion: Please do more Fabric videos with focus on Notebooks and analytics like common enterprise needs around price prediction, demand forecasting etc. This will help us the power bi users learn some data science and learn how to use notebooks as well.
Notebooks as in Apache Spark and Databricks? That's a monster to try and learn especially Scala or Python
Q: why would we need Fabric when we can use Azure SQL and grab data from there? Any response is appreciated , thanks
Well you could of course use Sql but Fabric is much bigger than that. With SQL you would need to do a lot of ETL work and set up permissions, maybe copy data do fifferent marts etc. Fabric helps you with that and you can create shortcuts to reuse your data easily. Not to mention yo can create and save DAX queries so they are reuasble by anyone connecting to that source
Do you know how many videos out there that I've found that shows me how to set up a data warehouse like this? One. This one. Thank you for this!
Patrick, can you please do some videos on how to take weekly raw data files and save them each week with a date stamp to keep historical records? That would be awesome. Basically same files each week, and append them to create a large system of weekly records
Thanks you so much for the wonderful videos. Its really helped me a lot and Today I have cleared my DP-600 exam🎉
This is actually great with DWH. But, after I am seeing this, there is a lot of options here, lot of clicks, and you need to know what you are doing. I was considering to get Fabric in our company, but I think the good thing is to have someone who is doing ETL (SQL programming) with Management Studio and prepare all the stuff. This way from Power BI (Fabric) is okay, but why to complicate when there is no need to it.
I used SSIS for some years but this is next level 😀
Great, unfortunately for now Identities columns are missing. Can you do some video on SCD Type 2 Dimensions?
All this what you showed Can we do it also if we are having Microsoft Developer account?
This is great. What I would like to achieve with all these new MS tools is ETL that watches a folder inside OneLake (or SharePoint) and when I add a new flat file (csv) to the folder it parses it and appends parsed rows to OneLake db. Then this data is available in PBI.
This would be an amazing tutorial if you could provide one.
We used a tool called VisualCron to watch folders for files and run ETL into Azure SQL database. Not sure of the compatibility here though.
Q: When you went through loading new data, I only saw 'Enable staging', however, I have 'Staging account connection' with a dropdown. What is needed for this?
Hey Patrick ! This is great video showing example . I wold like to see more store procedures examples like making scds . Cheers !
I was able to do this without premium capacity, what are the the needed ”... so we can use some Fabric items” 00:57 ?
Exciting capabilities!
Q: How often do you update the Producers list credits at the end?
thanks! you are the best, nice summary!!
Hi, I don't see data activator in my trial version, how to show it my MS fabric Home screen?
Thanks.
Curious to know how to deal with encrypted data in Fabric, and how we can decrypt those values in power automate/apps.
What is the price for this warehouse feature? If I don't need a full-fledged warehouse solution, will it be a cheaper option?
Q:When you add the new schema it seems like the information_schema is removed. Is this something you do that we don't get to see, or is the informaiton schema automatically removed when you create new schemas or where can they be found?
Isn't it better to use TRUNCATE instead od DELETE in SQL?
it depends
Q: Is any of the configuration scriptable, or do you have to go through the GUI?
Hi Good Morning!, I have to convert the existing SQL server stored procedure into fabric environment, In my stored procedures there are CURSOR commands but fabric doesn't support CURSOR commands, in this case how do I proceed, is there any alternative.
It was all fun and games until I needed premium capacity.
Great video, Patrick 🙃
If I look at Datamart, which you demoed last year, I can see common ideas, such as having everything together, but very different way of doing things and different audiences (Datamart for citizen dev and datamesh oriented, Data WareHouse for Data Office and IT). Since datamart are in preview since feb 2022 (2 years ago, waouh!) and have a proximity with Dataware House (Get data, create measure, do SQL queries etc...Do Microsoft intend to give up datamart in favor of Fabric Data Warehouse?
is the data warehouse only available in premium capacity or also as part of PPU or Pro. what would be the advatnage of using this over and above a data warehouse deployed on my local server?
Isn't fabric the next version of Synapse ? Also what are your thoughts on Databricks delta lake tables ? As a seasoned SQL developer, fabric is very familiar , it's T-SQL and GUI !
I assume Data Warehouse would be better than a lakehouse if you are heavy on TSQL. I am curious though if it would be worth migrating from something like serverless sql pool to a datawarehouse. Serverless can already read csv and parquet files from the azure storage account . What are the advantages?
I have the very same question! I hope someone can Reply to you/me on this. Especially if this has been covered on another video or post. Thanks Daniel!
You are a hero man
Q: what about using setup of active directory to limit access to data for all of possible users? Is this possible? (Not rls on dataset)
Hi, thanks for your video, seems that Microsoft is preparing their platform that all DWHs should run in the future in the cloud. So could you show us how to implement a SC2 historization here? thanks
Great video, thanks :)
thx for the fabric vid
looks a copy of snowflake,, style of creating warehouse
greate video, this will help me with my thesis, i have few questions to ask , i wish i had your email such as i can i tell witch data warehouse model are u using or chosen