I see your schema "What is Power BI Datamart" and i have some questions : 1- in this schema it seems like an architecture to follow , we are accustomed by 2 concepts DataMart (subset of specific data for one or more specific business units) and Datawarehouse (data fully covered for all business units) : so i found a different concept of datamarts : it's not a data structure (DB, DWH, etc) , it's a cylce of 3 components ? 2- the migration from step 1 (dataflow) to step 2 (Datawarehouse) : who is responsible : we use ETL for example Azure data factory ? 3- why we talk about azure Data warehouse : We can not use also SQL server in desktop ?
Does this also support storing on-premise data (through a gateway) in the sql database that is the output of the datamart? Can it serve as an alternative to Azure data factory for basic ETL?
At the moment the access to the underlying Azure SQL Database is limited. tables are not accessible directly. you can query them through Views. So based on the current settings, no you cannot import data into that database using other methods. However, I think in the future things like these will be possible and available.
Datamart comes with a dataset. that dataset will give you the ability to be used in live connection, or directquery with componsite model combined with another data, or anything else you want. just like a normal Power BI dattaset
Reza - Can you clarify about the Datamart refresh. Let's say I have 4 dataflows in the datamart. And they refresh 4 times per day. If the Datamart refreshes automatically, why is there an option to schedule refreshes? Or, if the datamart is schedule to refresh...does it automatically trigger the dataflows to refresh? I heard or read somewhere that the previous challenge of always having to time the dataset refresh after the dataflows is not an issue with Datamarts....but not clear on how/why?
Hi. Datamart includes a dataflow in it. and the schedule refresh of the datamart meaning the schedule refresh of that dataflow and then the dataset immidately after. In your case, you are getting data from four other dataflows. the datamart imports the data from those dataflows using another dataflow (the datamart's dataflow) into Azure SQL DB. this refresh is not necessarily triggered when those four other dataflows refreshes. if you want to sync them together, you still need to do it in another way. The datamart schedule refresh only syncs its own dataflow refresh and its own dataset refresh. in the future though, we might have things like what you mentioned.
But these users still need to understand basic relationships, pk, fk, one to one, one to many etc … otherwise they will loading data in and hoping for the best, then further down the road they’ll be asking why is my report wrong etc..
I am new with datamart and dataflows! I created a datamart from a dataflow. I wanted to schedule incremental refresh but I couldn't because the option is inactive. Any help will be appreciated
Great video and good explanation. Is that right to say, the datamart and dataset 1:1 copy storage point of view since it's autogenerated? When it comes architecture, what's your thoughts whether we recommend datamart as futuristic and discourage the data flow?.
Datamart is not a replacement of a Dataflow. It is using Dataflow and adding some other things to it (such as the Azure SQL DB). So you are still using Dataflow underneath. I do even suggest creating layers of dataflow as I explained in one of my articles.
since datamart is still in preview, what are the risks of implementing it as a solution? from an enterprise perspective. Looking for your thoughts @RADACAD
WOW.. Great content. I have a question. when you mention datawarehouse is better doing in synapse according the dimension, how can I know if power bi is the right tool to create company datawarehouse instead of Azure synapse. Does exist a Gigabyte Number to select the rigth tool or rows/columns numbers ? Thanks for sharing this, regards
Great set of videos Reza. Thanks for sharing. Do you know if the database will be hidden to Power Automate? For example, If I capture student questionaries using MS Forms could I deposit the answers directly into Azure SQL instead of filling Excel files for later synching to the Datamart?
The datamart's database comes with some limitations (as of today). You cannot access tables directly. You use views to get data from the tables. So I think not, you perhaps cannot do that. not today. But I hope this limitation to be lifted, so that we can use other ETL tools to feed data into that database.
Reza I have another question, would we be able to create a data mart, save the tables we need and then create and save a query for others to reuse? If that’s the case this is pretty much game changer because I would have to have queries saved for business users to run and get data from it.
By "retrieve data from different tenant" what do you mean exactly? What is your data source in the datamart? where do you get the data from? dataflows? or any other sources? Datamart supports all the data sources that dataflow does. but if you are getting data from a tenant specific object, then you can have one tenant in each data source connection at a time.
A datamart can be upto 100 gb and we need PPU or premium license to create it. My question is can I share my report built on that datamart to Pro users
Is the azure database it creates behind the scene available to be interacted with other applications? Can we run native query in Power BI against the azure DB?
There are a few reasons I can think of. Depends on which Azure region your tenant is based on, the Datamart may not yet be available in that region. or the creation of that might be disabled in the tenant settings.
Excellent explanation! This is a very underrated channel!
Good video. Not clear to me what datamart adds beyond existing dataflow functionality.
Hey, really liked the citizen data analyst definition, I`m one of them! Love Microsoft and Power BI!
Glad you liked it!
I see your schema "What is Power BI Datamart" and i have some questions :
1- in this schema it seems like an architecture to follow , we are accustomed by 2 concepts DataMart (subset of specific data
for one or more specific business units) and Datawarehouse (data fully covered for all business units) :
so i found a different concept of datamarts : it's not a data structure (DB, DWH, etc) , it's a cylce of 3 components ?
2- the migration from step 1 (dataflow) to step 2 (Datawarehouse) : who is responsible : we use ETL for example Azure data
factory ?
3- why we talk about azure Data warehouse : We can not use also SQL server in desktop ?
So you build the data model first, then the thin report. I like that refresh is automatic.
Great thank you... looking forward to the roadmap.
Does this also support storing on-premise data (through a gateway) in the sql database that is the output of the datamart? Can it serve as an alternative to Azure data factory for basic ETL?
At the moment the access to the underlying Azure SQL Database is limited. tables are not accessible directly. you can query them through Views. So based on the current settings, no you cannot import data into that database using other methods. However, I think in the future things like these will be possible and available.
Are datamarts import only, or is direct query available? Hybrid datamarts, maybe?
Datamart comes with a dataset. that dataset will give you the ability to be used in live connection, or directquery with componsite model combined with another data, or anything else you want. just like a normal Power BI dattaset
Reza - Can you clarify about the Datamart refresh. Let's say I have 4 dataflows in the datamart. And they refresh 4 times per day. If the Datamart refreshes automatically, why is there an option to schedule refreshes? Or, if the datamart is schedule to refresh...does it automatically trigger the dataflows to refresh? I heard or read somewhere that the previous challenge of always having to time the dataset refresh after the dataflows is not an issue with Datamarts....but not clear on how/why?
Hi.
Datamart includes a dataflow in it. and the schedule refresh of the datamart meaning the schedule refresh of that dataflow and then the dataset immidately after.
In your case, you are getting data from four other dataflows. the datamart imports the data from those dataflows using another dataflow (the datamart's dataflow) into Azure SQL DB. this refresh is not necessarily triggered when those four other dataflows refreshes. if you want to sync them together, you still need to do it in another way.
The datamart schedule refresh only syncs its own dataflow refresh and its own dataset refresh.
in the future though, we might have things like what you mentioned.
But these users still need to understand basic relationships, pk, fk, one to one, one to many etc … otherwise they will loading data in and hoping for the best, then further down the road they’ll be asking why is my report wrong etc..
That is right. That is what everyone who is building a BI model should learn about dimension, fact table, relationship types, star schema etc.
I am new with datamart and dataflows! I created a datamart from a dataflow. I wanted to schedule incremental refresh but I couldn't because the option is inactive. Any help will be appreciated
Thank you for sharing about datamart. Can a user with pro license access a workspace created by user with premium license?
Great video and good explanation. Is that right to say, the datamart and dataset 1:1 copy storage point of view since it's autogenerated? When it comes architecture, what's your thoughts whether we recommend datamart as futuristic and discourage the data flow?.
Datamart is not a replacement of a Dataflow. It is using Dataflow and adding some other things to it (such as the Azure SQL DB). So you are still using Dataflow underneath. I do even suggest creating layers of dataflow as I explained in one of my articles.
since datamart is still in preview, what are the risks of implementing it as a solution? from an enterprise perspective. Looking for your thoughts @RADACAD
WOW.. Great content. I have a question. when you mention datawarehouse is better doing in synapse according the dimension, how can I know if power bi is the right tool to create company datawarehouse instead of Azure synapse. Does exist a Gigabyte Number to select the rigth tool or rows/columns numbers ?
Thanks for sharing this, regards
Great set of videos Reza. Thanks for sharing. Do you know if the database will be hidden to Power Automate? For example, If I capture student questionaries using MS Forms could I deposit the answers directly into Azure SQL instead of filling Excel files for later synching to the Datamart?
The datamart's database comes with some limitations (as of today). You cannot access tables directly. You use views to get data from the tables. So I think not, you perhaps cannot do that. not today. But I hope this limitation to be lifted, so that we can use other ETL tools to feed data into that database.
well explained, thanks Reza!
Thanks Reza. So, does it mean we can do an incremental refresh against a table connected to a datamart In power bi desktop?
You don't need to do that in Power BI Desktop. You can do it using the Datamart editor in the Power BI service.
do you think this service will be replacing the microsoft analysis services? thank you
I don't think so. Analysis Services is empowering the Dataset part of the datamart. And will stay like that.
Excellent.
Reza I have another question, would we be able to create a data mart, save the tables we need and then create and save a query for others to reuse? If that’s the case this is pretty much game changer because I would have to have queries saved for business users to run and get data from it.
Yes. that is the NEW QUERY part of it. every query will build a view back in the Azure SQL DB too. that is exactly the purpose of it as you said.
Hi Reza, do you know if a datamart retreive data from different power bi tenants? or do we have to copy them manually to datamart on main tenant?
By "retrieve data from different tenant" what do you mean exactly? What is your data source in the datamart? where do you get the data from? dataflows? or any other sources? Datamart supports all the data sources that dataflow does. but if you are getting data from a tenant specific object, then you can have one tenant in each data source connection at a time.
A datamart can be upto 100 gb and we need PPU or premium license to create it. My question is can I share my report built on that datamart to Pro users
Yes, you can. Pro users can consume the datamart's dataset content
Is the azure database it creates behind the scene available to be interacted with other applications?
Can we run native query in Power BI against the azure DB?
Yes
You can use the database in any other applications you want to read data, but not to write into.
Excellent video!
Thank you very much!
Hi, I can’t see this functionality in my organization’s Premium workspace just yet, when is it expected to hit this publicly?
There are a few reasons I can think of. Depends on which Azure region your tenant is based on, the Datamart may not yet be available in that region. or the creation of that might be disabled in the tenant settings.
@@RADACAD thanks, I’m on the US East region, I’ll also talk today to my PBI Admin and check if the option is disabled.
Thank you!
@@RADACAD it just showed up yesterday! Excited to try this
Perfect
Like and comment