I am Lazy, but in order to be proper lazy i have to be really efficient, you can be both ;) - but in truth im not lazy, since i will peruse an idea that can save me 10 minutes for hours, because i know that idea can and will save me 10 minutes 100 times in the next year.
I noticed, when you publish the report in an app (and the end-user does not have view permissions on the original dataset) the end-user is not able to view the app When the dataset is in the same workspace as were the App is based on, it works fine.
@@Maartenravie I also ran into this problem. However, I found that you can go into Manage Permissions of the dataset and manually provide Permission/access to report consumers which then allows users to view the report in alternate apps then the workspace where the dataset exists. The reason it works automatically when the report is in the same app as the workspace is because when you publish app, it's assigning those app users these permissions. This is not the case when publishing alternative apps which is why you need to do it manually.
We recently ran into this issue but couldn’t work it out. We have the golden dataset in a PPU workspace. The reports are built on a different PPU workspace pointing to the golden dataset and published to the App. Users have pro license and we gave them viewer and build permission under the golden dataset and viewer permission under the workspace that hosts the reports. However, users still received the prompt that they don’t have access to the underlying dataset. Appreciated if you can answer this.
Fully agree with your reasons. I do it all the time and it makes life so much easier. With regards to measures, with shared datamodels you are still able to add your own measures to the reports. So if other people build reports based on the shared model, they can add their own measures. Then, if people create useful measures that might be of benefit of other users as well, I 'migrate' them to the source model. 4th reason: The reports built on a shared model are WAY smaller and publish MUCH faster.
Good tips. The flip side of this is finding a good way to organize your measures (display folders, governance). Before you know it your "central datasets" can get pretty crowded....which is a better problem to have than data anarchy.
This actually inspired me to clean all the companies report to one dataset of each needed data, like our product reports are all connected to one dataset and all sales reports to one dataset, and all mobile app reports to our online mobile app dataset
I always use that practice to separate data model and visuals. But there is an aspect you must pay attention. The visual ones (I mean connected with Services ones) has the ability and let's you write "local" extra measures. If you use so many measures in your "mother" dataset model you can't make out which ones were made on the dataset and which has been written "localy" on the visual .pbix. Tragic, (after three months) you must click every single measure and if the function is appear on the top of the screen is local otherwise is from dataset. Tip: Before you publish your data model create a pseudo_table with one dax function let's say Patric =1 and name it Dax Table for non dataset DAX functions. Thank you for your accommodation... :)
This is solid advice. Applying this now for an enterprise roster and headcount request project solution I’m working on with HR. I currently have like 6 pages developed against one dataset and each page/report has a site and dept filter. Could probably create a report for each site using this method and only have one central data model. I love it. Thanks kindly sir!!
Thanks Patrick. We have been following this design/architecture in our firm for a long time now and are a big fan/proponent of this design. We use the term 'core model' and 'thin report' for this design. Hopefully MS adopts some terminology for this design. The pain points we see with this are: 1) Whenever a column or table is renamed in the core model or the golden dataset, all the linked thin report using those tables/columns break. This is unlike other BI tools such as Webi. Hopefully MS can enhance this. 2) We know that a measure can be added to the linked thin report, but we often encounter a scenario where a calculated column is needed. It would be great if MS could provide a feature for this. 3) Last scenario we often get challenged upon is being able to add multiple relationships between the same set of tables in the core data model and at thin report building time, specify the relationship context. This is supported on other BI tools such Webi universes and in Power BI, we can only add one set of active relationships and other set becomes inactive relationships. Does MS plan to have the concept of 'context' for the relationships? Thanks again!
Cant you just use dax in your thun report to solve problem 2 and 3? Try looking at addcolums dax function and userelarionship dax function. 2. dax.guide/addcolumns/ 3. dax.guide/userelationship/ John
In a simple video I can learn a lot about databases! I have a online server with SQL databases, in this server I install the PowerBI Gateway, create my database, and now I can work directly from my laptop with this data... ohhh off course, schedule updates in this server... all data updated all the time!
It really makes sense with creating multiple reports with a single data model. Even i would say that to make a better use of your datasets that are coming from different sources and may be even for different purpose it is easier to put all those datasets in one data model for easier management.
Great, Patrick. Just starting with Power BI but was googling if I could not use one shared dataset for several reports. Your video explains it perfect, now using golden datamodels!
I'm not really a fan of connecting to datasets as you can't create calculated columns on tables. However, my team recently onboarded to Premium, and are taking advantage of connecting to the workspace SSAS api. We use our published datasets, open a new file, connect using the analysis services connector, and specify a DAX query with summarize to get the data we need. It's been incredibly helpful since we don't own all reports, and need to re-use and centralize many of our OKRs/KPIs. This also removes burden from the source sql servers, and preserves logic. Of course the only downside is recreating measures, but with some cleanly processed values, it's not so bad.
Great video, Patrick, thank you very much. I can really see why having one central dataset is very useful in many cases. But why are we talking only about the most simple ideal-world scenarios, where someone just wants to add some new tiny calculation on top of the existing model (a new measure for example). In reality there will be more complex requests, where you will end up adding new dimensions, increasing granularity, aggregations, creating new relationships, or even having to change existing dimensions to meet up to the new requirements of the end-users. In the end the central model would soon grow to a monster model, where each report would use only a tiny fraction of this model or even some older reports will be broken because of the changes made to the central model trying to solve new requirements. So is there some rule of thumb (or set of rules) - when to use a centralized model (and inevitably increasing it more and more) and when it is better to create a new model (allthough slightly redundant to the existing model). Or is this idea of using a central model perfect in all means and you should always use it and I am just not getting the idea right (sorry if this is the case) :) Thank you!
I think you're getting the idea perfectly. I find real life scenarios are, that they like measures but the definition is slightly different and you end up writing many measures. Or they need extra dimensions or facts.
Yes. A great video would be answering "When to use more than 1 dataset". In my case, as I do not have PowerBi premium I can't have a dataset that is too heavy. So what do i do? I separate dataset with tables that are not related to each other. That is, data that won't be shown in the same report/page, or that don't have to be updated at the same frequency. But that is what works in my cenario.
You can also add a local measure for reports using a single dataset. We do this for 3 reasons: 1. Modeling efficiency. Because PowerBI was built for use by power users, it has some built in safety nets that can be a problem if you have really complex DAX. Essentially, every time you create a new measure it creates a blank measure in the background, then creates another one to overwrite the blank once you hit enter or exit the formula pane. In both instances it recalculates ALL measures and calculates columns in the background. One inefficient measure can lock-up the dataset. 2. Data preservation. Adding a measure to a PBIX file and uploading overwrites the published model, including an up to date data. You end up having to refresh locally before pushing which can be very time consuming. 3. Clutter... sometimes users need a VERY specific DAX measure that no one else will use. For these, it's fine to leave it in the visual file. Basically, we develop all measures in the report files for testing and staging. After we are ready to publish, we make the determination if it should stay local or be pushed to the "Golden Dataset".
Another reason not to have single source of truth is that you will need to host the whole dataset at every workspace... Even for tiny reports... Because you can publish reports only on workspaces where lies the dataset the model is connected to... Thanks for video anyway, you are great!
Hello @partick, the concept of a shared dataset is really useful. However, there are situations when these can get us nuts Vs bananas. Below are a few situations & I would like to hear from the community on how to handle them - 1. If there are multiple reports pointing to the same shared dataset, what is the best place to define the measures - in the dataset or in that particular report? I took the approach to define the common ones in the dataset & the report specific ones in the report but got stuck while preparing my dataset for Q&A, as I was not able to train the dataset for the measures which were defined at the report level. 2. Having a central/shared dataset sounds really exciting, but when we have really big ones, things can go crazy - as a small mistake will impact a lot more reports and every report user will get affected. How to handle this? 3. What to do when 2 reports pointing to the same shared dataset wants a different relationship or filter direction? I am practically facing these issues & am looking for some guidance/suggestions from you & the community. Thanks in advance!
Yes yes yes!! So many reasons but just do it! But...Patrick, not fond of the borders ;). You can create it (connected to central dataset) and then give to them the PBIX for them to publish too, and they can create measures if they are savvy too.
Very good suggestion! I have multiple identical reports, which are different only in language (eng, ita etc.). Setting up automatic refresh is always a nightmare!
Hello Patrik, Great video as always ! . I am with you on this one 100% but the only reason why i cant do this is because i have different reports going to different audiences and i share these reports as apps. If we were able to seperate a workspace into multiple apps with their own access permission that would be awsome.
Why not have a shared dataset with different workspaces for the different roles, and the app for each? all reports hit the same dataset. The biggest blocker there is that you can't add anything with the shared dataset today, but once the updates to Composite Models comes to allow that, it really makes for a great way to handle it.
This video is very timely as I have been selling this "golden" model concept to the powers that be, now I can send them a link to this video to support my efforts. We have an extensive set of Dashboards in Performance Point that is based off an SSAS multidimensional data mart. The executives like their dashboards but we are scheduled to upgrade to Sharepoint Online which is not compatible with Performance Point. I'm in the process of creating a tabular data model in Power BI because the muti-dimensional SSAS data model is built specifically for Performance Point with name sets which are not compatible with Power BI in order to duplicate the the dashboards in PowerBI. So this is a long winded Thank you for this video.
thanks Patrick, its very informative. Having a Golden Data model is good because we can maintain one single version of truth. But this will have impact on the dataset size. Having all tables in one dataset will increase the size of the dataset.
Hello Patrick, thank you for this video ! Always interesting to learn good practices from professionals like Adam and yourself. Thanks for sharing with your community :) We appreciate it
I used to work with 'Eddie' worked with him for year, he never shared his last name, but he used to say this all the time (( he also worked in data in the early days of computing )).
Good one Patrick, we r using same approach while developing any new reports with same model...even if any measure is not present in model, we can create it for that perticular report in report only...
Another awesome video! But hey you forgot to mention a 'Common Data Model' and Dataflow entities. Also be worth extending out for the slightly older school SSAS Tabular model migration, if in a larger enterprise environment.
exactly! I do agree 100% on single data set approach for all good reasons mentioned here, therefore my question: How to fix all the many data set you eventually have already in place? Consider I am the administrator and I have about 30 Pro-license developers who filled up Pbi service with several tens of reports, each of them with dedicated data set. But in reality they could be leveraged on few common data sets. Did you find a way to fix up such a kind of mess?
@@filippogiustini2610 The technique I am using is to start a new PBIX, connect to the new data source, manually copy and past visuals in from the old PBIX, then manually fix all of the broken data associations. Very time consuming and error-prone.
@@iamscottr I can imagine, hope it could be some more handy tool to replace old datamodel with a link to an exisitng one. Btw I'm trying this workaround Thank you
How do you collaborate on powerbi model if it is defined in one file? Could you make a video about that? Imagine model with 50tables and 5 people working on it, defining relationships measures, making updates as people request features. How do you go about enabling that collaboration? Thanks
Could you use the roles for the Workspace for this? So maybe the developers are Members and everyone who needs to use the model are viewers? Not sure how that plays out when using the model in a report in another workspace. Another thought I had was around creating/managing a 'working' copy of the model in the Workspace, and then on a scheduled basis, the 'official' copy is updated? Surely this is similar (in some way) to how databases are managed? Just brainstorming out loud here as your question caught my imagination.
I'm sorry you didn't understand what Patrick was talking about. The main idea is to reuse data instead of duplicating data across multiple Power BI Desktop files. We do try to have fun with it. Glad that came across. 👊
@@GuyInACube Hello @Guy in a Cube. If I understood the approach properly. Essentially, we are making all the tables/measures recorded in the original pbi file/data modeling available in this new report without having the data modeling itself. it is kind of replication right? In this way, we can keep the integrity of the original data model
Me and my team struggle with this a lot going into a new project. Yes, of course we want to build the golden model but then only one person can work in it at a time! How is that going to work? Our idea is that we create very lean data models for very specific reports, then merge them together when we are done.
Funny you mention lean data models, I used this exact terminology with my boss the other day. I see your perspective here. I love the concept of shared datasets, but I think it's a balancing act. I try to make lean queries in specific reports and fetch common dimensions and such from the shared dataset. They can only get so big unless you dump them into a premium capacity.
Thanks! Thats great! the only downside is if you rename measures or column in the linked reports it crashes... in the original report all graphs are updated automatically. Is there a way to fix this?
Hi , I have 2 reports using single dataset/data model out of 1 report is daily refresh and another report is monthly consoildated refresh. How can this be done using the shared datasets. Any help on this please?
Power BI is basically Analysis Services under the hood, right? Doesn't model size bloat quickly as you continue adding measures to one unified model because of how aggregations occur for every dimension and attribute? Premium pricing can be a challenge as you start getting to P2, P3, etc. to ensure adequate capacity and vCores for such a large model. Also, would you recommend one model for everything, or one model by business unit or subject area? Thanks for the great video!
Sean Werick exactly problem we have. As we centralized the model, the size could go crazy. Then we needed the premium version. Oh well it isnt performing, so pay more for premium? When most of reports are static, I still believe one smaller data model that is specialized for the report performed much better than using the central data model that the report only utilizes 5% of data there
@@cafealpha82 That's exactly my concern and why the data mart approach seems to make more sense from a pricing and performance perspective, with customized models ALL based on an overlaying data model (only logical [Erwin, etc.]). All of the data marts are simply subsets of that model, just like Kimball methodology. It's more difficult from process, governance and change control perspectives though.
Thanks for this reason. But the general problem with data sets is that you can't add additional sources :-( That's why in my Corpo people use hundreds of data flows, but now nobody wants to use a common data model, even if we introduced one for everyone, nobody uses it ;-)
Summary : Single Central Data Model: 1. Minimum number of Reports: Reuse with Shared Dataset. 2. Maintain Logic in One Model: Its not good practice to maintain too many Data Models and updates on them. 3. Reduce Number of Refresh on Data Source.
I would be very interested in the answer.. The only thing that seem possible is to use Power BI API with dataset... But I do not think it is possible to have a real model ( with multiple tables..)
Unfortunately in Power BI report server, each report has its own data model and you cannot connect to other existing datasets. The only way around this is to host the data model on a SSAS server and connect reports to the server instead. I wrote more about the challenges of PBI RS here: zachrenwick.website/2020/05/13/why-you-shouldnt-choose-power-bi-report-server-top-8-issues-and-missing-features/
Hi, Thank you for the wonderful explanation...when we used shared dataset we connect with the entire dataset is it possible to choose only few tables from the shared dataset ??
I already thought the same last year and so I developed a very large data model which combines many different sources (sharepoint, sql, mysql etc.). The problem now is that when at least one source is not available the whole datasets wont get refreshed. It would be really great if I could have one master dataset per source and combine them as I need them in my reports. The problem is that its not possible to create a report from more than one PowerBI Datasets actually.
Hi Patrick, thanks for the video. Any chance you know about the limitations of using Powebi datasets across reports? I'm trying to build reports using a single PBI dataset, but it turns out that some visuals (like sparkline by OKViz) stop working when I use PBI dataset in the embedded report, while they work properly with DB imported data instead of PBI dataset. thanks!
Hi Patrick! Many thanks for this very useful video! 👏 Situation: Let's say we have 1 Dataset Workspace where all datasets are published. And then we have 3 reports connecting to 1 dataset in that workspace, ok? Questions: When you have finished building those reports in your Power BI Desktop, you publish them and now you want to give access to them (to designated users in your Active Directory, for example), do you only need to give access to the report, or, do you also need to give access to the Dataset Workspace as well? Thanks for your reply 🙏😊
Hi Patrick's where the start date is the lowest login date of any user with that domain and the end date is the most recent login of any user with the domain
Hey, that's what I was looking for. However, I have already created versions of pbix files with the same data. Is there a way to link all of them to one data model and not recreate all visuals?
Great thank you. I'm wondering, how can you reverse the path from 2 files (dataset & report for viz) to a single master PBI file? To test the RLS for example.
Thanks for this video. I was doing this in my projects. Have a question - does this relatively slow down the reports as they are no more in import mode - but in live connect mode to the published dataset ?
Would taking an approach like this increase loading time when opening different reports? I am curious if there would be performance issues opening a report containing more data than it is necessarily using, or if it would only impact that dataset refresh. (Ex: Linking data from multiple departments on the off-chance you may want to cross-analyze in the future)
when I start developing new report on the Golden shared Power BI Dataset, and if I need to create new measures and add one more table to the existing data model. Will I be in a position to do these two things and can I save this modified dataset on Power bi service workspace. Can you please clarify it Patrick?
You will need to make the change to the model in the Power BI Desktop file then publish it back to the service. After that, when you open the .pbix file that contains the reports and the live connection all the changes will be reflected.
Page 1, 3,4 is for group A, pages 1, 2, 5 for group B and pages 1, 3, 5 for group C. I get this scenario all the time and your workaround doesn’t make this ideal, any advice?
I assume that all the pages are sourced from the same data model? This scenario can be problematic. Do you maintain multiple reports repeating each page? What a nightmare, almost as bad as maintaining multiple models. You could get creative by hiding all pages besides a navigation page with a button corresponding to each group. Once a button is click it would expose navigation specific to that group. Not sure there is a good way to solve this, besides security to show and hide pages based on the authenticated user. There is an item on ideas.powerbi.com (ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/35607487-add-security-roles-for-separate-pages-not-rls). Let's vote it up.
@@GuyInACube Thanks for replying. I voted. I personally find this as such a common scenario that I'm surprised more people are not calling attention to it! I have one model, then either use "save as" on PBI online (to get multiple reports from one model) or create new PBI desktop files from the same report then republish them to the service. I envisage a fix where a user gets a checkbox upon publishing with the many report names, and tick which pages go to which reports - it would look like the sync sclicers tickbox pane.
I see nonetheless some disadvantages. 1. When you connect to this central dataset I can 't add a new source to the new datamodel in this new pbix. Or can I? 2. When you make changes to the central dataset you must be aware that is used in several dashboards and you can get some errors in those dashboards.
This is where dataflows come into play. You publish all of your analytical dataflows (thus getting the 1 time load from source) and then build out your datasets from these flows. I would highly recommend radically limiting the number of datasets. If properly modeled, even the largest companies can run off of a handful of central models. Example: Microsoft has 1 model that's used for 90%+ of their enterprise reporting.
Q: if i have one semantic model/dataset. What it i have multiple different frequencies that data needs to be refreshed? From slowly changing dimensions that are REAL slow, reference tables updated yearly to fact tables that need updating every hour. When i schedule a refresh of just the one semantic model i have to refresh everything frequently? Even if for some tables/sources they havent changed?
Yòooo Patrick! Great video, as usual :) What if I have different remote coworkers, which want me to "merge" their Excel files (in a single dataset)? I mean, how would you efficiently manage this scenario? How to collect the files (maybe via a shared folder)? How the refreshing process could work? Thank you in advance!
Hi Patrick like this. Is there a possibilty to develop in desktop a single dataset and the different reports in desktop and then publish the singledataset and the different reports based on the dataset. So changes in the dataset when developping can be made in desktop quickly without having it to publish.
Hi. I really like your videos. You help me to know something new in pbi every day. Can You answer the next question: how can I get list of all visualisations and object used in them? I use DAX Studio, but it cannot help me. Can You?
So after I saw this video, I built a master dataset and built all of my reports off the one dataset, SO helpful. Thanks for this. But within the last 4 weeks every report that I have built off a dataset doesn't refresh when i have it open in PBI Desktop, is anyone else having this issue since about October 15th onwards?
Q: Is it possible to create a dataset from a dataset? I would like to create different layers of datasets so everyone can use the level of granularity desired.
Thank you Guy in a Cube for those amazing information! I have a question; why the connection to the dataset is slow? I noticed it is really slower than the live connection to the data source, and how can we increase the performance of this dataset? Thank you in advance
Relatively new to PowerBI, and trying to make sure I get how this should work. In order to create datasets, do you create new pbix "reports" that only have the datasets in them and then publish them (organized in a special workspace) so that other reports are able to access those datasets directly?
Hi, I tried to use data model from another persons report to have one set of data, but it was not possible to ad additional dimension table I needed for my report. How could that be solved?
Thanks Patrick! I want to share with you one problem with splitting report and dataset. If we publish the dataset in one workspace and the report in another workspace, the users need to be at least contributor at both workspaces to download reports created by using the dataset. I reported this issue and they actualized the list of limitations and proposed an idea to solve it. This is a limitation and it is quite confusing since, if you give build access to users to the dataset, then, they can create, publish, view and edit reports with this dataset, but they can't download it. Please, would you vote for the idea to solve this problem?
Thanks for sharing Patrick. Honestly, I do not quite get it. Say we have an executive report (20 pages to deliver) which needs both Sales data and EHS data. If we already have a Sales dataset and EHS dataset from separate Workspaces, do we have to create a new dataset and duplicate some of the metrics / calculations?
Patrick - Is there a way to connect a excel data from Desktop to PowerBI online? I have an excel report which will be updated on every week, what i am doing now is updating the data in the excel open PowerBI desktop refresh my report and then publish the report to the workspace. Rather i want to connect the Excel report directly into my PowerBI online. Please provide some tips
Hi Patrick - great advice and I'm also an advocate for using Power BI hosted datasets! Question though....we have a Workspace and associated App called 'General Reports' and we have 2nd Workspace and App called 'Management Reports'. We then have a Power BI hosted dataset which feeds many reports, some in 'General' and some in 'Management'. However we currently need to deploy this to dataset to both Workspaces and have to refresh both many times each day, which increases the load on the datasource. Is there a way to share a Power BI dataset between Workspaces?
Hi Mark, Yes it's definitely possible. When using the Power BI Dataset connector as Patrick showed on the Desktop you can connect to a hosted dataset on Workspace 1 and publish your report to another Workspace 2. Power BI will created a link of the dataset in workspace 2 without duplicating the hosted dataset. Hope It's help :).
I'm almost certain you need to have migrated from v1 workspaces to the new V2 one for both the dataset source and any workspace you want to link to. Luckily the workspace upgrade is now publicly available so check that out if you can't see shared datasets and ensure its enabled on your tenant.
But is this still applicable with the advent of composite models? And what about chaining datasets? Seems to me there is also an argument to be made for having subject matter datasets that serve specific audiences within an organization.
1. Create a manual table wit one dummy column 2. Create a measure in the table you just created 3. Remove the dummy column 4. Collapse/expand the field set tab and the icon will change to a measure one On top of this you can group measures in display folders if you go to the relationship view. It can include subfolders as well.
I have some posts on how to do that using DATATABLE's as well as using Direct Query Models and Non-Direct Query. First: sqlitybi.com/how-to-create-a-measure-group-table-in-power-bi/ Second: sqlitybi.com/creating-measure-groups-in-power-bi-and-tabular-using-calculated-tables/
"I'm not lazy; I'm efficient!" ~ Thank you for that. It drives me crazy when people say they're lazy when clearly they are not.
I am Lazy, but in order to be proper lazy i have to be really efficient, you can be both ;) - but in truth im not lazy, since i will peruse an idea that can save me 10 minutes for hours, because i know that idea can and will save me 10 minutes 100 times in the next year.
Another top reason: only having to maintain the Row Level Security on 1 dataset and applying RLS across workspaces with centralized control!
I noticed, when you publish the report in an app (and the end-user does not have view permissions on the original dataset) the end-user is not able to view the app
When the dataset is in the same workspace as were the App is based on, it works fine.
@@Maartenravie I also ran into this problem. However, I found that you can go into Manage Permissions of the dataset and manually provide Permission/access to report consumers which then allows users to view the report in alternate apps then the workspace where the dataset exists. The reason it works automatically when the report is in the same app as the workspace is because when you publish app, it's assigning those app users these permissions. This is not the case when publishing alternative apps which is why you need to do it manually.
We recently ran into this issue but couldn’t work it out. We have the golden dataset in a PPU workspace. The reports are built on a different PPU workspace pointing to the golden dataset and published to the App. Users have pro license and we gave them viewer and build permission under the golden dataset and viewer permission under the workspace that hosts the reports. However, users still received the prompt that they don’t have access to the underlying dataset. Appreciated if you can answer this.
Fully agree with your reasons. I do it all the time and it makes life so much easier. With regards to measures, with shared datamodels you are still able to add your own measures to the reports. So if other people build reports based on the shared model, they can add their own measures. Then, if people create useful measures that might be of benefit of other users as well, I 'migrate' them to the source model.
4th reason: The reports built on a shared model are WAY smaller and publish MUCH faster.
Good tips. The flip side of this is finding a good way to organize your measures (display folders, governance). Before you know it your "central datasets" can get pretty crowded....which is a better problem to have than data anarchy.
This actually inspired me to clean all the companies report to one dataset of each needed data, like our product reports are all connected to one dataset and all sales reports to one dataset, and all mobile app reports to our online mobile app dataset
Hi can you please share your linkedin account
I always use that practice to separate data model and visuals. But there is an aspect you must pay attention. The visual ones (I mean connected with Services ones) has the ability and let's you write "local" extra measures. If you use so many measures in your "mother" dataset model you can't make out which ones were made on the dataset and which has been written "localy" on the visual .pbix. Tragic, (after three months) you must click every single measure and if the function is appear on the top of the screen is local otherwise is from dataset. Tip: Before you publish your data model create a pseudo_table with one dax function let's say Patric =1 and name it Dax Table for non dataset DAX functions. Thank you for your accommodation... :)
Nice approach
Great
This is solid advice. Applying this now for an enterprise roster and headcount request project solution I’m working on with HR. I currently have like 6 pages developed against one dataset and each page/report has a site and dept filter. Could probably create a report for each site using this method and only have one central data model. I love it. Thanks kindly sir!!
Thanks Patrick. We have been following this design/architecture in our firm for a long time now and are a big fan/proponent of this design. We use the term 'core model' and 'thin report' for this design. Hopefully MS adopts some terminology for this design.
The pain points we see with this are:
1) Whenever a column or table is renamed in the core model or the golden dataset, all the linked thin report using those tables/columns break. This is unlike other BI tools such as Webi. Hopefully MS can enhance this.
2) We know that a measure can be added to the linked thin report, but we often encounter a scenario where a calculated column is needed. It would be great if MS could provide a feature for this.
3) Last scenario we often get challenged upon is being able to add multiple relationships between the same set of tables in the core data model and at thin report building time, specify the relationship context. This is supported on other BI tools such Webi universes and in Power BI, we can only add one set of active relationships and other set becomes inactive relationships. Does MS plan to have the concept of 'context' for the relationships?
Thanks again!
Cant you just use dax in your thun report to solve problem 2 and 3?
Try looking at addcolums dax function and userelarionship dax function.
2. dax.guide/addcolumns/
3. dax.guide/userelationship/
John
In a simple video I can learn a lot about databases! I have a online server with SQL databases, in this server I install the PowerBI Gateway, create my database, and now I can work directly from my laptop with this data... ohhh off course, schedule updates in this server... all data updated all the time!
BAM! You are amazing 👊
It really makes sense with creating multiple reports with a single data model.
Even i would say that to make a better use of your datasets that are coming from different sources and may be even for different purpose it is easier to put all those datasets in one data model for easier management.
Great, Patrick. Just starting with Power BI but was googling if I could not use one shared dataset for several reports. Your video explains it perfect, now using golden datamodels!
Thank you Patrick for the video!Just had the conversation internally on why to maintain just one dataset. Excellent timing!
That's awesome! Thanks for watching! 👊
I'm not really a fan of connecting to datasets as you can't create calculated columns on tables.
However, my team recently onboarded to Premium, and are taking advantage of connecting to the workspace SSAS api. We use our published datasets, open a new file, connect using the analysis services connector, and specify a DAX query with summarize to get the data we need. It's been incredibly helpful since we don't own all reports, and need to re-use and centralize many of our OKRs/KPIs. This also removes burden from the source sql servers, and preserves logic. Of course the only downside is recreating measures, but with some cleanly processed values, it's not so bad.
Great video, Patrick, thank you very much. I can really see why having one central dataset is very useful in many cases.
But why are we talking only about the most simple ideal-world scenarios, where someone just wants to add some new tiny calculation on top of the existing model (a new measure for example). In reality there will be more complex requests, where you will end up adding new dimensions, increasing granularity, aggregations, creating new relationships, or even having to change existing dimensions to meet up to the new requirements of the end-users. In the end the central model would soon grow to a monster model, where each report would use only a tiny fraction of this model or even some older reports will be broken because of the changes made to the central model trying to solve new requirements.
So is there some rule of thumb (or set of rules) - when to use a centralized model (and inevitably increasing it more and more) and when it is better to create a new model (allthough slightly redundant to the existing model). Or is this idea of using a central model perfect in all means and you should always use it and I am just not getting the idea right (sorry if this is the case) :)
Thank you!
Great question, would love if they could answer it
I think you're getting the idea perfectly. I find real life scenarios are, that they like measures but the definition is slightly different and you end up writing many measures. Or they need extra dimensions or facts.
Yes. A great video would be answering "When to use more than 1 dataset". In my case, as I do not have PowerBi premium I can't have a dataset that is too heavy. So what do i do? I separate dataset with tables that are not related to each other. That is, data that won't be shown in the same report/page, or that don't have to be updated at the same frequency. But that is what works in my cenario.
You can also add a local measure for reports using a single dataset. We do this for 3 reasons:
1. Modeling efficiency. Because PowerBI was built for use by power users, it has some built in safety nets that can be a problem if you have really complex DAX. Essentially, every time you create a new measure it creates a blank measure in the background, then creates another one to overwrite the blank once you hit enter or exit the formula pane. In both instances it recalculates ALL measures and calculates columns in the background. One inefficient measure can lock-up the dataset.
2. Data preservation. Adding a measure to a PBIX file and uploading overwrites the published model, including an up to date data. You end up having to refresh locally before pushing which can be very time consuming.
3. Clutter... sometimes users need a VERY specific DAX measure that no one else will use. For these, it's fine to leave it in the visual file.
Basically, we develop all measures in the report files for testing and staging. After we are ready to publish, we make the determination if it should stay local or be pushed to the "Golden Dataset".
Interesting
AMAZING!!! I will start digging myself out of my 4-dataset-maintenance-nightmare first thing tomorrow!!! Thank you Patrick!!
I'm really looking forward to composite models enhancing this experience. Thanks for the video Patrick!
Another reason not to have single source of truth is that you will need to host the whole dataset at every workspace... Even for tiny reports... Because you can publish reports only on workspaces where lies the dataset the model is connected to... Thanks for video anyway, you are great!
Hello @partick, the concept of a shared dataset is really useful. However, there are situations when these can get us nuts Vs bananas. Below are a few situations & I would like to hear from the community on how to handle them -
1. If there are multiple reports pointing to the same shared dataset, what is the best place to define the measures - in the dataset or in that particular report? I took the approach to define the common ones in the dataset & the report specific ones in the report but got stuck while preparing my dataset for Q&A, as I was not able to train the dataset for the measures which were defined at the report level.
2. Having a central/shared dataset sounds really exciting, but when we have really big ones, things can go crazy - as a small mistake will impact a lot more reports and every report user will get affected. How to handle this?
3. What to do when 2 reports pointing to the same shared dataset wants a different relationship or filter direction?
I am practically facing these issues & am looking for some guidance/suggestions from you & the community.
Thanks in advance!
Yes yes yes!! So many reasons but just do it! But...Patrick, not fond of the borders ;). You can create it (connected to central dataset) and then give to them the PBIX for them to publish too, and they can create measures if they are savvy too.
Whooohooo. Guy in the Cube famous!
Thanks both Patrick and Chris for this great tips!.
Hope you found them helpful. Thanks for watching. 👊
2 videos in 2 days, is this heaven?
Thanks for the kind words. 👊
Very good suggestion! I have multiple identical reports, which are different only in language (eng, ita etc.). Setting up automatic refresh is always a nightmare!
Thanks Patrick for sharing this! have always tried to look for ways to increase efficiencies and this one helps a lot.
You are very welcome. Thanks for watching! 👊
Hello Patrik, Great video as always ! .
I am with you on this one 100% but the only reason why i cant do this is because i have different reports going to different audiences and i share these reports as apps. If we were able to seperate a workspace into multiple apps with their own access permission that would be awsome.
Why not have a shared dataset with different workspaces for the different roles, and the app for each? all reports hit the same dataset. The biggest blocker there is that you can't add anything with the shared dataset today, but once the updates to Composite Models comes to allow that, it really makes for a great way to handle it.
This video is very timely as I have been selling this "golden" model concept to the powers that be, now I can send them a link to this video to support my efforts. We have an extensive set of Dashboards in Performance Point that is based off an SSAS multidimensional data mart. The executives like their dashboards but we are scheduled to upgrade to Sharepoint Online which is not compatible with Performance Point. I'm in the process of creating a tabular data model in Power BI because the muti-dimensional SSAS data model is built specifically for Performance Point with name sets which are not compatible with Power BI in order to duplicate the the dashboards in PowerBI. So this is a long winded Thank you for this video.
Amazing video. You have saved me hours of copying measures and code from one report to another.
Love this channel.
thanks Patrick, its very informative. Having a Golden Data model is good because we can maintain one single version of truth. But this will have impact on the dataset size. Having all tables in one dataset will increase the size of the dataset.
Thanks Patrick. I have already implemented that scenario/architecture in my company and It's very helpful.
Awesome! 👊
Hello Patrick, thank you for this video !
Always interesting to learn good practices from professionals like Adam and yourself.
Thanks for sharing with your community :) We appreciate it
You are welcome. Thanks for watching.
I'm not lazy, just efficient.
I used to work with 'Eddie' worked with him for year, he never shared his last name, but he used to say this all the time (( he also worked in data in the early days of computing )).
Wow, for the 1st time in my life I have seen a video with 500+ likes and 0 dislike.these videos are gold.
Thank you for the kind words. 👊
Good one Patrick, we r using same approach while developing any new reports with same model...even if any measure is not present in model, we can create it for that perticular report in report only...
that creates also burden as you could end up with same measure calculated differently in two reports
OMG, I made almost 30 reports by copying pasting. This is a great tip thank you
Another awesome video! But hey you forgot to mention a 'Common Data Model' and Dataflow entities. Also be worth extending out for the slightly older school SSAS Tabular model migration, if in a larger enterprise environment.
When i was doing ReportBuilder there was an option to create SMDL files that host common data models. Good to know powerBi also has it.
Superb and informative as always. Hope you and Adam are well 👍.
Thanks for watching! 👊
What's the most efficient way to convert over to a single dataset if you have already created a bunch of different reports in different files?
exactly! I do agree 100% on single data set approach for all good reasons mentioned here, therefore my question: How to fix all the many data set you eventually have already in place? Consider I am the administrator and I have about 30 Pro-license developers who filled up Pbi service with several tens of reports, each of them with dedicated data set. But in reality they could be leveraged on few common data sets. Did you find a way to fix up such a kind of mess?
@@filippogiustini2610 The technique I am using is to start a new PBIX, connect to the new data source, manually copy and past visuals in from the old PBIX, then manually fix all of the broken data associations. Very time consuming and error-prone.
@@iamscottr I can imagine, hope it could be some more handy tool to replace old datamodel with a link to an exisitng one. Btw I'm trying this workaround Thank you
This is Raj, appreciate that you brought Christopher blog post, he was my manager earlier..hehe
How do you collaborate on powerbi model if it is defined in one file? Could you make a video about that? Imagine model with 50tables and 5 people working on it, defining relationships measures, making updates as people request features. How do you go about enabling that collaboration?
Thanks
Yeah have the same question!
Could you use the roles for the Workspace for this? So maybe the developers are Members and everyone who needs to use the model are viewers? Not sure how that plays out when using the model in a report in another workspace. Another thought I had was around creating/managing a 'working' copy of the model in the Workspace, and then on a scheduled basis, the 'official' copy is updated? Surely this is similar (in some way) to how databases are managed? Just brainstorming out loud here as your question caught my imagination.
i tried to follow along and I still don't know what he's talking about but I watched the whole video because it's entertaining either way
I'm sorry you didn't understand what Patrick was talking about. The main idea is to reuse data instead of duplicating data across multiple Power BI Desktop files. We do try to have fun with it. Glad that came across. 👊
@@GuyInACube I'm a newbie to power BI and trying to catch on. I'm a slow learner and use to boring excel sheets.
@@GuyInACube Hello @Guy in a Cube. If I understood the approach properly. Essentially, we are making all the tables/measures recorded in the original pbi file/data modeling available in this new report without having the data modeling itself. it is kind of replication right? In this way, we can keep the integrity of the original data model
The information I didn't know I needed! Thank you
Patrick, looking for a solution that will show previous year result, and then static line for goal improvement for new year. Appreciate your help.
Me and my team struggle with this a lot going into a new project. Yes, of course we want to build the golden model but then only one person can work in it at a time! How is that going to work? Our idea is that we create very lean data models for very specific reports, then merge them together when we are done.
hi, i think u can put the model file in cloud/onedrive/SharePoint and access simultaneously.
Funny you mention lean data models, I used this exact terminology with my boss the other day. I see your perspective here. I love the concept of shared datasets, but I think it's a balancing act. I try to make lean queries in specific reports and fetch common dimensions and such from the shared dataset. They can only get so big unless you dump them into a premium capacity.
Thanks! Thats great! the only downside is if you rename measures or column in the linked reports it crashes... in the original report all graphs are updated automatically. Is there a way to fix this?
Hi ,
I have 2 reports using single dataset/data model out of 1 report is daily refresh and another report is monthly consoildated refresh.
How can this be done using the shared datasets. Any help on this please?
Wow what a great tip to maintain data model. Love it
Thanks Patrick 👌🏻👌🏻👌🏻👌🏻👌🏻
Thanks for watching! 👊
Power BI is basically Analysis Services under the hood, right? Doesn't model size bloat quickly as you continue adding measures to one unified model because of how aggregations occur for every dimension and attribute? Premium pricing can be a challenge as you start getting to P2, P3, etc. to ensure adequate capacity and vCores for such a large model. Also, would you recommend one model for everything, or one model by business unit or subject area? Thanks for the great video!
Sean Werick exactly problem we have. As we centralized the model, the size could go crazy. Then we needed the premium version. Oh well it isnt performing, so pay more for premium? When most of reports are static, I still believe one smaller data model that is specialized for the report performed much better than using the central data model that the report only utilizes 5% of data there
@@cafealpha82 That's exactly my concern and why the data mart approach seems to make more sense from a pricing and performance perspective, with customized models ALL based on an overlaying data model (only logical [Erwin, etc.]). All of the data marts are simply subsets of that model, just like Kimball methodology. It's more difficult from process, governance and change control perspectives though.
Thanks for this reason. But the general problem with data sets is that you can't add additional sources :-( That's why in my Corpo people use hundreds of data flows, but now nobody wants to use a common data model, even if we introduced one for everyone, nobody uses it ;-)
Can't agree with you more.. perfect.. efficient not lazy.
Love it 👊
Summary :
Single Central Data Model:
1. Minimum number of Reports: Reuse with Shared Dataset.
2. Maintain Logic in One Model: Its not good practice to maintain too many Data Models and updates on them.
3. Reduce Number of Refresh on Data Source.
How does this work on Report Server? How do I save my master data model for other reports to use? Thanks.
Yes, please comment on a good practice for report server users - there must be many more companies relying on the on-premise solution!
I would be very interested in the answer..
The only thing that seem possible is to use Power BI API with dataset... But I do not think it is possible to have a real model ( with multiple tables..)
I am interested in this for report server as well. Thank you.
Unfortunately in Power BI report server, each report has its own data model and you cannot connect to other existing datasets. The only way around this is to host the data model on a SSAS server and connect reports to the server instead.
I wrote more about the challenges of PBI RS here: zachrenwick.website/2020/05/13/why-you-shouldnt-choose-power-bi-report-server-top-8-issues-and-missing-features/
Hi,
Thank you for the wonderful explanation...when we used shared dataset we connect with the entire dataset is it possible to choose only few tables from the shared dataset ??
I already thought the same last year and so I developed a very large data model which combines many different sources (sharepoint, sql, mysql etc.). The problem now is that when at least one source is not available the whole datasets wont get refreshed. It would be really great if I could have one master dataset per source and combine them as I need them in my reports. The problem is that its not possible to create a report from more than one PowerBI Datasets actually.
Hi Patrick, thanks for the video. Any chance you know about the limitations of using Powebi datasets across reports? I'm trying to build reports using a single PBI dataset, but it turns out that some visuals (like sparkline by OKViz) stop working when I use PBI dataset in the embedded report, while they work properly with DB imported data instead of PBI dataset. thanks!
Thank you Patrick, this is exactly what I needed!
Hi Patrick! Many thanks for this very useful video! 👏
Situation:
Let's say we have 1 Dataset Workspace where all datasets are published. And then we have 3 reports connecting to 1 dataset in that workspace, ok?
Questions:
When you have finished building those reports in your Power BI Desktop, you publish them and now you want to give access to them (to designated users in your Active Directory, for example), do you only need to give access to the report, or, do you also need to give access to the Dataset Workspace as well?
Thanks for your reply 🙏😊
Hi Patrick's where the start date is the lowest login date of any user with that domain and the end date is the most recent login of any user with the domain
Hey, that's what I was looking for. However, I have already created versions of pbix files with the same data. Is there a way to link all of them to one data model and not recreate all visuals?
Great video Patrick but quick question how many people can use a given shared model at a time without creating issues.
Great thank you. I'm wondering, how can you reverse the path from 2 files (dataset & report for viz) to a single master PBI file? To test the RLS for example.
This is definitely the problem that i was passing. Clever solution
Wooooow!! If i am not wrong this topic is perfect for the new composite models in Power BI (Dez/2020), right?!
Thanks Patrick for your great work!
Is there an alternative solution for Power BI Report Server?
Thanks for this video. I was doing this in my projects. Have a question - does this relatively slow down the reports as they are no more in import mode - but in live connect mode to the published dataset ?
Just what I needed. Thanks Patrick!
Nice touch with the LSU watch. GEAUX TIGAHHS!
Would taking an approach like this increase loading time when opening different reports?
I am curious if there would be performance issues opening a report containing more data than it is necessarily using, or if it would only impact that dataset refresh.
(Ex: Linking data from multiple departments on the off-chance you may want to cross-analyze in the future)
Thanks Patrick! You’re great. What about security? It’s possible to shared only one report and drill through different pages in others reports?
when I start developing new report on the Golden shared Power BI Dataset, and if I need to create new measures and add one more table to the existing data model. Will I be in a position to do these two things and can I save this modified dataset on Power bi service workspace. Can you please clarify it Patrick?
You will need to make the change to the model in the Power BI Desktop file then publish it back to the service. After that, when you open the .pbix file that contains the reports and the live connection all the changes will be reflected.
Thanks for your tip. But can you also name some of the reason we should NOT use the same dataset
Hi Patrick, thanks for this! So why can't join data from multiple datasets? It would make cascaded datasets so attractive.
Page 1, 3,4 is for group A, pages 1, 2, 5 for group B and pages 1, 3, 5 for group C. I get this scenario all the time and your workaround doesn’t make this ideal, any advice?
I assume that all the pages are sourced from the same data model? This scenario can be problematic. Do you maintain multiple reports repeating each page? What a nightmare, almost as bad as maintaining multiple models. You could get creative by hiding all pages besides a navigation page with a button corresponding to each group. Once a button is click it would expose navigation specific to that group. Not sure there is a good way to solve this, besides security to show and hide pages based on the authenticated user. There is an item on ideas.powerbi.com (ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/35607487-add-security-roles-for-separate-pages-not-rls). Let's vote it up.
@@GuyInACube Thanks for replying. I voted. I personally find this as such a common scenario that I'm surprised more people are not calling attention to it! I have one model, then either use "save as" on PBI online (to get multiple reports from one model) or create new PBI desktop files from the same report then republish them to the service. I envisage a fix where a user gets a checkbox upon publishing with the many report names, and tick which pages go to which reports - it would look like the sync sclicers tickbox pane.
I see nonetheless some disadvantages. 1. When you connect to this central dataset I can 't add a new source to the new datamodel in this new pbix. Or can I?
2. When you make changes to the central dataset you must be aware that is used in several dashboards and you can get some errors in those dashboards.
This is where dataflows come into play. You publish all of your analytical dataflows (thus getting the 1 time load from source) and then build out your datasets from these flows. I would highly recommend radically limiting the number of datasets. If properly modeled, even the largest companies can run off of a handful of central models. Example: Microsoft has 1 model that's used for 90%+ of their enterprise reporting.
This is exactly what I needed! Thanks team
Q: if i have one semantic model/dataset. What it i have multiple different frequencies that data needs to be refreshed?
From slowly changing dimensions that are REAL slow, reference tables updated yearly to fact tables that need updating every hour.
When i schedule a refresh of just the one semantic model i have to refresh everything frequently? Even if for some tables/sources they havent changed?
Yòooo Patrick!
Great video, as usual :)
What if I have different remote coworkers, which want me to "merge" their Excel files (in a single dataset)?
I mean, how would you efficiently manage this scenario?
How to collect the files (maybe via a shared folder)?
How the refreshing process could work?
Thank you in advance!
Hi Patrick like this. Is there a possibilty to develop in desktop a single dataset and the different reports in desktop and then publish the singledataset and the different reports based on the dataset. So changes in the dataset when developping can be made in desktop quickly without having it to publish.
Hi. I really like your videos. You help me to know something new in pbi every day. Can You answer the next question: how can I get list of all visualisations and object used in them? I use DAX Studio, but it cannot help me. Can You?
Patrick,
Awesome post.
Question: How can I view the "code" behind the measure without returning to the original dataset?
You can try using thr performance analyser, it generates the dax query when you run it which you can copy and see
Do you have a video how to setup a single dataset
Great tips. Really enjoy your conversation
Thanks Muhammad! Glad you liked it. 👊
So after I saw this video, I built a master dataset and built all of my reports off the one dataset, SO helpful. Thanks for this. But within the last 4 weeks every report that I have built off a dataset doesn't refresh when i have it open in PBI Desktop, is anyone else having this issue since about October 15th onwards?
Q: Is it possible to create a dataset from a dataset? I would like to create different layers of datasets so everyone can use the level of granularity desired.
Thank you From Thailand
Thank you for watching our videos! 👊
Thank you Guy in a Cube for those amazing information! I have a question; why the connection to the dataset is slow? I noticed it is really slower than the live connection to the data source, and how can we increase the performance of this dataset? Thank you in advance
I might have insights for you if you still need it, please reply here will answer your question
Does it carry over your DAX formulas too? Or do you have to recreate those in the "live connected" version?
Relatively new to PowerBI, and trying to make sure I get how this should work.
In order to create datasets, do you create new pbix "reports" that only have the datasets in them and then publish them (organized in a special workspace) so that other reports are able to access those datasets directly?
Wait so, doesn't saving it over a connected model create an over-tabulated dashboard, the first problem you stated?
Hi, I tried to use data model from another persons report to have one set of data, but it was not possible to ad additional dimension table I needed for my report. How could that be solved?
Thanks Patrick! I want to share with you one problem with splitting report and dataset. If we publish the dataset in one workspace and the report in another workspace, the users need to be at least contributor at both workspaces to download reports created by using the dataset.
I reported this issue and they actualized the list of limitations and proposed an idea to solve it. This is a limitation and it is quite confusing since, if you give build access to users to the dataset, then, they can create, publish, view and edit reports with this dataset, but they can't download it.
Please, would you vote for the idea to solve this problem?
Thank you Patrick!
Glad you liked it. Thanks for watching!👊
With this approach my org can connect to the dataset from excel. Bring them to my data instead of the other way around (as much as possible).
Thanks for sharing Patrick. Honestly, I do not quite get it. Say we have an executive report (20 pages to deliver) which needs both Sales data and EHS data. If we already have a Sales dataset and EHS dataset from separate Workspaces, do we have to create a new dataset and duplicate some of the metrics / calculations?
Patrick - Is there a way to connect a excel data from Desktop to PowerBI online?
I have an excel report which will be updated on every week, what i am doing now is updating the data in the excel open PowerBI desktop refresh my report and then publish the report to the workspace.
Rather i want to connect the Excel report directly into my PowerBI online. Please provide some tips
I was just wondering how the performance is for this report? Do visuals still load the same? Faster? Slower?
Hi Patrick - great advice and I'm also an advocate for using Power BI hosted datasets!
Question though....we have a Workspace and associated App called 'General Reports' and we have 2nd Workspace and App called 'Management Reports'. We then have a Power BI hosted dataset which feeds many reports, some in 'General' and some in 'Management'. However we currently need to deploy this to dataset to both Workspaces and have to refresh both many times each day, which increases the load on the datasource. Is there a way to share a Power BI dataset between Workspaces?
Hi Mark,
Yes it's definitely possible. When using the Power BI Dataset connector as Patrick showed on the Desktop you can connect to a hosted dataset on Workspace 1 and publish your report to another Workspace 2.
Power BI will created a link of the dataset in workspace 2 without duplicating the hosted dataset.
Hope It's help :).
I'm almost certain you need to have migrated from v1 workspaces to the new V2 one for both the dataset source and any workspace you want to link to. Luckily the workspace upgrade is now publicly available so check that out if you can't see shared datasets and ensure its enabled on your tenant.
@@nickdoy7019 You are correct. Both workspaces have to be the NEW type (V2)
But is this still applicable with the advent of composite models? And what about chaining datasets? Seems to me there is also an argument to be made for having subject matter datasets that serve specific audiences within an organization.
Hi! Thanks for the video. Tell me how you did the grouping of measures?
1. Create a manual table wit one dummy column
2. Create a measure in the table you just created
3. Remove the dummy column
4. Collapse/expand the field set tab and the icon will change to a measure one
On top of this you can group measures in display folders if you go to the relationship view. It can include subfolders as well.
@@d3x0x thanks a lot
here is a link on how to group your own DAX measure
1, ua-cam.com/video/6IHNYDy_Ynw/v-deo.html
I have some posts on how to do that using DATATABLE's as well as using Direct Query Models and Non-Direct Query.
First:
sqlitybi.com/how-to-create-a-measure-group-table-in-power-bi/
Second:
sqlitybi.com/creating-measure-groups-in-power-bi-and-tabular-using-calculated-tables/