To summarize: There is nothing wrong with the Card visual, it's your dashboard that is bad for using them. Bigger question is why can't MS be bothered to fix the woeful performance of the most simple visual component.
Hmm 🤔 i don't like the advice of reducing the visuals. PBI is a data visualization and analysis tool. Visuals are essential. This shouldn't be a problem.
I think the larger point Patrick is conveying is that often beginners use multiple visuals (6 card visuals) when you could substitute 1 visual (Matrix) to display the same information of the 6 KPI custom measures. This will cut down on having unnecessary rendering of 5 additional visuals on the page. Its the lowest hanging fruit (1st Ugly Baby) quick development change you can make to improve speed.
You can often convey/present the same amount of data, just with reduced number of visuals (due to the number of queries sent to Power BI, it could help speed up the report). In this way you won't be limiting the visuals, just the back end changes
I agree, what's the point of advertising these visuals if being used the wrong way impacts the report. Surely the report itself should handle all the things recommended in the video automatically?
The problem is power bi itself not the advice given here. Having a good looking report with custom visuals makes it almost unusable. Power bi is just too slow
Thanks for sharing. These consolidations are great for the KPIs with same metric and rather they should have always been in a matrix/multi-row card/small multiples in the first place. The challenge comes for KPI cards for different measures/metrics - as you have to use different visuals for that in most cases (especially if there are custom icons for each KPI). Nevertheless good to know more options. Just that the customizability in Power BI is becoming a limitation for good designs.
One thing that is a very hit to performance from what I have found is that some computer science graduates put NVARCHAR(MAX) as their datatype and from what I know, you cannot index on a column that has MAX in the parameter making Power BI populate KPI's much slower. Not only that, the automation mess with so many Stored Procedures doesn't help! You should try to always use JOIN's instead of Sub-Queries if at all possible, because SQL Server is optimized for JOIN's and try to use Common Table Expressions instead of Temp Tables to reduce clean-up. Best practices for SQL Server that helps Power BI from the back end for performance tuning!
Patrick, agree to your suggestions in theory but this is not always practical or even acceptable to our designers and end-users. Our designers really want to have customized KPI cards, chart components, and other UI stuff to achieve a great look and feel, which does required a lot of visuals . I have been pushed back that Power BI cannot achieve the look and feel of designs expected when trying to use tricks to reduce visuals - and the only way to do that is to have a lot of components (visuals, images, buttons, etc.). You combined 4 KPIs to 1 matrix. To get a good looking KPI card (with icon) - one sometimes needs 4 different components (KPI card, image/icon, bar chart for trend & text box for custom header placement). So 4 KPIs would be 16 components!!! This is one reason I limit the use of bookmarks too (adds a lot of visuals). So this example is great for show and tell, but in reality a great looking baby (oops report) a lot of times needs the high number of visuals/components. Agree with the other points, but again you cannot always expose your underlying data model as is (especially if you are going to allow end-users to use Q&A and/or personalize visuals)
It seems like that now only updating is slow but also Power Bi on my Windows machine gets slower and slower ... that's ridiculous, I mean I have a few visuals but isn't it the key functionality of BI?
My desktop version...the first time I open it on my desktop for the day takes a long time to open....no file...just opening the desktop version. Subsequent file opening are not as slow. Just when opening the initial desktop version.
I've been watching loads of these videos thanks for the tips. Helped me a lot. I had a really slow dashboard, and realised my silly mistake after hours of trying to optimise. I was running multiple measures over a time period in a line chart. I didn't notice that I had set the axis up so that it was on date, and not hierarchy (Year Month). So the visualise was calculating the multiple measures 365 times over multiple visuals. Lots of ugly babies! Changed to hierarchy and drilled to month, calculating 12 times instead. No problem now.
A little torn abt this video. Reducing visial for a product whose purpose is to provide visualization feels...off. Reports arent just displayed on PBIS and user monitors, the reason for compact visual pages is for overview meetings in conf rooms.
Thanks a lot for this video! This video would realise my teammates how important are these aspects to optimise reports and help them approach in the right way going forward(which they are not following now).
If everything else fails you might need a calculated table where all heavy lifting calculation has been already done. But use this only as a last resource thing.
Its a very bad tool its just overhyped. My computer has big specs but still i’ve seen bi desktop freeze with just a couple 100 row tables. Meanwhile i can open a 100 tabs on chrome and every page still loads instantly smh
Hey Patrick, my power bi DESKTOP is slow and I suspect it’s my virtual machine. What’s the recommended cpu clocks for PBID so I don’t feel lag when I click on any buttons?
I can tell you why... because Power BI runs a language on a language on a language on a language. DAX, on Power Query, on SQL, (on C++). You could write a SQL or view for your report and its fine, powerBI decides... lets download ALL the rows that might be of interest, I can filter it again later... then again once more. 20 rows is actually 1000 -> 20 -> 20.
Great video. I'd be interested in two additional "flip the script" style videos like this one: Why is my Power BI file so large?, and Why is Power BI Desktop So Resource Intensive? I'm thinking of 350-400+ mb pbix files, and those times you have a report open in PBI Desktop and your machine sounds like it's about to take off. The solutions are likely related to the "triplets" you've outlined here, but even with star schema, etc. these two can happen, I think.
Hi Patrick. What can we do about the report cosmetics? A matrix doesn't always looks good. And what if the cards need to show the same measure, but with different filters applied? A matrix doesn't work rigth? In special if we need to use a PBIRS version prior May 2020 that doesn't include calculation groups. Thank you as always.
Power BI is just more fancy excel TBH, it doesn't scale very well on large datasets or more complex logic with dozen of variables. In our company we switched to preprocessed and denormalized tables in Greenplum and from there we push to PBI where we are having just bunch of filters, no logic, otherwise it will be hell slow
Very interesting Patrick , thanks for all your videos , Could you show us how analyze Social Media data (Twiter, Facebook, Instagam, youtube , etc..) in PowerBI , How to connect and examples of the KPIs we can show
Patrick, how can I get you to take a look at my Power BI report and help me 1:1 ? I have one visual on the page which is REALLY slow, and all it is is a graph of 3 measures over time. Other visuals using the same measures are not slow. Only the graph showing history and future (+/- 10 years) is slow.
Hi Patric, Can you please resolve the following issue, I am trying the Matrix table column size increase based on the quarter I choose. For example if I choose three quarters I want to utilize the screen real estate properly.
Great video! One thing we have noticed as as cause for our slow dashboards/visuals is sequential execution for subqueries for individual visualization with direct query against bigquery. In those cases the final time for the visualization will be a sum of all individual subqueries. I was wondering if there is a way the engine would pass those subqueries in parallel. Anyone has the same experience?
using single report as dashboard better or creating multiple reports and using them for dashboard - which is better use case? your videos are super.Thanks.
Generally sharing a single report with users is all you need to do unless the end users require a dashboard with a few key visuals from the report. If you have multiple reports to share, then consider using an App to share all the reports instead of a dashboard. You could also create a dashboard with a key visuals from your multiple reports and include that dashboard in the App with the multiple reports
Hi! I have a different speed problem. My report grew quite big over time, and now every modifying step takes minutes. For ex if I add a new column in a table, I press new column and I get the message "Working on it" with the spinning thing in front of it for about 3-4 minutes, then I type in the code for the new column, press enter, and again, working on it for 3-4 minutes. Same thing if I press Enter data to add manually a very small table, same thing with everything, every modification, deletion, adding in the data. The visuals are not affected, I can add / remove as many data as I want in the visuals without this delay. I checked with the performance analyzer, in total it's between 10 and 20 seconds, which is a lot but still pretty far from 2x4 minutes. Where is the problem and how can I fix it? Thanks!
Patrick, my data is huge like I have around 375M records. even though I optimize DAX and reduce the visual. still it is slow. Is there any solution you can suggest?
Yes, in Star schema, how do we handle dimensions with hundreds of millions? History Refresh fails with concurrency limits of dwh., No problem with incremental refresh though.
Might need some help on Ugly DAX, I have a Fact table that has a lot of category's with another column that is called total. I want to do DAX to calc the total only on category_id = 100015 My current DAX is Total Revenue = SUMX('MyTable', if(MyTable[category_id == 100015, calculate(SUM(MyTable[total])), Black() ) ) Any help on a better way to this let me know looking for performance side note category_id and total are on the same table
No, I dont buy in to this at all. The whole app and everything about it is hideously slow. Loading Power BI is slow, the report designer is slow as hell. Dragging components is slow, changing data is slow, the workflow is hideous. Everything about Power BI is horrifically slow.
Cool video! I like the whatsapp conversation! "You don't want to see my data model"; " You don't want to see my DAX". So funny. :) Btw, I think that six cards plus one matrix is acceptable in most cases, when you have modified the data model and DAX. Wondering if you have tested the result with the first not-so-ugly baby kept?
@@BernatAgulloRosello hi if there is one single table which have columns to filter that table is it necessary to create separate column dimensions derived from that table and connecting it to maintable.as filters from same table columns are working same as columns derived from same table.
@@saratchandra7388 if it's a large table and you have several columns that are actually a dimension, you'll be much better off creating a separate table and leaving only the key on the fact table to establish the relationship.
@@saratchandra7388 A Star Schema is generally best practice when creating a PBI report, particularly if the Fact table has many rows and dimensional attributes. Your report will run quicker using a Star Schema and it helps future proof the report to add more Fact tables in the future and so ending up with a Snowflake Schema
Another great video Patrick! However I was TOTALLY distracted and excited that there may be a LEGO theme for Teams?! Does such a thing exist?! My Google skills tell me not - BOO :(
No you are wrong.PowerBI is total shit. I did the same queries with python pandas, it took 5 seconds to generate my report after importing csv straight away. The entire pipeline taking 15 seconds with cleaning etc. In PowerBI it took 10 minutes, and the laptop is overheating! If it is made by Microsoft then it is always shit. Can't believe how you can easily mess up something that is so easy.
To summarize: There is nothing wrong with the Card visual, it's your dashboard that is bad for using them. Bigger question is why can't MS be bothered to fix the woeful performance of the most simple visual component.
Hmm 🤔 i don't like the advice of reducing the visuals. PBI is a data visualization and analysis tool. Visuals are essential. This shouldn't be a problem.
I think the larger point Patrick is conveying is that often beginners use multiple visuals (6 card visuals) when you could substitute 1 visual (Matrix) to display the same information of the 6 KPI custom measures. This will cut down on having unnecessary rendering of 5 additional visuals on the page. Its the lowest hanging fruit (1st Ugly Baby) quick development change you can make to improve speed.
You can often convey/present the same amount of data, just with reduced number of visuals (due to the number of queries sent to Power BI, it could help speed up the report). In this way you won't be limiting the visuals, just the back end changes
I agree, what's the point of advertising these visuals if being used the wrong way impacts the report. Surely the report itself should handle all the things recommended in the video automatically?
The problem is power bi itself not the advice given here. Having a good looking report with custom visuals makes it almost unusable. Power bi is just too slow
This video rocks!!! I just reduced the number of visuals on my report using a matrix. Thanks!!!
The SQLBI guys just did a great video on using the Cards with States visual to consolidate a lot of different card visuals using small multiples.
Great tip man, thank you!
Thanks for sharing. These consolidations are great for the KPIs with same metric and rather they should have always been in a matrix/multi-row card/small multiples in the first place. The challenge comes for KPI cards for different measures/metrics - as you have to use different visuals for that in most cases (especially if there are custom icons for each KPI). Nevertheless good to know more options. Just that the customizability in Power BI is becoming a limitation for good designs.
Guys... this type of content is a differentiator from competitors. SOLID.
One thing that is a very hit to performance from what I have found is that some computer science graduates put NVARCHAR(MAX) as their datatype and from what I know, you cannot index on a column that has MAX in the parameter making Power BI populate KPI's much slower. Not only that, the automation mess with so many Stored Procedures doesn't help! You should try to always use JOIN's instead of Sub-Queries if at all possible, because SQL Server is optimized for JOIN's and try to use Common Table Expressions instead of Temp Tables to reduce clean-up.
Best practices for SQL Server that helps Power BI from the back end for performance tuning!
Patrick, agree to your suggestions in theory but this is not always practical or even acceptable to our designers and end-users.
Our designers really want to have customized KPI cards, chart components, and other UI stuff to achieve a great look and feel, which does required a lot of visuals . I have been pushed back that Power BI cannot achieve the look and feel of designs expected when trying to use tricks to reduce visuals - and the only way to do that is to have a lot of components (visuals, images, buttons, etc.). You combined 4 KPIs to 1 matrix. To get a good looking KPI card (with icon) - one sometimes needs 4 different components (KPI card, image/icon, bar chart for trend & text box for custom header placement). So 4 KPIs would be 16 components!!!
This is one reason I limit the use of bookmarks too (adds a lot of visuals). So this example is great for show and tell, but in reality a great looking baby (oops report) a lot of times needs the high number of visuals/components.
Agree with the other points, but again you cannot always expose your underlying data model as is (especially if you are going to allow end-users to use Q&A and/or personalize visuals)
It seems like that now only updating is slow but also Power Bi on my Windows machine gets slower and slower ... that's ridiculous, I mean I have a few visuals but isn't it the key functionality of BI?
Let's not even talk about the babies...opening the program itself initially without a report...is slow :(
@Maximus Pedro watch it for PowerBI info?
My desktop version...the first time I open it on my desktop for the day takes a long time to open....no file...just opening the desktop version. Subsequent file opening are not as slow. Just when opening the initial desktop version.
Yep it is slow
I've been watching loads of these videos thanks for the tips. Helped me a lot. I had a really slow dashboard, and realised my silly mistake after hours of trying to optimise. I was running multiple measures over a time period in a line chart. I didn't notice that I had set the axis up so that it was on date, and not hierarchy (Year Month). So the visualise was calculating the multiple measures 365 times over multiple visuals.
Lots of ugly babies! Changed to hierarchy and drilled to month, calculating 12 times instead. No problem now.
A little torn abt this video. Reducing visial for a product whose purpose is to provide visualization feels...off. Reports arent just displayed on PBIS and user monitors, the reason for compact visual pages is for overview meetings in conf rooms.
Thanks a lot for this video! This video would realise my teammates how important are these aspects to optimise reports and help them approach in the right way going forward(which they are not following now).
If everything else fails you might need a calculated table where all heavy lifting calculation has been already done. But use this only as a last resource thing.
I think this is always a good reminder for developers!
You are the MVP Patrick, great video
Not to create too visuals in a visualisation tool? Why even call it a visualisation tool then?
Its a very bad tool its just overhyped. My computer has big specs but still i’ve seen bi desktop freeze with just a couple 100 row tables. Meanwhile i can open a 100 tabs on chrome and every page still loads instantly smh
Hey, come on, 7 visuals on a single page is too much? What is this, single threading?
Cards really need development to collect into a single visual and other options for devs as they really are a slow spot.
Good stuff as always Patrick :)
Hey Patrick, my power bi DESKTOP is slow and I suspect it’s my virtual machine. What’s the recommended cpu clocks for PBID so I don’t feel lag when I click on any buttons?
I can tell you why... because Power BI runs a language on a language on a language on a language.
DAX, on Power Query, on SQL, (on C++).
You could write a SQL or view for your report and its fine, powerBI decides... lets download ALL the rows that might be of interest, I can filter it again later... then again once more.
20 rows is actually 1000 -> 20 -> 20.
Babies are never ugly
Great video. I'd be interested in two additional "flip the script" style videos like this one: Why is my Power BI file so large?, and Why is Power BI Desktop So Resource Intensive? I'm thinking of 350-400+ mb pbix files, and those times you have a report open in PBI Desktop and your machine sounds like it's about to take off. The solutions are likely related to the "triplets" you've outlined here, but even with star schema, etc. these two can happen, I think.
Hi, I believe there is already a video on why is my PBI file so large...
Hi Patrick. What can we do about the report cosmetics? A matrix doesn't always looks good. And what if the cards need to show the same measure, but with different filters applied? A matrix doesn't work rigth? In special if we need to use a PBIRS version prior May 2020 that doesn't include calculation groups. Thank you as always.
Power BI is just more fancy excel TBH, it doesn't scale very well on large datasets or more complex logic with dozen of variables. In our company we switched to preprocessed and denormalized tables in Greenplum and from there we push to PBI where we are having just bunch of filters, no logic, otherwise it will be hell slow
Hi Patrick, Can we set up a different colour for a visual when it is still buffering and not fully loaded?
Very interesting Patrick , thanks for all your videos , Could you show us how analyze Social Media data (Twiter, Facebook, Instagam, youtube , etc..) in PowerBI , How to connect and examples of the KPIs we can show
Patrick, how can I get you to take a look at my Power BI report and help me 1:1 ? I have one visual on the page which is REALLY slow, and all it is is a graph of 3 measures over time. Other visuals using the same measures are not slow. Only the graph showing history and future (+/- 10 years) is slow.
What do you use for the animations? They're nice and fast.
Fantastic, I think this is great. I love how to explain your videos
Ngl I'd rather keep my aesthetics and have a slower report :)
Interesting… As a newbie I come here to solve my issue with Azure Cost Management report running for more than 2 (TWO!) hours. 😕
Hy Patrick, Everytime requirement is not so simple and stright forward. So reducing the visual will not help much.
Hi Patric, Can you please resolve the following issue, I am trying the Matrix table column size increase based on the quarter I choose. For example if I choose three quarters I want to utilize the screen real estate properly.
Great!
Great video! One thing we have noticed as as cause for our slow dashboards/visuals is sequential execution for subqueries for individual visualization with direct query against bigquery. In those cases the final time for the visualization will be a sum of all individual subqueries. I was wondering if there is a way the engine would pass those subqueries in parallel. Anyone has the same experience?
using single report as dashboard better or creating multiple reports and using them for dashboard - which is better use case? your videos are super.Thanks.
As all the mvp's say: "it depends"
Generally sharing a single report with users is all you need to do unless the end users require a dashboard with a few key visuals from the report. If you have multiple reports to share, then consider using an App to share all the reports instead of a dashboard. You could also create a dashboard with a key visuals from your multiple reports and include that dashboard in the App with the multiple reports
Hi!
I have a different speed problem. My report grew quite big over time, and now every modifying step takes minutes. For ex if I add a new column in a table, I press new column and I get the message "Working on it" with the spinning thing in front of it for about 3-4 minutes, then I type in the code for the new column, press enter, and again, working on it for 3-4 minutes. Same thing if I press Enter data to add manually a very small table, same thing with everything, every modification, deletion, adding in the data. The visuals are not affected, I can add / remove as many data as I want in the visuals without this delay. I checked with the performance analyzer, in total it's between 10 and 20 seconds, which is a lot but still pretty far from 2x4 minutes. Where is the problem and how can I fix it? Thanks!
Please what's the spec of your machine especially ram?
very useful, thanks Patrick
Patrick, my data is huge like I have around 375M records. even though I optimize DAX and reduce the visual. still it is slow. Is there any solution you can suggest?
Thats just powerbi for you, its a slow barely useable tool for anything big enough
Great help saving my office time. Love ya! Lol
Yes, in Star schema, how do we handle dimensions with hundreds of millions? History Refresh fails with concurrency limits of dwh., No problem with incremental refresh though.
I'm curious what dimension it is that have hundreds of millions of unique entries.
@@d3x0x It's property dimension, of SCD type 2, with 3 years of data.
Can we use azure appid and client secret in Power BI ?? To configure datasources??
Might need some help on Ugly DAX,
I have a Fact table that has a lot of category's with another column that is called total. I want to do DAX to calc the total only on category_id = 100015
My current DAX is
Total Revenue =
SUMX('MyTable',
if(MyTable[category_id == 100015,
calculate(SUM(MyTable[total])),
Black()
)
)
Any help on a better way to this let me know looking for performance side note category_id and total are on the same table
how is he doing the filter bar on the left?
If you have to dive into data modeling it's not self-service bi.
No, I dont buy in to this at all. The whole app and everything about it is hideously slow. Loading Power BI is slow, the report designer is slow as hell. Dragging components is slow, changing data is slow, the workflow is hideous. Everything about Power BI is horrifically slow.
Hi I am facing slow performance in matrix to laod
Patrick, why is Power BI SLOW?
These videos are getting a little spicy! 4:37
Im talking about just running it. Just running. Its so slow.
Cool video! I like the whatsapp conversation! "You don't want to see my data model"; " You don't want to see my DAX". So funny. :)
Btw, I think that six cards plus one matrix is acceptable in most cases, when you have modified the data model and DAX. Wondering if you have tested the result with the first not-so-ugly baby kept?
Your table has more than 1 BILLION rows hahaha look at that in 6:15: at the bottom-left you can see 1,034,769,900 rows
is it possible to create starschema always?
If you have more than one fact table it may not be a simple star, but like several stars connected by conformed dimensions.
@@BernatAgulloRosello hi if there is one single table which have columns to filter that table is it necessary to create separate column dimensions derived from that table and connecting it to maintable.as filters from same table columns are working same as columns derived from same table.
@@saratchandra7388 if it's a large table and you have several columns that are actually a dimension, you'll be much better off creating a separate table and leaving only the key on the fact table to establish the relationship.
@@BernatAgulloRosello Don't you mean if you have columns that are attributes of a dimension?
@@saratchandra7388 A Star Schema is generally best practice when creating a PBI report, particularly if the Fact table has many rows and dimensional attributes. Your report will run quicker using a Star Schema and it helps future proof the report to add more Fact tables in the future and so ending up with a Snowflake Schema
My babies are beautiful in my eyes ,but they might look different in others eyes.
Patrick. I need you to rate my babies? How can I do that?? 👨🏽🍼
Another great video Patrick! However I was TOTALLY distracted and excited that there may be a LEGO theme for Teams?! Does such a thing exist?! My Google skills tell me not - BOO :(
👍
And you didn't even touch upon things like AvailableInMDX, making sure your keys are text and all the fun things you can do.
I assume you meant to say 'keys are numeric'
No you are wrong.PowerBI is total shit. I did the same queries with python pandas, it took 5 seconds to generate my report after importing csv straight away. The entire pipeline taking 15 seconds with cleaning etc. In PowerBI it took 10 minutes, and the laptop is overheating!
If it is made by Microsoft then it is always shit. Can't believe how you can easily mess up something that is so easy.
Truth! :D
The 1st&2nd babies are understandable but 3rd baby. It seems to study more to get better DAX to get rid of this ugly one
No ugly babies for this guy bcoz of you😀