Just used this as a step by step to analyze a model that I am working on. I knew I had to get rid of columns (and I had) but now I am pruning even more ruthlessly. And one thing I would add--don't be afraid to remove columns--you can always add what you need back. So what I do is select JUST the columns I know I want, and then use REMOVE OTHER COLUMNS. Then, if I find that there is a column I DO need, I go back to that REMOVE OTHER COLUMNS step, and modify the command by adding the name of the column I need back in. Super easy, super quick.
I have recently started expanding my knowledge with PBI and your channel has amazing information, examples and tips. I appreciate your work very much! Thank you for your efforts!
Excellent video! Just reduced my PBIX file from 136MB to 34MB. Goes to show how little I know about how data is stored. I had several tables with a couple of unique key columns.
Great tips.. I always use Remove other Columns to make sure I only keep the columns I need.. always get rid of the columns as a first step and not after you have done bunch of transformations.. plus always, always reduce the date-time to date only if you dont need the time. Time adds lot of bulk to the size (i guess because of high cardinality). hopefully PowerBI team will add Vertipaq analyzer-like tool in performance analyzer
Because this video, I was able to reduce the size of a Power BI report that includes a customized calendar dimension from 500 MB to 2 MB, by just turning off the Time Intelligence feature. This is so unreal that I got my coworker to reproduce the size reduction. In hind site, it makes so much sense based on how Time Intelligence works. Thank you so much!
Hi Adam & Patrick, thank you guys so much for posting awesome contents, as always :) One thing I would like to point out is the shout out for DAX Studio. I have to admit I was a little bit surprised that Darren Gosbell wasn't mentioned as he's the creator and main contributor of DAX Studio. Yes, no doubt that Marco and Alberto (I have huge respect for them) have contributed in some of the coding; Marco has also mentioned a few times that people have mistaken him as its creator and had to clarify that he contributed approx. 5 - 10% of it. So I'm not sure whether that's the case here. Once again, thanks for the awesome contents and keep being awesome!
Thanks for bringing this up. I have the same impression that the two DAX gurus from SQL BI are the creator of DAX Studio. Now I the big man behind this very useful tool is Darren Gosbell. (mvp.microsoft.com/en-us/PublicProfile/35889?fullName=Darren%20Gosbell)
We have a lot of love for Darren! It is a SQLBI tool though and that was the intent. Apologies for giving the impression on actual development time. That wasn't what we were going for.
I completely agree with removing columns that you don't need. But, I think we need to be careful that we don't remove so many columns that we can no longer guarantee uniqueness in the table. When a context transition occurs, the table is iterated and if we have duplicate rows, they will get double (or triple etc.) processed causing very hard to catch (and resolve) bugs.
Good techniques What i normally do is take out as many colums as possible with select colums, and you can always bring them back if at some stage you need a previously removed colum. Just disable data type detection and select data types as your last step
Brilliant! As someone used to working with tabular data, I inherently knew removing unwanted columns takes a huge load of schemas. I am a newbie to Power BI and was looking on ways to reduce the model size on my projects and your video just proves how simple it is to cut down the size if you are really *clear* about your data. Thanks for highlighting that part so well, Adam!
My file size was relatively small (c.4 MB) but visual fails to load in PowerBI service. This has helped me to optimise how the table loads and it is now working! This did not reduce my file size significantly (now c.3.5 MB) but that's not the point anyway. Thanks Adam
You guys are the bomb! Thanks for the tips. That VertiPaq Analyzer thing? Holy crap! That's a gold mine! It shows all my measures! I've been looking for something like this for forever!
Ya, I have a model which is taking every time over an hour to refresh. I'm using dataflow as the source and still is taking a long time. So, my next step will be to check if I need all the columns :) Thanks for the video.
First.. Great Video.. Second.. I love how you say to "Jump over to Premium to give you some breathing room" Power BI Premium sits at a price point that only large corporations can afford it. I would love to jump to it for the use of computed tables inside data flows, but cant get it into the budget till next year.
Oh man... I really would love to show you what were working on. I'm in healthcare data analysis. Healthcare data is legit big and we're doing everything we can think of to reduce our data size. Our latest project PBI file saves at 6GB!
15:30 you have selected few columns and apply and load. While we publish this from Dev to Prod, do you think it will create problem? If the columns will missmatch in dev and test and prod.
is the video still valid? I try using VertiPaq 2.01 and DAX 2.14.1 and there is no SSAS connection in data source as in the video in 4:40 . I have instead Query Data Source=$Workbook and it cannot be changed to connect to the local host port as described in the video...
The video is still valid. I think you were doing the step in Excel itself and not from within Power Pivot. Another option that was available after this video is to just do it directly within DAX Studio. When connecting with DAX Studio to your Power BI Desktop file, go to Advanced ribbon bar and choose View Metrics.
3:15 Yup I approve that! 😅 I worked with a table of around 12 mil records and not only it took about 2+ hours to fetch on the desktop, ate up all the ram making the PC almost unusable!!!
Hi Adam, great talk, however what are your toughts about the usal practice of creating huge / heavy / slow multipurpose "golden" datasets which intend to solve the "several sources of truth" problem by putting everything and the kitchen sink in a single dataset file serving dozens of reports?
Thanks for this information also i think even if the organization said may we need this column latter on , it is easy to get this column again not a big deal
Hello fellow Devs *Please Note* : The Process has changed on how to Load your Model into VertiPaq Analyzer. ✅ Now Export a VPAX file first from Dax Studio, ✅ Then load THAT into the Excel Analyzer. Instruction on first page of new Vertipaq Analyzer ✅
Hello Patrick - Thanks. This is a great tutorial on the usage of DAX Studio and VertiPaq analyzer. I have tried using it for my Power BI report which is built based on SAP BW Application server connector. However, I do not see SSAS connection to update the local host and analyze. Could you please help me understand how I can create it? Thanks, PS
Just stumbled across this video - some good tips. Couple of questions though: 1. Unchecking the auto-date/time setting stops me from being able to show a nice hierarchical date slicer (Year->Qtr->Month->Day) - How could I still have one or more of those with the setting disabled? 2. For reducing the number of columns in the dataset, wouldn't it be better to edit the initial source query to only get the columns you need from source? Otherwise, you are telling Power BI to pull in all the columns (and have to handle them all), just to then say "now forget about half the columns I just told you to import"
What is Evaluating set do when refreshing the report. I am using the SharePoint folder with JSON as my data source. It is very slow when refreshing. Major time taking in Evaluations.
So I have a question since I am confused. Based on another video as Well I wonder should I use SQL to pre-filter when loading data OR should I Just load all and remove data and columns I don't need using transformations. You pointed out in another video that transformations Will be blocked if you use SQL tot pre-select data. Or is that only the case if you have used select all and not select specific columns? Secondly would it improve speed of In create a query which links all ID's tò eachother in 1 table and can you increment only that table and refresh the lookup tables whenever you need them? Basically is the refresh of a model done on the whole model of specifically to certain tables.
Do you normally create a backup of the PBI data before you remove columns? Is that just as easy as just creating a PBI file? Just in case we removed columns we should not have done. If so, how do you backup the proper way before we start removing columns, to retain an original the client provided.
A quick question about data flows and pbi service, would it make sense to load large dimensions to data flows and then only reference it (dataflow) in reports or that approach could cause issues in the long run?
I wonder the same thing. If the dataflow is being refreshed, how can I access that and not refresh in my model. Maybe this is similar to a Direct Query for the dataflow. I don't desire to refresh data, I just want whatever is stored in the dataflow.
Great stuff. Got rid of 500MB worth of LocalDate_tables o/. Also found that in one report we have 22 million rows where one column contains numbers but is stored as a String. Wrong on so many levels :) It's not even used in the report! 700MB saved in a few seconds. This will come handy setting up guidelines for building Power BI-reports in our organization. Thanks!
hi! Thank you very much for sharing such interesting videos! Could you share the excel file you use in the video and could you explain a little more about the use of DAX Studio? Thanks!
The excel spreadsheet is VertiPaq Analyzer - which you can get from sqlbi.com - www.sqlbi.com/tools/vertipaq-analyzer/. Also Marco, from sqlbi.com, has a longer recording on what to do with a slow report. This goes into details on DAX Studio. www.sqlbi.com/tv/my-power-bi-report-is-slow-what-should-i-do-2/. Also, we will be looking at doing more videos on these topics as well.
I broke my rule in this video. I actually was surprised when i saw it in the editing. Was on auto pilot. I used ZoomIt in the video at one point, but that's honestly the first time - in a long time - I've done that in the videos. Normally all of the zoom and highlight stuff I do in post. But when presenting in person I absolutely use ZoomIt. Every presenter should have it! Or something similar like it.
Love this. I was wondering, If you have a Surrogate Key and a Business ID which would be high cardinality and you join the tables by Key. Could you actually remove the business keys from the model or should you always leave those in. for example Product Key 1 Product ID 35335 ? I'm thinking in terms of the Fact table AND dimension if you have gone for a STAR schema
Debbie you will need to Surrogate Keys for the relationships, but if you are not using the Business ID I would l definitely remove it from the model. The only time we suggest keeping anytime of ID is if it is needed for reporting. Great point!
Thanks for the info in this video, impressed with how much this decreases the size of the Powerbi File! Would a similar approach work if you are suffering with the "visual has exceeded the available resources" error in the service when linking to a powerbi dataset?
Hi Adam, very nice video! Thanks a lot, and just wondering what is video recording application that you use to recording your operation on PowerBI? Very appreciate it if you can reply me !
I am trying to run a already created report file by Microsoft " Customer Service Analytics for Dynamics 365.pbix" for my Dynamics 365 instance but it fails every time while load with error "The refresh operation failed because it took more than 120 minutes to complete. Consider reducing the size of your dataset or breaking it up into smaller datasets", now i guess i can only do all this Optimization only once my data is loaded into the .pbix file. What to do in case if in the First Time itself the Report doesn't load?
I initially add just the fields I can filter on (market, customer type etc), together with one fact (e.g. order quantity), then I filter, then I add all the other columns required. The only annoying thing is that once you change a column data type, then you can't add any more from the data tables (at least on import).
Thank you so much for your tutorial. I am using ssas multidimensional live connection. I am trying to create a stacked column chart which show month x axis and value y axis. also shows month wise top selling store. But when i try to top n filter by store that filter shows highest sales per year not filter month wise sales. Please help me how can i solve this.
This is really great! Thanks Adam! Would there be any negative impact(s) if we disable the time intelligence for an existing report have datetime columns.
Yoooo .... Sometimes a refresh process that typically lasts 10-15min gets some problem and doesn´t finish succesfully. In the meanwhile it that schedule refresh keeps the status as in progress for 2 hours and I cannot start an manual refresh while that scheduled refresh hasn´t finished. How can I proceed to manually stop a refresh that is currently running?
Thanks for the video. This message appear when i edit the connection and perform the refresh data of this connection. "We couldn't refresh the connection. Please go to existing connections and verify they connect to the file or server". what should i d?
Does in make any sense to group and summarise the remaining columns after deleting unnecessary IDs? Would it increase performance given that Power BI has very intelligent "packing" abilities?
Unfortuantely no. I used it based on the CotosoRetailDW sample database. But we increased the number of rows. I had 25 million rows in that file. It is pretty big.
The most of Power BI devs use all columns from source and them doesn't understand why the project runs slow. The best solution is first PREPARE your data source. In SQL Server, create Queries with only columns that you have to use into Power BI. This process is much more faster than others. And, also, you have at PBI all columns that you will use.
Hi Adam i am a very big fan of your power bi videos.......... i have a small doubt about how to validate the reports that are developed in Power BI Desktop .....Thanks in advance
Hey guys in a cube! I've run into a memory issue with my PBI reports. The dashboard visual is a table (🤦) and the stakeholder wants to add a new measure to the table. The file size is less than 100MB, and I went through these great tips, but when I try to pull in the new measure, I still get an error that there's not enough memory to complete the action. It pulls in fine when I create a new page with less columns in the table. Q: is there a limit to the number of columns PBI can handle in a visual table? I'm currently working on recreating the measures in a specific measure table and out of the data table hoping that will relieve the memory. Am I on the wrong track? Any advice for fixing memory issues?
M using Huge data set around 2b rows,, and that too using python query to pull data from Mongo .... does this Incremental refresh helps in this scenario ???? pl do Help help Help
One little thing though - if you disable Auto Date many of your Quick measures won't work any longer as only power bi provided date hierarchies are supported.
Hello, here from Chile and I am a fan of your UA-cam channel. I would like to get the files that you show in the video to be able to practice and follow your steps. I will be grateful if you could share them. Regards!!!
Unfortunately, they are pretty big. It is just the ContsoRetailDW database, modified with extra rows. I also added a custom column for that OrderID column to simulate :) I did that at the SQL level using a view and to also flatten out the data.
Although it doesn't show which aren't used the PowerBI helper (radacad.com/power-bi-helper) by Radacad does show columns which are used. I just put PowerBI helper on one screen, the powerbi file on the other and just remove the unused measures and columns.
Hello everyone. I got an error message - "Not Enough Memory To Complete This Operation". This was happening while i was trying to refresh the data. Does anyone knows what should i do in order to fix that issue.
Just used this as a step by step to analyze a model that I am working on. I knew I had to get rid of columns (and I had) but now I am pruning even more ruthlessly. And one thing I would add--don't be afraid to remove columns--you can always add what you need back. So what I do is select JUST the columns I know I want, and then use REMOVE OTHER COLUMNS. Then, if I find that there is a column I DO need, I go back to that REMOVE OTHER COLUMNS step, and modify the command by adding the name of the column I need back in. Super easy, super quick.
Love it! It is definitely something folks should be looking at.
By unchecking the auto-date/time , it simply brought down my data model size by 22MB! I am so happy I tried this. You do a fab job, keep it coming!
I have recently started expanding my knowledge with PBI and your channel has amazing information, examples and tips. I appreciate your work very much! Thank you for your efforts!
Aaaahhhhh where would I be without Guy in a Cube? As always, fantastic info.
Excellent video! Just reduced my PBIX file from 136MB to 34MB. Goes to show how little I know about how data is stored. I had several tables with a couple of unique key columns.
WOW! That's amazing. So happy that this helped you out. That's pretty incredible. 👊
Great tips.. I always use Remove other Columns to make sure I only keep the columns I need.. always get rid of the columns as a first step and not after you have done bunch of transformations.. plus always, always reduce the date-time to date only if you dont need the time. Time adds lot of bulk to the size (i guess because of high cardinality). hopefully PowerBI team will add Vertipaq analyzer-like tool in performance analyzer
Totally agree! Date-time to date is definitely something we recommend. If you don't need time, get rid of it. If you do need it, split it out.
Because this video, I was able to reduce the size of a Power BI report that includes a customized calendar dimension from 500 MB to 2 MB, by just turning off the Time Intelligence feature. This is so unreal that I got my coworker to reproduce the size reduction. In hind site, it makes so much sense based on how Time Intelligence works. Thank you so much!
Hi Adam & Patrick, thank you guys so much for posting awesome contents, as always :)
One thing I would like to point out is the shout out for DAX Studio. I have to admit I was a little bit surprised that Darren Gosbell wasn't mentioned as he's the creator and main contributor of DAX Studio. Yes, no doubt that Marco and Alberto (I have huge respect for them) have contributed in some of the coding; Marco has also mentioned a few times that people have mistaken him as its creator and had to clarify that he contributed approx. 5 - 10% of it. So I'm not sure whether that's the case here.
Once again, thanks for the awesome contents and keep being awesome!
Thanks for bringing this up. I have the same impression that the two DAX gurus from SQL BI are the creator of DAX Studio. Now I the big man behind this very useful tool is Darren Gosbell. (mvp.microsoft.com/en-us/PublicProfile/35889?fullName=Darren%20Gosbell)
@@ynwtint You're welcome. Cheers.
We have a lot of love for Darren! It is a SQLBI tool though and that was the intent. Apologies for giving the impression on actual development time. That wasn't what we were going for.
@@GuyInACube Don't be sorry and totally understood :) I'm looking forward to your next video already. Cheers!
I completely agree with removing columns that you don't need. But, I think we need to be careful that we don't remove so many columns that we can no longer guarantee uniqueness in the table. When a context transition occurs, the table is iterated and if we have duplicate rows, they will get double (or triple etc.) processed causing very hard to catch (and resolve) bugs.
Good techniques
What i normally do is take out as many colums as possible with select colums, and you can always bring them back if at some stage you need a previously removed colum.
Just disable data type detection and select data types as your last step
Yup. not a bad approach to pull things in later when you need it. Can you explain more on the data type point?
Adam, this is an incredible video, thank you. It makes so much sense now that you have explained.
Thanks! 👊
Nice Video. Like that its not just repetitive basics. Its very IRL scenario based of optimization.
Never really thought columns had an effect, thank you so much for this!
Brilliant! As someone used to working with tabular data, I inherently knew removing unwanted columns takes a huge load of schemas. I am a newbie to Power BI and was looking on ways to reduce the model size on my projects and your video just proves how simple it is to cut down the size if you are really *clear* about your data. Thanks for highlighting that part so well, Adam!
Loads of love for this optimization technique. It felt like the PBIX file was suffocated with unrelated columns.
Thanks so much; As always, it has been super helpful! We greatly appreciate you guys giving back to the community this way. Keep up the good work!!
That means a lot Shabnam! Thank you so much 👊
@@GuyInACube 👊 anytime!
My file size was relatively small (c.4 MB) but visual fails to load in PowerBI service. This has helped me to optimise how the table loads and it is now working! This did not reduce my file size significantly (now c.3.5 MB) but that's not the point anyway. Thanks Adam
Really amazing. I have reduce one of my pbix file from 256 Mo to 182 Mo. I also discover a lot of options to optimize my data set.
Thank you
That's awesome! 👊
Excellent video! I will use countrows from now on and ditch unique ids
Awesome! If you have the time, always be sure to test things as well. Things may work different with your data. Always good to validate.
Thanks Patrick! Amazing impact that losing a few redundant columns has! 🐱👤🐱👤
I am a recent subscriber to your channel and must say I love it! Thank you for putting in effort and time and sharing your knowledge.
You guys are the bomb! Thanks for the tips. That VertiPaq Analyzer thing? Holy crap! That's a gold mine! It shows all my measures! I've been looking for something like this for forever!
Ya, I have a model which is taking every time over an hour to refresh. I'm using dataflow as the source and still is taking a long time. So, my next step will be to check if I need all the columns :) Thanks for the video.
First.. Great Video.. Second.. I love how you say to "Jump over to Premium to give you some breathing room" Power BI Premium sits at a price point that only large corporations can afford it. I would love to jump to it for the use of computed tables inside data flows, but cant get it into the budget till next year.
Oh man... I really would love to show you what were working on. I'm in healthcare data analysis. Healthcare data is legit big and we're doing everything we can think of to reduce our data size. Our latest project PBI file saves at 6GB!
Seeing that file size go down from 600MB to 74MB just made a my jaw drop! Thanks for this!
Someone else already mentioned the datetime fields to watch out for and another one is calculated columns. Great job as always 👍
Yup. sooo many things. We have some other videos coming on data model optimizations. Great call outs though 👊
Hi Dan, why watch out for calculated columns? Could you clarify?
Thanks for what you guys do.
Seriously it's so practical and easy to absorb, your channel is very undersubscribed
Much appreciated Alfred! 👊
Nice Video. Like that its not just basic repetitive basic skills, But real life scenario for optimization.
Amazing... only removing the Auto date/time reduced a pbix file from 20mb to 2mb. loved it!!!
Great optimization tips, thank you
Most welcome! Thanks for watching 👊
15:30 you have selected few columns and apply and load. While we publish this from Dev to Prod, do you think it will create problem? If the columns will missmatch in dev and test and prod.
is the video still valid? I try using VertiPaq 2.01 and DAX 2.14.1 and there is no SSAS connection in data source as in the video in 4:40 . I have instead Query Data Source=$Workbook and it cannot be changed to connect to the local host port as described in the video...
The video is still valid. I think you were doing the step in Excel itself and not from within Power Pivot. Another option that was available after this video is to just do it directly within DAX Studio. When connecting with DAX Studio to your Power BI Desktop file, go to Advanced ribbon bar and choose View Metrics.
My file size has reduced a lot. Thank you so much
If only I saw this video on time, like for example last year....awesome video guys
Thanks for watching! 👊
Amazing! Turning off the date/time configuration you mention in the video reduced my report size by 8MB!
Good examples. I’ve got a team with an S1 AAS, with ginormous composite transaction key that needs to die. Would save money to get it down to an S0.
Yeah it is amazing what exists in a model.
Very helpful Adam - thanks for doing the video
3:15 Yup I approve that! 😅 I worked with a table of around 12 mil records and not only it took about 2+ hours to fetch on the desktop, ate up all the ram making the PC almost unusable!!!
Very easy to get into that spot. crazy stuff. 👊
Hi Adam, great talk, however what are your toughts about the usal practice of creating huge / heavy / slow multipurpose "golden" datasets which intend to solve the "several sources of truth" problem by putting everything and the kitchen sink in a single dataset file serving dozens of reports?
You present very well!!
Appreciate that! 👊
You guys should cover the inforiver visual 🙏
Thanks for this information also i think even if the organization said may we need this column latter on , it is easy to get this column again not a big deal
Great ideas presented to reduce the dataset.
Appreciate that! We have some more videos coming on data model optimizations as well. 👊
@@GuyInACube Super !!! looking forward to those videos.
Hello fellow Devs *Please Note* :
The Process has changed on how to Load your Model into VertiPaq Analyzer.
✅ Now Export a VPAX file first from Dax Studio,
✅ Then load THAT into the Excel Analyzer.
Instruction on first page of new Vertipaq Analyzer ✅
Thank you! - This comment was a life saver!
Great video as always, you guys are great!
Appreciate that 🙏 Thanks for watching 👊
Is it excel is auto generated or we should connect as datasource or is it SQL server datasource ...excel sheet-- data model not clear
Hello Patrick - Thanks. This is a great tutorial on the usage of DAX Studio and VertiPaq analyzer. I have tried using it for my Power BI report which is built based on SAP BW Application server connector. However, I do not see SSAS connection to update the local host and analyze. Could you please help me understand how I can create it?
Thanks,
PS
i dont see the ssas connection option when using vertipaq analyzer, im using a connection to an sql db with azure active directory for power bi
Thanks Adam. I unchecked Auto date/time and my PBI file dropped from 80MB to 2.4MB !!!!!
Just stumbled across this video - some good tips. Couple of questions though:
1. Unchecking the auto-date/time setting stops me from being able to show a nice hierarchical date slicer (Year->Qtr->Month->Day) - How could I still have one or more of those with the setting disabled?
2. For reducing the number of columns in the dataset, wouldn't it be better to edit the initial source query to only get the columns you need from source?
Otherwise, you are telling Power BI to pull in all the columns (and have to handle them all), just to then say "now forget about half the columns I just told you to import"
What is Evaluating set do when refreshing the report. I am using the SharePoint folder with JSON as my data source. It is very slow when refreshing. Major time taking in Evaluations.
So I have a question since I am confused. Based on another video as Well I wonder should I use SQL to pre-filter when loading data OR should I Just load all and remove data and columns I don't need using transformations. You pointed out in another video that transformations Will be blocked if you use SQL tot pre-select data. Or is that only the case if you have used select all and not select specific columns?
Secondly would it improve speed of In create a query which links all ID's tò eachother in 1 table and can you increment only that table and refresh the lookup tables whenever you need them? Basically is the refresh of a model done on the whole model of specifically to certain tables.
Do you normally create a backup of the PBI data before you remove columns? Is that just as easy as just creating a PBI file? Just in case we removed columns we should not have done. If so, how do you backup the proper way before we start removing columns, to retain an original the client provided.
A quick question about data flows and pbi service, would it make sense to load large dimensions to data flows and then only reference it (dataflow) in reports or that approach could cause issues in the long run?
I wonder the same thing. If the dataflow is being refreshed, how can I access that and not refresh in my model. Maybe this is similar to a Direct Query for the dataflow. I don't desire to refresh data, I just want whatever is stored in the dataflow.
SSAS doesnt show up in the "existing connections" inside power pivot?
Very helpful, Adam. Thanks a lot.
LOL on 'just like on a cooking show' liked that too!
Great stuff. Got rid of 500MB worth of LocalDate_tables o/. Also found that in one report we have 22 million rows where one column contains numbers but is stored as a String. Wrong on so many levels :) It's not even used in the report! 700MB saved in a few seconds. This will come handy setting up guidelines for building Power BI-reports in our organization. Thanks!
Thank you man! Only the time intelligence reduced my file size from 52MB to 22MB :D
hi! Thank you very much for sharing such interesting videos! Could you share the excel file you use in the video and could you explain a little more about the use of DAX Studio?
Thanks!
I agree, it would be great if we could have some more details about how to use Dax Studio. Thanks Adam for your great explanation as ever!
The excel spreadsheet is VertiPaq Analyzer - which you can get from sqlbi.com - www.sqlbi.com/tools/vertipaq-analyzer/. Also Marco, from sqlbi.com, has a longer recording on what to do with a slow report. This goes into details on DAX Studio. www.sqlbi.com/tv/my-power-bi-report-is-slow-what-should-i-do-2/. Also, we will be looking at doing more videos on these topics as well.
very good video. This needs an update though as I cannot follow all the options are not the same any more
Hi Adam love your videos. What do you guys use to zoom in and out on the screen? I also saw it at the MS biz app summit. Thanks!
ZoomIt docs.microsoft.com/en-us/sysinternals/downloads/zoomit
I broke my rule in this video. I actually was surprised when i saw it in the editing. Was on auto pilot. I used ZoomIt in the video at one point, but that's honestly the first time - in a long time - I've done that in the videos. Normally all of the zoom and highlight stuff I do in post. But when presenting in person I absolutely use ZoomIt. Every presenter should have it! Or something similar like it.
Great video, I just saw the another from Aug 2020 :) about disable Auto Date/Time
Appreciate this kind of video.
Love this. I was wondering, If you have a Surrogate Key and a Business ID which would be high cardinality and you join the tables by Key. Could you actually remove the business keys from the model or should you always leave those in. for example Product Key 1 Product ID 35335 ? I'm thinking in terms of the Fact table AND dimension if you have gone for a STAR schema
Debbie you will need to Surrogate Keys for the relationships, but if you are not using the Business ID I would l definitely remove it from the model. The only time we suggest keeping anytime of ID is if it is needed for reporting. Great point!
Thanks for the info in this video, impressed with how much this decreases the size of the Powerbi File! Would a similar approach work if you are suffering with the "visual has exceeded the available resources" error in the service when linking to a powerbi dataset?
Great video. Thanks a lot. I just learned some new tips.
Hi Adam, very nice video! Thanks a lot, and just wondering what is video recording application that you use to recording your operation on PowerBI? Very appreciate it if you can reply me !
OMG! Is that a Lone Star State on the Millennium Falcon? LEGIT!
Yes it is! Thanks! 👊
Guys you are amazing, keep it up
Thank you so much! Really appreciate that. 👊
Hello. Is it any differences between removing columns or loading not all columns from the file from performance perspective?
after choosing the columns to keep, will the data refresh the same or will it throw error?
Hoe do i access Dax studio and vertiPaq analyzer tools?
I am trying to run a already created report file by Microsoft " Customer Service Analytics for Dynamics 365.pbix" for my Dynamics 365 instance but it fails every time while load with error "The refresh operation failed because it took more than 120 minutes to complete. Consider reducing the size of your dataset or breaking it up into smaller datasets", now i guess i can only do all this Optimization only once my data is loaded into the .pbix file. What to do in case if in the First Time itself the Report doesn't load?
Hello, may I know why there is SSAS? Is it the data source of pbix?
I initially add just the fields I can filter on (market, customer type etc), together with one fact (e.g. order quantity), then I filter, then I add all the other columns required. The only annoying thing is that once you change a column data type, then you can't add any more from the data tables (at least on import).
Thank you so much for your tutorial. I am using ssas multidimensional live connection. I am trying to create a stacked column chart which show month x axis and value y axis. also shows month wise top selling store. But when i try to top n filter by store that filter shows highest sales per year not filter month wise sales. Please help me how can i solve this.
This is really great! Thanks Adam! Would there be any negative impact(s) if we disable the time intelligence for an existing report have datetime columns.
Absolutely not, unless you are using them in the report. We actually recommend disabling time intelligence if you have your own date table.
Can you do this in power query ?
Yoooo ....
Sometimes a refresh process that typically lasts 10-15min gets some problem and doesn´t finish succesfully. In the meanwhile it that schedule refresh keeps the status as in progress for 2 hours and I cannot start an manual refresh while that scheduled refresh hasn´t finished.
How can I proceed to manually stop a refresh that is currently running?
Thanks for the video. This message appear when i edit the connection and perform the refresh data of this connection. "We couldn't refresh the connection. Please go to existing connections and verify they connect to the file or server". what should i d?
Does in make any sense to group and summarise the remaining columns after deleting unnecessary IDs? Would it increase performance given that Power BI has very intelligent "packing" abilities?
It was so greate. could i have source file that show in video?
Unfortuantely no. I used it based on the CotosoRetailDW sample database. But we increased the number of rows. I had 25 million rows in that file. It is pretty big.
The most of Power BI devs use all columns from source and them doesn't understand why the project runs slow. The best solution is first PREPARE your data source. In SQL Server, create Queries with only columns that you have to use into Power BI. This process is much more faster than others. And, also, you have at PBI all columns that you will use.
Hi I need to analyze multiple csv files of each 1mb size. Then how many files can I connect
From where can I get this Excel file?
Excel is fun has an outstanding video on the same topic
Dead in the water after the Vertipaq Analyser segment. I have no SSAS connection to edit. Could I please get a work-around?
Extremely helpful:)
Appreciate that Togeir! 👊
Hi Adam i am a very big fan of your power bi videos..........
i have a small doubt about how to validate the reports that are developed in Power BI Desktop
.....Thanks in advance
If i disable the auto date/time, should i create all the measures for the date format that i need for those date fields?
You should have a Calendar/Date table that accommodates the values you need for your report.
Hey guys in a cube! I've run into a memory issue with my PBI reports. The dashboard visual is a table (🤦) and the stakeholder wants to add a new measure to the table. The file size is less than 100MB, and I went through these great tips, but when I try to pull in the new measure, I still get an error that there's not enough memory to complete the action. It pulls in fine when I create a new page with less columns in the table. Q: is there a limit to the number of columns PBI can handle in a visual table? I'm currently working on recreating the measures in a specific measure table and out of the data table hoping that will relieve the memory. Am I on the wrong track? Any advice for fixing memory issues?
M using Huge data set around 2b rows,, and that too using python query to pull data from Mongo .... does this Incremental refresh helps in this scenario ???? pl do Help help Help
One little thing though - if you disable Auto Date many of your Quick measures won't work any longer as only power bi provided date hierarchies are supported.
the excel file you use is available somehow? didn't find it in the links
sorry just found it down at the comments
Hello, here from Chile and I am a fan of your UA-cam channel. I would like to get the files that you show in the video to be able to practice and follow your steps. I will be grateful if you could share them. Regards!!!
Unfortunately, they are pretty big. It is just the ContsoRetailDW database, modified with extra rows. I also added a custom column for that OrderID column to simulate :) I did that at the SQL level using a view and to also flatten out the data.
A tool that detects unused Tables & Columns would be great!
Hi @Kazaki you can try Power BI Helper radacad.com/power-bi-helper
Agreed. that would be a great tool.
Although it doesn't show which aren't used the PowerBI helper (radacad.com/power-bi-helper) by Radacad does show columns which are used. I just put PowerBI helper on one screen, the powerbi file on the other and just remove the unused measures and columns.
powerbi sentinel does it
Hello everyone. I got an error message - "Not Enough Memory To Complete This Operation". This was happening while i was trying to refresh the data. Does anyone knows what should i do in order to fix that issue.
Is there a better way to see all the columns not being used than to just look at the data model?
Wow awesome tips
Glad you liked it! 👊