2 ways to reduce your Power BI dataset size and speed up refresh

Поділитися
Вставка
  • Опубліковано 7 січ 2025

КОМЕНТАРІ •

  • @alt-enter237
    @alt-enter237 4 роки тому +12

    Just used this as a step by step to analyze a model that I am working on. I knew I had to get rid of columns (and I had) but now I am pruning even more ruthlessly. And one thing I would add--don't be afraid to remove columns--you can always add what you need back. So what I do is select JUST the columns I know I want, and then use REMOVE OTHER COLUMNS. Then, if I find that there is a column I DO need, I go back to that REMOVE OTHER COLUMNS step, and modify the command by adding the name of the column I need back in. Super easy, super quick.

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      Love it! It is definitely something folks should be looking at.

  • @sandykashyap
    @sandykashyap 3 роки тому +6

    By unchecking the auto-date/time , it simply brought down my data model size by 22MB! I am so happy I tried this. You do a fab job, keep it coming!

  • @dangelo90
    @dangelo90 5 років тому +20

    I have recently started expanding my knowledge with PBI and your channel has amazing information, examples and tips. I appreciate your work very much! Thank you for your efforts!

  • @meg11c
    @meg11c 3 роки тому +1

    Aaaahhhhh where would I be without Guy in a Cube? As always, fantastic info.

  • @Slyder9278
    @Slyder9278 5 років тому +8

    Excellent video! Just reduced my PBIX file from 136MB to 34MB. Goes to show how little I know about how data is stored. I had several tables with a couple of unique key columns.

    • @GuyInACube
      @GuyInACube  5 років тому +1

      WOW! That's amazing. So happy that this helped you out. That's pretty incredible. 👊

  • @SandeepPawar1
    @SandeepPawar1 5 років тому +19

    Great tips.. I always use Remove other Columns to make sure I only keep the columns I need.. always get rid of the columns as a first step and not after you have done bunch of transformations.. plus always, always reduce the date-time to date only if you dont need the time. Time adds lot of bulk to the size (i guess because of high cardinality). hopefully PowerBI team will add Vertipaq analyzer-like tool in performance analyzer

    • @GuyInACube
      @GuyInACube  5 років тому +3

      Totally agree! Date-time to date is definitely something we recommend. If you don't need time, get rid of it. If you do need it, split it out.

  • @chiligarden
    @chiligarden 3 роки тому +1

    Because this video, I was able to reduce the size of a Power BI report that includes a customized calendar dimension from 500 MB to 2 MB, by just turning off the Time Intelligence feature. This is so unreal that I got my coworker to reproduce the size reduction. In hind site, it makes so much sense based on how Time Intelligence works. Thank you so much!

  • @likhui
    @likhui 5 років тому +4

    Hi Adam & Patrick, thank you guys so much for posting awesome contents, as always :)
    One thing I would like to point out is the shout out for DAX Studio. I have to admit I was a little bit surprised that Darren Gosbell wasn't mentioned as he's the creator and main contributor of DAX Studio. Yes, no doubt that Marco and Alberto (I have huge respect for them) have contributed in some of the coding; Marco has also mentioned a few times that people have mistaken him as its creator and had to clarify that he contributed approx. 5 - 10% of it. So I'm not sure whether that's the case here.
    Once again, thanks for the awesome contents and keep being awesome!

    • @ynwtint
      @ynwtint 5 років тому +2

      Thanks for bringing this up. I have the same impression that the two DAX gurus from SQL BI are the creator of DAX Studio. Now I the big man behind this very useful tool is Darren Gosbell. (mvp.microsoft.com/en-us/PublicProfile/35889?fullName=Darren%20Gosbell)

    • @likhui
      @likhui 5 років тому +1

      @@ynwtint You're welcome. Cheers.

    • @GuyInACube
      @GuyInACube  5 років тому +1

      We have a lot of love for Darren! It is a SQLBI tool though and that was the intent. Apologies for giving the impression on actual development time. That wasn't what we were going for.

    • @likhui
      @likhui 5 років тому

      @@GuyInACube Don't be sorry and totally understood :) I'm looking forward to your next video already. Cheers!

  • @Randyminder
    @Randyminder 5 років тому +10

    I completely agree with removing columns that you don't need. But, I think we need to be careful that we don't remove so many columns that we can no longer guarantee uniqueness in the table. When a context transition occurs, the table is iterated and if we have duplicate rows, they will get double (or triple etc.) processed causing very hard to catch (and resolve) bugs.

  • @bilalazim1901
    @bilalazim1901 5 років тому +4

    Good techniques
    What i normally do is take out as many colums as possible with select colums, and you can always bring them back if at some stage you need a previously removed colum.
    Just disable data type detection and select data types as your last step

    • @GuyInACube
      @GuyInACube  5 років тому +2

      Yup. not a bad approach to pull things in later when you need it. Can you explain more on the data type point?

  • @dylandelport6497
    @dylandelport6497 4 роки тому +8

    Adam, this is an incredible video, thank you. It makes so much sense now that you have explained.

  • @navjeet41
    @navjeet41 4 роки тому +1

    Nice Video. Like that its not just repetitive basics. Its very IRL scenario based of optimization.

  • @bernadettearaea6976
    @bernadettearaea6976 3 роки тому +3

    Never really thought columns had an effect, thank you so much for this!

  • @meetdenis82
    @meetdenis82 4 роки тому +3

    Brilliant! As someone used to working with tabular data, I inherently knew removing unwanted columns takes a huge load of schemas. I am a newbie to Power BI and was looking on ways to reduce the model size on my projects and your video just proves how simple it is to cut down the size if you are really *clear* about your data. Thanks for highlighting that part so well, Adam!

  • @avecNava
    @avecNava 5 років тому +2

    Loads of love for this optimization technique. It felt like the PBIX file was suffocated with unrelated columns.

  • @ShabnamKhan-vk7fj
    @ShabnamKhan-vk7fj 5 років тому +6

    Thanks so much; As always, it has been super helpful! We greatly appreciate you guys giving back to the community this way. Keep up the good work!!

    • @GuyInACube
      @GuyInACube  5 років тому +1

      That means a lot Shabnam! Thank you so much 👊

    • @ShabnamKhan-vk7fj
      @ShabnamKhan-vk7fj 5 років тому

      @@GuyInACube 👊 anytime!

  • @curious_yang
    @curious_yang 3 роки тому

    My file size was relatively small (c.4 MB) but visual fails to load in PowerBI service. This has helped me to optimise how the table loads and it is now working! This did not reduce my file size significantly (now c.3.5 MB) but that's not the point anyway. Thanks Adam

  • @pierresonkeng6827
    @pierresonkeng6827 4 роки тому +4

    Really amazing. I have reduce one of my pbix file from 256 Mo to 182 Mo. I also discover a lot of options to optimize my data set.
    Thank you

  • @PowerProd
    @PowerProd 4 роки тому +3

    Excellent video! I will use countrows from now on and ditch unique ids

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      Awesome! If you have the time, always be sure to test things as well. Things may work different with your data. Always good to validate.

  • @scooterza
    @scooterza 2 роки тому

    Thanks Patrick! Amazing impact that losing a few redundant columns has! 🐱‍👤🐱‍👤

  • @ishasakalley4254
    @ishasakalley4254 5 років тому +5

    I am a recent subscriber to your channel and must say I love it! Thank you for putting in effort and time and sharing your knowledge.

  • @roseventura1711
    @roseventura1711 4 роки тому

    You guys are the bomb! Thanks for the tips. That VertiPaq Analyzer thing? Holy crap! That's a gold mine! It shows all my measures! I've been looking for something like this for forever!

  • @MartinKuzmicz
    @MartinKuzmicz 3 роки тому

    Ya, I have a model which is taking every time over an hour to refresh. I'm using dataflow as the source and still is taking a long time. So, my next step will be to check if I need all the columns :) Thanks for the video.

  • @jamesharvey1979
    @jamesharvey1979 4 роки тому +1

    First.. Great Video.. Second.. I love how you say to "Jump over to Premium to give you some breathing room" Power BI Premium sits at a price point that only large corporations can afford it. I would love to jump to it for the use of computed tables inside data flows, but cant get it into the budget till next year.

  • @JasonRidenour
    @JasonRidenour 3 роки тому

    Oh man... I really would love to show you what were working on. I'm in healthcare data analysis. Healthcare data is legit big and we're doing everything we can think of to reduce our data size. Our latest project PBI file saves at 6GB!

  • @SolutionsAbroad
    @SolutionsAbroad 4 роки тому

    Seeing that file size go down from 600MB to 74MB just made a my jaw drop! Thanks for this!

  • @denglishbi
    @denglishbi 5 років тому +2

    Someone else already mentioned the datetime fields to watch out for and another one is calculated columns. Great job as always 👍

    • @GuyInACube
      @GuyInACube  5 років тому +1

      Yup. sooo many things. We have some other videos coming on data model optimizations. Great call outs though 👊

    •  5 років тому +1

      Hi Dan, why watch out for calculated columns? Could you clarify?

  • @alfredlear4141
    @alfredlear4141 5 років тому +3

    Thanks for what you guys do.
    Seriously it's so practical and easy to absorb, your channel is very undersubscribed

    • @GuyInACube
      @GuyInACube  5 років тому +1

      Much appreciated Alfred! 👊

  • @navjeet41
    @navjeet41 4 роки тому

    Nice Video. Like that its not just basic repetitive basic skills, But real life scenario for optimization.

  • @PabloBadenas
    @PabloBadenas Рік тому

    Amazing... only removing the Auto date/time reduced a pbix file from 20mb to 2mb. loved it!!!

  • @wizaphiri19
    @wizaphiri19 4 роки тому +3

    Great optimization tips, thank you

    • @GuyInACube
      @GuyInACube  4 роки тому

      Most welcome! Thanks for watching 👊

  • @subusahu69
    @subusahu69 Рік тому

    15:30 you have selected few columns and apply and load. While we publish this from Dev to Prod, do you think it will create problem? If the columns will missmatch in dev and test and prod.

  • @marianszetyinszki9963
    @marianszetyinszki9963 4 роки тому

    is the video still valid? I try using VertiPaq 2.01 and DAX 2.14.1 and there is no SSAS connection in data source as in the video in 4:40 . I have instead Query Data Source=$Workbook and it cannot be changed to connect to the local host port as described in the video...

    • @GuyInACube
      @GuyInACube  4 роки тому

      The video is still valid. I think you were doing the step in Excel itself and not from within Power Pivot. Another option that was available after this video is to just do it directly within DAX Studio. When connecting with DAX Studio to your Power BI Desktop file, go to Advanced ribbon bar and choose View Metrics.

  • @arunasubin8965
    @arunasubin8965 4 роки тому +1

    My file size has reduced a lot. Thank you so much

  • @PedroCabraldaCamara
    @PedroCabraldaCamara 4 роки тому +1

    If only I saw this video on time, like for example last year....awesome video guys

  • @dianamgdata
    @dianamgdata 3 роки тому

    Amazing! Turning off the date/time configuration you mention in the video reduced my report size by 8MB!

  • @Milhouse77BS
    @Milhouse77BS 5 років тому +2

    Good examples. I’ve got a team with an S1 AAS, with ginormous composite transaction key that needs to die. Would save money to get it down to an S0.

    • @GuyInACube
      @GuyInACube  5 років тому

      Yeah it is amazing what exists in a model.

  • @jhewitthunt
    @jhewitthunt 3 роки тому

    Very helpful Adam - thanks for doing the video

  • @etherlords88
    @etherlords88 5 років тому +2

    3:15 Yup I approve that! 😅 I worked with a table of around 12 mil records and not only it took about 2+ hours to fetch on the desktop, ate up all the ram making the PC almost unusable!!!

    • @GuyInACube
      @GuyInACube  5 років тому

      Very easy to get into that spot. crazy stuff. 👊

  • @juanlauroaguirre5646
    @juanlauroaguirre5646 2 роки тому

    Hi Adam, great talk, however what are your toughts about the usal practice of creating huge / heavy / slow multipurpose "golden" datasets which intend to solve the "several sources of truth" problem by putting everything and the kitchen sink in a single dataset file serving dozens of reports?

  • @RajanieshKaushikk
    @RajanieshKaushikk 4 роки тому +1

    You present very well!!

  • @richardostrea7842
    @richardostrea7842 2 роки тому

    You guys should cover the inforiver visual 🙏

  • @shafialameri8363
    @shafialameri8363 4 роки тому

    Thanks for this information also i think even if the organization said may we need this column latter on , it is easy to get this column again not a big deal

  • @chamilam
    @chamilam 5 років тому +1

    Great ideas presented to reduce the dataset.

    • @GuyInACube
      @GuyInACube  5 років тому +1

      Appreciate that! We have some more videos coming on data model optimizations as well. 👊

    • @chamilam
      @chamilam 5 років тому

      @@GuyInACube Super !!! looking forward to those videos.

  • @gkool4655
    @gkool4655 3 роки тому +2

    Hello fellow Devs *Please Note* :
    The Process has changed on how to Load your Model into VertiPaq Analyzer.
    ✅ Now Export a VPAX file first from Dax Studio,
    ✅ Then load THAT into the Excel Analyzer.
    Instruction on first page of new Vertipaq Analyzer ✅

  • @arklur3193
    @arklur3193 5 років тому +3

    Great video as always, you guys are great!

    • @GuyInACube
      @GuyInACube  5 років тому

      Appreciate that 🙏 Thanks for watching 👊

  • @C15-k4d
    @C15-k4d 6 місяців тому

    Is it excel is auto generated or we should connect as datasource or is it SQL server datasource ...excel sheet-- data model not clear

  • @skumars78
    @skumars78 2 роки тому

    Hello Patrick - Thanks. This is a great tutorial on the usage of DAX Studio and VertiPaq analyzer. I have tried using it for my Power BI report which is built based on SAP BW Application server connector. However, I do not see SSAS connection to update the local host and analyze. Could you please help me understand how I can create it?
    Thanks,
    PS

  • @DEMONTmx
    @DEMONTmx 3 роки тому

    i dont see the ssas connection option when using vertipaq analyzer, im using a connection to an sql db with azure active directory for power bi

  • @donaldscott8782
    @donaldscott8782 4 роки тому

    Thanks Adam. I unchecked Auto date/time and my PBI file dropped from 80MB to 2.4MB !!!!!

  • @brypie04
    @brypie04 2 роки тому

    Just stumbled across this video - some good tips. Couple of questions though:
    1. Unchecking the auto-date/time setting stops me from being able to show a nice hierarchical date slicer (Year->Qtr->Month->Day) - How could I still have one or more of those with the setting disabled?
    2. For reducing the number of columns in the dataset, wouldn't it be better to edit the initial source query to only get the columns you need from source?
    Otherwise, you are telling Power BI to pull in all the columns (and have to handle them all), just to then say "now forget about half the columns I just told you to import"

  • @sunilg7648
    @sunilg7648 3 роки тому

    What is Evaluating set do when refreshing the report. I am using the SharePoint folder with JSON as my data source. It is very slow when refreshing. Major time taking in Evaluations.

  • @huuya
    @huuya Рік тому

    So I have a question since I am confused. Based on another video as Well I wonder should I use SQL to pre-filter when loading data OR should I Just load all and remove data and columns I don't need using transformations. You pointed out in another video that transformations Will be blocked if you use SQL tot pre-select data. Or is that only the case if you have used select all and not select specific columns?
    Secondly would it improve speed of In create a query which links all ID's tò eachother in 1 table and can you increment only that table and refresh the lookup tables whenever you need them? Basically is the refresh of a model done on the whole model of specifically to certain tables.

  • @juanlopez4033
    @juanlopez4033 4 роки тому +1

    Do you normally create a backup of the PBI data before you remove columns? Is that just as easy as just creating a PBI file? Just in case we removed columns we should not have done. If so, how do you backup the proper way before we start removing columns, to retain an original the client provided.

  • @1BlackSwordsman1
    @1BlackSwordsman1 5 років тому +1

    A quick question about data flows and pbi service, would it make sense to load large dimensions to data flows and then only reference it (dataflow) in reports or that approach could cause issues in the long run?

    • @claytillman2227
      @claytillman2227 5 років тому

      I wonder the same thing. If the dataflow is being refreshed, how can I access that and not refresh in my model. Maybe this is similar to a Direct Query for the dataflow. I don't desire to refresh data, I just want whatever is stored in the dataflow.

  • @Drengen10
    @Drengen10 3 роки тому

    SSAS doesnt show up in the "existing connections" inside power pivot?

  • @jennethtaja7458
    @jennethtaja7458 4 роки тому

    Very helpful, Adam. Thanks a lot.
    LOL on 'just like on a cooking show' liked that too!

  • @karlnorberg7768
    @karlnorberg7768 3 роки тому

    Great stuff. Got rid of 500MB worth of LocalDate_tables o/. Also found that in one report we have 22 million rows where one column contains numbers but is stored as a String. Wrong on so many levels :) It's not even used in the report! 700MB saved in a few seconds. This will come handy setting up guidelines for building Power BI-reports in our organization. Thanks!

  • @rickuijlen4790
    @rickuijlen4790 2 роки тому

    Thank you man! Only the time intelligence reduced my file size from 52MB to 22MB :D

  • @alejandrogonzalezbueno8044
    @alejandrogonzalezbueno8044 5 років тому +5

    hi! Thank you very much for sharing such interesting videos! Could you share the excel file you use in the video and could you explain a little more about the use of DAX Studio?
    Thanks!

    • @Gustavo-Santana
      @Gustavo-Santana 5 років тому +2

      I agree, it would be great if we could have some more details about how to use Dax Studio. Thanks Adam for your great explanation as ever!

    • @GuyInACube
      @GuyInACube  5 років тому +7

      The excel spreadsheet is VertiPaq Analyzer - which you can get from sqlbi.com - www.sqlbi.com/tools/vertipaq-analyzer/. Also Marco, from sqlbi.com, has a longer recording on what to do with a slow report. This goes into details on DAX Studio. www.sqlbi.com/tv/my-power-bi-report-is-slow-what-should-i-do-2/. Also, we will be looking at doing more videos on these topics as well.

  • @eljangoolak
    @eljangoolak Рік тому

    very good video. This needs an update though as I cannot follow all the options are not the same any more

  • @arahuac0
    @arahuac0 5 років тому +1

    Hi Adam love your videos. What do you guys use to zoom in and out on the screen? I also saw it at the MS biz app summit. Thanks!

    • @denglishbi
      @denglishbi 5 років тому +1

      ZoomIt docs.microsoft.com/en-us/sysinternals/downloads/zoomit

    • @GuyInACube
      @GuyInACube  5 років тому

      I broke my rule in this video. I actually was surprised when i saw it in the editing. Was on auto pilot. I used ZoomIt in the video at one point, but that's honestly the first time - in a long time - I've done that in the videos. Normally all of the zoom and highlight stuff I do in post. But when presenting in person I absolutely use ZoomIt. Every presenter should have it! Or something similar like it.

  • @joaquinmaverick82
    @joaquinmaverick82 4 роки тому

    Great video, I just saw the another from Aug 2020 :) about disable Auto Date/Time

  • @gokukanishka
    @gokukanishka 3 роки тому

    Appreciate this kind of video.

  • @debbieedwards7267
    @debbieedwards7267 4 роки тому

    Love this. I was wondering, If you have a Surrogate Key and a Business ID which would be high cardinality and you join the tables by Key. Could you actually remove the business keys from the model or should you always leave those in. for example Product Key 1 Product ID 35335 ? I'm thinking in terms of the Fact table AND dimension if you have gone for a STAR schema

    • @GuyInACube
      @GuyInACube  4 роки тому

      Debbie you will need to Surrogate Keys for the relationships, but if you are not using the Business ID I would l definitely remove it from the model. The only time we suggest keeping anytime of ID is if it is needed for reporting. Great point!

  • @markharris7325
    @markharris7325 4 роки тому

    Thanks for the info in this video, impressed with how much this decreases the size of the Powerbi File! Would a similar approach work if you are suffering with the "visual has exceeded the available resources" error in the service when linking to a powerbi dataset?

  • @evangelinekiku7380
    @evangelinekiku7380 4 роки тому

    Great video. Thanks a lot. I just learned some new tips.

  • @neverGrowup1224
    @neverGrowup1224 3 роки тому

    Hi Adam, very nice video! Thanks a lot, and just wondering what is video recording application that you use to recording your operation on PowerBI? Very appreciate it if you can reply me !

  • @bwaughevents
    @bwaughevents 4 роки тому +4

    OMG! Is that a Lone Star State on the Millennium Falcon? LEGIT!

  • @osamaasif9601
    @osamaasif9601 5 років тому +2

    Guys you are amazing, keep it up

    • @GuyInACube
      @GuyInACube  5 років тому

      Thank you so much! Really appreciate that. 👊

  • @dreamofyou00
    @dreamofyou00 3 роки тому

    Hello. Is it any differences between removing columns or loading not all columns from the file from performance perspective?

  • @HarishS12137
    @HarishS12137 5 років тому

    after choosing the columns to keep, will the data refresh the same or will it throw error?

  • @SAHILSHARMA-xx1db
    @SAHILSHARMA-xx1db 3 роки тому

    Hoe do i access Dax studio and vertiPaq analyzer tools?

  • @premprakash334
    @premprakash334 4 роки тому

    I am trying to run a already created report file by Microsoft " Customer Service Analytics for Dynamics 365.pbix" for my Dynamics 365 instance but it fails every time while load with error "The refresh operation failed because it took more than 120 minutes to complete. Consider reducing the size of your dataset or breaking it up into smaller datasets", now i guess i can only do all this Optimization only once my data is loaded into the .pbix file. What to do in case if in the First Time itself the Report doesn't load?

  • @NC-un7tr
    @NC-un7tr 4 роки тому

    Hello, may I know why there is SSAS? Is it the data source of pbix?

  • @tanyacraig2672
    @tanyacraig2672 2 роки тому

    I initially add just the fields I can filter on (market, customer type etc), together with one fact (e.g. order quantity), then I filter, then I add all the other columns required. The only annoying thing is that once you change a column data type, then you can't add any more from the data tables (at least on import).

  • @curiousmind9825
    @curiousmind9825 5 років тому

    Thank you so much for your tutorial. I am using ssas multidimensional live connection. I am trying to create a stacked column chart which show month x axis and value y axis. also shows month wise top selling store. But when i try to top n filter by store that filter shows highest sales per year not filter month wise sales. Please help me how can i solve this.

  • @prakash4190
    @prakash4190 5 років тому

    This is really great! Thanks Adam! Would there be any negative impact(s) if we disable the time intelligence for an existing report have datetime columns.

    • @GuyInACube
      @GuyInACube  5 років тому

      Absolutely not, unless you are using them in the report. We actually recommend disabling time intelligence if you have your own date table.

  • @yaaboii34
    @yaaboii34 3 роки тому

    Can you do this in power query ?

  • @DAngeloSilvestre
    @DAngeloSilvestre 4 роки тому

    Yoooo ....
    Sometimes a refresh process that typically lasts 10-15min gets some problem and doesn´t finish succesfully. In the meanwhile it that schedule refresh keeps the status as in progress for 2 hours and I cannot start an manual refresh while that scheduled refresh hasn´t finished.
    How can I proceed to manually stop a refresh that is currently running?

  • @GeekSP1
    @GeekSP1 5 років тому

    Thanks for the video. This message appear when i edit the connection and perform the refresh data of this connection. "We couldn't refresh the connection. Please go to existing connections and verify they connect to the file or server". what should i d?

  • @4eyesleo
    @4eyesleo 4 роки тому

    Does in make any sense to group and summarise the remaining columns after deleting unnecessary IDs? Would it increase performance given that Power BI has very intelligent "packing" abilities?

  • @rezaafkhamnia9183
    @rezaafkhamnia9183 5 років тому +2

    It was so greate. could i have source file that show in video?

    • @GuyInACube
      @GuyInACube  5 років тому

      Unfortuantely no. I used it based on the CotosoRetailDW sample database. But we increased the number of rows. I had 25 million rows in that file. It is pretty big.

  • @RenatoHaddadMVP
    @RenatoHaddadMVP 3 роки тому

    The most of Power BI devs use all columns from source and them doesn't understand why the project runs slow. The best solution is first PREPARE your data source. In SQL Server, create Queries with only columns that you have to use into Power BI. This process is much more faster than others. And, also, you have at PBI all columns that you will use.

  • @annamalaithirumalraj3787
    @annamalaithirumalraj3787 5 років тому

    Hi I need to analyze multiple csv files of each 1mb size. Then how many files can I connect

  • @drac6426
    @drac6426 2 роки тому

    From where can I get this Excel file?

  • @majdyazigi8185
    @majdyazigi8185 5 років тому +2

    Excel is fun has an outstanding video on the same topic

  • @DuncanFairweather
    @DuncanFairweather 2 роки тому

    Dead in the water after the Vertipaq Analyser segment. I have no SSAS connection to edit. Could I please get a work-around?

  • @TorgeirLognvik
    @TorgeirLognvik 4 роки тому +1

    Extremely helpful:)

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      Appreciate that Togeir! 👊

  • @manikiran5902
    @manikiran5902 5 років тому

    Hi Adam i am a very big fan of your power bi videos..........
    i have a small doubt about how to validate the reports that are developed in Power BI Desktop
    .....Thanks in advance

  • @mjtejero
    @mjtejero 5 років тому

    If i disable the auto date/time, should i create all the measures for the date format that i need for those date fields?

    • @GuyInACube
      @GuyInACube  5 років тому

      You should have a Calendar/Date table that accommodates the values you need for your report.

  • @zoeteets9353
    @zoeteets9353 2 роки тому

    Hey guys in a cube! I've run into a memory issue with my PBI reports. The dashboard visual is a table (🤦) and the stakeholder wants to add a new measure to the table. The file size is less than 100MB, and I went through these great tips, but when I try to pull in the new measure, I still get an error that there's not enough memory to complete the action. It pulls in fine when I create a new page with less columns in the table. Q: is there a limit to the number of columns PBI can handle in a visual table? I'm currently working on recreating the measures in a specific measure table and out of the data table hoping that will relieve the memory. Am I on the wrong track? Any advice for fixing memory issues?

  • @Ritunjan
    @Ritunjan 4 роки тому

    M using Huge data set around 2b rows,, and that too using python query to pull data from Mongo .... does this Incremental refresh helps in this scenario ???? pl do Help help Help

  • @1yyymmmddd
    @1yyymmmddd 4 роки тому

    One little thing though - if you disable Auto Date many of your Quick measures won't work any longer as only power bi provided date hierarchies are supported.

  • @BernatAgulloRosello
    @BernatAgulloRosello 5 років тому

    the excel file you use is available somehow? didn't find it in the links

  • @mrrobert2008
    @mrrobert2008 5 років тому

    Hello, here from Chile and I am a fan of your UA-cam channel. I would like to get the files that you show in the video to be able to practice and follow your steps. I will be grateful if you could share them. Regards!!!

    • @GuyInACube
      @GuyInACube  5 років тому

      Unfortunately, they are pretty big. It is just the ContsoRetailDW database, modified with extra rows. I also added a custom column for that OrderID column to simulate :) I did that at the SQL level using a view and to also flatten out the data.

  • @0kazaki
    @0kazaki 5 років тому +8

    A tool that detects unused Tables & Columns would be great!

    • @yannickfranckum6589
      @yannickfranckum6589 5 років тому

      Hi @Kazaki you can try Power BI Helper radacad.com/power-bi-helper

    • @GuyInACube
      @GuyInACube  5 років тому +1

      Agreed. that would be a great tool.

    • @moizsherwani8651
      @moizsherwani8651 5 років тому +1

      Although it doesn't show which aren't used the PowerBI helper (radacad.com/power-bi-helper) by Radacad does show columns which are used. I just put PowerBI helper on one screen, the powerbi file on the other and just remove the unused measures and columns.

    • @atomek1000
      @atomek1000 5 років тому

      powerbi sentinel does it

  • @kaloyandimitrov6571
    @kaloyandimitrov6571 3 роки тому

    Hello everyone. I got an error message - "Not Enough Memory To Complete This Operation". This was happening while i was trying to refresh the data. Does anyone knows what should i do in order to fix that issue.

  • @RobertHickey
    @RobertHickey 4 роки тому

    Is there a better way to see all the columns not being used than to just look at the data model?

  • @MuhammadBerki
    @MuhammadBerki 5 років тому +2

    Wow awesome tips