Exploring new composite models in Power BI Desktop - Unplugged #1

Поділитися
Вставка
  • Опубліковано 9 січ 2025

КОМЕНТАРІ • 85

  • @ЕвгенийСкляров-б3ч
    @ЕвгенийСкляров-б3ч 4 місяці тому +1

    Quite interesting experiens. The topic itself and concept of "unplugged" generally. Great thank.

  • @vida1719
    @vida1719 4 роки тому +2

    It was great to see process of your thought with multiple options and finally choosing the best performing solution

  • @vibhaskashyap8247
    @vibhaskashyap8247 4 роки тому +2

    Liked the way you explained multiple options and its performance using Dax Studio. Great explanation. Thanks alot for sharing to the folks!

  • @jamessr9347
    @jamessr9347 Рік тому

    Great video, good explanation and well explained, thank you. Would be interesting to see what happens if a new product is added to one of the two datasets when using the local model brand table

  • @vanlessing
    @vanlessing 2 роки тому +1

    Great. Tried two of the presented "linked table" scenarios and I got good results. However creating measures across the comp.model is another challenge (for me)

  • @catglabs
    @catglabs 4 роки тому +2

    Thank you for this video. Great example of composite models!

  • @terryliu3635
    @terryliu3635 3 роки тому +1

    Great video! I’ve been using the approach very similar to the 3rd option you introduced here. The only difference is that I was using the new product table to link to fact tables directly instead of connecting to the two existing product tables.

  • @datamandy8975
    @datamandy8975 2 роки тому +1

    Great lecture!! informative one..

  • @joaoluismartins
    @joaoluismartins 2 роки тому +1

    This video is so excelent. Very useful insights that I've been using for the last year.
    However, we're facing an interesting issue. We're importing both dimension tables (brand and product). The brand, once it has much less data to be sent to remote model, is used everytime we want to aggregate values equal or greater this granuarity. We use only the product table when we identify the we're at product granularity. PS: in our model, brands is also connected to the products table.
    Thus, having both tables, connecting the brand with an active relationship, the products with an inactive one and switching the way to calculate the measure using USERELATIONSHIP() works just fine when all the tables are imported. However, when we live connect with a model and try to do the same approach, we get an error which looks like a "circular reference" (two different paths to filter the remote model , by local brand or by local product). It looks like Power BI isn't really identifying that the limited relationship connecting product is indeed inactive (and then, impossible to have more than one path to filter product remote table).
    Any thoughts about it?
    Thanks a lot,
    Cheers,
    João Martins

    • @marcorusso7472
      @marcorusso7472 2 роки тому

      This is interesting. Please, can you share a sample file, or a picture of the diagram, so I can try to reproduce? You can contact me writing in the contact form here, and we'll be in touch by email: www.sqlbi.com/about/ (go at the bottom of the page)

  • @BernatAgulloRosello
    @BernatAgulloRosello 4 роки тому +2

    PLayed at 1.5 speed was a breeze :D good work!

  • @mattheosprotopapas3286
    @mattheosprotopapas3286 4 роки тому

    Great video and overall format! Many insights emerge from this 'unplugged' way of presenting things!
    My opinion regarding composite models: I believe they will have a great impact! They could even enable using Power BI models as building blocks to evolve and build an enterprise wide BI solution, rendering the traditional data warehouse obsolete - OK not 100% obsolete but anyway...You still of course need performance and history - and you definitely need enterprise wide governance, data definitions and data management in general, but this does not mean you need to build a cumbersome data warehousing solution of the past. You combine (or will as this is still in preview) work that's already done within Power BI with other data elements (within a data lake from example) in an agile way, avoiding re-work and easily capture changing data & measure definitions at the enterprise level. It will definitely be very cost effective, with greatly reduced time to market. So I really hope all this works out.

    • @marcorusso7472
      @marcorusso7472 4 роки тому +3

      For critical company data (financial, balance sheet, sales, HR) the data warehouse was, is, and will be a good investment. Not implementing that for critical data is a technical debt you will continue to pay for the lack of data consistency in systems that do not have the same data quality level (assuming you build the data warehouse the right way).
      For many other datasets (weblogs, telemetry, competition comparison, market penetration, datasets used for individual projects limited in scope and time), then this is a good tool.
      Thinking that Power BI and mashup of models can remove the need for a data warehouse is a wrong idea, in my opinion.
      Unless the company does not need a data warehouse (small companies often don't need it).
      Power BI composite models solve another problem: removing the need of extending the corporate data model for individual reports, without having to duplicate data in many models just to get the desired reports.

  • @DavidWilsonNZ
    @DavidWilsonNZ 4 роки тому +1

    Ver awesome. Thank you. Totally love the format of the presentation. +1 for unplugged.

  • @jefffischer7734
    @jefffischer7734 3 роки тому +1

    Amazing job Marco! Really appreciate your video

  • @gopichand5017
    @gopichand5017 4 роки тому +1

    Thanks for sharing,
    Very useful feature.🔥🔥

  • @DIGITAL_COOKING
    @DIGITAL_COOKING 4 роки тому +1

    great video Marco ..., and I support this series I give five stars for beginning 🙂

  • @VTFIJIGraphix
    @VTFIJIGraphix 4 роки тому +1

    Most exciting feature to come out in the BI space I’ve seen in a long time. Wish it would work over on-prem Analysis Services, though. I don’t see why they only enabled it for cloud datasets.

    • @SQLBI
      @SQLBI  4 роки тому

      It is not possible to use it with SSAS - we have to need SSAS vNext (after 2019 - will it be 2021 or 2022?)

  • @phillipk9891
    @phillipk9891 2 роки тому

    How do you hide those 3 options incl the three dot ellipsis when hovering a dashboard on some visuals like card, slicers, textbox, tables, etc. Are there any fancy or creative way of hiding those settings?

  • @Anthony_Lecoq
    @Anthony_Lecoq 3 роки тому +1

    Thanks for those video, very usefull ;)

  • @ikar2k
    @ikar2k Рік тому +1

    295395 at 35:54, but!) video is great anyway! 😊

  • @diegoj9056
    @diegoj9056 4 роки тому +1

    Great job. To measure performance for composite models must come with several challenges. Is it time for Dax Studio premium?

    • @SQLBI
      @SQLBI  4 роки тому +1

      DAX Studio already works. There could be additional features to automate certain processes, but it's too early... we have to practice composite models first.

  • @luciedelbarre4701
    @luciedelbarre4701 2 роки тому +1

    Really awesome video!! The video helped me on a project I need to do. However I have a question to create a table from 4 different tables. My 4 tables don't have the same number of columns so how can I adapt the script to create my unique table? please 🙂

    • @SQLBI
      @SQLBI  2 роки тому

      Use SELECTCOLUMNS to create the same columns for the 4 tables (just use blank values for the missing columns).

  • @msksenthil
    @msksenthil 4 роки тому +1

    Thanks for a great session, I have a couple of questions.
    1. Did you configured gateway earlier for the report published because DQ requires gateway right?
    2. If we try for the 'ProductKey' in the same way as 'Brands' we will get the same results?

    • @SQLBI
      @SQLBI  4 роки тому

      No, we are using DirectQuery over Power BI Dataset - no gateway required
      I don't understand the second question.

  • @martinargimon730
    @martinargimon730 2 роки тому

    Hi Marco. Do u have a video/tutorial that explains the possibility of ‘importing’ an Excel power pivot model into Power Bi, but with the possibility of updating my original Excel workbook and ‘push through’ to Power Bi the periodic updates that i edit in the original Excel wirkbook?
    Thanks
    Martin ( South Africa)

  • @ooooFISCHERoooo
    @ooooFISCHERoooo 3 роки тому

    cool...are there any limitations to the number of dataset that can be brought in as part of the composite model ?

    • @SQLBI
      @SQLBI  3 роки тому

      Microsoft does not provide a number, but probably it is not a good idea having too many external datasets. Do you have use cases for more than 10 datasets?

  • @kadirkkkk
    @kadirkkkk 2 роки тому

    Thanks for the video Marco
    You told that report has the shadow copy of other datasets but what if i revise the original dataset. Do mashup report has ability to refresh and get the new tables/columns etc from origin ?

    • @SQLBI
      @SQLBI  2 роки тому +1

      Yes, but you have to refresh the local model (also called the proxy model).

  • @jey2884
    @jey2884 2 роки тому

    Hi Marco, I just started exploring the new composite models and I am wondering if it is better to have one big model vs a few small models linked with a master dimension (calculated table) . Using composite models we may end up having a few other calculated tables if there were more dimensions related to Receipts and Sales. Thanks

  • @gsanchez7124
    @gsanchez7124 2 роки тому

    very helpful video, but what about combining two different sap sources into one single dataset or model, I've read this is possible as long as they are not OLAP dbs, but didn't find much information about how to do it.

  • @santoshhiremath-kl9nu
    @santoshhiremath-kl9nu 10 місяців тому

    Hi I have 4 different shared datasets dataset 1,2,3 and 4. On dataset 3 RLS is applied. Imported these all and built report. Report is embedded in application. Working on PBI service but in application not opening. Please can you help me on this problem. Using SQL server for data source.

  • @jaco7433
    @jaco7433 4 роки тому +1

    Amazing as always .. Did you notice any decrease in performance when using dax in the composite models ? (Imported csv files and a live connection to a power Bi dataset)

    • @SQLBI
      @SQLBI  4 роки тому +1

      It depends - we discuss performance in the video, a CSV imported is like any other "local" table.

  • @FrankPreusker
    @FrankPreusker 4 роки тому +1

    Thanks Marco for the great video! With the small mistakes and side effects it feels indeed more "real". Does the order of the model you are connecting to matter (1. Sales, 2. Receipts or vice versa)?

    • @SQLBI
      @SQLBI  4 роки тому

      Not much, the cardinality of tables/relationships is more
      important.

  • @PeterKontogeorgis
    @PeterKontogeorgis 4 роки тому

    Do you expect the blank row handling behavior to change as the feature matures? Maybe my expectations are off, however dropping rows, and needing to build local limited versions of the common dimensions seems to restrict the usefulness of this feature.
    Awesome content as always.

    • @SQLBI
      @SQLBI  4 роки тому +1

      It's a design problem more than a feature issue. Think about doing the same in SQL with different databases, you would face very similar problems. Because of the nature of the semantic model, you choose one column of one table in the report, and you get that. The relationship traversing different datasets in Power BI don't have the additional blank row you have in relationships that are internal to the same data source. I don't think this will change, the cost for managing that (in performance) would be very high.

  • @campsb3186
    @campsb3186 3 роки тому +1

    Great Video

  • @jamiea8755
    @jamiea8755 3 роки тому

    thanks for this video. question: how do you handle scenarios where one table has RLS and the other doesnt? I have a new model like this, but am unsure how to propagate to the users so that it applies to the "unprotected" data the same way as the protected. Do you create a new security rule in the new model, or make the security data filter out the unsecure data with bidirectional filtering set to filter the unsecure table? or is there another option?
    shared with the rest of my dev team btw.

    • @SQLBI
      @SQLBI  3 роки тому

      The RLS story is not implemented yet in the first preview. We have to wait upcoming updates. In general, you will not be able to see data you are not authorized to see just because you use a composite model. However, we still don't have all the details about how it will work in the final release. Let's wait a few weeks.

  • @平凡-p1v
    @平凡-p1v 4 роки тому

    If I have row security in the original models, does the composite model honor the row security implementation of the original models?

    • @SQLBI
      @SQLBI  4 роки тому

      Yes - RLS is not complete as implementation for the local model, yet - but existing RLS should already work fine.

  • @IN2FPA
    @IN2FPA 4 роки тому

    Thanks Marco. Quick question hopefully. Can one use RELATED with these weak relationships?

    • @IN2FPA
      @IN2FPA 4 роки тому

      also wondering about time intelligence :)

    • @SQLBI
      @SQLBI  4 роки тому

      You can use RELATED through a many-to-one relationship. Performance can suffer over weak relationships. Same for time intelligence, with a big "it depends" because there are many variables to evaluate the impact on performance (which could be minimal, but... it depends!).

    • @IN2FPA
      @IN2FPA 4 роки тому +1

      @@SQLBI Thanks for the reply SQLBI

  • @zeisslaimen1
    @zeisslaimen1 4 роки тому +1

    admire you guys!

  • @adsf-tw4yy
    @adsf-tw4yy 2 роки тому

    Thank you for this video. Now that it is more than a year old I am interested to know if there is any performance improvement in how these direct queries work. Because in the way it is described in the video it rather looks like a gentle warning that with other than really small cardinalities the composite model is not likely to perform well. I have a composite data model with table cardinality in the range of 10000, some moderately complex calculations and response times up to 30 secs (or even more). I am about to test it by merging the data models because from the video I see I was probably making the composite model the right way, so I cannot hope to find just a single error that I could simply correct to radically improve response times. (Until the end of the video I was hoping for some kind of a surprise solution that would do much better.) So if there is any new guidance on this it would be great to hear about it. Thanks!

    • @SQLBI
      @SQLBI  2 роки тому

      While there have been some improvements, the architecture is what you have seen - when a relationship crosses data source boundaries, keys must be moved one way or another...

    • @adsf-tw4yy
      @adsf-tw4yy 2 роки тому

      @@SQLBI Thank you very much!

  • @aeimei
    @aeimei 4 роки тому

    Very informative video. Now can you made similar video for consolidating such a dimension which already have very high cardinality maybe in millions? Is there using calculate table with unions still good option, and I think this new table will not be compressed right? What are the options then?

    • @SQLBI
      @SQLBI  4 роки тому

      The table is compressed as any other table, but it has to be materialized in an uncompressed state when you refresh the model (before being compressed).
      I don't understand the remaining part of the question, though. Please, can you clarify?

  • @MrRakesh448
    @MrRakesh448 3 роки тому

    Hi Marco,
    While creating composite models, everything works fine but if I change the source of the model from DEV to UAT then all the measures which are created within the Power BI files are missing. Is it this expected or a bug? I tried to refresh the Power BI file as well but still it no luck.
    Thanks

    • @SQLBI
      @SQLBI  3 роки тому

      Not sure - you should contact Microsoft for that. Because the feature is in preview, write on the Power BI Desktop forum: community.powerbi.com/t5/Desktop/bd-p/power-bi-designer

  • @vkkpp
    @vkkpp 3 роки тому

    Thank you so much for this wonderful video, I have question in same lines.
    Hi ,
    Can I have composite model where my data source should be tabular model (on premise) and excel/any other RDBS as second source.
    Can you help me with some tips how can I do that ?

    • @SQLBI
      @SQLBI  3 роки тому

      You can create composite models connecting only Power BI datasets or Azure Analysis Services databases. On-premises Tabular models (SQL Server Analysis Services) are not supported.

  • @sylau1516
    @sylau1516 2 роки тому

    Tried but does not allow to load the 2nd dataset eventhough after added the new preview feature. Does this only work for premium users? I am a pro user and owns both datasets.

    • @SQLBI
      @SQLBI  2 роки тому

      No, it should work also for Pro users.

  • @pravinupadhyay2046
    @pravinupadhyay2046 4 роки тому

    What is GROUPCROSSAPPLY()?
    another question is, will we see all the data of table from live connection if we create a calculated table, like selectcolumn() from live tables??

    • @SQLBI
      @SQLBI  4 роки тому

      GROUPCROSSAPPLY is a new function (still undocumented) used to manage queries in composite models.
      Yes, you can see the data of the remote table if you create a calculated table, but this copies the entire table locally. If you just want to take a look at the content of the remote table, you might want to query it using DAX Studio.

  • @noahhadro8213
    @noahhadro8213 4 роки тому

    how does RLS work with this? I connected to a powerbi data set and an excel import. They can only see the visual using the excel import. All the other visuals are broken?

    • @SQLBI
      @SQLBI  4 роки тому

      RLS is not fully implemented yet in the preview, you have to wait an upcoming update.

  • @ppillo3349
    @ppillo3349 3 роки тому +1

    eres muy grande tio!!!

  • @dattrieuds
    @dattrieuds 4 роки тому +1

    Thanks Marco! Really enjoyed your in-depth video about composite models. I would appreciate if you could answer the follow-up question.
    I see in the video you used UNION DAX function to calculate the Products and Brands tables. But do you know if Microsoft will allow us to Union (append) those 2 tables in PowerQuery? The same goal would be achieved, but then we don't have to load the 2 tables from Receipt and Sales sides, which would make the data model lighter? In this case the 2 dimenstion tables Products and Brandes are not so big, so DAX query is efficient, but I can imagine in some cases, we want to load the final combined tables upstream, and disable load the "hidden" big tables.
    Or maybe a more general question, do you know if Microsoft allows to edit the composite models inside PowerQuery?
    Grazie mille!

    • @marcorusso7472
      @marcorusso7472 4 роки тому +1

      If you can use PowerQuery to access the data source, you have fewer reasons to create a composite model - it would be more efficient to load all the data in the same model. Technically, you could access the two models in PowerQuery through XMLA endpoint if you have Power BI Premium or if the two models are hosted on Azure Analysis Services - but as I said, it doesn't seem an efficient architecture to combine both technologies (composite models and power query accessing remote datasets).

  • @666Niusia
    @666Niusia 4 роки тому

    Ciao, Why i cant see the option in Preview to turn this feature on in Modelling Tab

    • @SQLBI
      @SQLBI  4 роки тому

      You need Power BI Desktop December 2020.

  • @rick_.
    @rick_. 4 роки тому +1

    Thanks Marco, really well done and I learned a lot!

  • @natolira
    @natolira 4 роки тому +1

    As always, great content, thank you very much!!!
    I have one question: can DAX Studio clear the cache from the source or only the localhost?
    If it cannot clear the cache from the published dataset(s), running the query benchmark makes warm caches queries from the SE perspective.

    • @SQLBI
      @SQLBI  4 роки тому +1

      Yes it works!

    • @wta306
      @wta306 4 роки тому

      @@SQLBI just to confirm, are you saying that clearing the cache in DAX studio will clear the caches of the local AND remote models? Or just the local model?

  • @SergiyVakshul
    @SergiyVakshul 4 роки тому

    Marco, regarding to docs "Some queries may return wrong results when there's a relationship between calculated tables and table(s) in a remote source. Creating calculated tables over a remote dataset isn't supported, although it isn't currently blocked in the interface."
    docs.microsoft.com/en-gb/power-bi/connect-data/desktop-directquery-datasets-azure-analysis-services
    You act as if there is no such limitation. Does it mean that the limitation will soon be lifted?

    • @SQLBI
      @SQLBI  4 роки тому +1

      Very interesting point. We will investigate, because if calculated tables cannot be created, it is a big limitation to solve the scenario described in this video. Thanks for the comment!

    • @SergiyVakshul
      @SergiyVakshul 4 роки тому

      @@SQLBI Matt Allington encountered the problem with calculated tables recently:
      exceleratorbi.com.au/the-most-important-update-to-power-bi-desktop-ever/#comment-85883

    • @SQLBI
      @SQLBI  4 роки тому +2

      Good news: Microsoft confirmed that by the end of the preview the calculated tables over a remote dataset will be supported. There are possible bugs now in preview, the message was intended to warn about that situation only.

    • @SergiyVakshul
      @SergiyVakshul Рік тому

      @@SQLBI Today is 2023-04-17. This feature is now GA. But the model still can not be refreshed in Power BI service because of the presence of a calculated table. The refresh ends with “Refresh is not supported for datasets with a calculated table or calculated column that depends on a table which references Analysis Services using DirectQuery.”. I tested the feature on a newly created Pro & PPU workspaces.

    • @SergiyVakshul
      @SergiyVakshul 7 місяців тому

      Today is 2024-05-22. MS finally released an update that allowed the refresh in the service. More than 3 years it took to lift the limitation.

  • @magnumbi8396
    @magnumbi8396 2 роки тому

    Mitic

  • @amirulaizuddin5400
    @amirulaizuddin5400 4 роки тому +1

    First