Power BI Direct Lake - What is it and Why it is Important When Working With Fabric

Поділитися
Вставка
  • Опубліковано 15 вер 2024

КОМЕНТАРІ • 23

  • @gabrielmenezes3967
    @gabrielmenezes3967 Рік тому +2

    Hi Reza, great video once more. Have you tested performance between Import mode and Direct Lake? I tested a scenario of a very simple model (just one 40M rows table) and Import mode came out a little bit faster (of course I used performance analyzer on PBI Desktop against a live connected dataset. Both datasets on the same premium capacity). Still very promissing for the future, but DirectLake connection without any options for security is a no go for me.

  • @nithyanandamraman3078
    @nithyanandamraman3078 Рік тому

    Thanks Raza for this video, Does the Direct Lake support all the DAX commands as import mode?

  • @thomasgremm6127
    @thomasgremm6127 10 місяців тому

    Can I use direct lake with a Delta table stored in S3-compatible store on-premise?

  • @ranjanroy1623
    @ranjanroy1623 Рік тому

    Thanks Reza, amazing explanation !

    • @RADACAD
      @RADACAD  Рік тому

      Glad you liked it!

  • @ranjanroy1623
    @ranjanroy1623 Рік тому

    1 Question , Can i Say that "direct lake from Pbi" is a kind of direct query connection (as data resides in one lake itself)+ which also has all advantages of import (as in performace etc )?

    • @RADACAD
      @RADACAD  Рік тому

      It is similar to DirectQuery in the way that the data is residing in the source which is OneLake
      But it is much faster than DQ because there won't be a process of querying the data from a relational database system.

  • @germanareta7267
    @germanareta7267 Рік тому

    Thanks, great video.

  • @DOSOLOCLUB
    @DOSOLOCLUB Рік тому

    Hello Reze, thank you for the great video. I have a simple question for you and your viewers. I have multiple Power BI files using 'import'. Each of the Power BI files is quite extensive. Now I need to create a Power BI file consolidates tables prepared with M language from those Power BI files. Can you please tell me the best way to connect? Thank you.

    • @RADACAD
      @RADACAD  Рік тому +1

      Hi
      Thanks
      If you want a new Power BI file to be made with the tables from other Power BI datasets, then you can use DirectQuery to Power BI datasets, and connect to those datasets. You won't have ability to transform the data however, for that, you will need to import from those models into the new one. Please check out the video I have in this channel about the DirectQuery to Power BI dataset.

  • @bradj229
    @bradj229 Рік тому

    Isn't Direct Lake and Live connection basically identical, except Direct Lake allows you to have Live (Vertipaq) type performance w/o requiring a SSAS (Tabular model)?

    • @RADACAD
      @RADACAD  Рік тому +1

      You still have the tabular model and vertipaq, the difference is that instead of the files that vertical uses as default, the files will be parquet files, same performance

  • @behrad9712
    @behrad9712 Рік тому

    Thank you!👍

  • @midata787
    @midata787 Рік тому

    Great video Reza. You mentioned at the end that calculated columns are not yet available. Do you think that this would be something that will be on the roadmap?

    • @RADACAD
      @RADACAD  Рік тому

      Thanks. I believe it would come at some point, but I don't have any timelines

    • @juanpablorvvv
      @juanpablorvvv Рік тому +1

      Without calculated columns it’s almost imposible to prefer this over the normal import mode.
      Also, you have to refresh the delta tables with a scheduled refresh using a pipeline or data flow, so the “advantage” of directlake is not real unless your users or system writes directly on the delta tables hosted in the lake house. I am correct?

  • @nonamenoname3323
    @nonamenoname3323 Рік тому

    No calculate columns is big disadvantage :(
    Datamart with dataflows give more flexibility plus near live connection or even live connection with super fast response.
    Do we have any comparison from money value (licence) ans memory/cpu on cluster?
    Thank you for your work!

    • @RADACAD
      @RADACAD  Рік тому

      It is of course still preview. I'm sure you will be able to create calculated column sometime in the future in it

  • @kylec7973
    @kylec7973 Рік тому

    I'm still waiting to see how established large Datasets can take advantage of Direct Lake. So far the limitations I've found that are deal breakers:
    1. Calculated Columns not yet available.
    2. When creating Measures, seemingly no way to organize them.
    3. Measures have very little formatting options (such as # of decimals), I think you'd need to write it into the DAX which is undesirable.
    Something I'm curious to test:
    Is it possible to swap an existing classic Import Mode Dataset for a Direct Lake Dataset of the same name without Users noticing? Same Dataset name, same Columns, same Measures.

    • @RADACAD
      @RADACAD  Рік тому

      Limitations will be much less when GA I assume
      Regarding replacing your Import mode dataset with a Direct Lake: Even if it be possible to have the same name Dataset and report and app, there would be still complications. In the Power BI Service object IDs are used to distinguish objects. If you create a new dataset using the same name it would still have a different ID, and different URL.

    • @kylec7973
      @kylec7973 Рік тому

      @@RADACAD if the Dataset can't be replaced while retaining the object Id, that probably fully dashes my dream. It seems that existing Datasets that are attached to hundreds of reports/users will not be able to smoothly leverage Direct Lake. You'd have to go down the path of converting all existing reports to a new Dataset connection.
      Still looking forward to what's possible for future Dataset projects, but what bugs me is how small scale MSFT show's these scenarios. I want to see some big data models with lots of Measures in examples!

  • @raz8657
    @raz8657 Рік тому

    you look good in the blue glasses!!