Dive into Microsoft Fabric's Power BI Direct Lake

Поділитися
Вставка
  • Опубліковано 2 лип 2024
  • Let's break down Power BI Direct Lake in Microsoft Fabric and explain how you can leverage one copy of the data from OneLake. Patrick explains!
    Direct Lake
    learn.microsoft.com/power-bi/...
    📢 Become a member: guyinacu.be/membership
    *******************
    Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.
    🎓 Guy in a Cube courses: guyinacu.be/courses
    *******************
    LET'S CONNECT!
    *******************
    -- / guyinacube
    -- / awsaxton
    -- / patrickdba
    -- / guyinacube
    -- / guyinacube
    -- guyinacube.com
    **Gear**
    🛠 Check out my Tools page - guyinacube.com/tools/
    #PowerBI #DirectLake #GuyInACube
  • Наука та технологія

КОМЕНТАРІ • 21

  • @gvasvas
    @gvasvas 3 місяці тому +3

    Awesome demo! Quick and right on the spot.

  • @archanasrivastava6531
    @archanasrivastava6531 2 місяці тому +2

    Thanks for this insightful video.
    Do you have any performance/capability metrix for comparison between Import, Direct query and Direct Lake , pls share. Thanks in advance

  • @toma4528
    @toma4528 3 місяці тому +2

    Great video, Patrick!

  • @christophehervouet3280
    @christophehervouet3280 2 місяці тому +1

    Super post Patrick , as usual

  • @googlogmob
    @googlogmob 3 місяці тому

    Patrick, thanks 👍

  • @dilipinamdarpatil6301
    @dilipinamdarpatil6301 3 місяці тому

    Awesome 🙏

  • @gnomesukno
    @gnomesukno 3 місяці тому

    Not using it currently but I can see some potential benefits to it. Will have to look into it

  • @shekharkumardas
    @shekharkumardas 3 місяці тому +2

    How to create dax column in direct lake dataset

  • @danrolfe7862
    @danrolfe7862 3 місяці тому

    THIS IS BANANAS!!!!!!!! WOOOHOOOO
    Is there still a row limit? (On data that you can actually bring into Power BI)
    I seem to remember hitting an upper limit on rows using SQL Endpoint / Direct Query.. I had this MONSTER data set of about 14m rows that the stakeholder insisted he needed all of the data.

  • @Mike-en1rd
    @Mike-en1rd 2 місяці тому

    Do you know when Direct Lake will be available to use in Power BI Desktop?

  • @UnbelievableOdyssey
    @UnbelievableOdyssey 24 дні тому

    If my Delta Lake is in Azure Data Lake Stoage can I still use Direct Lake?

  • @user-iv5tq4qk7m
    @user-iv5tq4qk7m 3 місяці тому +3

    Q I love the ease of creating new semantic models but I keep coming across the problem whereby I have to give somebody access to the whole lake house in order to give them access to a segmented part of that data but I only want them to see via a semantic model. Is there any way that I can create a gold lake house in one workspace then create multiple semantic models in other workspaces and only give users access to those?

    • @npergand
      @npergand 3 місяці тому

      You don’t need to give users access to the lakehouse, that’s just the default behavior. What happens is when you create a new semantic model it uses a gateway connection the lakehouse that is with SSO. You can see this in the semantic model settings screen. You can change that by creating a new connection to the lakehouse using a specific credential.

  • @NateHerring1
    @NateHerring1 3 місяці тому +1

    I watch Patrick

  • @nishantkumar9570
    @nishantkumar9570 3 місяці тому +5

    How costing will work for direct lake mode?

    • @toulasantha
      @toulasantha 2 місяці тому

      Less to start with
      Will be rocketing up after that
      Just like everything else MS 😂

  • @Milhouse77BS
    @Milhouse77BS 3 місяці тому

    I’m up

  • @NicolasPappasA
    @NicolasPappasA Місяць тому

    Is direct lake using Delta Live Tables? It seems like it's the same technology.

  • @EBAN4444
    @EBAN4444 3 місяці тому +1

    Does this mean the massive 25GB model I have that holds too many years of data because the "business" needs it, even though they only look at a few years, can be removed and then only the partitions of data that is needed will be held into memory? Lowing the memory used on the capacity and the amount of data and CPU needed to crunch all the measures?
    Can I recreate the model using direct lake against our ADLS gen2 databricks parquet files which are already the fact tables we pull in. Do you need to setup partitions in the onelake or does it automatically do it for you?
    This does seem to remove the query folding performance gains, so it seems like the parquet files will need to be rewrote to be better optimized and only include the data that is needed in the model.
    Also is that python library to refresh a dataset available outside of onelake? aka would love an easy way to refresh a PBI model from an Azure databricks notebook versus adf xmla call

  • @googlogmob
    @googlogmob 3 місяці тому

    Does Fablic available for developers for free?

    • @srikanthm4504
      @srikanthm4504 3 місяці тому

      No your admin must enable and can do for a specific space.