Mirroring Snowflake in Fabric Supercharged Analytics at No Extra Costs

Поділитися
Вставка
  • Опубліковано 9 січ 2025

КОМЕНТАРІ • 22

  • @Nalaka-Wanniarachchi
    @Nalaka-Wanniarachchi 9 місяців тому +1

    A Game changer .Nice explanation Reza. By this approach SF can serve as a storage layer and for compute,etc customers can use Fabric..

    • @RADACAD
      @RADACAD  8 місяців тому +1

      Yes, you are right. Customers can keep their data in Snowflake and still leverage the powerful analytical features of Fabric.

  • @ginameronek8068
    @ginameronek8068 8 місяців тому

    A deceptively simple demo of something that is bound to fundamentally boost adoption of Fabric. I'm thinking of all the pain points this eliminates, from data plumbing side to being able to reassure users that the data is available. Excellent to see the querying info on the Snowflake side as proof too, will be interesting to see what the admin side of this looks like over time within Fabric.

    • @RADACAD
      @RADACAD  8 місяців тому

      Thanks Gina! Good to see you here. Indeed this feature makes a difference

  • @DavidWilsonNZ
    @DavidWilsonNZ 8 місяців тому +1

    Hey there Reza...
    You've made a comment in this video about Direct Query being slow... and yes - that was my perception.
    Last Friday I loaded 45M rows into a table in Snowflake and connected to it with with a Published Power BI report via DQ.
    I've been completely below away by the outstanding performance when producing a Matrix visual across 9 months of data.
    IN my view - with this example the outstanding performance of Direct Query completely and utterly out weighs the benefits of refreshing the data on a regular basis. The refresh alone is 40+ minutes... and will only grow every month. Yes I could use incremental refresh - but why when the DQ performance is so outstandingly fantastic.

    • @RADACAD
      @RADACAD  8 місяців тому

      Hi David
      That is interesting indeed, I haven't seen DirectQuery performing fast unless a good database table optimization is done on the data source, maybe that is the case? some column store clustered indexing etc on the source?

    • @DavidWilsonNZ
      @DavidWilsonNZ 8 місяців тому

      @@RADACAD No tuning has been done at all. I did create a view with a select *, cast( to numeric for one column) from db.schema.table
      The data has been loaded from csv files from Azure BLOB storage.
      Very stunning, happy to do a demo for you..

    • @DavidWilsonNZ
      @DavidWilsonNZ 8 місяців тому

      @@RADACAD No special treatment or optimisation anywhere. And the warehouse within snowflake is a small one.

  • @bladerunnerisback
    @bladerunnerisback 9 місяців тому

    Thanks for the video! We use Snowflake and can't wait to start testing it. Next great thing will be Direct Lake in Power BI desktop

    • @RADACAD
      @RADACAD  8 місяців тому

      You're welcome. Let me know your experience with it.

  • @fr1sket363
    @fr1sket363 8 місяців тому

    They still have a few bugs to work out, i think larger snowflake databases seems to have some issues with mirroring, i've raised these through a support ticket and they are currently with the product team. I'm a little disappointed as i can really use this feature now, i hope it's fixed soon.

  • @gauravdevgan79
    @gauravdevgan79 7 місяців тому

    Hi, where is the snowflake environment for this demo. Is the snowflake environment hosted in aws or azure? does it work across cloud?

    • @RADACAD
      @RADACAD  7 місяців тому

      It won't matter what Snowflake environment for Mirroring
      It uses the APIs and CDC for it

  • @johansantacruz6464
    @johansantacruz6464 9 місяців тому

    Thanks for this update!

    • @RADACAD
      @RADACAD  8 місяців тому

      You're welcome!👊👊

  • @eekshitchawla3637
    @eekshitchawla3637 7 місяців тому

    Hey Reza, can you please suggest if there's a way to mirror data from Athena and BigQuery, need an urgent solution for optimising the cost

    • @RADACAD
      @RADACAD  6 місяців тому

      At the moment Mirroring is only possible for 3 sources (Snowflake, Azure SQL DB, and Azure cosmos db). for other sources, you have to develop the data integration using Data Pipelines and Dataflows yourself.

  • @tahmidyusuf6674
    @tahmidyusuf6674 9 місяців тому

    Hi Reza, thanks for the information. I’ve checked in my P1 capacity (in Australia) and Mirroring options haven’t been available yet. Moreover, will it be possible to access through snowflake SSO login?

    • @RADACAD
      @RADACAD  8 місяців тому

      That is strange. Because Australia supports the mirroring based on this link: learn.microsoft.com/en-us/fabric/database/mirrored-database/snowflake-limitations
      If by Snowflake SSO you mean users to be able to access the part of data in Snowflake that they are authorized to see (the security to flows through from Snowflake into Fabric); that is not possible at the moment. the security rules have to be re-created in Fabric. But if you mean that we use Snowflake user account to set up the mirroring as a stored credentials, it is possible, like what I showed in this video

    • @Scorpian22k
      @Scorpian22k 4 місяці тому

      Have you find any solution for the SSO, does it start supporting or still we have to recreate the roles is fabric?

  • @enocharthur4322
    @enocharthur4322 9 місяців тому

    I'm not sure your microphone works

    • @RADACAD
      @RADACAD  8 місяців тому

      What do you mean by not working? I can hear the sound in the video