Advancing Fabric - The Data Warehouse Experience

Поділитися
Вставка
  • Опубліковано 5 вер 2024

КОМЕНТАРІ • 21

  • @tashraf7262
    @tashraf7262 Рік тому +2

    Thx guys. It's good to see the simple explanation of the interface and how stuff works. Can you explore) performance and explain how scalability works in fabric?

  • @axyz_music
    @axyz_music Рік тому +2

    What is the use case for this warehouse? I get that it's smart to have it all in OneLake, but with the performances I see in this video, I would not be able to use this as an actual data warehouse. More than 5 seconds to select 542 rows from a table and 7 seconds to update 542 records in a table with 542 records. It's an honest question. I have a hard time to see the use case when it performs like this, am I missing something?
    Edit: And almost 1 second to get an error that you're not allowed to update a lakehouse.

  • @Fernando_Calero
    @Fernando_Calero Рік тому

    Great video to understand the basics of the Data Warehouse experience in Fabric, thanks Simon and Craig. Also, I am a big fan of the statement at 13:33. Finally, it'd help us Craig if you could use a pointer to highlight where your cursor is. Cheers!

  • @chinny4953
    @chinny4953 Рік тому +1

    Thanks guys. Great stuff

  • @bharathianjeneya2111
    @bharathianjeneya2111 10 місяців тому +1

    How do we easily find the sequence of videos on data warehouse experience on fabric!

  • @Reitseschaatser
    @Reitseschaatser Рік тому

    Great video, thanks! It has some way to go for GA I think but the preview features are looking quite good. Especially the shortcuts that are nice way of data virtualisation is something that could really cut down on loading times and IO 'waste'.

  • @user-dh6sj7mv9n
    @user-dh6sj7mv9n 6 місяців тому

    I dont see linked service to Filesystem!!! am i right?

  • @pini22ki
    @pini22ki Рік тому +1

    Thanks, how do i create table as shortcut?

  • @NickJe
    @NickJe Рік тому

    I appreciate the free fabric content but let's talk about the real elephant in the room where we actually do some proper data warehousing rather than copying data from one location to another or providing snapshots! How are we working around identity columns and merge statements in the T-SQL world? Back to old school methods?

    • @AdvancingAnalytics
      @AdvancingAnalytics  Рік тому +1

      Pretty much. I'm hoping we'll see those elements come in pretty quickly towards GA - they're already supported by Delta. Worst case, if you need those elements, sacrifice the "T" and use Spark SQL in a notebook and you can do all of those things

    • @danhorus
      @danhorus Рік тому

      Judging by the docs, I believe identity columns are not yet supported by Delta OSS. Microsoft may have to reimplement it themselves if they don't want to wait. At least MERGE has been in Delta OSS for a long time

  • @mohammedghouse9088
    @mohammedghouse9088 Рік тому +1

    Hey Simon,
    Can you please post the link to that polaris paper?

    • @AdvancingAnalytics
      @AdvancingAnalytics  Рік тому

      Absolutely!
      Our vid about Polaris: ua-cam.com/video/IqjVZexHCcE/v-deo.html
      The original whitepaper: www.vldb.org/pvldb/vol13/p3204-saborit.pdf
      Enjoy! Simon

  • @user-dh6sj7mv9n
    @user-dh6sj7mv9n 6 місяців тому

    how do i load varchar data more than 8000+ characters?

  • @Fonsmail
    @Fonsmail Рік тому

    my most important question; “what do I need to avoid now in order to improve my migration experience that is coming?”

  • @robertsbd
    @robertsbd Рік тому

    If I am implementing a data platform on Azure for a client now as a temporary measure for the here and now, but the plan is for a Fabric platform when that hits GA, and we plan for a lakehouse and spark centric solution. Will it be easier to migrate from Synapse or Databricks, or does it not really matter which we pick?

    • @AdvancingAnalytics
      @AdvancingAnalytics  Рік тому +1

      A Synapse spark-based Lakehouse moving to Fabric will have slightly less refactoring than moving a Databricks-based lake. Although with Databricks you could also happily keep a lot of your code in Databricks and just switch it to using the OneLake destination. So it depends on what you want your future state architecture to be!

    • @robertsbd
      @robertsbd Рік тому

      @@AdvancingAnalytics Thanks for the helpful advice!

  • @sreekanth6180
    @sreekanth6180 Рік тому

    Will it be easy to do lift and shift of existing data solutions to fabric ??

    • @AdvancingAnalytics
      @AdvancingAnalytics  Рік тому +1

      Of some data solutions? Yep! Really easy! Of... all data solutions? I can't promise that! :D