Getting Started with Dataflow in Microsoft Fabric Data Factory

Поділитися
Вставка
  • Опубліковано 16 гру 2024

КОМЕНТАРІ • 33

  • @shafa7668
    @shafa7668 Рік тому +1

    I wanted to get started with Fabic from day one of announcement literally. So thank you for starting this series. You have given us an ahead start!! Cheers

    • @RADACAD
      @RADACAD  Рік тому

      Always glad to help :)

  • @antonyliokaizer
    @antonyliokaizer Рік тому

    I'm wondering why public preview don't have the button "Add data destination" in 10:16 after I upload a csv file as a table? Thank you.

    • @antonyliokaizer
      @antonyliokaizer Рік тому

      Without the button, I cannot send data to lakehouse nor warehouse....

    • @RADACAD
      @RADACAD  Рік тому +1

      Are you creating dataflow gen2? Because Gen 1 doesn't have this option

    • @antonyliokaizer
      @antonyliokaizer Рік тому

      @@RADACAD In public review, I don't see any entry for creating gen1 dataflow...
      Thanks, let me double check again and again

    • @antonyliokaizer
      @antonyliokaizer Рік тому

      @@RADACAD Per checked, you're correct. Thank you.
      I guess the gen 1 data flow was created in pipeline.
      From Data Factory page, there's only "data flow gen 2" but "gen 1"
      Thank you again and again.

  • @Jojms123
    @Jojms123 9 місяців тому +1

    First of all thanks for the video. Suggestion : It would be great to have the links for your other videos appearing as you speak or in the description below.

  • @mounikajuttiga3936
    @mounikajuttiga3936 6 місяців тому

    Can we refresh the dataset for every 15mins in fabric(schedule refresh)?

  • @RajeshGopu-x7m
    @RajeshGopu-x7m Рік тому

    Very Good Video and easy to understand to explore futher for beginners...

  • @raviv5109
    @raviv5109 Рік тому

    Good Video, Thanks for creating and sharing. It would be interesting to know how it performs on real world large datasets.

  • @adamsabourin9416
    @adamsabourin9416 Рік тому

    Reza if we choose append instead of replace is it going to keep duplicates? If so how can we save as “append and remove duplicates”?

  • @ruru1419
    @ruru1419 Рік тому

    Thanks Reza great video as usual!
    We're trying some PoC with Fabric Warehouse (not Lakehouse) for our SQL user community. Although I have no issues loading small files with Dataflow Gen2, when trying to load On-Premis data through our Gateway (which works fine to refresh PowerBI Datasets) i always get this error:
    "An exception occurred: Microsoft SQL: This function doesn't support the query option 'EnableCrossDatabaseFolding' with value 'true'."
    I cannot find anything related to this...any clue? I wonder if many have tried to implement a "true" business scenario and not just some Exel samples...for this we need to pull data from the Gateway. Thanks!

  • @Milhouse77BS
    @Milhouse77BS Рік тому

    Thanks. Seems like there should be a "Publish & Refresh" option?

    • @RADACAD
      @RADACAD  Рік тому

      I agree :) would be helpful

  • @yoismelperez2744
    @yoismelperez2744 Рік тому

    Thanks for sharing Reza. I like how you are taking the lead to go over Microsoft Fabric products. One question, I may have missed, will replace do update on existing records and inserts for new, or just replace on the entire dataset. Being familiar with PBI Dataflows, I think the answer is it will replace all but just want to confirm.

    • @yoismelperez2744
      @yoismelperez2744 Рік тому

      Reza, confirmed, you mentioned it in this video ua-cam.com/video/qNoOQzMjrfk/v-deo.html, it will replace whatever exists 👍

    • @RADACAD
      @RADACAD  Рік тому

      Thanks :)
      Replace will wipe out the data and enters the new data, whereas the append will append it to the existing data.

  • @tea0819
    @tea0819 Рік тому

    Excellent video. Thank you for sharing. I am new to your channel but enjoying all of the content. I recently started a YT channel as well focused on Azure Data and I was just curious what software are you using for drawing red boxes around items and zooming in on your video?

    • @RADACAD
      @RADACAD  Рік тому

      Best of luck! and thanks
      I use Zoomit

  • @kapiljadaun7264
    @kapiljadaun7264 Рік тому

    Hi
    Your way of explaining is great.
    I would request you to make a video from starting to making reports in Power BI with demo. It will be very helpfull.
    Thank you

    • @RADACAD
      @RADACAD  Рік тому

      We are glad it is helpful

  • @barttrudeau9237
    @barttrudeau9237 Рік тому

    Reza, Your videos are amazing. You stay razor focused and on subject. I'm really enthused about Fabric but concerned about licensing. I don't want to try a bunch of new things for a month only to find out I can't afford them once the trial period is over. We have E5 licensing and I'm not sure what that's going to cover when the trial period is over. Any chance you could update the licensing video you did a while back to help us understand the cost implications of using Fabric?

    • @RADACAD
      @RADACAD  Рік тому

      Thanks Bart
      I will have a new video on Microsoft Fabric licensing soon. It is slightly different from how Power BI licensing works, but similar principals.

  • @debasisrana6437
    @debasisrana6437 8 місяців тому

    Thanks for the video

  • @AbhishekYadav-rb4bi
    @AbhishekYadav-rb4bi Рік тому

    Thank you🙌

    • @RADACAD
      @RADACAD  Рік тому

      You're welcome 😊

  • @mjbah
    @mjbah Рік тому

    Hi Reza.
    May thanks for the video. As always, your videos are helping a lot.
    I got a question around 'adding data to destination'. I was just wondering if you must add each table separately. I am just thinking that if you got so many tables and you want to add all the tables to the same destination whether you can't do it all at once?

    • @RADACAD
      @RADACAD  Рік тому

      Hi Mohamed
      That is totally my question too; why shouldn't I be able to add one destination for multiple queries. Let's hope when the preview is done and is generally available, we have a feature like that :)

  • @decentmendreams
    @decentmendreams Рік тому

    Hi Reza, these are all good but what has downed on me is that if you are with a Premium Per user licensing Fabric means squat . If feels like a rich man has moved in to your neighborhood and you are watching all his fancy toys as the movers unload . I actually went ahead and turned off the trial version as it seems to overcrowd my Service page .Am I far off here ?

    • @barttrudeau9237
      @barttrudeau9237 Рік тому +1

      I share similar concerns

    • @RADACAD
      @RADACAD  Рік тому +1

      I feel your concerns.
      And to be honest if you want to just purely use Power BI, you won't need Fabric.
      For example, a small business with a data analysts and a few users analyzing data of some Excel files using Power BI, works best as a pure Power BI solution.
      However, for larger scenarios you get more done with other elements. In large organizations, you would need a storage for structured and unstructured data, you need staging environment for the data, then a data warehouse, a fully automated ETL mechanism to load data in, then model it, visualize it etc. Power BI is only part of the picture. Fabric would enable organizations to achieve more in the data analytics space.
      It might look like a very huge product (which is), but remember how you eat an elephant? one bit at a time :D

    • @decentmendreams
      @decentmendreams Рік тому

      @@RADACAD Hi Reza, you are right, Fabric will be overkill for most of my needs except for the DirectLake Connector, if I understood it correctly, blazingly fast data refreshes. My files are so large (>100mb per day) and I need to keep as many of them as I can.
      One bright spot about the introduction of Fabric is that it has made me curious about file compressions. For example, I learned that if I convert my CSV files to parquet files (never knew about it till this week) I can reduce its size by 75% which is so awesome.
      Thank you for everything.
      A person in Phoenix, Arizona.