Creating A Microsoft Fabric End-To-End Solution ⚡ [Full Course]

Поділитися
Вставка
  • Опубліковано 11 лип 2023
  • Download your certificate of completion after you finish this course:
    📄 prag.works/202307-lwtn-certif...
    Student files
    ✔️prag.works/lwtn-microsoft-fab...
    Prepare to revolutionize your data universe! Are you tired of battling against disruptive data silos that sabotage your organization's natural flow? Have you grown weary of futile attempts to construct a data warehouse or lakehouse, realizing the colossal waste of time and energy? And let's not forget those individuals lost in the labyrinth, desperately seeking access to the elusive data required for their crucial reports.
    Fear not, for salvation has arrived in the form of Microsoft Fabric!
    Microsoft Fabric is the veritable holy grail of analytics solutions, encompassing a breathtaking array of full-service capabilities. Behold the marvel of seamless data movement, awe-inspiring data lakes, and the sheer power of data engineering, integration, science, real-time analytics, and business intelligence all in one extraordinary package.
    Delve into the boundless possibilities of Power BI, Synapse, and Data Factory as you unlock the mysteries of this groundbreaking offering. Whether you're a daring data engineer, a visionary scientist, a sharp-witted analyst, or even a wordsmith weaving reports, Microsoft Fabric unveils a world of limitless potential.
    Prepare to be mesmerized as we unravel the enigmatic wonders of OneLake, the crown jewel of Microsoft Fabric. Within this lake-centric architectural masterpiece, data professionals and the business elite unite, forging an unbreakable bond of collaboration on daring data projects.
    Get ready to transcend the ordinary and embrace the extraordinary. Microsoft Fabric awaits, ready to propel your organization into an exhilarating new era of data-driven success!
    Next event:
    Mini Hackathon 2023 with Brian Knight
    ✔️ prag.works/lwtn-mini-hackatho...
    Next step on your journey:
    👉 On-Demand Learning - Start With The FREE Community Plan: prag.works/lwtn-trial
    🔗Pragmatic Works On-Demand Learning Packages: pragmaticworks.com/pricing/
    🔗Pragmatic Works Boot Camps: pragmaticworks.com/boot-camps/
    🔗Pragmatic Works Hackathons: pragmaticworks.com/private-tr...
    🔗Pragmatic Works Virtual Mentoring: pragmaticworks.com/virtual-me...
    🔗Pragmatic Works Enterprise Private Training: pragmaticworks.com/private-tr...
    🔗Pragmatic Works Blog: blog.pragmaticworks.com/

    Let's connect:
    ✔️Twitter: / pragmaticworks
    ✔️Facebook: / pragmaticworks
    ✔️Instagram: / pragmatic.w. .
    ✔️LinkedIn: / prag. .
    Pragmatic Works
    7175 Hwy 17, Suite 2 Fleming Island, FL 32003
    Phone: (904) 638-5743
    Email: training@pragmaticworks.com
    #Fabric #PragmaticWorks #AustinLibal #Training #Microsoft #Tech #FreeConference #LearnWithTheNerds
    **Any sales mentioned in the video may no longer be valid. Offers are subject to change with/without notice and are for a limited time only.
  • Розваги

КОМЕНТАРІ • 61

  • @PragmaticWorks
    @PragmaticWorks  8 місяців тому +1

    Download your certificate of completion after you finish this course:
    📄 prag.works/202307-lwtn-certificate
    Student files
    ✔prag.works/lwtn-microsoft-fabric-files

    • @theonlymattkone
      @theonlymattkone 7 місяців тому +1

      Great video. Thank you for all of the information. ♥

  • @MubarakAli-cb4cq
    @MubarakAli-cb4cq 4 місяці тому +7

    00:03 Microsoft Fabric is an end-to-end analytics solution with full service capabilities.
    06:30 Fabric enables a wide range of tools in the Power BI service.
    17:53 Fabric is a one-stop shop for analytics, visualization, storage, sharing, and integration.
    22:50 Create a workspace with fabric enabled and a lake house called Adventure Works.
    32:36 Creating a connection and loading data to a Lake House using Power Query
    37:20 Create tables in Adventure Works lake house and load data from Excel workbook
    47:26 Different Microsoft products available. Live teachings provide in-depth learning.
    52:39 Use historical versions of Delta table for data analysis and recovery
    1:02:46 You can easily upload and analyze delimited text files in Power BI Lakehouse.
    1:07:36 Clicking the new Power BI data set and selecting tables from the data lake
    1:18:23 There are two ways to update data in the lake house: data flow and pipeline.
    1:23:06 Data Factory pipelines and activities can be used to move and process data

  • @juliestudy1475
    @juliestudy1475 7 місяців тому +2

    Hi Austin, your tutorial has been most helpful! Thank you so much for sharing!

    • @austinlibal
      @austinlibal 7 місяців тому

      Thanks for watching and stay tuned for another session on 12/7/23 on the channel as well for more Fabric!

  • @Babayagaom
    @Babayagaom 7 місяців тому +4

    Pragmatic is most underrated channel out there

  • @user-xl3ru2tw9g
    @user-xl3ru2tw9g 9 місяців тому +3

    Hey, would love to see you do a video on governance. How do you secure (delta) tables? Can you secure them in Lakhouse and have same security in Warehouse and PowerBI?

    • @austinlibal
      @austinlibal 9 місяців тому

      There's a pretty big conversation that would go into that and could be a good topic for a future LWTN session. The short answer is, it depends on where users are going to be accessing the underlying data of the delta tables. The Lakehouse accessed via spark, the SQL Endpoint accessed via SQL, and the Default Power BI dataset accessed with Power BI all have their own individual items in a Fabric Workspace and have different permissions for accessing each. You could give users the ability to access the Power BI dataset but never touch the SQL endpoint.
      From Microsoft:
      "Lakehouse sharing by default grants users Read permission on shared lakehouse, associated SQL endpoint, and default dataset. In addition to default permission, the users can receive:
      ReadData permission on SQL endpoint to access data without SQL policy.
      ReadAll permission on the lakehouse to access all data using Apache Spark.
      Build permission on the default dataset to allow building Power BI reports on top of the dataset."

  • @SudarshanThakurIRONPULLER
    @SudarshanThakurIRONPULLER 7 місяців тому +2

    HI Austin ,Very informative .Just a question :Can we user One lake as Open search and key based search ? And also can we store PDF,html,images so that users can download then directly from data lakes based on key ?

  • @austinlibal
    @austinlibal 11 місяців тому +4

    Hey friends! Thanks for watching our Fabric LWTN session! I want to do some spin off videos from this session to continue discussing capabilities within Fabric and want to know what you are most interested in? Spark Notebooks? Data Warehouses? Datamarts? Let me know in the comments! Thanks!

    • @rosecraigie
      @rosecraigie 11 місяців тому +1

      Would love to learn more about how to use and leverage the delta lake files in fabric. Delta lake and lakehouse is a new concept for me so covering the basics would be great, have been watching your previous videos on the delta lake, do these all apply to fabric?

    • @austinlibal
      @austinlibal 11 місяців тому +1

      @@rosecraigie Thanks for your response! I'll put together something on working with delta lake in Notebooks in Fabric!

    • @gonzalomedina6614
      @gonzalomedina6614 11 місяців тому

      Would love to learn more about Data Warehouse, Datamarts and Spark Notebook.

    • @sirozhiddinbaltabayev9692
      @sirozhiddinbaltabayev9692 7 місяців тому

      Hello. Thank you so much for sharing your experience and hands-on demo! I am interested in Spark Notebooks, at my workplace I am assigned to demo Fabric features and your vidos are tremendously helpful. Also, wanted to mention, I have not received my cert for using above link? nothing in spam folder neither, could you check.. the name it should be is Flora Dosmet

    • @VeigaChannelUtube
      @VeigaChannelUtube 7 місяців тому

      spark notebooks and pipelines

  • @user-lj9fk8dg9h
    @user-lj9fk8dg9h 8 днів тому +1

    Hello sir,
    Thank you so much providing these productive videos.
    Today, I faced a challenge, and the solution I couldn't find elsewhere.
    That is
    How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?

    • @austinlibal
      @austinlibal 3 дні тому

      Seems like the only way to do this according to Microsoft documentation is through Dataflows Gen2 at this time. There is a connector that allows you to bring data from SAP Hana.
      You can also do this with Spark notebooks but don't have any specific connectors to point you through with that.
      - Austin

  • @clodola1
    @clodola1 9 місяців тому +1

    Thanks very informative , Q on the PBI dataset provisioned from a Lakehouse via directlake storage mode, the RBAC Security tab is blanked out , Is this not available yet? I cannot find anything online

    • @austinlibal
      @austinlibal 9 місяців тому

      Had not actually noticed that, my best guess would be that the security would only be controlled in the Power BI Service workspace that the dataset comes from. You're only going to be authoring reports in PBI Desktop and anything else that can be controlled in the service will be.

  • @niranjanmakkuva4639
    @niranjanmakkuva4639 10 місяців тому +1

    Do we have a way to track the transformations we have done using data flow, I mean logging the details that we applied pivot, groupby and merged table 1 and table 2 to create table 3

    • @austinlibal
      @austinlibal 10 місяців тому

      Thanks for the question!
      You can track the steps of the changes you make inside the data flow just like you can in the Power Query Editor in Power BI Desktop. You still have your applied steps on the right side of the screen. You also have the ability to look at the JSON on the Dataflow and can detect some changes to the data there as well. I'm not certain if there is a specific "Monitoring" section to see a higher level of all of the changes made across the entire workspace / OneLake. I'll keep looking into that though.

  • @nikzadmahbobi4820
    @nikzadmahbobi4820 11 місяців тому +1

    Austin my man! LY bro

  • @excelexperttraining6003
    @excelexperttraining6003 19 днів тому

    Hi Austin, what should I do if I am getting this error message -- TypeError: Cannot read properties of undefined (reading 'toResource')
    -- when trying to connect to a lakehouse (workspace) through the data pipeline process you have in the video. I don't have the Lakehouse in the list. thanks

  • @niranjanmakkuva4639
    @niranjanmakkuva4639 11 місяців тому +1

    I have a question, can we dynamically pass a query to the data flow (I need a flow to read from txt file the query and copy to a destination) eg i have a file with tablename the df should read from source and copy to destination

    • @austinlibal
      @austinlibal 11 місяців тому

      There may be a way to do what you are describing by combining the elements of pipelines in Fabric and leveraging an activity that does the lookup for what needs to get copied and then passing that into a dataflow activity. I have not tested this but will look into it and see if I can figure out the limitations of such a workflow. Thanks for your question!

  • @samihnaien4840
    @samihnaien4840 3 місяці тому +1

    dans t-sql dans data warehouse il ya beaucoup de fonctionnalités manquant comme alter table add column... ya t´il des alternatives ?

    • @austinlibal
      @austinlibal 2 місяці тому

      Those features are being added in the coming months according to Microsoft’s release path.

  • @brycedwhite1
    @brycedwhite1 5 місяців тому +1

    I followed along with your walkthrough, but Fabric did not create a default power bi dataset and I have no button to create a new power bi dataset. I have a trail Fabic license and a PPU license for power bi - if this matters.

  • @antonzhong9970
    @antonzhong9970 7 місяців тому +1

    Awesome video, Austin. May I ask, at 1:16:00, when we want to connect and use Power BI Dataset in Power BI Desktop, if lets say the person is different person which is not create the whole OneLake and Lakehouse and this person only want to use the dataset which given by their company admin, will the person only need Power BI Pro license ? Any other license that he/she might able to have for just connect and use that "given" dataset ?
    Thanks for your video, it really helps.

    • @austinlibal
      @austinlibal 7 місяців тому

      Great question!
      While an organization does need either a P SKU or F SKU to even leverage Fabric abilities in a workspace. A user who does not have an assigned license that is given permission to build reports on the Lakehouse "Dataset" or what they have recently renamed to "Semantic Model" is able to interact and access the dataset in the OneLake Data Hub. I have tested this and confirmed that it works in a demo environment.
      Thanks for watching and have a great day!

    • @antonzhong9970
      @antonzhong9970 7 місяців тому

      @@austinlibal Thanks for your prompt reply. Last question, so what is the benefit of using "Dataset" (from that OneLake Hub->PowerBI Dataaset) versus directly connect to the Lakehouse (from OneLake Hub->Lakehouses) ? Beside as per your example, the Dataset is to segregate which table can be used or not (Visible to user or not). Many thanks!

    • @austinlibal
      @austinlibal 7 місяців тому

      @@antonzhong9970 In the dataset you can define relationships that can be leveraged by the user downstream. You can also create measures in the dataset that will be used in reports. This is an attempt to get to one single source of truth and centralizing the logic across an organization.

  • @user-lj9fk8dg9h
    @user-lj9fk8dg9h 25 днів тому

    Sir,
    How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook.
    I have been finding it since a long time, but un-successful

  • @amarachinwokeugagbe3323
    @amarachinwokeugagbe3323 Рік тому +2

    Waiting for the files. Thank you for the class.

    • @danilcsete
      @danilcsete Рік тому

      @@austinlibal Sadly it is not possible to replay the live chat :/

    • @sorayaruiz8652
      @sorayaruiz8652 Рік тому

      me too

    • @austinlibal
      @austinlibal Рік тому

      ​@@sorayaruiz8652 The class files should be at the top of the description now. Thanks for your patience!

    • @austinlibal
      @austinlibal Рік тому

      @@danilcsete The class files should be at the top of the description now. Thanks for your patience!

    • @austinlibal
      @austinlibal Рік тому

      The class files should be at the top of the description now. Thanks for your patience!

  • @user-vt9lc7ng2p
    @user-vt9lc7ng2p 8 місяців тому +1

    Hi Austin
    While I followed your steps, there was no SQL endpoint created in my space (44:00). How do I resolve it?

    • @austinlibal
      @austinlibal 8 місяців тому

      Just to be clear, you're saying that when you created the Dataflow, there was no SQL endpoint that was created?

  • @pwasoo
    @pwasoo Рік тому +1

    Hi Austin, I am getting this error when trying to create a relationship between FactInternetSales & DImProduct in PowerBI Dataset (1:10:00).
    "Free User cannot apply model changes, upgrade to PowerBI Pro "
    How did you got it working?

    • @austinlibal
      @austinlibal Рік тому +1

      Hello! There will be limitations when working in this free Fabric trial when it comes to accessing a Power BI workload instead of a Fabric workload. The ability to alter a dataset would fall under that case. So it seems that you won't be able to alter that part unless you have a Power BI Pro license at a minimum. Sorry for that but it seems like you were able to follow along with the rest OK! Thanks for watching!

  • @theonlymattkone
    @theonlymattkone 7 місяців тому +1

    Does this work for multiple tiles that have the same data in it? Say I have Receipts by month from FY 2023, and I want to combine all of the files in a folder - similar to how I would in PQ on Excel or PBI Desktop - Could I do that if the files are being stored in the lakehouse?

    • @austinlibal
      @austinlibal 7 місяців тому

      Great question. You absolutely can. You are able to make a connection in Dataflows Gen 2 and combine all files in a folder into one query just like you could with Desktop Power Query. The "Files" folder in the Lakehouse is just a data lake container (folder) and you can have nested folders inside of it.

  • @tbe0116
    @tbe0116 4 місяці тому

    UA-cam: how many ads do you want to show?
    Pragmatic: Yes

    • @PragmaticWorks
      @PragmaticWorks  4 місяці тому

      All of our archived Learn with the Nerds webinars are available ad-free on our On Demand Learning platform. prag.works/odl-trial-yt

  • @user-yv2fw3cb8p
    @user-yv2fw3cb8p 7 місяців тому +1

    Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: 值无法更新。 why happen this error on dataflow Gen2

    • @austinlibal
      @austinlibal 7 місяців тому

      Hard to tell from that error message which I believe translated for me to, "The value cannot be updated," can you shoot me an email with a screenshot of the error and we can troubleshoot?
      Austin

  • @naktibaldauju
    @naktibaldauju 3 місяці тому +2

    Guys, good content but did you REALLY have to put ads like every 5 minutes? So monetization-hungry, eh

    • @austinlibal
      @austinlibal 3 місяці тому

      Sign up for our on-demand learning program for free! No ads! Thanks for watching!

  • @antonzhong9970
    @antonzhong9970 7 місяців тому +1

    Hi Austin, sorry to bother you again. But I get stuck with the dimDate table, is that after I'm using it in my Power BI Dekstop, I cannot "Mark as date table". Although I already add new column in my PBI Desktop using this DAX: Date = DATE(Dates[CalendarYear], Dates[MonthNumberOfYear], Dates[DayNumberOfMonth]). Turns out when "Mark as date table", only this column appear in the selection to choose column with, it is validated successfully but then error saying "The calculated column "Dates[]" cannot be used as a primary key of the table". Do you have any idea what's the problem, and how to mark this as a date table ? Thanks in advance.

    • @austinlibal
      @austinlibal 7 місяців тому

      I haven't tested this but it could come back to a similar concept with the previous conversation we were having about where you model, write DAX measures, etc. in the Lakehouse which is going to the Model view in the SQL Endpoint and marking as a date table there. When in PBI Desktop with a Fabric Lakehouse connection the model view is more of a view only and the service is where you define those things.

  • @clodola1
    @clodola1 9 місяців тому +1

    I created a new powerbi dataset from the adventureworks lakehouse, I connected to the PBI dataset via One Lake data hub in PBI desktop , I created a measure in Desktop , published the report to same workspace, the Report.pbix is still linked to the PBI dataset in the adventureworks lakehouse; The measure is visible in the Report but not the PBI dataset? Bug? I was expecting to see the measure also within the PBI dataset in the adventureworks lakehouse, To Test , I also created a measure with the same name via fabric within the same PBI dataset in the adventureworks lakehouse; PBI dataset illustrates the measure once but the report now illustrates a duplicate Measure (two measures , same name)

    • @austinlibal
      @austinlibal 9 місяців тому +1

      I believe that you should only be creating measures for the default dataset in the Fabric workspace. Creating it in the report will not reflect back to the service, so if the intention is to share this measure with others in the organization you would have to do it in the service.
      I'm not sure if the ability to create a measure with a Lakehouse Dataset connection should even be enabled in the Desktop as it should all be done in the service, creating relationships for example in a report is not allowed.

  • @notoriousft
    @notoriousft Рік тому +1

    Exercise files please.

    • @austinlibal
      @austinlibal Рік тому +1

      The class files should be at the top of the description now. Thanks for your patience!

  • @aadilmuhamed3133
    @aadilmuhamed3133 10 місяців тому

    Poda Motta