Download your certificate of completion after you finish this course: 📄 prag.works/202307-lwtn-certificate Student files ✔prag.works/lwtn-microsoft-fabric-files
00:03 Microsoft Fabric is an end-to-end analytics solution with full service capabilities. 06:30 Fabric enables a wide range of tools in the Power BI service. 17:53 Fabric is a one-stop shop for analytics, visualization, storage, sharing, and integration. 22:50 Create a workspace with fabric enabled and a lake house called Adventure Works. 32:36 Creating a connection and loading data to a Lake House using Power Query 37:20 Create tables in Adventure Works lake house and load data from Excel workbook 47:26 Different Microsoft products available. Live teachings provide in-depth learning. 52:39 Use historical versions of Delta table for data analysis and recovery 1:02:46 You can easily upload and analyze delimited text files in Power BI Lakehouse. 1:07:36 Clicking the new Power BI data set and selecting tables from the data lake 1:18:23 There are two ways to update data in the lake house: data flow and pipeline. 1:23:06 Data Factory pipelines and activities can be used to move and process data
Hey, would love to see you do a video on governance. How do you secure (delta) tables? Can you secure them in Lakhouse and have same security in Warehouse and PowerBI?
There's a pretty big conversation that would go into that and could be a good topic for a future LWTN session. The short answer is, it depends on where users are going to be accessing the underlying data of the delta tables. The Lakehouse accessed via spark, the SQL Endpoint accessed via SQL, and the Default Power BI dataset accessed with Power BI all have their own individual items in a Fabric Workspace and have different permissions for accessing each. You could give users the ability to access the Power BI dataset but never touch the SQL endpoint. From Microsoft: "Lakehouse sharing by default grants users Read permission on shared lakehouse, associated SQL endpoint, and default dataset. In addition to default permission, the users can receive: ReadData permission on SQL endpoint to access data without SQL policy. ReadAll permission on the lakehouse to access all data using Apache Spark. Build permission on the default dataset to allow building Power BI reports on top of the dataset."
Hey friends! Thanks for watching our Fabric LWTN session! I want to do some spin off videos from this session to continue discussing capabilities within Fabric and want to know what you are most interested in? Spark Notebooks? Data Warehouses? Datamarts? Let me know in the comments! Thanks!
Would love to learn more about how to use and leverage the delta lake files in fabric. Delta lake and lakehouse is a new concept for me so covering the basics would be great, have been watching your previous videos on the delta lake, do these all apply to fabric?
Hello. Thank you so much for sharing your experience and hands-on demo! I am interested in Spark Notebooks, at my workplace I am assigned to demo Fabric features and your vidos are tremendously helpful. Also, wanted to mention, I have not received my cert for using above link? nothing in spam folder neither, could you check.. the name it should be is Flora Dosmet
Hello sir, Thank you so much providing these productive videos. Today, I faced a challenge, and the solution I couldn't find elsewhere. That is How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?
Seems like the only way to do this according to Microsoft documentation is through Dataflows Gen2 at this time. There is a connector that allows you to bring data from SAP Hana. You can also do this with Spark notebooks but don't have any specific connectors to point you through with that. - Austin
I followed along with your walkthrough, but Fabric did not create a default power bi dataset and I have no button to create a new power bi dataset. I have a trail Fabic license and a PPU license for power bi - if this matters.
HI Austin ,Very informative .Just a question :Can we user One lake as Open search and key based search ? And also can we store PDF,html,images so that users can download then directly from data lakes based on key ?
Hey, first of all, thank you so much for the course. I have an issue. The tables (dimProduct and factInternetSales) only appear in the lakehouse menú. When I get into SQL end point and go into tables subfolder, doesn`t show the tables. What am I doing wrong? Grettings
If you expand the Schemas folder, then the Tables folder and nothing appears. Click on the ellipses (...) next to the Tables name and click on Refresh. It should reload the tables available. Let me know if that works or not.
@@austinlibal Hi austin. I tried refreshing the tables folder and nothing appear. I noticed up tha the conexión of my lakehouse (that lakehouse), is offline. Is it right? However, I tried to reconnect via OAuth protocol, only choice to reconnect the lakehouse connection but i can't. The error trying to change the credential (windows authentication) Unable to update connection credentials. We could not access the data source. Please make sure you have permission to access the data source and that your credentials are correct. Let me know if u kwow how to fix it, Austin. Thank you in advance
Thanks very informative , Q on the PBI dataset provisioned from a Lakehouse via directlake storage mode, the RBAC Security tab is blanked out , Is this not available yet? I cannot find anything online
Had not actually noticed that, my best guess would be that the security would only be controlled in the Power BI Service workspace that the dataset comes from. You're only going to be authoring reports in PBI Desktop and anything else that can be controlled in the service will be.
Awesome video, Austin. May I ask, at 1:16:00, when we want to connect and use Power BI Dataset in Power BI Desktop, if lets say the person is different person which is not create the whole OneLake and Lakehouse and this person only want to use the dataset which given by their company admin, will the person only need Power BI Pro license ? Any other license that he/she might able to have for just connect and use that "given" dataset ? Thanks for your video, it really helps.
Great question! While an organization does need either a P SKU or F SKU to even leverage Fabric abilities in a workspace. A user who does not have an assigned license that is given permission to build reports on the Lakehouse "Dataset" or what they have recently renamed to "Semantic Model" is able to interact and access the dataset in the OneLake Data Hub. I have tested this and confirmed that it works in a demo environment. Thanks for watching and have a great day!
@@austinlibal Thanks for your prompt reply. Last question, so what is the benefit of using "Dataset" (from that OneLake Hub->PowerBI Dataaset) versus directly connect to the Lakehouse (from OneLake Hub->Lakehouses) ? Beside as per your example, the Dataset is to segregate which table can be used or not (Visible to user or not). Many thanks!
@@antonzhong9970 In the dataset you can define relationships that can be leveraged by the user downstream. You can also create measures in the dataset that will be used in reports. This is an attempt to get to one single source of truth and centralizing the logic across an organization.
hey! fantastic video!!! awesome thx i have some schemas on my warehouse, how I can aim a dataflow to a specific one? i dont want to save everytinhg on dbo
Hard to tell from that error message which I believe translated for me to, "The value cannot be updated," can you shoot me an email with a screenshot of the error and we can troubleshoot? Austin
Does this work for multiple tiles that have the same data in it? Say I have Receipts by month from FY 2023, and I want to combine all of the files in a folder - similar to how I would in PQ on Excel or PBI Desktop - Could I do that if the files are being stored in the lakehouse?
Great question. You absolutely can. You are able to make a connection in Dataflows Gen 2 and combine all files in a folder into one query just like you could with Desktop Power Query. The "Files" folder in the Lakehouse is just a data lake container (folder) and you can have nested folders inside of it.
Hi Austin, I am getting this error when trying to create a relationship between FactInternetSales & DImProduct in PowerBI Dataset (1:10:00). "Free User cannot apply model changes, upgrade to PowerBI Pro " How did you got it working?
Hello! There will be limitations when working in this free Fabric trial when it comes to accessing a Power BI workload instead of a Fabric workload. The ability to alter a dataset would fall under that case. So it seems that you won't be able to alter that part unless you have a Power BI Pro license at a minimum. Sorry for that but it seems like you were able to follow along with the rest OK! Thanks for watching!
Do we have a way to track the transformations we have done using data flow, I mean logging the details that we applied pivot, groupby and merged table 1 and table 2 to create table 3
Thanks for the question! You can track the steps of the changes you make inside the data flow just like you can in the Power Query Editor in Power BI Desktop. You still have your applied steps on the right side of the screen. You also have the ability to look at the JSON on the Dataflow and can detect some changes to the data there as well. I'm not certain if there is a specific "Monitoring" section to see a higher level of all of the changes made across the entire workspace / OneLake. I'll keep looking into that though.
Hi Austin, what should I do if I am getting this error message -- TypeError: Cannot read properties of undefined (reading 'toResource') -- when trying to connect to a lakehouse (workspace) through the data pipeline process you have in the video. I don't have the Lakehouse in the list. thanks
Hi Austin, sorry to bother you again. But I get stuck with the dimDate table, is that after I'm using it in my Power BI Dekstop, I cannot "Mark as date table". Although I already add new column in my PBI Desktop using this DAX: Date = DATE(Dates[CalendarYear], Dates[MonthNumberOfYear], Dates[DayNumberOfMonth]). Turns out when "Mark as date table", only this column appear in the selection to choose column with, it is validated successfully but then error saying "The calculated column "Dates[]" cannot be used as a primary key of the table". Do you have any idea what's the problem, and how to mark this as a date table ? Thanks in advance.
I haven't tested this but it could come back to a similar concept with the previous conversation we were having about where you model, write DAX measures, etc. in the Lakehouse which is going to the Model view in the SQL Endpoint and marking as a date table there. When in PBI Desktop with a Fabric Lakehouse connection the model view is more of a view only and the service is where you define those things.
I have a question, can we dynamically pass a query to the data flow (I need a flow to read from txt file the query and copy to a destination) eg i have a file with tablename the df should read from source and copy to destination
There may be a way to do what you are describing by combining the elements of pipelines in Fabric and leveraging an activity that does the lookup for what needs to get copied and then passing that into a dataflow activity. I have not tested this but will look into it and see if I can figure out the limitations of such a workflow. Thanks for your question!
Sir, How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook. I have been finding it since a long time, but un-successful
I created a new powerbi dataset from the adventureworks lakehouse, I connected to the PBI dataset via One Lake data hub in PBI desktop , I created a measure in Desktop , published the report to same workspace, the Report.pbix is still linked to the PBI dataset in the adventureworks lakehouse; The measure is visible in the Report but not the PBI dataset? Bug? I was expecting to see the measure also within the PBI dataset in the adventureworks lakehouse, To Test , I also created a measure with the same name via fabric within the same PBI dataset in the adventureworks lakehouse; PBI dataset illustrates the measure once but the report now illustrates a duplicate Measure (two measures , same name)
I believe that you should only be creating measures for the default dataset in the Fabric workspace. Creating it in the report will not reflect back to the service, so if the intention is to share this measure with others in the organization you would have to do it in the service. I'm not sure if the ability to create a measure with a Lakehouse Dataset connection should even be enabled in the Desktop as it should all be done in the service, creating relationships for example in a report is not allowed.
Download your certificate of completion after you finish this course:
📄 prag.works/202307-lwtn-certificate
Student files
✔prag.works/lwtn-microsoft-fabric-files
Great video. Thank you for all of the information. ♥
00:03 Microsoft Fabric is an end-to-end analytics solution with full service capabilities.
06:30 Fabric enables a wide range of tools in the Power BI service.
17:53 Fabric is a one-stop shop for analytics, visualization, storage, sharing, and integration.
22:50 Create a workspace with fabric enabled and a lake house called Adventure Works.
32:36 Creating a connection and loading data to a Lake House using Power Query
37:20 Create tables in Adventure Works lake house and load data from Excel workbook
47:26 Different Microsoft products available. Live teachings provide in-depth learning.
52:39 Use historical versions of Delta table for data analysis and recovery
1:02:46 You can easily upload and analyze delimited text files in Power BI Lakehouse.
1:07:36 Clicking the new Power BI data set and selecting tables from the data lake
1:18:23 There are two ways to update data in the lake house: data flow and pipeline.
1:23:06 Data Factory pipelines and activities can be used to move and process data
Thanks for this!
Pragmatic is most underrated channel out there
Thank you, we agree. :)
Hi Austin, your tutorial has been most helpful! Thank you so much for sharing!
Thanks for watching and stay tuned for another session on 12/7/23 on the channel as well for more Fabric!
Hey, would love to see you do a video on governance. How do you secure (delta) tables? Can you secure them in Lakhouse and have same security in Warehouse and PowerBI?
There's a pretty big conversation that would go into that and could be a good topic for a future LWTN session. The short answer is, it depends on where users are going to be accessing the underlying data of the delta tables. The Lakehouse accessed via spark, the SQL Endpoint accessed via SQL, and the Default Power BI dataset accessed with Power BI all have their own individual items in a Fabric Workspace and have different permissions for accessing each. You could give users the ability to access the Power BI dataset but never touch the SQL endpoint.
From Microsoft:
"Lakehouse sharing by default grants users Read permission on shared lakehouse, associated SQL endpoint, and default dataset. In addition to default permission, the users can receive:
ReadData permission on SQL endpoint to access data without SQL policy.
ReadAll permission on the lakehouse to access all data using Apache Spark.
Build permission on the default dataset to allow building Power BI reports on top of the dataset."
great tutorial.. a simple yet thorough explanation.
Hey friends! Thanks for watching our Fabric LWTN session! I want to do some spin off videos from this session to continue discussing capabilities within Fabric and want to know what you are most interested in? Spark Notebooks? Data Warehouses? Datamarts? Let me know in the comments! Thanks!
Would love to learn more about how to use and leverage the delta lake files in fabric. Delta lake and lakehouse is a new concept for me so covering the basics would be great, have been watching your previous videos on the delta lake, do these all apply to fabric?
@@rosecraigie Thanks for your response! I'll put together something on working with delta lake in Notebooks in Fabric!
Would love to learn more about Data Warehouse, Datamarts and Spark Notebook.
Hello. Thank you so much for sharing your experience and hands-on demo! I am interested in Spark Notebooks, at my workplace I am assigned to demo Fabric features and your vidos are tremendously helpful. Also, wanted to mention, I have not received my cert for using above link? nothing in spam folder neither, could you check.. the name it should be is Flora Dosmet
spark notebooks and pipelines
Guys, good content but did you REALLY have to put ads like every 5 minutes? So monetization-hungry, eh
Sign up for our on-demand learning program for free! No ads! Thanks for watching!
Hello sir,
Thank you so much providing these productive videos.
Today, I faced a challenge, and the solution I couldn't find elsewhere.
That is
How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?
Seems like the only way to do this according to Microsoft documentation is through Dataflows Gen2 at this time. There is a connector that allows you to bring data from SAP Hana.
You can also do this with Spark notebooks but don't have any specific connectors to point you through with that.
- Austin
I followed along with your walkthrough, but Fabric did not create a default power bi dataset and I have no button to create a new power bi dataset. I have a trail Fabic license and a PPU license for power bi - if this matters.
Hi Austin
While I followed your steps, there was no SQL endpoint created in my space (44:00). How do I resolve it?
Just to be clear, you're saying that when you created the Dataflow, there was no SQL endpoint that was created?
HI Austin ,Very informative .Just a question :Can we user One lake as Open search and key based search ? And also can we store PDF,html,images so that users can download then directly from data lakes based on key ?
Hey, first of all, thank you so much for the course. I have an issue. The tables (dimProduct and factInternetSales) only appear in the lakehouse menú. When I get into SQL end point and go into tables subfolder, doesn`t show the tables. What am I doing wrong? Grettings
If you expand the Schemas folder, then the Tables folder and nothing appears. Click on the ellipses (...) next to the Tables name and click on Refresh. It should reload the tables available. Let me know if that works or not.
@@austinlibal Hi austin. I tried refreshing the tables folder and nothing appear. I noticed up tha the conexión of my lakehouse (that lakehouse), is offline. Is it right? However, I tried to reconnect via OAuth protocol, only choice to reconnect the lakehouse connection but i can't. The error trying to change the credential (windows authentication)
Unable to update connection credentials.
We could not access the data source. Please make sure you have permission to access the data source and that your credentials are correct.
Let me know if u kwow how to fix it, Austin. Thank you in advance
Thanks very informative , Q on the PBI dataset provisioned from a Lakehouse via directlake storage mode, the RBAC Security tab is blanked out , Is this not available yet? I cannot find anything online
Had not actually noticed that, my best guess would be that the security would only be controlled in the Power BI Service workspace that the dataset comes from. You're only going to be authoring reports in PBI Desktop and anything else that can be controlled in the service will be.
Awesome video, Austin. May I ask, at 1:16:00, when we want to connect and use Power BI Dataset in Power BI Desktop, if lets say the person is different person which is not create the whole OneLake and Lakehouse and this person only want to use the dataset which given by their company admin, will the person only need Power BI Pro license ? Any other license that he/she might able to have for just connect and use that "given" dataset ?
Thanks for your video, it really helps.
Great question!
While an organization does need either a P SKU or F SKU to even leverage Fabric abilities in a workspace. A user who does not have an assigned license that is given permission to build reports on the Lakehouse "Dataset" or what they have recently renamed to "Semantic Model" is able to interact and access the dataset in the OneLake Data Hub. I have tested this and confirmed that it works in a demo environment.
Thanks for watching and have a great day!
@@austinlibal Thanks for your prompt reply. Last question, so what is the benefit of using "Dataset" (from that OneLake Hub->PowerBI Dataaset) versus directly connect to the Lakehouse (from OneLake Hub->Lakehouses) ? Beside as per your example, the Dataset is to segregate which table can be used or not (Visible to user or not). Many thanks!
@@antonzhong9970 In the dataset you can define relationships that can be leveraged by the user downstream. You can also create measures in the dataset that will be used in reports. This is an attempt to get to one single source of truth and centralizing the logic across an organization.
hey! fantastic video!!! awesome thx
i have some schemas on my warehouse, how I can aim a dataflow to a specific one? i dont want to save everytinhg on dbo
Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: 值无法更新。 why happen this error on dataflow Gen2
Hard to tell from that error message which I believe translated for me to, "The value cannot be updated," can you shoot me an email with a screenshot of the error and we can troubleshoot?
Austin
Does this work for multiple tiles that have the same data in it? Say I have Receipts by month from FY 2023, and I want to combine all of the files in a folder - similar to how I would in PQ on Excel or PBI Desktop - Could I do that if the files are being stored in the lakehouse?
Great question. You absolutely can. You are able to make a connection in Dataflows Gen 2 and combine all files in a folder into one query just like you could with Desktop Power Query. The "Files" folder in the Lakehouse is just a data lake container (folder) and you can have nested folders inside of it.
Hi Austin, I am getting this error when trying to create a relationship between FactInternetSales & DImProduct in PowerBI Dataset (1:10:00).
"Free User cannot apply model changes, upgrade to PowerBI Pro "
How did you got it working?
Hello! There will be limitations when working in this free Fabric trial when it comes to accessing a Power BI workload instead of a Fabric workload. The ability to alter a dataset would fall under that case. So it seems that you won't be able to alter that part unless you have a Power BI Pro license at a minimum. Sorry for that but it seems like you were able to follow along with the rest OK! Thanks for watching!
Do we have a way to track the transformations we have done using data flow, I mean logging the details that we applied pivot, groupby and merged table 1 and table 2 to create table 3
Thanks for the question!
You can track the steps of the changes you make inside the data flow just like you can in the Power Query Editor in Power BI Desktop. You still have your applied steps on the right side of the screen. You also have the ability to look at the JSON on the Dataflow and can detect some changes to the data there as well. I'm not certain if there is a specific "Monitoring" section to see a higher level of all of the changes made across the entire workspace / OneLake. I'll keep looking into that though.
dans t-sql dans data warehouse il ya beaucoup de fonctionnalités manquant comme alter table add column... ya t´il des alternatives ?
Those features are being added in the coming months according to Microsoft’s release path.
Hi Austin, what should I do if I am getting this error message -- TypeError: Cannot read properties of undefined (reading 'toResource')
-- when trying to connect to a lakehouse (workspace) through the data pipeline process you have in the video. I don't have the Lakehouse in the list. thanks
Hi Austin, sorry to bother you again. But I get stuck with the dimDate table, is that after I'm using it in my Power BI Dekstop, I cannot "Mark as date table". Although I already add new column in my PBI Desktop using this DAX: Date = DATE(Dates[CalendarYear], Dates[MonthNumberOfYear], Dates[DayNumberOfMonth]). Turns out when "Mark as date table", only this column appear in the selection to choose column with, it is validated successfully but then error saying "The calculated column "Dates[]" cannot be used as a primary key of the table". Do you have any idea what's the problem, and how to mark this as a date table ? Thanks in advance.
I haven't tested this but it could come back to a similar concept with the previous conversation we were having about where you model, write DAX measures, etc. in the Lakehouse which is going to the Model view in the SQL Endpoint and marking as a date table there. When in PBI Desktop with a Fabric Lakehouse connection the model view is more of a view only and the service is where you define those things.
I have a question, can we dynamically pass a query to the data flow (I need a flow to read from txt file the query and copy to a destination) eg i have a file with tablename the df should read from source and copy to destination
There may be a way to do what you are describing by combining the elements of pipelines in Fabric and leveraging an activity that does the lookup for what needs to get copied and then passing that into a dataflow activity. I have not tested this but will look into it and see if I can figure out the limitations of such a workflow. Thanks for your question!
Sir,
How can we build the JDBC/Pyodbc connection between Fabric Data warehouse and Fabric Notebook.
I have been finding it since a long time, but un-successful
Austin my man! LY bro
UA-cam: how many ads do you want to show?
Pragmatic: Yes
All of our archived Learn with the Nerds webinars are available ad-free on our On Demand Learning platform. prag.works/odl-trial-yt
Waiting for the files. Thank you for the class.
@@austinlibal Sadly it is not possible to replay the live chat :/
me too
@@sorayaruiz8652 The class files should be at the top of the description now. Thanks for your patience!
@@danilcsete The class files should be at the top of the description now. Thanks for your patience!
The class files should be at the top of the description now. Thanks for your patience!
I created a new powerbi dataset from the adventureworks lakehouse, I connected to the PBI dataset via One Lake data hub in PBI desktop , I created a measure in Desktop , published the report to same workspace, the Report.pbix is still linked to the PBI dataset in the adventureworks lakehouse; The measure is visible in the Report but not the PBI dataset? Bug? I was expecting to see the measure also within the PBI dataset in the adventureworks lakehouse, To Test , I also created a measure with the same name via fabric within the same PBI dataset in the adventureworks lakehouse; PBI dataset illustrates the measure once but the report now illustrates a duplicate Measure (two measures , same name)
I believe that you should only be creating measures for the default dataset in the Fabric workspace. Creating it in the report will not reflect back to the service, so if the intention is to share this measure with others in the organization you would have to do it in the service.
I'm not sure if the ability to create a measure with a Lakehouse Dataset connection should even be enabled in the Desktop as it should all be done in the service, creating relationships for example in a report is not allowed.
Exercise files please.
The class files should be at the top of the description now. Thanks for your patience!
Poda Motta