As a non-technical guy suddenly finding myself pushing PBI and trying to transition from individual spreadsheets being loaded from reporting server, ssas seemed like a godsend - but also difficult to find info on. I learned so much and had multiple Aha!-moments watching this video, much appreciated
Thank you Bryan!. I just started my first job after graduating from university and was informed by my team that we are moving from SSAS to Azure Analytics Services. Your videos are a lifesaver. You provide an excellent explanation for the evolution of data models. I hope you continue making videos.
Just want to thank you for your work. I know that to present something complex and make it seem simple and logical is really hard and it requires some real talent. Great work!
Wow! I don't usually leave comments in UA-cam, but this was amazing! The real life examples you shared are very much appreciated! Looking forward to many more videos from you!
Wow superb ! I have never seen such a detail level of explanation in any videos so far and this requires a lot of hard work and in depth knowledge . Thanks Bryan and keep doing the great work. Big thumbs up for your effort
The world needs you Brian! :-) As ever, thanks. The knowledge you share is incredibly helpful in building a meaningful level understanding about the available tools, services and technologies :)
Amazing the way that you have explained! I really liked it! Thank you for this video! I was thinking "whata hell is tabular model?", and I got it! tks!
As well, if possible, please do a video on the management tasks for the tabular model: automating partitions; deploying to multiple environments (Dev, PAT, Prod); and scaling out to multiple users and regions (Azure). ☺️
Thank you Bryan! If possible, please do a video on the management tasks for the tabular model: how to create dynamic partition? automating partitions? how to automatically identify changes in data warehouse then identify the cube partitions need to be updated and be processed? Appreciated. Thank you
Couple comments: 1. The Vertipaq engine can fit into memory what is available to it. If i have a 400 GB compressed model, and 8 GB of memory on my machine, windows swaps to disk moving required data from disk to memory. It's not entirely in memory. Its not entirely a pleasant experience. 2. Pre aggregations now exist in Power BI. 3. Self Service from the ground up with data sources, models, and Dax, is typically beyond the average user. You cannot casually learn DAX, imho.
There are two ways to query data in Power BI, Import Model model mode and Direct Query which passes the queries to the back end sources to get the results. A Tabular Model is imported and according to the documentation "When queried or refreshed, the data must be fully loaded into memory of the Power BI capacity. Once in memory, Import models can then achieve very fast query results. It's also important to understand that there's no concept of an Import model being partially loaded into memory." Link: docs.microsoft.com/en-us/power-bi/service-dataset-modes-understand I did see an old blog about modifying the Vertipak configuration for on prem SSAS but that is not the default and I did not find anything about it for the Azure based services. If you have something about that, glad to see it.
Azure Analysis services is a service only available on Azure and only supports creating Tabular Models. SSAS is SQL Server Analysis Services, part of SQL Server on prem only. It supports multi-dimensional cubes and the Tabular Model. There are other differences in feature compatibility and Azure Analysis Services had built-in multi server scale support.
Thank you for use full explanation, but still i have confusion about Synapse where i can put it in this pipeline? in case I am using Visual studio with Data tool and deploying the model to AAS , what is the benefit to using Synapse? in other way is Azure Synapse is IDE like Visual Studio with more powerful features?
You don't need to use Synapse with AAS. Synapse is a scaled out data platform like Snowflake but most AAS models get data from SQL Server or even flat files.
I work for years with Multidimensional and Tabular Models and I do have a very critical question for many businesses, what if between tables in tabular we need to make a relationship between more that one column, for example two columns, where the first one could be the origin of the data and the second one the original primary key, because we may have consolidated data into a staging database from multiple systems with the same structure. In multidimensional, relationships supports multiple columns in comparison with tabular. Of course we can merge one or more columns into one into the Tabular but this is not good idea especially because engine is not going to take into consideration the database indexes.
So yeah. The Tabular Model does not currently support concatenated keys which is what I think you are referring too and yes, they are quite common. Ideally, during the dimensional model load, you could create a new surrogate key for each table which is a best practice anyway. Then use that to join your tables. It's more work but has benefits.
Then you should reconsider the following: Right now, also in Power Bi you can have a single source of truth with "Power BI datasets" or "Dataflows". Another thing is that you have in Power BI the possibility to use "group" in your data model to work with both Import and Direct Query mode at the same time, reducing the size of the .pbix .
Hi Bryan, Would you mind to clarify me on the following questions: 1.As soon as I load my DW to Power BI I can say that I have my Tabular model implemented?" Did I get it right? 2."You just use SSAS tabular(SQL Server Data Tools) to centralize all the data right?" Thank you.
Hi. Kind of. Power BI does create a Tabular Model. SSAS adds some additional features like perspectives and object security.. Its a bit like using SQLLIte vs. Enterprise SQL Server. However, the Power BI Premium Service in Azure is getting lots of enhancements that will provide many of the Analysis Services features right in the service. However, from a conceptual point of view, PBI Desktop, Azure Analysis Services, SSAS, and PBI Premium all use the Tabular model. Only SSAS support the older Multi Dimensional model.
When you build your data model directly in Power BI Desktop, is does create a local tabular model within the Power BI .pbix file. However, with Analysis Services you build a Tabular Model that is more scalable and external to the Power BI project. This allows it to be used from many Power BI reports, Excel, SSRS, and even Tableau. There are also additional features available when you build the Tabular Model in Analysis Services.
Yes. Power BI is a reporting tool, one of many. The DW is the data source. Power BI can come and go but you always need the data. The Tabular Model is just a way to speed things up and support data munging.
Doh! I need to do that one. Do you use or planning to use Analysis Services. Microsoft seems to be moving away from it and building it into Power BI Premium.
Hi Bryan...I am trying to connect Azure SQL DW in Direct Query mode with SSDT and get this error - Failed to save modifications to the server. Datasource has authentication kind OAuth2. This is not supported in DirectQuery 1400 mode. I read in the MS docs that Direct Query mode is not supported with OAuth credentials. Need help.
Hi Ajit, Check this article: social.msdn.microsoft.com/Forums/en-US/a83e3cef-8073-4b78-9f65-23231ac13617/getting-quotnot-supported-in-directquery-1400-modequot-error-with-oauth2-authentication?forum=AzureAnalysisServices From this documentation, it looks like oath2 is supported by Azure DW, docs.microsoft.com/en-us/power-bi/connect-data/service-azure-sql-data-warehouse-with-direct-connect Look at the Single Signon section. Not sure what your requirements are but maybe selecting a different authentication method would fix this, if possible.
@@BryanCafferky The 1st article does not suggest a full proof solution. The work around suggested is to have a SQL server authentication but that won't be viable for the end users as the requirement is to pass the users credentials(impersonate user) to the source DB as there is RLS implemented in the source. The 2nd article is for Power BI and there I am able to connect to the source and build my model in the Direct query mode. Thanks for your reply
thanks Bryan, from Jim in Oz....it helps substantiate the information floating around in my head. Do you have a demonstration showing the use of Power Bi consuming the SSAS service?
Awesome Video Bryan! A couple of questions about the SSAS piece specifically: 1. You mention the model exists in text form so you can put it into Git. That makes sense, but can you elaborate more on some more sophisticated development features. I'm interested specifically in collaboration, having multiple branches / developers editing the same model, merge conflict handling, and setting up multi-stage environments (i.e. dev --> QA --> Production). What are some best practices you've seen? 2. In terms of joining data in the SSAS via that spider-web type diagram - does SSAS have support for join logic that might go beyond simple primary to foreign key relationships? i.e. additional "and" clause, or joining the same table in a model multiple times. Thanks!
Hi Assan, 1. The model is stored in JSON text files and Git can be used with these. However, the tabular model is basically one large object and I would be careful trying to do a lot of parallel development on it as there are many dependencies. Automated testing of models is also a challenge. There is a model comparison tool called BISM Normalizer that compares the files and allows you to deploy selected parts. A great video on on this is at ua-cam.com/video/44I48ufKhOs/v-deo.html, the BISM Normalizer part is about half way through. Data-centric projects are coming along fast but I don't think it is at the level of web/mobile app DevOps. They pose some different challenges. 2. For complex join logic, I would recommend using a view in the source database. This offers advantages in maintenance, i.e. you don't have to modify your model and performance optimization. Tabular models will load the data into memory on refresh (except for direct query) so you get the performance improvement on the refresh and also a simplified tabular model design process. Plus, other tabular models could take advantage of the view. I don't think the model designer is meant for complex joins. Thanks, Bryan
@@BryanCafferky Great Video! I always suspected there was a roadmap to merge the development code between Power BI Services and SSAS Tabular, but it is hard to find much information on it. The March 2020 version of Power BI has of preview of "Enhanced Dataset Metadata". Do you have some information (other than the microsoft docs), opinion, etc on this? Thx in advance
@@tomchelle1 I don't have anything on the code bases, not sure Microsoft publicizes how the code it maintained. Slight correction, Azure Analysis Services and Power BI Premium are merging in terms of functionality (not SSAS which is on prem). However, Microsoft has been working on this for at least 2 years and is still in progress. Scale is on the side of AAS. I still like the manageability of a separate AAS vs. merging them but in time, Power BI Premium may convince me otherwise. This blog has a good explanation. Bear in mind preview can last for a long time and features can change. I would not recommend using any preview features in production. powerbi.microsoft.com/en-us/blog/power-bi-premium-and-azure-analysis-services/
@@BryanCafferky Thanks for sharing wonderful information. I know this video is dated in 2018. With the Power BI Dataset sharing option across reports, doesn't it solve the Enterprise Self service BI in Power BI Premium? or Is there a real need to go to AAS?
@@Vasanthv123 Thanks. Microsoft might like you to think that, but no. One, that feature is new, in preview, and undergoing changes. You cannot even assume it will be retained in the future. Second, reading the docs, it looks more like a work-around, then a clean architecture. Third, AAS, provides better scale and a centralized source. The new feature looks like it may be unwieldy to manage. Often new features sound good but the devil is in the details. I would not rely on a sales pitch. You need to really test it out. See limitation at docs.microsoft.com/en-us/power-bi/connect-data/service-datasets-across-workspaces
Watching this one for a second time...this really makes me wonder if most of our analysis and reports are on Power BI, why do we have to maintain the star schema models in our EDW in Azure? Thanks.
I look at it like this. Power BI is a data consumer, i.e. a set of reports. The data warehouse is the source of the reports which needs consistency, accuracy, etc. It supports things like slowly changing dimensions and grain level data. You could drop Power BI and still use the EDW. In fact, the EDW can and should be used by many reporting and analytics tools; Tableau, Cognos, Data Scientists (Python, Spark, R, Jupyter, etc.). I think Microsoft obscures this by adding so much data side functionality into Power BI but I would avoid being drawn into the all in Power BI mindset. Make sense?
This is clear and nice short demo. One quick question: I have bunch of measures, shall I create in Power Report level or SSAS Tabular Model level.? my power bi reports are very slow and I have used liveconnect SSAS Tabular model Thank You! Vishwanath
If the measures are useful in different reports, it would probably be better to add them to the tabular model. It would probably perform better too. I assume you are using Power BI Report Server on SQL Server?
@@BryanCafferky Thank you so much for your reply.... My Power BI reports are published in Power BI service and those reports are connected to SSAS tabular model, reports are refreshing data through data gateway. need to check... Thanks again! - Vish
@@vishwanathr3507 If you don't have Express Route, that would probably be too slow. Could you replicate the SSAS model to Azure Analysis Services? It should perform much better there. Or you may be able to upload the model to Power BI Premium which will also perform better.
Hi Bryan, is it possible to do a business view of the data with Microsoft Tabular model. In Cognos, we have a business layer which shows all underlying data in a view that is easy to understand for the business and it mirrors the screens of our transaction system. One entity like a "Registration" in the business layer may have 10 tables behind the scenes, I don't want business to have all this information in 10 different tables in the tabular model.
Hi, Ideally, the data should be in a dimensional model data warehouse structure. Sounds like you are building off an OLTP data model which results in many tables. You could build SQL views that consolidate the data for the Tabular model. You can also do some restructing in the Tabullar model import but the SQL view would probably be a better approach. PBI Workflows learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-create Be careful of the cost of that. Also see powerbi.microsoft.com/en-my/blog/introducing-power-bi-data-prep-wtih-dataflows/
I learned a lot from your videos and would like to thank you for your efforts, Regarding to Power bi dataflows can we use it with Power bi instead of using Azure Analysis services with power bi?
Thanks. Power BI data flows are more focused on sharing data among applications with a Common Data Model (CDM). Analysis Services supports the Power BI Tabular Model, a highly optimized and scalable data format native to Power BI. I think they serve different purposes.
Thanks for asking. Pasted below. Also, at the end of each video several other videos of mine are displayed. Video What is Analysis Services? ua-cam.com/video/Y0hWQh7EJME/v-deo.html Video: The Tabular Model and WHy You Need It? ua-cam.com/video/R7k4qJ120dY/v-deo.html
Hi Bryan, thanks for the nice intro into the different models. I have a question considering your demonstration in Power BI. You joined the geography to territory and territory to the facts. Aren't the geography-data completly useless with these facts, because there is no connection?
Thanks. Good catch. Yes. I missed that when I recorded this and saw this when I did a demo recently. The sales data can only be related to the sales territory and since there can my multiple territories in a geography, it is not valid to use the related geography elements. This is also an example of Snowflaking. Power BI can mask this type of thing because it gives you an answer without telling you the relationship is not really valid for reporting. Maybe a video on things to watch out for in Power BI is in order.
This is an excellent video . . real experiences shared here. . Thanks a ton Sir. . Two question if you may help: 1.Okay i got that we really need analysis services . .now notice that at times we have the same model replicated from warehouse to ssas beause of what you explained in video . .But then question is why we have data wareouse then .Can't we directly create the dimention model in SSAS itself ? 2. Do data scientist(not referring to BI folks) use data warehouse . .I kind of get confused when I hear data dcientist work mainly with unstructured data . . Now If you say data scientist use data warehouse , then here we already have relationship discovered. . right . because Data scietist is supposed to uncover new relationship in data . . So the precise question I datacientist use data ware house as primary source. ?
Thanks for your comments. 1) No. The data warehouse is the foundation upon which reporting solutions are built and a semantic model like a Tabular Model is part of a reporting solution and specific to a commercial technology, i.e Power BI. Many different reporting and analytics solutions can be built using a DW. Also, SSAS does not support the creation and constructs to create a DW, i.e. row-level data maintenance, surrogate keys, slowly changing dimensions. I made that mistake on my first SSAS MDX model and it was horrible maintaining the solution. 2) Data Scientists take the data from where it is lies and in whatever format, i.e. images, video, sound, structured, unstructured. You don't get to choose. You use what you need to solve the problem. Many times there are data warehouses to support solutions like sales forecasting, fraud detection, etc. But many times they don't exist or can't help solve the problem. Also, a DW explicitly enforces data relationships. This is not discovery. It built into the design. Discovery is finding relationships like customers who drink coffee buy ten times more sugar than those that that don't or when product A is presented next to product B on the screen, product A and B are likely to be ordered together. Hope that helps.
@@BryanCafferky Thanks a ton . . this are real insight . . One last question .you may reply or not as per your comfort. . Sorry for another last question for now . :) From what POINT Business intelligence pro(Not referring to DS) role start . . Is it from the Point of Pulling data from Data ware house , creating cube and building dashboard ? To be clear with question : In other words ,I am tring to understand . is it BI Professional responsitiblity to get the data from different sources ,do ETL and then into the Datawarehouse or BI pro role start after data is already there in Data ware house. Reason of confusion is I see job role as Data warehousing position. . so I am confused whose job is getting data into Data warehouse . Agreed we may not have strict boundary but I am referring to wide majority of the cases as per you Agreed no one stopping BI folks to do ETL and dwarehouse . but in reality given the time constraints it might not be possible for one person to focus on so many areas. So you may please guide on above
@@techproductowner Yeah. Great question and part of the confusion is the evolution of the roles. When I started as a BI Engineer, it tended to focus on the ETL through to building the DW and possibly semantic model like Power BI Tabular Model using SSAS but did not include reporting or visualizations. This has changed b/c Big Data and Data Science emerged but not everyone has caught up. Generally now, someone who extracts the data, develops the ETL, and builds the data architecture is a data engineer (big data or small - could be either or both). BI Developers and like roles tend to focus on the BI tool like Tableau or Power BI and the reports/visuals. This mayl require building semantics models but most of the heavy lifting is already done and the emphasis is on the reporting solution. There is much more business analysis in this role My advice is to largely disregard the titles and analyze what the employer expects from the role. Often they require x, y, and z but consider a and b nice to have (which usually means you'll never do those but employers want everything). Beware of employers who say the role currently is X but will change to Y. It is usually just to get you in and will not transition as promised. That's been my experience. Make sense?
@@BryanCafferky Yes that is why I am askign you . . from your lecture I can sense that you have been there . thats the only reason I thought atlest let me try asking you . . and thankfully you replied. . i am clear now.
Hi, Bryan. First off, great video. Thanks. When you started with Power BI and created your model, I assumed the next step would be that you could pull that same model into Visual Studio (or into Azure Analysis Services) and use it there, enabling various tools to connect with that model. Instead it seems you re-created the model in VS as though you hadn't already made one. Was there no way to import the work you had already done in PBI?
Azure Analysis Services has a new feature in preview that allows you to import a Tabular Model from a Power BI project file. See this link for information on it. docs.microsoft.com/en-us/azure/analysis-services/analysis-services-import-pbix From there you could download and maintain it from Visual Studio.
Hi Joe, Generally, I think Power BI is considered the successor to SSRS. Power BI supports rich data visualizations far beyond the ability of SSRS. Microsoft is all-in on Azure and Power BI is the Azure-based reporting service. Yes, SSRS reports can be migrated to Power BI but this seems to be more for lift and shift than anything else. The bottom line is that if you want to keep your skills relevant, I recommend you learn Power BI. Make sense?
Thanks and yes. I think the next topic might be Slowly Changing Dimensions (Type 1 - 5) as this is often an interview question. What are you most interested in?
@@BryanCafferky Thank you in advance for the answer and the videos. First of all, I want to learn this. Let's say we have created our data model, does this model present the most up-to-date data to the BI tools that call it on each call? Or do we have to Deploy every time? Or is there a feature like automatic Deploy to keep data always up to date?
@@kenanunkesti Hi Kenan, There are 2 ways to define how data is refreshed. One I direct query that always goes back to the source data to get the latest. This can severely slow things down and should only be done if really necessary. The other is to set a model refresh schedule in Power BI, i.e. tell it how often to refresh the data. This will perform better and you can set it to refresh as often as you like but there may be some additional cost, i.e. based on service level. See docs.microsoft.com/en-us/power-bi/report-server/configure-scheduled-refresh Thanks, Bryan
@@BryanCafferky As I understand the server will need a Report Server service and cost. You asked me what I wanted to learn in the first question. I will be very happy if you make videos about DAX and Multi Dimensional Model. Thank you very much for your effort and answers :)
@@kenanunkesti Just to confirm. Are you deploying Power BI reports on-premise or to the Power BI service? Also, may I ask why you plan to use Multi-Dimensional models vs. the Tabular? The Tabular model is easier to create and generally performs better?
Hey Bryan! I am facing issues with deploying SSAS tabular model. It takes a very long time to deploy even with calculation mode turned to manual and processing turned off. The size of my cube is ~5000MB. It would be great if you can help with diagnosis
Hi Jatin, Often the issue is not with SSAS but with the query used to refresh the data. Confirm you can run that query completely in a short time from the same type of connection as SSAS is using, i.e. same connection speed. Otherwise, this blog may help stackoverflow.com/questions/39018032/ssas-tabular-model-slow
Maybe I missed it, but how do you define how often the data in the tabular model is refreshed? At 40:45 it seems like you published and it did a one time population of data. I understand it's stored in memory and pulls from views you defined against the source data source, but how/when does it decide to refresh the in-memory model from the data source?
Hi Aaron, You are correct, I did not cover refreshing the data. This is done once the model has been deployed to Analysis Services. It is scheduled and there are a few ways the model can be refreshed. Using SQL Server Management Studio, you can manually do a fresh. See docs.microsoft.com/en-us/analysis-services/tabular-models/process-database-table-or-partition-analysis-services?view=asallproducts-allversions There are several ways to automate processing the model (aka refresh) with my favorite being using Azure Automation. If you have SQL Agent, that's even better. Automated Processing on SQL Server info www.sqlshack.com/how-to-automate-ssas-tabular-model-processing-in-sql-server-2016/ On Azure using Azure Automation see docs.microsoft.com/en-us/azure/analysis-services/analysis-services-refresh-azure-automation You can also use Azure Functions or the Analysis Services REST API. Gee! This might be a good video to do. Thanks!
@@BryanCafferky Thanks for the really thorough response. I have tried to keep an open mind about tabular models, and view them as just a different approach, especially since Azure SSAS as a service option is tabular only. The more I learn about how tabular is managed though, the less it feels like an enterprise solution and more like an adhoc self-service solution. I can see it certainly can be built out to be an enterprise solution, but it just seems like the implementation and deployment workflow is lined up for someone trying to get something going quickly. I mean, data currency is a given for anything that has viability as an enterprise solution, and it not being a first class feature that requires external plumbing gives me a funny feeling.
@@AaronShumaker Yeah. Microsoft is definitely pushing to make Power BI Premium an all-inclusive service but, to your point, I think keeping the model separate by using Azure Analysis Services is more scalable and supportable. Using Visual Studio/Data Tools, you have a BI developer process to create and maintain the models and a separate side to develop and maintain reports. If you go to far with self-service, you end up with questionable reports. Thanks.
Great video and hi from the Netherlands. I have two questions. Firstly, whats the difference between the tabular model and snowflake model given that the model used here looks like snowflake as well. Also, is right to say every imported data to power BI runs on a tabular model? And if i later create a star schema from the imported data, is it safe to say the model now changes to a star schema? Thanks a lot for your reply in advance and keep delivering great content
Hi Kinsley, I think you are confusing a methodology (Dimensional Modeling and the Star Schema) with a technology (tabular model). The tabular model is compressed in memory table used by Power BI that can be loaded from many sources. It works well with Dimensionally modeled data but that is not a requirement. Dimensional modeling is to data warehousing what the laws of normalization are to a good OLTP design. Make sense?
Great question Andreas. When possible, I think reformatting the data to be consumed by reporting is better which can be dimensionally modeled or at least restructured with end user reporting in mind. Source systems, OLTP, data is designed for data maintenance optimization, not reporting, and causes many issues when used 'as is'. Also, avoid pointing to any production systems data as this can cause performance impact on that system. Reporting data sources should be separate from data used by production applications as a best practice. Caveat: There can always be exceptions to any rule, but these are guidelines I use. Perhaps this could be a video. :-) Thanks
@@andreasratz3894 This blog post may help, gist is "Power BI is simply most performant when the data it’s using comes from a dimensional design." www.blue-granite.com/blog/the-importance-of-structure-in-analytics-part-2-the-dimensional-model-and-power-bi
At 16:12 , you mentioned that the snowflake schema doesn’t pose a problem for the tabular model because it converts it into a single large table. However, I’m confused because most sources recommend using the star schema for modeling in Power BI. They claim that SQL queries against the model perform better with fewer joins. How can the chosen data modeling approach not impact performance? Also, Is dimensional modeling used for both tabular models and multidimensional models, or is it specific to one of them? Thank you.
Good catch and thanks! Yeah. This video was a long time ago and I have since learned the tabular model is a living dynamic and not always easy to control entity. I tried to delete that comment I made but if that does not take, I'll take down the video to avoid confusion. On your question, Dimensional Modeling is ideal for both Multidimensional and Tabular models.
@BryanCafferky Thank you, this has clarified many doubts I had. Also, please do not remove the video as it provided a lot of clarification (although pointing it out in the description section may be good). I would kindly ask if you could take the time to answer some of my questions. I am eager to know your opinion on the following: 1. How will the data model impact the performance of tabular models if the queries are run against the compressed version in memory (not the data model itself)? As far as I understand, the structure of the compressed version does not differ between one specific data model and another. 2. I have started reading the book you recommended (Tabular Modeling in Microsoft Analysis Services, Second Edition). Do you think this book is still relevant? (P.S.: I'm a beginner, so any recommendations for resources would be greatly appreciated.) Thank you.
@@alaamohammed3491 This blog by the authors of the book, explain why the Star Schema should be used as the source of a Power BI model. www.sqlbi.com/articles/power-bi-star-schema-or-single-table/ They have a web site at sqlbi.org and a lot of excellent training. Some is free and valuable. Their UA-cam channel is at www.youtube.com/@SQLBI I am not a deep dive expert on Power BI and these guys helped me realize that what happens under the covers is complex. So my recommendation is to leverage them as a resource.
Yes. It is just a text file. See this for more social.msdn.microsoft.com/Forums/en-US/399824b4-c0e4-457b-9d88-1a30394ecdaa/tabular-model-version-control-visual-studio-2019?forum=sqlanalysisservices
Absolutely Yes. In fact, the Tabular Model is much better suited to that than the Multi-Dimensional model. To make your life easier for support and performance tuning, I would try to use SQL views to pull the data together when possible. This makes the model simpler, pushed to query to the backend database saving Power BI work, provides your model with a layer of insulation from source data changes, and you can optimize the View's performance independent of the Tabular Model.
Thanks Bryan, its good explanation, i request that, can SSAS as data source to use in Power BI and build a report and publish in Power BI serve, is there any video like this and i would to know how to integrate SSAA->Power BI->Any ERP, can u please explain this, it would be grateful.
Hello, How can we connect on premise HANA ( calculation view ) inside SSDT for modelling. The Web version of AAS is discontinued starting March. Thank you. Can you list down the steps...
@@sid0000009 See these links: docs.microsoft.com/en-us/azure/analysis-services/analysis-services-datasource and this visualbi.com/blogs/microsoft/azure/importing-different-data-tables-sap-microsoft-azure-analysis-services/ Note: Tabular 1400 and higher models only. HANA has a source connector SAP HANA should show up as a source when you do a data import.
thanks for the link.. yes read that before. want to know in such cases the computation would happen in HANA server ( being direct query) and the result would be cached in AAS server. Is that how its expected to work? Is my understanding correct on the topic...It would mean I am using the computation power of my existing on prem while utilizing the cache options from Azure to makes things faster..
TIP: The authors of the recommended book offer two free courses on their site, www.sqlbi.com/training/ Introduction to Data Modeling for PowerBI, and Introduction to DAX
From the business side , i feel datawarehouse and cubes are becoming irrelevant. Connecting directly to OLTP using powerbi or tableau solves 90% of all requirements.
Hi Joe, OLTP databases are not designed to return query results at click speed. It may work for small datasets but not larger ones. The need is growing for the Tabular Model and Power BI uses it as its native format so people are using it whether they know it or not. Also, data warehouses are making a big comeback. Look at Apache Spark Delta Lake and you see data warehousing on Spark. Thanks
I love how you explain the concepts in a bigger vision. I am a fresh grad starting a BI job and this is extremely helpful!
As a non-technical guy suddenly finding myself pushing PBI and trying to transition from individual spreadsheets being loaded from reporting server, ssas seemed like a godsend - but also difficult to find info on. I learned so much and had multiple Aha!-moments watching this video, much appreciated
Glad it helped. Thanks.
Thank you Bryan!. I just started my first job after graduating from university and was informed by my team that we are moving from SSAS to Azure Analytics Services. Your videos are a lifesaver. You provide an excellent explanation for the evolution of data models. I hope you continue making videos.
Thanks. Really glad they are helpful. Note: I think you mean Azure Analysis Services which is just SSAS on Azure but only supports the Tabular Model.
Just want to thank you for your work. I know that to present something complex and make it seem simple and logical is really hard and it requires some real talent. Great work!
Thanks!
Wow! I don't usually leave comments in UA-cam, but this was amazing! The real life examples you shared are very much appreciated! Looking forward to many more videos from you!
Thanks Karan!
Thanks Bryan for a person who didn't knew anything about SSAS you made me understand it.
I am new to BI and have some familiarity with SQL , using your videos to build the concept and loving it. Thanks for this.
Do you have any video to learn about DAX and MDX?
Great to hear!
I have paused video in the middle to comment...thanks for explaining everything in details..It really helps me to understand...
Thanks. Glad it is helpful.
Wow superb ! I have never seen such a detail level of explanation in any videos so far and this requires a lot of hard work and in depth knowledge . Thanks Bryan and keep doing the great work. Big thumbs up for your effort
Brilliant. Great presentation of Tabular Model and its capabilities. Loved it.
Thank you very much, really cleared the concepts. Looking forward to the next video in the series, please post !!, really appreciate.
Excellent, well matured explanation.. thanks..
You're Welcome. Thanks for watching and please subscribe.
This was awesome. So well explained. Thank you for your time and effort.
Excellent work my man Bryan, really loved your Analysis Services videos! Thanks for sharing this awesome work experience and knowledge of yours!
Thanks so much Bryan. You explain so clearly, and use simple examples. Like it very much.
Thanks. Glad it was helpful!
The world needs you Brian! :-)
As ever, thanks. The knowledge you share is incredibly helpful in building a meaningful level understanding about the available tools, services and technologies :)
Excellent video! It's called "M" for the Mashup Engine, the engine behind Power Query. ☺️
So well explained! Thank you sir!
ABSOLUTELY BRILLIANT. thanks a million Bryan
Thank you Bryan, I got Power Query, Power BI and SSAS in Visual Studio all going right now. Your explanations are simple and easy to follow.
Thanks
Googled a lot and found this video. Well explained and easy to follow. Thank you Bryan, this really helps.
Amazing the way that you have explained! I really liked it! Thank you for this video! I was thinking "whata hell is tabular model?", and I got it! tks!
So well explained ! Thank you for doing what you’re doing and sharing knowledge
YW. Thanks for watching.
thanks a lot, sir. your lucid explanation taught me a lot
YW. Thanks for watching.
Great explanation! Very useful. Thank you Bryan!
Where is Part 2? Loved this one :)
Doh! I need to fix this title but this would be the 2nd video.
As well, if possible, please do a video on the management tasks for the tabular model: automating partitions; deploying to multiple environments (Dev, PAT, Prod); and scaling out to multiple users and regions (Azure). ☺️
Great video Bryan.. Explained the details in a simple way.. Can you please share the links for part 2 and 3
Yeah. Been distracted by Big Data tools lately. :-)
This is a great video and gets straight to the point whilst providing a very good overview. Thank you very much!
YW
Thank you very much. Well explained and easy to follow.
Thanks.
Great content and superb delivery. Thank you.
Learned so much from this, thank you!
Glad to hear. Thanks for watching.
My first comment on UA-cam goes to this awesome video Bryan. Loved the way you have explained in detail. Looking forward to your future sessions.
Thanks!
Thank you Bryan! If possible, please do a video on the management tasks for the tabular model: how to create dynamic partition? automating partitions? how to automatically identify changes in data warehouse then identify the cube partitions need to be updated and be processed? Appreciated. Thank you
Great suggestions!
Great video, full of wisdom. Thank you.
Great intro and demo, thanks!
Couple comments:
1. The Vertipaq engine can fit into memory what is available to it. If i have a 400 GB compressed model, and 8 GB of memory on my machine, windows swaps to disk moving required data from disk to memory. It's not entirely in memory. Its not entirely a pleasant experience.
2. Pre aggregations now exist in Power BI.
3. Self Service from the ground up with data sources, models, and Dax, is typically beyond the average user. You cannot casually learn DAX, imho.
There are two ways to query data in Power BI, Import Model model mode and Direct Query which passes the queries to the back end sources to get the results. A Tabular Model is imported and according to the documentation "When queried or refreshed, the data must be fully loaded into memory of the Power BI capacity. Once in memory, Import models can then achieve very fast query results. It's also important to understand that there's no concept of an Import model being partially loaded into memory." Link: docs.microsoft.com/en-us/power-bi/service-dataset-modes-understand
I did see an old blog about modifying the Vertipak configuration for on prem SSAS but that is not the default and I did not find anything about it for the Azure based services. If you have something about that, glad to see it.
great video :) but i have a confusion, what is the difference between Azure analysis service and SSAS? i there a video that explains it?
Azure Analysis services is a service only available on Azure and only supports creating Tabular Models. SSAS is SQL Server Analysis Services, part of SQL Server on prem only. It supports multi-dimensional cubes and the Tabular Model. There are other differences in feature compatibility and Azure Analysis Services had built-in multi server scale support.
Hey man, tks for the lesson! Just found out your channel and what a finding! Keep up the good work
Glad it helped! Thanks for watching and please tell your friends about me.
Thank you for use full explanation, but still i have confusion about Synapse where i can put it in this pipeline?
in case I am using Visual studio with Data tool and deploying the model to AAS , what is the benefit to using Synapse?
in other way is Azure Synapse is IDE like Visual Studio with more powerful features?
You don't need to use Synapse with AAS. Synapse is a scaled out data platform like Snowflake but most AAS models get data from SQL Server or even flat files.
I work for years with Multidimensional and Tabular Models and I do have a very critical question for many businesses, what if between tables in tabular we need to make a relationship between more that one column, for example two columns, where the first one could be the origin of the data and the second one the original primary key, because we may have consolidated data into a staging database from multiple systems with the same structure. In multidimensional, relationships supports multiple columns in comparison with tabular. Of course we can merge one or more columns into one into the Tabular but this is not good idea especially because engine is not going to take into consideration the database indexes.
So yeah. The Tabular Model does not currently support concatenated keys which is what I think you are referring too and yes, they are quite common. Ideally, during the dimensional model load, you could create a new surrogate key for each table which is a best practice anyway. Then use that to join your tables. It's more work but has benefits.
Great video definitely deserves a like! Thanks Bryan!
Great! Thanks for the feedback.
Thank you so much Brian.
You're Welcome! Hope it was helpful.
This is what I am looking for, why using power bi vs analysis services
Then you should reconsider the following: Right now, also in Power Bi you can have a single source of truth with "Power BI datasets" or "Dataflows". Another thing is that you have in Power BI the possibility to use "group" in your data model to work with both Import and Direct Query mode at the same time, reducing the size of the .pbix .
so good explanation ! thanks!
Hi Bryan,
Would you mind to clarify me on the following questions:
1.As soon as I load my DW to Power BI I can say that I have my Tabular model implemented?" Did I get it right?
2."You just use SSAS tabular(SQL Server Data Tools) to centralize all the data right?"
Thank you.
Hi. Kind of. Power BI does create a Tabular Model. SSAS adds some additional features like perspectives and object security.. Its a bit like using SQLLIte vs. Enterprise SQL Server. However, the Power BI Premium Service in Azure is getting lots of enhancements that will provide many of the Analysis Services features right in the service. However, from a conceptual point of view, PBI Desktop, Azure Analysis Services, SSAS, and PBI Premium all use the Tabular model. Only SSAS support the older Multi Dimensional model.
When you build your data model directly in Power BI Desktop, is does create a local tabular model within the Power BI .pbix file. However, with Analysis Services you build a Tabular Model that is more scalable and external to the Power BI project. This allows it to be used from many Power BI reports, Excel, SSRS, and even Tableau. There are also additional features available when you build the Tabular Model in Analysis Services.
Great video! Thanks for sharing.
YW! Thanks
Thanks for the great video! I have a question, I hope you'll answer it. With Power BI (Tabular model), are still data warehouses still necessary?
Yes. Power BI is a reporting tool, one of many. The DW is the data source. Power BI can come and go but you always need the data. The Tabular Model is just a way to speed things up and support data munging.
Not to mention you can do source control comparison in SSAS in clear text, not binary files in Power BI
Great explanation! Where can I find part 2?
Doh! I need to do that one. Do you use or planning to use Analysis Services. Microsoft seems to be moving away from it and building it into Power BI Premium.
Hi Bryan...I am trying to connect Azure SQL DW in Direct Query mode with SSDT and get this error - Failed to save modifications to the server. Datasource has authentication kind OAuth2. This is not supported in DirectQuery 1400 mode.
I read in the MS docs that Direct Query mode is not supported with OAuth credentials.
Need help.
Hi Ajit,
Check this article: social.msdn.microsoft.com/Forums/en-US/a83e3cef-8073-4b78-9f65-23231ac13617/getting-quotnot-supported-in-directquery-1400-modequot-error-with-oauth2-authentication?forum=AzureAnalysisServices
From this documentation, it looks like oath2 is supported by Azure DW, docs.microsoft.com/en-us/power-bi/connect-data/service-azure-sql-data-warehouse-with-direct-connect
Look at the Single Signon section.
Not sure what your requirements are but maybe selecting a different authentication method would fix this, if possible.
@@BryanCafferky The 1st article does not suggest a full proof solution. The work around suggested is to have a SQL server authentication but that won't be viable for the end users as the requirement is to pass the users credentials(impersonate user) to the source DB as there is RLS implemented in the source.
The 2nd article is for Power BI and there I am able to connect to the source and build my model in the Direct query mode. Thanks for your reply
excellent.
Nice video Man, keep up the good work. Regards from Brazil.
thanks Bryan, from Jim in Oz....it helps substantiate the information floating around in my head. Do you have a demonstration showing the use of Power Bi consuming the SSAS service?
By SSAS, do you mean the multidimensional model?
Awesome Video Bryan! A couple of questions about the SSAS piece specifically:
1. You mention the model exists in text form so you can put it into Git. That makes sense, but can you elaborate more on some more sophisticated development features. I'm interested specifically in collaboration, having multiple branches / developers editing the same model, merge conflict handling, and setting up multi-stage environments (i.e. dev --> QA --> Production). What are some best practices you've seen?
2. In terms of joining data in the SSAS via that spider-web type diagram - does SSAS have support for join logic that might go beyond simple primary to foreign key relationships? i.e. additional "and" clause, or joining the same table in a model multiple times.
Thanks!
Hi Assan,
1. The model is stored in JSON text files and Git can be used with these. However, the tabular model is basically one large object and I would be careful trying to do a lot of parallel development on it as there are many dependencies. Automated testing of models is also a challenge. There is a model comparison tool called BISM Normalizer that compares the files and allows you to deploy selected parts. A great video on on this is at ua-cam.com/video/44I48ufKhOs/v-deo.html, the BISM Normalizer part is about half way through. Data-centric projects are coming along fast but I don't think it is at the level of web/mobile app DevOps. They pose some different challenges.
2. For complex join logic, I would recommend using a view in the source database. This offers advantages in maintenance, i.e. you don't have to modify your model and performance optimization. Tabular models will load the data into memory on refresh (except for direct query) so you get the performance improvement on the refresh and also a simplified tabular model design process. Plus, other tabular models could take advantage of the view. I don't think the model designer is meant for complex joins.
Thanks,
Bryan
@@BryanCafferky Great Video! I always suspected there was a roadmap to merge the development code between Power BI Services and SSAS Tabular, but it is hard to find much information on it. The March 2020 version of Power BI has of preview of "Enhanced Dataset Metadata". Do you have some information (other than the microsoft docs), opinion, etc on this? Thx in advance
@@tomchelle1 I don't have anything on the code bases, not sure Microsoft publicizes how the code it maintained. Slight correction, Azure Analysis Services and Power BI Premium are merging in terms of functionality (not SSAS which is on prem). However, Microsoft has been working on this for at least 2 years and is still in progress. Scale is on the side of AAS. I still like the manageability of a separate AAS vs. merging them but in time, Power BI Premium may convince me otherwise. This blog has a good explanation. Bear in mind preview can last for a long time and features can change. I would not recommend using any preview features in production. powerbi.microsoft.com/en-us/blog/power-bi-premium-and-azure-analysis-services/
@@BryanCafferky Thanks for sharing wonderful information. I know this video is dated in 2018. With the Power BI Dataset sharing option across reports, doesn't it solve the Enterprise Self service BI in Power BI Premium? or Is there a real need to go to AAS?
@@Vasanthv123 Thanks. Microsoft might like you to think that, but no. One, that feature is new, in preview, and undergoing changes. You cannot even assume it will be retained in the future. Second, reading the docs, it looks more like a work-around, then a clean architecture. Third, AAS, provides better scale and a centralized source. The new feature looks like it may be unwieldy to manage. Often new features sound good but the devil is in the details. I would not rely on a sales pitch. You need to really test it out. See limitation at docs.microsoft.com/en-us/power-bi/connect-data/service-datasets-across-workspaces
Watching this one for a second time...this really makes me wonder if most of our analysis and reports are on Power BI, why do we have to maintain the star schema models in our EDW in Azure? Thanks.
I look at it like this. Power BI is a data consumer, i.e. a set of reports. The data warehouse is the source of the reports which needs consistency, accuracy, etc. It supports things like slowly changing dimensions and grain level data. You could drop Power BI and still use the EDW. In fact, the EDW can and should be used by many reporting and analytics tools; Tableau, Cognos, Data Scientists (Python, Spark, R, Jupyter, etc.). I think Microsoft obscures this by adding so much data side functionality into Power BI but I would avoid being drawn into the all in Power BI mindset. Make sense?
This is clear and nice short demo.
One quick question: I have bunch of measures, shall I create in Power Report level or SSAS Tabular Model level.?
my power bi reports are very slow and I have used liveconnect SSAS Tabular model
Thank You!
Vishwanath
If the measures are useful in different reports, it would probably be better to add them to the tabular model. It would probably perform better too. I assume you are using Power BI Report Server on SQL Server?
@@BryanCafferky
Thank you so much for your reply....
My Power BI reports are published in Power BI service and those reports are connected to SSAS tabular model, reports are refreshing data through data gateway. need to check...
Thanks again! - Vish
@@vishwanathr3507 If you don't have Express Route, that would probably be too slow. Could you replicate the SSAS model to Azure Analysis Services? It should perform much better there. Or you may be able to upload the model to Power BI Premium which will also perform better.
This was so so amazing, Thank you so much!!!
You're welcome!
Hi Bryan, is it possible to do a business view of the data with Microsoft Tabular model. In Cognos, we have a business layer which shows all underlying data in a view that is easy to understand for the business and it mirrors the screens of our transaction system. One entity like a "Registration" in the business layer may have 10 tables behind the scenes, I don't want business to have all this information in 10 different tables in the tabular model.
Hi, Ideally, the data should be in a dimensional model data warehouse structure. Sounds like you are building off an OLTP data model which results in many tables. You could build SQL views that consolidate the data for the Tabular model. You can also do some restructing in the Tabullar model import but the SQL view would probably be a better approach. PBI Workflows learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-create
Be careful of the cost of that. Also see powerbi.microsoft.com/en-my/blog/introducing-power-bi-data-prep-wtih-dataflows/
I learned a lot from your videos and would like to thank you for your efforts, Regarding to Power bi dataflows can we use it with Power bi instead of using Azure Analysis services with power bi?
Thanks. Power BI data flows are more focused on sharing data among applications with a Common Data Model (CDM). Analysis Services supports the Power BI Tabular Model, a highly optimized and scalable data format native to Power BI. I think they serve different purposes.
in minute 1:40 you mention that you are going to share a link to another video. Can you share that link?
Thanks for asking. Pasted below. Also, at the end of each video several other videos of mine are displayed.
Video What is Analysis Services?
ua-cam.com/video/Y0hWQh7EJME/v-deo.html
Video: The Tabular Model and WHy You Need It?
ua-cam.com/video/R7k4qJ120dY/v-deo.html
Nevermind... ua-cam.com/video/Y0hWQh7EJME/v-deo.html
Thanks in a million.Great content.
fantastic well explained
Hi Bryan, thanks for the nice intro into the different models. I have a question considering your demonstration in Power BI. You joined the geography to territory and territory to the facts. Aren't the geography-data completly useless with these facts, because there is no connection?
Thanks. Good catch. Yes. I missed that when I recorded this and saw this when I did a demo recently. The sales data can only be related to the sales territory and since there can my multiple territories in a geography, it is not valid to use the related geography elements. This is also an example of Snowflaking. Power BI can mask this type of thing because it gives you an answer without telling you the relationship is not really valid for reporting. Maybe a video on things to watch out for in Power BI is in order.
This is an excellent video . . real experiences shared here. . Thanks a ton Sir. .
Two question if you may help:
1.Okay i got that we really need analysis services . .now notice that at times we have the same model replicated from warehouse to ssas beause of what you explained in video . .But then question is why we have data wareouse then .Can't we directly create the dimention model in SSAS itself ?
2. Do data scientist(not referring to BI folks) use data warehouse . .I kind of get confused when I hear data dcientist work mainly with unstructured data . .
Now If you say data scientist use data warehouse , then here we already have relationship discovered. . right . because Data scietist is supposed to uncover new relationship in data . .
So the precise question I datacientist use data ware house as primary source. ?
Thanks for your comments.
1) No. The data warehouse is the foundation upon which reporting solutions are built and a semantic model like a Tabular Model is part of a reporting solution and specific to a commercial technology, i.e Power BI. Many different reporting and analytics solutions can be built using a DW. Also, SSAS does not support the creation and constructs to create a DW, i.e. row-level data maintenance, surrogate keys, slowly changing dimensions. I made that mistake on my first SSAS MDX model and it was horrible maintaining the solution.
2) Data Scientists take the data from where it is lies and in whatever format, i.e. images, video, sound, structured, unstructured. You don't get to choose. You use what you need to solve the problem. Many times there are data warehouses to support solutions like sales forecasting, fraud detection, etc. But many times they don't exist or can't help solve the problem. Also, a DW explicitly enforces data relationships. This is not discovery. It built into the design. Discovery is finding relationships like customers who drink coffee buy ten times more sugar than those that that don't or when product A is presented next to product B on the screen, product A and B are likely to be ordered together.
Hope that helps.
@@BryanCafferky Thanks a ton . . this are real insight . .
One last question .you may reply or not as per your comfort. .
Sorry for another last question for now . :)
From what POINT Business intelligence pro(Not referring to DS) role start . . Is it from the Point of Pulling data from Data ware house , creating cube and building dashboard ?
To be clear with question :
In other words ,I am tring to understand . is it BI Professional responsitiblity to get the data from different sources ,do ETL and then into the Datawarehouse or BI pro role start after data is already there in Data ware house.
Reason of confusion is I see job role as Data warehousing position. . so I am confused whose job is getting data into Data warehouse .
Agreed we may not have strict boundary but I am referring to wide majority of the cases as per you
Agreed no one stopping BI folks to do ETL and dwarehouse . but in reality given the time constraints it might not be possible for one person to focus on so many areas. So you may please guide on above
@@techproductowner Yeah. Great question and part of the confusion is the evolution of the roles. When I started as a BI Engineer, it tended to focus on the ETL through to building the DW and possibly semantic model like Power BI Tabular Model using SSAS but did not include reporting or visualizations. This has changed b/c Big Data and Data Science emerged but not everyone has caught up. Generally now, someone who extracts the data, develops the ETL, and builds the data architecture is a data engineer (big data or small - could be either or both). BI Developers and like roles tend to focus on the BI tool like Tableau or Power BI and the reports/visuals. This mayl require building semantics models but most of the heavy lifting is already done and the emphasis is on the reporting solution. There is much more business analysis in this role
My advice is to largely disregard the titles and analyze what the employer expects from the role. Often they require x, y, and z but consider a and b nice to have (which usually means you'll never do those but employers want everything). Beware of employers who say the role currently is X but will change to Y. It is usually just to get you in and will not transition as promised. That's been my experience.
Make sense?
@@BryanCafferky Yes that is why I am askign you . . from your lecture I can sense that you have been there . thats the only reason I thought atlest let me try asking you . . and thankfully you replied. . i am clear now.
Hi, Bryan. First off, great video. Thanks. When you started with Power BI and created your model, I assumed the next step would be that you could pull that same model into Visual Studio (or into Azure Analysis Services) and use it there, enabling various tools to connect with that model. Instead it seems you re-created the model in VS as though you hadn't already made one. Was there no way to import the work you had already done in PBI?
Azure Analysis Services has a new feature in preview that allows you to import a Tabular Model from a Power BI project file. See this link for information on it. docs.microsoft.com/en-us/azure/analysis-services/analysis-services-import-pbix
From there you could download and maintain it from Visual Studio.
Thank you Bryan
You're Welcome! Thanks for watching.
Is PowerBi superior than SSRS? Are their better tools than SSRS for Business Reporting?
Hi Joe, Generally, I think Power BI is considered the successor to SSRS. Power BI supports rich data visualizations far beyond the ability of SSRS. Microsoft is all-in on Azure and Power BI is the Azure-based reporting service. Yes, SSRS reports can be migrated to Power BI but this seems to be more for lift and shift than anything else. The bottom line is that if you want to keep your skills relevant, I recommend you learn Power BI. Make sense?
Direct Query is for when you always need the latest actual data.
Will there be a continuation of this series? I hope it very much ...
Thanks and yes. I think the next topic might be Slowly Changing Dimensions (Type 1 - 5) as this is often an interview question. What are you most interested in?
@@BryanCafferky Thank you in advance for the answer and the videos. First of all, I want to learn this. Let's say we have created our data model, does this model present the most up-to-date data to the BI tools that call it on each call? Or do we have to Deploy every time? Or is there a feature like automatic Deploy to keep data always up to date?
@@kenanunkesti Hi Kenan, There are 2 ways to define how data is refreshed. One I direct query that always goes back to the source data to get the latest. This can severely slow things down and should only be done if really necessary. The other is to set a model refresh schedule in Power BI, i.e. tell it how often to refresh the data. This will perform better and you can set it to refresh as often as you like but there may be some additional cost, i.e. based on service level. See docs.microsoft.com/en-us/power-bi/report-server/configure-scheduled-refresh
Thanks,
Bryan
@@BryanCafferky As I understand the server will need a Report Server service and cost. You asked me what I wanted to learn in the first question. I will be very happy if you make videos about DAX and Multi Dimensional Model. Thank you very much for your effort and answers :)
@@kenanunkesti Just to confirm. Are you deploying Power BI reports on-premise or to the Power BI service? Also, may I ask why you plan to use Multi-Dimensional models vs. the Tabular? The Tabular model is easier to create and generally performs better?
great video
Hey Bryan! I am facing issues with deploying SSAS tabular model. It takes a very long time to deploy even with calculation mode turned to manual and processing turned off. The size of my cube is ~5000MB. It would be great if you can help with diagnosis
Hi Jatin, Often the issue is not with SSAS but with the query used to refresh the data. Confirm you can run that query completely in a short time from the same type of connection as SSAS is using, i.e. same connection speed.
Otherwise, this blog may help stackoverflow.com/questions/39018032/ssas-tabular-model-slow
Excellent 👌
Maybe I missed it, but how do you define how often the data in the tabular model is refreshed? At 40:45 it seems like you published and it did a one time population of data. I understand it's stored in memory and pulls from views you defined against the source data source, but how/when does it decide to refresh the in-memory model from the data source?
Hi Aaron,
You are correct, I did not cover refreshing the data. This is done once the model has been deployed to Analysis Services. It is scheduled and there are a few ways the model can be refreshed. Using SQL Server Management Studio, you can manually do a fresh. See docs.microsoft.com/en-us/analysis-services/tabular-models/process-database-table-or-partition-analysis-services?view=asallproducts-allversions
There are several ways to automate processing the model (aka refresh) with my favorite being using Azure Automation. If you have SQL Agent, that's even better.
Automated Processing on SQL Server info www.sqlshack.com/how-to-automate-ssas-tabular-model-processing-in-sql-server-2016/
On Azure using Azure Automation see docs.microsoft.com/en-us/azure/analysis-services/analysis-services-refresh-azure-automation
You can also use Azure Functions or the Analysis Services REST API.
Gee! This might be a good video to do. Thanks!
@@BryanCafferky Thanks for the really thorough response. I have tried to keep an open mind about tabular models, and view them as just a different approach, especially since Azure SSAS as a service option is tabular only. The more I learn about how tabular is managed though, the less it feels like an enterprise solution and more like an adhoc self-service solution. I can see it certainly can be built out to be an enterprise solution, but it just seems like the implementation and deployment workflow is lined up for someone trying to get something going quickly. I mean, data currency is a given for anything that has viability as an enterprise solution, and it not being a first class feature that requires external plumbing gives me a funny feeling.
@@AaronShumaker Yeah. Microsoft is definitely pushing to make Power BI Premium an all-inclusive service but, to your point, I think keeping the model separate by using Azure Analysis Services is more scalable and supportable. Using Visual Studio/Data Tools, you have a BI developer process to create and maintain the models and a separate side to develop and maintain reports. If you go to far with self-service, you end up with questionable reports. Thanks.
Great video and hi from the Netherlands. I have two questions. Firstly, whats the difference between the tabular model and snowflake model given that the model used here looks like snowflake as well. Also, is right to say every imported data to power BI runs on a tabular model? And if i later create a star schema from the imported data, is it safe to say the model now changes to a star schema? Thanks a lot for your reply in advance and keep delivering great content
Hi Kinsley, I think you are confusing a methodology (Dimensional Modeling and the Star Schema) with a technology (tabular model). The tabular model is compressed in memory table used by Power BI that can be loaded from many sources. It works well with Dimensionally modeled data but that is not a requirement. Dimensional modeling is to data warehousing what the laws of normalization are to a good OLTP design. Make sense?
@@BryanCafferky makes absolute sense Bryan and i appreciate your clearification.
Awesome video - thank you!
How do connect to the tabular model in 2022 Visual Studio?
Hi Bryan.
Do you still need to develop a DWH or do you connect the tabular model to the original data source (f.e. ERP-System)?
Great question Andreas. When possible, I think reformatting the data to be consumed by reporting is better which can be dimensionally modeled or at least restructured with end user reporting in mind. Source systems, OLTP, data is designed for data maintenance optimization, not reporting, and causes many issues when used 'as is'. Also, avoid pointing to any production systems data as this can cause performance impact on that system. Reporting data sources should be separate from data used by production applications as a best practice. Caveat: There can always be exceptions to any rule, but these are guidelines I use. Perhaps this could be a video. :-) Thanks
@@BryanCafferky Thx for your quick response. That was also my suggestion to the customer. Just wanted to go sure cause tabular is new to me ;-)
@@andreasratz3894 This blog post may help, gist is "Power BI is simply most performant when the data it’s using comes from a dimensional design." www.blue-granite.com/blog/the-importance-of-structure-in-analytics-part-2-the-dimensional-model-and-power-bi
At 16:12 , you mentioned that the snowflake schema doesn’t pose a problem for the tabular model because it converts it into a single large table. However, I’m confused because most sources recommend using the star schema for modeling in Power BI. They claim that SQL queries against the model perform better with fewer joins. How can the chosen data modeling approach not impact performance?
Also, Is dimensional modeling used for both tabular models and multidimensional models, or is it specific to one of them?
Thank you.
Good catch and thanks! Yeah. This video was a long time ago and I have since learned the tabular model is a living dynamic and not always easy to control entity. I tried to delete that comment I made but if that does not take, I'll take down the video to avoid confusion. On your question, Dimensional Modeling is ideal for both Multidimensional and Tabular models.
@BryanCafferky Thank you, this has clarified many doubts I had. Also, please do not remove the video as it provided a lot of clarification (although pointing it out in the description section may be good).
I would kindly ask if you could take the time to answer some of my questions. I am eager to know your opinion on the following:
1. How will the data model impact the performance of tabular models if the queries are run against the compressed version in memory (not the data model itself)? As far as I understand, the structure of the compressed version does not differ between one specific data model and another.
2. I have started reading the book you recommended (Tabular Modeling in Microsoft Analysis Services, Second Edition). Do you think this book is still relevant?
(P.S.: I'm a beginner, so any recommendations for resources would be greatly appreciated.)
Thank you.
@@alaamohammed3491 This blog by the authors of the book, explain why the Star Schema should be used as the source of a Power BI model. www.sqlbi.com/articles/power-bi-star-schema-or-single-table/
They have a web site at sqlbi.org and a lot of excellent training. Some is free and valuable.
Their UA-cam channel is at www.youtube.com/@SQLBI
I am not a deep dive expert on Power BI and these guys helped me realize that what happens under the covers is complex.
So my recommendation is to leverage them as a resource.
@@BryanCafferky thank you very much.
Can I source control the model file?
Yes. It is just a text file. See this for more social.msdn.microsoft.com/Forums/en-US/399824b4-c0e4-457b-9d88-1a30394ecdaa/tabular-model-version-control-visual-studio-2019?forum=sqlanalysisservices
Can I build a tabular model from a SQL OLTP database, i.e. without the need for Facts and dimensions?
Absolutely Yes. In fact, the Tabular Model is much better suited to that than the Multi-Dimensional model. To make your life easier for support and performance tuning, I would try to use SQL views to pull the data together when possible. This makes the model simpler, pushed to query to the backend database saving Power BI work, provides your model with a layer of insulation from source data changes, and you can optimize the View's performance independent of the Tabular Model.
I want to duplicate one table in power query editor and I want import the second duplicate table, but it doesn't let me do it. Can you help me? Thanks
Sorry. I have not used Power Query much. As you probably found too, most web searches come back with information about Power BI Desktop.
@@BryanCafferky thanks you for the answer.
Thanks Bryan, its good explanation, i request that, can SSAS as data source to use in Power BI and build a report and publish in Power BI serve, is there any video like this and i would to know how to integrate SSAA->Power BI->Any ERP, can u please explain this, it would be grateful.
Sure. SSAS is designed to be used on prem with Power BI Report Server.
Not sure what the best videos are are there for this. MS Docs is probably a good place to start.
Which playlist is this part of?
This playlit ua-cam.com/video/Y0hWQh7EJME/v-deo.html
Hello, How can we connect on premise HANA ( calculation view ) inside SSDT for modelling. The Web version of AAS is discontinued starting March. Thank you. Can you list down the steps...
Not sure how that relates to this topic?
@@BryanCafferky - How can we use SAP Hana views in AAS for setting up tabular models.
@@sid0000009 See these links: docs.microsoft.com/en-us/azure/analysis-services/analysis-services-datasource
and this
visualbi.com/blogs/microsoft/azure/importing-different-data-tables-sap-microsoft-azure-analysis-services/
Note: Tabular 1400 and higher models only. HANA has a source connector
SAP HANA should show up as a source when you do a data import.
thanks for the link.. yes read that before. want to know in such cases the computation would happen in HANA server ( being direct query) and the result would be cached in AAS server. Is that how its expected to work? Is my understanding correct on the topic...It would mean I am using the computation power of my existing on prem while utilizing the cache options from Azure to makes things faster..
@@sid0000009 Sorry but I cannot help you there.
Hi Bryan .
How do l implement dynamic role level security using UPN . I need some tutorial
Thanks
This post explains it in detail. docs.microsoft.com/en-us/power-bi/desktop-tutorial-row-level-security-onprem-ssas-tabular
Awesome, thank you Bryan!
Thank you Bryan! Your explanation is amazing!
Hi
Can u pls let me know if u aware of , what is the use of the annotations in XMLA script, u will find these in XMLA script of Tabular Model?
Hi, See docs.microsoft.com/en-us/analysis-services/xmla/xml-for-analysis-xmla-reference?view=asallproducts-allversions
Thanks for doing this
Amazing job!
Great video!
Thank you, you made my day....
Nice
TIP: The authors of the recommended book offer two free courses on their site, www.sqlbi.com/training/ Introduction to Data Modeling for PowerBI, and Introduction to DAX
Organized!!!
Very organized and answered a lot of my questions!
You need to zoom in. Cant see
Yeah. Sorry. One of my earlier videos and did not know any good ways to zoom in yet. Thanks for the feedback.
From the business side , i feel datawarehouse and cubes are becoming irrelevant. Connecting directly to OLTP using powerbi or tableau solves 90% of all requirements.
Hi Joe, OLTP databases are not designed to return query results at click speed. It may work for small datasets but not larger ones. The need is growing for the Tabular Model and Power BI uses it as its native format so people are using it whether they know it or not. Also, data warehouses are making a big comeback. Look at Apache Spark Delta Lake and you see data warehousing on Spark. Thanks