Great video, good explanation and well explained, thank you. Would be interesting to see what happens if a new product is added to one of the two datasets when using the local model brand table
Great. Tried two of the presented "linked table" scenarios and I got good results. However creating measures across the comp.model is another challenge (for me)
Great video! I’ve been using the approach very similar to the 3rd option you introduced here. The only difference is that I was using the new product table to link to fact tables directly instead of connecting to the two existing product tables.
This video is so excelent. Very useful insights that I've been using for the last year. However, we're facing an interesting issue. We're importing both dimension tables (brand and product). The brand, once it has much less data to be sent to remote model, is used everytime we want to aggregate values equal or greater this granuarity. We use only the product table when we identify the we're at product granularity. PS: in our model, brands is also connected to the products table. Thus, having both tables, connecting the brand with an active relationship, the products with an inactive one and switching the way to calculate the measure using USERELATIONSHIP() works just fine when all the tables are imported. However, when we live connect with a model and try to do the same approach, we get an error which looks like a "circular reference" (two different paths to filter the remote model , by local brand or by local product). It looks like Power BI isn't really identifying that the limited relationship connecting product is indeed inactive (and then, impossible to have more than one path to filter product remote table). Any thoughts about it? Thanks a lot, Cheers, João Martins
This is interesting. Please, can you share a sample file, or a picture of the diagram, so I can try to reproduce? You can contact me writing in the contact form here, and we'll be in touch by email: www.sqlbi.com/about/ (go at the bottom of the page)
Great video and overall format! Many insights emerge from this 'unplugged' way of presenting things! My opinion regarding composite models: I believe they will have a great impact! They could even enable using Power BI models as building blocks to evolve and build an enterprise wide BI solution, rendering the traditional data warehouse obsolete - OK not 100% obsolete but anyway...You still of course need performance and history - and you definitely need enterprise wide governance, data definitions and data management in general, but this does not mean you need to build a cumbersome data warehousing solution of the past. You combine (or will as this is still in preview) work that's already done within Power BI with other data elements (within a data lake from example) in an agile way, avoiding re-work and easily capture changing data & measure definitions at the enterprise level. It will definitely be very cost effective, with greatly reduced time to market. So I really hope all this works out.
For critical company data (financial, balance sheet, sales, HR) the data warehouse was, is, and will be a good investment. Not implementing that for critical data is a technical debt you will continue to pay for the lack of data consistency in systems that do not have the same data quality level (assuming you build the data warehouse the right way). For many other datasets (weblogs, telemetry, competition comparison, market penetration, datasets used for individual projects limited in scope and time), then this is a good tool. Thinking that Power BI and mashup of models can remove the need for a data warehouse is a wrong idea, in my opinion. Unless the company does not need a data warehouse (small companies often don't need it). Power BI composite models solve another problem: removing the need of extending the corporate data model for individual reports, without having to duplicate data in many models just to get the desired reports.
Most exciting feature to come out in the BI space I’ve seen in a long time. Wish it would work over on-prem Analysis Services, though. I don’t see why they only enabled it for cloud datasets.
How do you hide those 3 options incl the three dot ellipsis when hovering a dashboard on some visuals like card, slicers, textbox, tables, etc. Are there any fancy or creative way of hiding those settings?
DAX Studio already works. There could be additional features to automate certain processes, but it's too early... we have to practice composite models first.
Really awesome video!! The video helped me on a project I need to do. However I have a question to create a table from 4 different tables. My 4 tables don't have the same number of columns so how can I adapt the script to create my unique table? please 🙂
Thanks for a great session, I have a couple of questions. 1. Did you configured gateway earlier for the report published because DQ requires gateway right? 2. If we try for the 'ProductKey' in the same way as 'Brands' we will get the same results?
Hi Marco. Do u have a video/tutorial that explains the possibility of ‘importing’ an Excel power pivot model into Power Bi, but with the possibility of updating my original Excel workbook and ‘push through’ to Power Bi the periodic updates that i edit in the original Excel wirkbook? Thanks Martin ( South Africa)
Microsoft does not provide a number, but probably it is not a good idea having too many external datasets. Do you have use cases for more than 10 datasets?
Thanks for the video Marco You told that report has the shadow copy of other datasets but what if i revise the original dataset. Do mashup report has ability to refresh and get the new tables/columns etc from origin ?
Hi Marco, I just started exploring the new composite models and I am wondering if it is better to have one big model vs a few small models linked with a master dimension (calculated table) . Using composite models we may end up having a few other calculated tables if there were more dimensions related to Receipts and Sales. Thanks
very helpful video, but what about combining two different sap sources into one single dataset or model, I've read this is possible as long as they are not OLAP dbs, but didn't find much information about how to do it.
Hi I have 4 different shared datasets dataset 1,2,3 and 4. On dataset 3 RLS is applied. Imported these all and built report. Report is embedded in application. Working on PBI service but in application not opening. Please can you help me on this problem. Using SQL server for data source.
Amazing as always .. Did you notice any decrease in performance when using dax in the composite models ? (Imported csv files and a live connection to a power Bi dataset)
Thanks Marco for the great video! With the small mistakes and side effects it feels indeed more "real". Does the order of the model you are connecting to matter (1. Sales, 2. Receipts or vice versa)?
Do you expect the blank row handling behavior to change as the feature matures? Maybe my expectations are off, however dropping rows, and needing to build local limited versions of the common dimensions seems to restrict the usefulness of this feature. Awesome content as always.
It's a design problem more than a feature issue. Think about doing the same in SQL with different databases, you would face very similar problems. Because of the nature of the semantic model, you choose one column of one table in the report, and you get that. The relationship traversing different datasets in Power BI don't have the additional blank row you have in relationships that are internal to the same data source. I don't think this will change, the cost for managing that (in performance) would be very high.
thanks for this video. question: how do you handle scenarios where one table has RLS and the other doesnt? I have a new model like this, but am unsure how to propagate to the users so that it applies to the "unprotected" data the same way as the protected. Do you create a new security rule in the new model, or make the security data filter out the unsecure data with bidirectional filtering set to filter the unsecure table? or is there another option? shared with the rest of my dev team btw.
The RLS story is not implemented yet in the first preview. We have to wait upcoming updates. In general, you will not be able to see data you are not authorized to see just because you use a composite model. However, we still don't have all the details about how it will work in the final release. Let's wait a few weeks.
You can use RELATED through a many-to-one relationship. Performance can suffer over weak relationships. Same for time intelligence, with a big "it depends" because there are many variables to evaluate the impact on performance (which could be minimal, but... it depends!).
Thank you for this video. Now that it is more than a year old I am interested to know if there is any performance improvement in how these direct queries work. Because in the way it is described in the video it rather looks like a gentle warning that with other than really small cardinalities the composite model is not likely to perform well. I have a composite data model with table cardinality in the range of 10000, some moderately complex calculations and response times up to 30 secs (or even more). I am about to test it by merging the data models because from the video I see I was probably making the composite model the right way, so I cannot hope to find just a single error that I could simply correct to radically improve response times. (Until the end of the video I was hoping for some kind of a surprise solution that would do much better.) So if there is any new guidance on this it would be great to hear about it. Thanks!
While there have been some improvements, the architecture is what you have seen - when a relationship crosses data source boundaries, keys must be moved one way or another...
Very informative video. Now can you made similar video for consolidating such a dimension which already have very high cardinality maybe in millions? Is there using calculate table with unions still good option, and I think this new table will not be compressed right? What are the options then?
The table is compressed as any other table, but it has to be materialized in an uncompressed state when you refresh the model (before being compressed). I don't understand the remaining part of the question, though. Please, can you clarify?
Hi Marco, While creating composite models, everything works fine but if I change the source of the model from DEV to UAT then all the measures which are created within the Power BI files are missing. Is it this expected or a bug? I tried to refresh the Power BI file as well but still it no luck. Thanks
Not sure - you should contact Microsoft for that. Because the feature is in preview, write on the Power BI Desktop forum: community.powerbi.com/t5/Desktop/bd-p/power-bi-designer
Thank you so much for this wonderful video, I have question in same lines. Hi , Can I have composite model where my data source should be tabular model (on premise) and excel/any other RDBS as second source. Can you help me with some tips how can I do that ?
You can create composite models connecting only Power BI datasets or Azure Analysis Services databases. On-premises Tabular models (SQL Server Analysis Services) are not supported.
Tried but does not allow to load the 2nd dataset eventhough after added the new preview feature. Does this only work for premium users? I am a pro user and owns both datasets.
What is GROUPCROSSAPPLY()? another question is, will we see all the data of table from live connection if we create a calculated table, like selectcolumn() from live tables??
GROUPCROSSAPPLY is a new function (still undocumented) used to manage queries in composite models. Yes, you can see the data of the remote table if you create a calculated table, but this copies the entire table locally. If you just want to take a look at the content of the remote table, you might want to query it using DAX Studio.
how does RLS work with this? I connected to a powerbi data set and an excel import. They can only see the visual using the excel import. All the other visuals are broken?
Thanks Marco! Really enjoyed your in-depth video about composite models. I would appreciate if you could answer the follow-up question. I see in the video you used UNION DAX function to calculate the Products and Brands tables. But do you know if Microsoft will allow us to Union (append) those 2 tables in PowerQuery? The same goal would be achieved, but then we don't have to load the 2 tables from Receipt and Sales sides, which would make the data model lighter? In this case the 2 dimenstion tables Products and Brandes are not so big, so DAX query is efficient, but I can imagine in some cases, we want to load the final combined tables upstream, and disable load the "hidden" big tables. Or maybe a more general question, do you know if Microsoft allows to edit the composite models inside PowerQuery? Grazie mille!
If you can use PowerQuery to access the data source, you have fewer reasons to create a composite model - it would be more efficient to load all the data in the same model. Technically, you could access the two models in PowerQuery through XMLA endpoint if you have Power BI Premium or if the two models are hosted on Azure Analysis Services - but as I said, it doesn't seem an efficient architecture to combine both technologies (composite models and power query accessing remote datasets).
As always, great content, thank you very much!!! I have one question: can DAX Studio clear the cache from the source or only the localhost? If it cannot clear the cache from the published dataset(s), running the query benchmark makes warm caches queries from the SE perspective.
@@SQLBI just to confirm, are you saying that clearing the cache in DAX studio will clear the caches of the local AND remote models? Or just the local model?
Marco, regarding to docs "Some queries may return wrong results when there's a relationship between calculated tables and table(s) in a remote source. Creating calculated tables over a remote dataset isn't supported, although it isn't currently blocked in the interface." docs.microsoft.com/en-gb/power-bi/connect-data/desktop-directquery-datasets-azure-analysis-services You act as if there is no such limitation. Does it mean that the limitation will soon be lifted?
Very interesting point. We will investigate, because if calculated tables cannot be created, it is a big limitation to solve the scenario described in this video. Thanks for the comment!
@@SQLBI Matt Allington encountered the problem with calculated tables recently: exceleratorbi.com.au/the-most-important-update-to-power-bi-desktop-ever/#comment-85883
Good news: Microsoft confirmed that by the end of the preview the calculated tables over a remote dataset will be supported. There are possible bugs now in preview, the message was intended to warn about that situation only.
@@SQLBI Today is 2023-04-17. This feature is now GA. But the model still can not be refreshed in Power BI service because of the presence of a calculated table. The refresh ends with “Refresh is not supported for datasets with a calculated table or calculated column that depends on a table which references Analysis Services using DirectQuery.”. I tested the feature on a newly created Pro & PPU workspaces.
Quite interesting experiens. The topic itself and concept of "unplugged" generally. Great thank.
It was great to see process of your thought with multiple options and finally choosing the best performing solution
Liked the way you explained multiple options and its performance using Dax Studio. Great explanation. Thanks alot for sharing to the folks!
Great video, good explanation and well explained, thank you. Would be interesting to see what happens if a new product is added to one of the two datasets when using the local model brand table
Great. Tried two of the presented "linked table" scenarios and I got good results. However creating measures across the comp.model is another challenge (for me)
Thank you for this video. Great example of composite models!
Great video! I’ve been using the approach very similar to the 3rd option you introduced here. The only difference is that I was using the new product table to link to fact tables directly instead of connecting to the two existing product tables.
Great lecture!! informative one..
This video is so excelent. Very useful insights that I've been using for the last year.
However, we're facing an interesting issue. We're importing both dimension tables (brand and product). The brand, once it has much less data to be sent to remote model, is used everytime we want to aggregate values equal or greater this granuarity. We use only the product table when we identify the we're at product granularity. PS: in our model, brands is also connected to the products table.
Thus, having both tables, connecting the brand with an active relationship, the products with an inactive one and switching the way to calculate the measure using USERELATIONSHIP() works just fine when all the tables are imported. However, when we live connect with a model and try to do the same approach, we get an error which looks like a "circular reference" (two different paths to filter the remote model , by local brand or by local product). It looks like Power BI isn't really identifying that the limited relationship connecting product is indeed inactive (and then, impossible to have more than one path to filter product remote table).
Any thoughts about it?
Thanks a lot,
Cheers,
João Martins
This is interesting. Please, can you share a sample file, or a picture of the diagram, so I can try to reproduce? You can contact me writing in the contact form here, and we'll be in touch by email: www.sqlbi.com/about/ (go at the bottom of the page)
PLayed at 1.5 speed was a breeze :D good work!
Love this trick!
Great video and overall format! Many insights emerge from this 'unplugged' way of presenting things!
My opinion regarding composite models: I believe they will have a great impact! They could even enable using Power BI models as building blocks to evolve and build an enterprise wide BI solution, rendering the traditional data warehouse obsolete - OK not 100% obsolete but anyway...You still of course need performance and history - and you definitely need enterprise wide governance, data definitions and data management in general, but this does not mean you need to build a cumbersome data warehousing solution of the past. You combine (or will as this is still in preview) work that's already done within Power BI with other data elements (within a data lake from example) in an agile way, avoiding re-work and easily capture changing data & measure definitions at the enterprise level. It will definitely be very cost effective, with greatly reduced time to market. So I really hope all this works out.
For critical company data (financial, balance sheet, sales, HR) the data warehouse was, is, and will be a good investment. Not implementing that for critical data is a technical debt you will continue to pay for the lack of data consistency in systems that do not have the same data quality level (assuming you build the data warehouse the right way).
For many other datasets (weblogs, telemetry, competition comparison, market penetration, datasets used for individual projects limited in scope and time), then this is a good tool.
Thinking that Power BI and mashup of models can remove the need for a data warehouse is a wrong idea, in my opinion.
Unless the company does not need a data warehouse (small companies often don't need it).
Power BI composite models solve another problem: removing the need of extending the corporate data model for individual reports, without having to duplicate data in many models just to get the desired reports.
Ver awesome. Thank you. Totally love the format of the presentation. +1 for unplugged.
Amazing job Marco! Really appreciate your video
Thanks for sharing,
Very useful feature.🔥🔥
great video Marco ..., and I support this series I give five stars for beginning 🙂
Most exciting feature to come out in the BI space I’ve seen in a long time. Wish it would work over on-prem Analysis Services, though. I don’t see why they only enabled it for cloud datasets.
It is not possible to use it with SSAS - we have to need SSAS vNext (after 2019 - will it be 2021 or 2022?)
How do you hide those 3 options incl the three dot ellipsis when hovering a dashboard on some visuals like card, slicers, textbox, tables, etc. Are there any fancy or creative way of hiding those settings?
Thanks for those video, very usefull ;)
295395 at 35:54, but!) video is great anyway! 😊
Great job. To measure performance for composite models must come with several challenges. Is it time for Dax Studio premium?
DAX Studio already works. There could be additional features to automate certain processes, but it's too early... we have to practice composite models first.
Really awesome video!! The video helped me on a project I need to do. However I have a question to create a table from 4 different tables. My 4 tables don't have the same number of columns so how can I adapt the script to create my unique table? please 🙂
Use SELECTCOLUMNS to create the same columns for the 4 tables (just use blank values for the missing columns).
Thanks for a great session, I have a couple of questions.
1. Did you configured gateway earlier for the report published because DQ requires gateway right?
2. If we try for the 'ProductKey' in the same way as 'Brands' we will get the same results?
No, we are using DirectQuery over Power BI Dataset - no gateway required
I don't understand the second question.
Hi Marco. Do u have a video/tutorial that explains the possibility of ‘importing’ an Excel power pivot model into Power Bi, but with the possibility of updating my original Excel workbook and ‘push through’ to Power Bi the periodic updates that i edit in the original Excel wirkbook?
Thanks
Martin ( South Africa)
cool...are there any limitations to the number of dataset that can be brought in as part of the composite model ?
Microsoft does not provide a number, but probably it is not a good idea having too many external datasets. Do you have use cases for more than 10 datasets?
Thanks for the video Marco
You told that report has the shadow copy of other datasets but what if i revise the original dataset. Do mashup report has ability to refresh and get the new tables/columns etc from origin ?
Yes, but you have to refresh the local model (also called the proxy model).
Hi Marco, I just started exploring the new composite models and I am wondering if it is better to have one big model vs a few small models linked with a master dimension (calculated table) . Using composite models we may end up having a few other calculated tables if there were more dimensions related to Receipts and Sales. Thanks
very helpful video, but what about combining two different sap sources into one single dataset or model, I've read this is possible as long as they are not OLAP dbs, but didn't find much information about how to do it.
Hi I have 4 different shared datasets dataset 1,2,3 and 4. On dataset 3 RLS is applied. Imported these all and built report. Report is embedded in application. Working on PBI service but in application not opening. Please can you help me on this problem. Using SQL server for data source.
Amazing as always .. Did you notice any decrease in performance when using dax in the composite models ? (Imported csv files and a live connection to a power Bi dataset)
It depends - we discuss performance in the video, a CSV imported is like any other "local" table.
Thanks Marco for the great video! With the small mistakes and side effects it feels indeed more "real". Does the order of the model you are connecting to matter (1. Sales, 2. Receipts or vice versa)?
Not much, the cardinality of tables/relationships is more
important.
Do you expect the blank row handling behavior to change as the feature matures? Maybe my expectations are off, however dropping rows, and needing to build local limited versions of the common dimensions seems to restrict the usefulness of this feature.
Awesome content as always.
It's a design problem more than a feature issue. Think about doing the same in SQL with different databases, you would face very similar problems. Because of the nature of the semantic model, you choose one column of one table in the report, and you get that. The relationship traversing different datasets in Power BI don't have the additional blank row you have in relationships that are internal to the same data source. I don't think this will change, the cost for managing that (in performance) would be very high.
Great Video
thanks for this video. question: how do you handle scenarios where one table has RLS and the other doesnt? I have a new model like this, but am unsure how to propagate to the users so that it applies to the "unprotected" data the same way as the protected. Do you create a new security rule in the new model, or make the security data filter out the unsecure data with bidirectional filtering set to filter the unsecure table? or is there another option?
shared with the rest of my dev team btw.
The RLS story is not implemented yet in the first preview. We have to wait upcoming updates. In general, you will not be able to see data you are not authorized to see just because you use a composite model. However, we still don't have all the details about how it will work in the final release. Let's wait a few weeks.
If I have row security in the original models, does the composite model honor the row security implementation of the original models?
Yes - RLS is not complete as implementation for the local model, yet - but existing RLS should already work fine.
Thanks Marco. Quick question hopefully. Can one use RELATED with these weak relationships?
also wondering about time intelligence :)
You can use RELATED through a many-to-one relationship. Performance can suffer over weak relationships. Same for time intelligence, with a big "it depends" because there are many variables to evaluate the impact on performance (which could be minimal, but... it depends!).
@@SQLBI Thanks for the reply SQLBI
admire you guys!
Thank you for this video. Now that it is more than a year old I am interested to know if there is any performance improvement in how these direct queries work. Because in the way it is described in the video it rather looks like a gentle warning that with other than really small cardinalities the composite model is not likely to perform well. I have a composite data model with table cardinality in the range of 10000, some moderately complex calculations and response times up to 30 secs (or even more). I am about to test it by merging the data models because from the video I see I was probably making the composite model the right way, so I cannot hope to find just a single error that I could simply correct to radically improve response times. (Until the end of the video I was hoping for some kind of a surprise solution that would do much better.) So if there is any new guidance on this it would be great to hear about it. Thanks!
While there have been some improvements, the architecture is what you have seen - when a relationship crosses data source boundaries, keys must be moved one way or another...
@@SQLBI Thank you very much!
Very informative video. Now can you made similar video for consolidating such a dimension which already have very high cardinality maybe in millions? Is there using calculate table with unions still good option, and I think this new table will not be compressed right? What are the options then?
The table is compressed as any other table, but it has to be materialized in an uncompressed state when you refresh the model (before being compressed).
I don't understand the remaining part of the question, though. Please, can you clarify?
Hi Marco,
While creating composite models, everything works fine but if I change the source of the model from DEV to UAT then all the measures which are created within the Power BI files are missing. Is it this expected or a bug? I tried to refresh the Power BI file as well but still it no luck.
Thanks
Not sure - you should contact Microsoft for that. Because the feature is in preview, write on the Power BI Desktop forum: community.powerbi.com/t5/Desktop/bd-p/power-bi-designer
Thank you so much for this wonderful video, I have question in same lines.
Hi ,
Can I have composite model where my data source should be tabular model (on premise) and excel/any other RDBS as second source.
Can you help me with some tips how can I do that ?
You can create composite models connecting only Power BI datasets or Azure Analysis Services databases. On-premises Tabular models (SQL Server Analysis Services) are not supported.
Tried but does not allow to load the 2nd dataset eventhough after added the new preview feature. Does this only work for premium users? I am a pro user and owns both datasets.
No, it should work also for Pro users.
What is GROUPCROSSAPPLY()?
another question is, will we see all the data of table from live connection if we create a calculated table, like selectcolumn() from live tables??
GROUPCROSSAPPLY is a new function (still undocumented) used to manage queries in composite models.
Yes, you can see the data of the remote table if you create a calculated table, but this copies the entire table locally. If you just want to take a look at the content of the remote table, you might want to query it using DAX Studio.
how does RLS work with this? I connected to a powerbi data set and an excel import. They can only see the visual using the excel import. All the other visuals are broken?
RLS is not fully implemented yet in the preview, you have to wait an upcoming update.
eres muy grande tio!!!
Thanks Marco! Really enjoyed your in-depth video about composite models. I would appreciate if you could answer the follow-up question.
I see in the video you used UNION DAX function to calculate the Products and Brands tables. But do you know if Microsoft will allow us to Union (append) those 2 tables in PowerQuery? The same goal would be achieved, but then we don't have to load the 2 tables from Receipt and Sales sides, which would make the data model lighter? In this case the 2 dimenstion tables Products and Brandes are not so big, so DAX query is efficient, but I can imagine in some cases, we want to load the final combined tables upstream, and disable load the "hidden" big tables.
Or maybe a more general question, do you know if Microsoft allows to edit the composite models inside PowerQuery?
Grazie mille!
If you can use PowerQuery to access the data source, you have fewer reasons to create a composite model - it would be more efficient to load all the data in the same model. Technically, you could access the two models in PowerQuery through XMLA endpoint if you have Power BI Premium or if the two models are hosted on Azure Analysis Services - but as I said, it doesn't seem an efficient architecture to combine both technologies (composite models and power query accessing remote datasets).
Ciao, Why i cant see the option in Preview to turn this feature on in Modelling Tab
You need Power BI Desktop December 2020.
Thanks Marco, really well done and I learned a lot!
As always, great content, thank you very much!!!
I have one question: can DAX Studio clear the cache from the source or only the localhost?
If it cannot clear the cache from the published dataset(s), running the query benchmark makes warm caches queries from the SE perspective.
Yes it works!
@@SQLBI just to confirm, are you saying that clearing the cache in DAX studio will clear the caches of the local AND remote models? Or just the local model?
Marco, regarding to docs "Some queries may return wrong results when there's a relationship between calculated tables and table(s) in a remote source. Creating calculated tables over a remote dataset isn't supported, although it isn't currently blocked in the interface."
docs.microsoft.com/en-gb/power-bi/connect-data/desktop-directquery-datasets-azure-analysis-services
You act as if there is no such limitation. Does it mean that the limitation will soon be lifted?
Very interesting point. We will investigate, because if calculated tables cannot be created, it is a big limitation to solve the scenario described in this video. Thanks for the comment!
@@SQLBI Matt Allington encountered the problem with calculated tables recently:
exceleratorbi.com.au/the-most-important-update-to-power-bi-desktop-ever/#comment-85883
Good news: Microsoft confirmed that by the end of the preview the calculated tables over a remote dataset will be supported. There are possible bugs now in preview, the message was intended to warn about that situation only.
@@SQLBI Today is 2023-04-17. This feature is now GA. But the model still can not be refreshed in Power BI service because of the presence of a calculated table. The refresh ends with “Refresh is not supported for datasets with a calculated table or calculated column that depends on a table which references Analysis Services using DirectQuery.”. I tested the feature on a newly created Pro & PPU workspaces.
Today is 2024-05-22. MS finally released an update that allowed the refresh in the service. More than 3 years it took to lift the limitation.
Mitic
First