The best explanation so far regarding this awesome feature! I always like Alberto's and Marco's style to dig deeper into new stuffs and to analyze its strengths and weaknesses! Thanks a lot Alberto!
"Hurry up, don't change everything today but start learning because this is really going to change the way you look at your business models" :) Thanks for the video!
Thank you, this has really helped understand the potential and how to start to explore the best practise in using the new features. This is game changing and what so many businesses have been asking for.
Amazing! Really interested to implement this, having a full enterprise model managed by global IT teams and allowing local teams to extend the model as they need. Thanks a lot for this video with a very good explanation and demostration of this feature!
This come just as we got the go for attempting full coverage of our BI with SSAS, I'll be looking forward to the new best practices for tabular models to take advantage of it!
This is the killer feature. Sorry to bring up OBIEE again, but inability to standardize your organization's semantic model was a HUGE drawback of Power BI in comparison with the industrial BI solutions such as OBIEE and Cognos. THIS solves this problem. It is now that we can finally separate metadata stewardship to more qualified analysts and deevelopers, but keep the Power BI's unmatched front-end flexibility in hands of advanced users. Separation of duties is a big thing in large companies.
I would normally connect to DataFlows instead of DataSets precisely because this seemed the only way to add other tables and relationship (with the drawback of having to start with no base-model at all each time). This new feature does sound revolutionary! Then the question is: are there any use cases left for DataFlows?
Whenever you want to separate the transformation from the data model. For example, you can easily have a Product table share in many datasets. If the transformation is complex, you can keep it in a single place using a dataflow.
Thank you Alberto for this amazing demonstration! Off-topic: what is the model of the microphone you're using? I find it really convenient for recording screencast videos.
Hi Kay, I am still testing several ones. Because I have seven monitors in front of me, there is a lot of echo coming back and I need an very close mic. In this video I am using a PROEL condenser mic (HCM23SE) even though I am not really satisfied with the audio results. I plan to experiment with different headsets in the next weeks... I am in the process of learning the secrets of audio tuning! :)
great video!!! I have a few questions... where is stored the data from excel? is stored in a new power bi dataset? or you have to upload the new composite model as a new dataset publishing it?
Only Azure Analysis Services. Analysis Services on-premises is not supported. They could support in the future but deploying a big update to the on-premises version (perhaps, SQL vNext?).
Great video Alberto. Any thoughts on how calculation groups would work when combining two datasets? Fx a main dataset with sales and calculations for time intelligence, combined with website data wanting the same time intelligence calculations
Thanks for this, i was looking for the part of the running totals with percentage that way a long time. Can you tell me how you can filter this? if i add a filter on a different column the filter does nothing, it only filters when adding a column to the filter which also contains the added calculated columns for the running total calculation.
Amazing!! I was awaiting a lot my end user will stop hear "We can't include that shitty local Excel file in our robust model enterprise model. Even that is important for you."
Hi Alberto, Thanks for the excellent presentation on the new feature. What happens when business users need to reflect their data just updated in SQL (storage model) to power bi report? as you recommeneded semantic layer refreshing, power bi shared dataset can only be refreshed once in 30 mins unless done externally or you prefer to connect to analysis services and it connect to sql by direct query and power bi connects to it. please let me know your thoughts.
It really depends. You can refresh quicker with Power BI Premium, using DirectQuery has a big impact on performance and usually it is acceptable having some latency because of the refresh obtaining better scalability and response time. If you really need real-time data, you have to use DirectQuery, but you should be aware of the issues in data consistency in the report (if you run a report during a refresh, you can get different numbers in visuals that should provide the same measure in the same report, for example a line chart and a table). Real-time update is a tough business.
Thanks for this overview of this amazing new feature. I would like to only use 1 table from my cloud model. It is a calculated table that aggregates the „result“ of my cloud model in just the right granularity. Until now I just exported this table as an excel table using DAX Studio and importing it in my local model that works with this result. At first composite models seemed to be a much more elegant solution for this scenario. Testing it though I always have ti import the intire cloud model with all the table and all the complexity. This also crowds up my local model with a lot of tables that i do not need. Any advice on best practises here ?
Try the last version of Power BI Desktop: you can select the tables to include. Hint: create a local table first, then connect to the external model. At this point you can select the tables to import.
This is a cool feature, does this create a brand new model on top of existing analysis model in Power BI. means it also needs to be refreshed frequently for any dataset (Excel, SQL server etc) added by user.
This is really huge, tried it since yesterday and really feeling amazed. After going through the documentation, came to know that 3 chain of model is the limit in preview. Now if we think of an organisation and multiple analysts creating their own model on top of enterprise bism, will it deviate from golden dataset!? And will the admin be able to control the publishing of these local models? Thank you for taking up this. So this brings up lots of questions related to data Island, weak and strong relationship and other existing modeling and DAX concepts (will it continue to behave same as existing). Curious to learn about this.. will look forward for things around DQ against semantic model.
We are just at the beginning and there are bugs and lessons to learn. Moreover, the feature is in preview, expect changes before its release. In general, we think that easy tasks like importing an Excel file, creating an additional calculated column and renaming a measure will not have particular issues. Creating mash up with two or more models and adding complex relationships between models, interacting with calculation groups,, managing security (not supported in this first preview)... well, that's another story. We need to try, study, and learn.
@@SQLBI Yes agree, this is just the beginning and lot more things will be coming out of the box. Many things to learn on the way. Appreciate it and welcome the New beginning.
@@SQLBI Thanks a lot Alberto! Can we immagine to create different models that can interact within each other in a modular way? What do you think about provide the user with simple models that he can compose in a scalable way? Does it overlap the dataflow concept?
Love this new feature, but it leaves me wondering a bit. Doesn't this pretty much make dataflows obsolete already? Why should I use a dataflow instead of this solution?
A dataflow split the business logic of a transformation from the dataset. It could be useful to reuse the same transformation logic in different models. A dataset is table+relationships+mesures, it's a more high-level concept. If you used dataflows as a way to share tables between datasets, yes, you used a temporary solution and this is better. However, you might end up having the need of reusing the same table (dimension?) in different datasets (connecting a dimension in one dataset to a fact table in another dataset is intuitively a bad idea, even though we have to practice to evaluate pros-cons). Thus, you might still want to use dataflows. Each tool is a just tool. It doesn't have to be used just because it exists, but when it's useful!
You can find free content (including videos) from this list: www.sqlbi.com/topics/composite-models/ And more content on SQLBI+: www.sqlbi.com/p/plus/ www.sqlbi.com/whitepapers/composite-models/
Thanks so much for the video, I watched it when it first came out....and now again for the 3rd time ! Yes indeed a game changer, I wonder what your opinion is now that we are 10 months in from the first launch...would be great to understand your learnings, pros and cons.
There should be a limit on the number of indirections, it seems it is two or three by the first tests we made but it could change and we didn't run extensive tests yet.
@@SQLBI Thanks. I like the idea of being able to distribute load (and hence costs). If you have 3 servers making up a composite model you could make it so the more accessed parts of the model reside on the highest tier server and data that you are happy to wait for can be on the lowest cheaper tier. From the user experience point of view there is still only one connection point and it presents as massive integrated model that still performs where needed.
Amazing Video Alberto! Learnt a lot from your videos as always. A question though, I couldn't figure out how the DAX of the ABC Amt RT calculated column works. Did a lot of research on the function but still couldn't figure out. Are you please able to explain a bit? or suggest any document I can read? thank you!!!
See these patterns: www.daxpatterns.com/cumulative-total/ www.daxpatterns.com/abc-classification/
3 года назад
Excellent video on an excellent new feature of Power BI, indeed! I am also grateful for all the tutorials you are creating and sharing. However, my question refers to the ABC classification example. I have a model where I have a product table (16 k rows) and another table with locations (cities) belonging to different countries, countries belonging to different business units (1 BU could be 1 country, but there are cases where 1 BU means 4 countries together). What is your advice to have ABC classification that would respect the filtering by country and/or by BU? If I add calculated columns to the product table as you suggested I would not get correct classification. Many thanks for the feedback!
I suggest you taking a look at the ABC Analysis pattern at daxpatterns.com. There, you'll find plenty of options. In your case, I think you will like the dynamic ABC, or the snapshot ABC. Both should work well in your scenario
3 года назад
@@SQLBI Many thanks for the feedback and tip. It looks like the snapshot ABC would be best option for me so I would spend some time to read it carefully and implement it. Mille Grazie!
Great! What happens if i connect 2 dataset with their own date table (to simplify with the same structure)? Remove one table and recreate the relationship or create a bidirectional relationship between the two?
It's too early to say. We are experimenting. There are several pros-cons in each scenario and there are so many combinations: - local copy of Date connected to the two Date tables in remote datasets - use one Date table and hide the other, using a single-direction filter - show both Date tables and create a one-to-one relationship (bidirectional) On top of these solutions, you have the options of the granularity of the relationship between datasets: Date, Month, Year? It depends, you have to balance granularity of analysis with performance. At the beginning, start with simple things, like adding a local table or a calculated column. Mash up of two or more remote datasets in unexplored territory, nobody has experience to define best practices.
We use "new composite models" to remove the ambiguity with "DirectQuery for relational data sources". When you connect to a published dataset with a new composite model (officially "DirectQuery for Power BI dataset and Analysis Services") you can expect the same performance of Live connection, which is orders of magnitude better than "DirectQuery for relational data sources" (like DirectQuery for SQL Server).
Very informative. With this new feature (assuming it works as expected), does it mean that we don't need SSAS or even Azure Analysis services? We could take raw data (from ODS for example), build a virtual operational datamart, then load that data via import mode to Corporate Power BI Dataset (which might combine other datamarts) and make it available to business users? (especialy if one has Power BI Premium). Or would it make more sense to have a physical operational datamart and load it through direct query to Corporate Power BI dataset for further business usage? The last option doesn't make sense to me - but I might be missing things :) I am just trying to picture the enterprise data architecture with Power BI and ODS/Datamarts on MS SQL Server. Don't have plan to use AAS as plan to have PBI Premium.
It's still too early to say, but this tool is not a way to avoid ETL/data cleansing and a good architecture. You can certainly create reports integrating additional data, but creating complex mashups with multiple data sources is not a good idea. You would always pay a technical debt in data quality and performance.
@@SQLBI i mean i know is not possible to use time intelligence functions with DQ. My questions is: is it possible to use time intelligence functions with a model that has DQ and import mode?
You CAN use time intelligence functions with Direct Query (docs.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-about). You just have to add a Date table to the model, which is a best practice for any model. See ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Bzruqrj-wZg.html and ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-xu3uDEHtCrg.html
It depends on the cardinality of the relationships between different data sources. If you use a remote dataset (not a DirectQuery connection to SQL) the performance is relatively good in that case. DirectQuery over a relational database was, is, and will continue to be much slower.
Now in April 2022, I can appreciate how powerful this is because we seem to be denied it with Live Connect to SSAS and it's positively disabling. I'm stuck with the model's design, even for adding simple slicer columns such as Price Range. Am I missing something? Is there some option I need to enable? (HELLLLPPP!)
HOLY COW, I just built a report on TWO cloud datasets! It's slow, but it's working. Very promising feature! The bad things that will happen: all the display folders will be gone, naming conventions should be in place ( I still don't know what will happen in the naming conflict), and I couldn't find a way to disconnect from a dataset as they don't appear in the Power Query.
After you pull in your own data and it is local, then do you publish it back up to your tenant? I have seen a bunch of videos like this and its great that you can do this but they never explain what to do after you merge the data. From what I understand, you just keep it local..?
The local data stay local. You can publish the second model to your tenant, but also in that case you still have two separate databases (they could be in different tenants - assuming you have authorization to connect).
@@SQLBI - Is that common to do that? You would need to publish the model to a different tenant? What happens if you publish back into the main tenant the connected data set was from? Would that add the Excel file to that data set? Thanks!
Thank you so much Alberto - This is amazing - side question - SQLBI normally advise against using Many to Many, when you added the budget to the model and added the relationship, is there an alternative best practice or is this the time when it's acceptable to use Many to Many?
In this case it is acceptable. However, we strongly advice to avoid the bidirectional filter. The many-many cardinality relationship with a single direction filter is usually good as long as you are aware of the side effects you have on measures used below the cardinality of the relationship.
@@SQLBI Thank you, when you state below the cardinality of the relationship - I presume you mean further away from those tables - so if you have a fact table on your product and product budget, on one side of the model. The issue might start to appear if you want to look at the budget from another completely separate dimension table that holds not direct relationship to the product/product budget table?
by different granularity I mean that with a many-to-many cardinal it’s relationship at the product category level with the budget, you cannot split by product name the budget (but you can split the sales by product name).
@@SQLBI - I think I'm beginning to understand this point now. Is there any visual that I can look at to drive this home more simple. I feel this is very important point, and want to ensure I fully appreciate your comments
You can start here: www.sqlbi.com/articles/budget-and-other-data-at-different-granularities-in-powerpivot/ www.sqlbi.com/articles/working-below-a-dax-formulas-granularity/
The composite model is great, but I cannot refresh the report as it has calculated tables and columns which seems to be a limitation when refreshing from powerbi service. Any help is appreciated.
@@SQLBI when you copied the line in the switch function i noticed your cursor is on the last letter they you highlighted the whole line then pasted, is it shift+down arrow
What version of power bi desk top are using, because it doesn't work with my. I am using analysis service live connect and I can't add anything else. Like importing excel
Excellent video... Just tried this with my model (using Dataflows). As soon as i enabled the DirectQuery i lost the date heirarchy in all of my date fields in the table. Does anyone know why this happened and can i fix this ?
It could be a bug or not - the feature is new and in preview, there are knew limitations published in the release doc and there are probably several bugs. However, if you are using auto date/time (as it seems from the description), you should stop it now! :) See ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Bzruqrj-wZg.html
In its beautiful and important videos, I have not yet found something that finds the maximum among a range of values. Example given a date field and a value field, where to find the maximum value of the values field within 10 days, referenced the date field of the previous 10 days. I thank you for your attention, a greeting from Turin.
Can we create a composite model making one dataset as live connection and another one as an imported table, but both the queries are fetching data from a single data source. For example Oracle database.
Hello, There may be a big catch! Let me explain myself : Today I always encounter a very annoying issue with PBI desktop when I use a table from a query (let's say Table1) to create other tables (Crossjoin or simply Table2=Table1). whenever you create a new column on Table2 you will find yourself with a "not enough memory" error on updating Table1 with more rows. I'm pretty sure when the model in the cloud will increase it's number of rows you should encounter the same problem obliging you to remove all the columns updating and putting back the columns again on each updates . Please let me know. Christian
As usual, you have to be aware of what is going on when you create composite models. Take a look at this other video: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-9lkVk4t2qL0.html
You need the latest version of Power BI Desktop and you have to click on "Make changes to this model" button in Modeling ribbon. If you don't see that button and you have Power BI Desktop December 2020, you are probably connected to a database that is not supported (only Power BI datasets and Azure Analysis Services databases are supported. Power BI Report Server is not supported).
my concern is that calculate column is quite heavy if your data table is over 1 million line , the function is cool but not so workable for real big data senario.
A column True/False is not that big, its cost in memory and processing time is moderate. I would be careful with 1 billion rows (or at least 100 million rows), definitely not with 1 million rows.
@@SQLBI I meant, create the local model connected in DirectQuery to analysis services as you did in the video, and then add a new connection in DirectQuery to an Azure SQL database, to join some tables, is it possible? Thanks!!
Thank you for this video. I'm unable to use this feature though. I just upgraded my Power BI desktop version to December 2020. I connected to a model in Live mode but when the "Transform data" button remains disabled. Could you please advice, am I missing anything?
Did you connect to Analysis Services on-premises? It only works with connection to Power BI Datasets and to Azure Analysis Services. You should also see a button "Make changes to this model" and a link in the bottom right corner (we prepared the video with a preview). Watch MS announcement, too: powerbi.microsoft.com/en-us/blog/power-bi-december-2020-feature-summary/#_Toc58831296
That's great news. But do you know if we can do the same thing with Multidimensional Cube? I tried and it seems still not working, maybe I miss something? Well, I hope so. Thanks!
You need the latest version of Power BI Desktop and you have to click on "Make changes to this model" button in Modeling ribbon. If you don't see that button and you have Power BI Desktop December 2020, you are probably connected to a database that is not supported (only Power BI datasets and Azure Analysis Services databases are supported. Power BI Report Server is not supported).
Live Connection allows to connect to a single data source. With new composite models, you can have multiple live connections within the same model (and report).