Тёмный

3 REASONS to use a single dataset for your Power BI reports 

Guy in a Cube
Подписаться 454 тыс.
Просмотров 157 тыс.
50% 1

Опубликовано:

 

26 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 253   
@ntm0709
@ntm0709 4 года назад
Another top reason: only having to maintain the Row Level Security on 1 dataset and applying RLS across workspaces with centralized control!
@Maartenravie
@Maartenravie 4 года назад
I noticed, when you publish the report in an app (and the end-user does not have view permissions on the original dataset) the end-user is not able to view the app When the dataset is in the same workspace as were the App is based on, it works fine.
@ntm0709
@ntm0709 4 года назад
@@Maartenravie I also ran into this problem. However, I found that you can go into Manage Permissions of the dataset and manually provide Permission/access to report consumers which then allows users to view the report in alternate apps then the workspace where the dataset exists. The reason it works automatically when the report is in the same app as the workspace is because when you publish app, it's assigning those app users these permissions. This is not the case when publishing alternative apps which is why you need to do it manually.
@ceciliawang3538
@ceciliawang3538 Год назад
We recently ran into this issue but couldn’t work it out. We have the golden dataset in a PPU workspace. The reports are built on a different PPU workspace pointing to the golden dataset and published to the App. Users have pro license and we gave them viewer and build permission under the golden dataset and viewer permission under the workspace that hosts the reports. However, users still received the prompt that they don’t have access to the underlying dataset. Appreciated if you can answer this.
@NeumsFor9
@NeumsFor9 3 года назад
Good tips. The flip side of this is finding a good way to organize your measures (display folders, governance). Before you know it your "central datasets" can get pretty crowded....which is a better problem to have than data anarchy.
@tonycovarrubias5931
@tonycovarrubias5931 4 года назад
"I'm not lazy; I'm efficient!" ~ Thank you for that. It drives me crazy when people say they're lazy when clearly they are not.
@MichaelRygaard
@MichaelRygaard 4 года назад
I am Lazy, but in order to be proper lazy i have to be really efficient, you can be both ;) - but in truth im not lazy, since i will peruse an idea that can save me 10 minutes for hours, because i know that idea can and will save me 10 minutes 100 times in the next year.
@hosamalanazi5289
@hosamalanazi5289 3 года назад
This actually inspired me to clean all the companies report to one dataset of each needed data, like our product reports are all connected to one dataset and all sales reports to one dataset, and all mobile app reports to our online mobile app dataset
@shimaaal_harbi4155
@shimaaal_harbi4155 7 месяцев назад
Hi can you please share your linkedin account
@manosmaniatakis4067
@manosmaniatakis4067 4 года назад
Thank you Patrick for the video!Just had the conversation internally on why to maintain just one dataset. Excellent timing!
@GuyInACube
@GuyInACube 4 года назад
That's awesome! Thanks for watching! 👊
@yornav
@yornav 4 года назад
Fully agree with your reasons. I do it all the time and it makes life so much easier. With regards to measures, with shared datamodels you are still able to add your own measures to the reports. So if other people build reports based on the shared model, they can add their own measures. Then, if people create useful measures that might be of benefit of other users as well, I 'migrate' them to the source model. 4th reason: The reports built on a shared model are WAY smaller and publish MUCH faster.
@wardsteensels4598
@wardsteensels4598 2 года назад
Great, Patrick. Just starting with Power BI but was googling if I could not use one shared dataset for several reports. Your video explains it perfect, now using golden datamodels!
@jb360360
@jb360360 4 года назад
This is solid advice. Applying this now for an enterprise roster and headcount request project solution I’m working on with HR. I currently have like 6 pages developed against one dataset and each page/report has a site and dept filter. Could probably create a report for each site using this method and only have one central data model. I love it. Thanks kindly sir!!
@hrishiw1989
@hrishiw1989 2 года назад
It really makes sense with creating multiple reports with a single data model. Even i would say that to make a better use of your datasets that are coming from different sources and may be even for different purpose it is easier to put all those datasets in one data model for easier management.
@mohamedaboobackersiddique7322
@mohamedaboobackersiddique7322 4 года назад
Thanks both Patrick and Chris for this great tips!.
@GuyInACube
@GuyInACube 4 года назад
Hope you found them helpful. Thanks for watching. 👊
@JoanBoadas-Boadas
@JoanBoadas-Boadas Год назад
Amazing video. You have saved me hours of copying measures and code from one report to another. Love this channel.
@JackOfTrades12
@JackOfTrades12 4 года назад
I'm not really a fan of connecting to datasets as you can't create calculated columns on tables. However, my team recently onboarded to Premium, and are taking advantage of connecting to the workspace SSAS api. We use our published datasets, open a new file, connect using the analysis services connector, and specify a DAX query with summarize to get the data we need. It's been incredibly helpful since we don't own all reports, and need to re-use and centralize many of our OKRs/KPIs. This also removes burden from the source sql servers, and preserves logic. Of course the only downside is recreating measures, but with some cleanly processed values, it's not so bad.
@CCHSmathematics
@CCHSmathematics 4 года назад
I'm really looking forward to composite models enhancing this experience. Thanks for the video Patrick!
@chayakiraneng
@chayakiraneng 4 года назад
Thanks Patrick. We have been following this design/architecture in our firm for a long time now and are a big fan/proponent of this design. We use the term 'core model' and 'thin report' for this design. Hopefully MS adopts some terminology for this design. The pain points we see with this are: 1) Whenever a column or table is renamed in the core model or the golden dataset, all the linked thin report using those tables/columns break. This is unlike other BI tools such as Webi. Hopefully MS can enhance this. 2) We know that a measure can be added to the linked thin report, but we often encounter a scenario where a calculated column is needed. It would be great if MS could provide a feature for this. 3) Last scenario we often get challenged upon is being able to add multiple relationships between the same set of tables in the core data model and at thin report building time, specify the relationship context. This is supported on other BI tools such Webi universes and in Power BI, we can only add one set of active relationships and other set becomes inactive relationships. Does MS plan to have the concept of 'context' for the relationships? Thanks again!
@youjohnny16
@youjohnny16 4 года назад
Cant you just use dax in your thun report to solve problem 2 and 3? Try looking at addcolums dax function and userelarionship dax function. 2. dax.guide/addcolumns/ 3. dax.guide/userelationship/ John
@lindarn6--269
@lindarn6--269 4 года назад
AMAZING!!! I will start digging myself out of my 4-dataset-maintenance-nightmare first thing tomorrow!!! Thank you Patrick!!
@janquieldapper
@janquieldapper 4 года назад
In a simple video I can learn a lot about databases! I have a online server with SQL databases, in this server I install the PowerBI Gateway, create my database, and now I can work directly from my laptop with this data... ohhh off course, schedule updates in this server... all data updated all the time!
@GuyInACube
@GuyInACube 4 года назад
BAM! You are amazing 👊
@vinaypugalia6090
@vinaypugalia6090 4 года назад
Hello @partick, the concept of a shared dataset is really useful. However, there are situations when these can get us nuts Vs bananas. Below are a few situations & I would like to hear from the community on how to handle them - 1. If there are multiple reports pointing to the same shared dataset, what is the best place to define the measures - in the dataset or in that particular report? I took the approach to define the common ones in the dataset & the report specific ones in the report but got stuck while preparing my dataset for Q&A, as I was not able to train the dataset for the measures which were defined at the report level. 2. Having a central/shared dataset sounds really exciting, but when we have really big ones, things can go crazy - as a small mistake will impact a lot more reports and every report user will get affected. How to handle this? 3. What to do when 2 reports pointing to the same shared dataset wants a different relationship or filter direction? I am practically facing these issues & am looking for some guidance/suggestions from you & the community. Thanks in advance!
@jaymehta3320
@jaymehta3320 4 года назад
Thanks Patrick for sharing this! have always tried to look for ways to increase efficiencies and this one helps a lot.
@GuyInACube
@GuyInACube 4 года назад
You are very welcome. Thanks for watching! 👊
@simonestrizzolo5357
@simonestrizzolo5357 4 года назад
Very good suggestion! I have multiple identical reports, which are different only in language (eng, ita etc.). Setting up automatic refresh is always a nightmare!
@Silverlythia
@Silverlythia 4 года назад
Yes yes yes!! So many reasons but just do it! But...Patrick, not fond of the borders ;). You can create it (connected to central dataset) and then give to them the PBIX for them to publish too, and they can create measures if they are savvy too.
@driouchemountasir7894
@driouchemountasir7894 4 года назад
Hello Patrick, thank you for this video ! Always interesting to learn good practices from professionals like Adam and yourself. Thanks for sharing with your community :) We appreciate it
@GuyInACube
@GuyInACube 4 года назад
You are welcome. Thanks for watching.
@mudyasaad
@mudyasaad 4 года назад
Hello Patrik, Great video as always ! . I am with you on this one 100% but the only reason why i cant do this is because i have different reports going to different audiences and i share these reports as apps. If we were able to seperate a workspace into multiple apps with their own access permission that would be awsome.
@GuyInACube
@GuyInACube 4 года назад
Why not have a shared dataset with different workspaces for the different roles, and the app for each? all reports hit the same dataset. The biggest blocker there is that you can't add anything with the shared dataset today, but once the updates to Composite Models comes to allow that, it really makes for a great way to handle it.
@stevenfoster5799
@stevenfoster5799 4 года назад
2 videos in 2 days, is this heaven?
@GuyInACube
@GuyInACube 4 года назад
Thanks for the kind words. 👊
@yannickfranckum6589
@yannickfranckum6589 4 года назад
Thanks Patrick. I have already implemented that scenario/architecture in my company and It's very helpful.
@GuyInACube
@GuyInACube 4 года назад
Awesome! 👊
@earlnoli
@earlnoli 2 года назад
When i was doing ReportBuilder there was an option to create SMDL files that host common data models. Good to know powerBi also has it.
@read89simo
@read89simo 3 года назад
OMG, I made almost 30 reports by copying pasting. This is a great tip thank you
@jonbaylis2203
@jonbaylis2203 4 года назад
Superb and informative as always. Hope you and Adam are well 👍.
@GuyInACube
@GuyInACube 4 года назад
Thanks for watching! 👊
@malcorub
@malcorub 4 года назад
I'm not lazy, just efficient.
@TheLeotLion
@TheLeotLion 3 года назад
I used to work with 'Eddie' worked with him for year, he never shared his last name, but he used to say this all the time (( he also worked in data in the early days of computing )).
@donaldkidd7427
@donaldkidd7427 4 года назад
This video is very timely as I have been selling this "golden" model concept to the powers that be, now I can send them a link to this video to support my efforts. We have an extensive set of Dashboards in Performance Point that is based off an SSAS multidimensional data mart. The executives like their dashboards but we are scheduled to upgrade to Sharepoint Online which is not compatible with Performance Point. I'm in the process of creating a tabular data model in Power BI because the muti-dimensional SSAS data model is built specifically for Performance Point with name sets which are not compatible with Power BI in order to duplicate the the dashboards in PowerBI. So this is a long winded Thank you for this video.
@Noxictyz
@Noxictyz 4 года назад
Me and my team struggle with this a lot going into a new project. Yes, of course we want to build the golden model but then only one person can work in it at a time! How is that going to work? Our idea is that we create very lean data models for very specific reports, then merge them together when we are done.
@mohamedaboobackersiddique7322
@mohamedaboobackersiddique7322 4 года назад
hi, i think u can put the model file in cloud/onedrive/SharePoint and access simultaneously.
@dmiradakis
@dmiradakis 3 года назад
Funny you mention lean data models, I used this exact terminology with my boss the other day. I see your perspective here. I love the concept of shared datasets, but I think it's a balancing act. I try to make lean queries in specific reports and fetch common dimensions and such from the shared dataset. They can only get so big unless you dump them into a premium capacity.
@jackypacky13
@jackypacky13 4 года назад
The information I didn't know I needed! Thank you
@intuitivelearning4179
@intuitivelearning4179 4 года назад
Hi , I have 2 reports using single dataset/data model out of 1 report is daily refresh and another report is monthly consoildated refresh. How can this be done using the shared datasets. Any help on this please?
@abhijeetdesai3350
@abhijeetdesai3350 4 года назад
Good one Patrick, we r using same approach while developing any new reports with same model...even if any measure is not present in model, we can create it for that perticular report in report only...
@atomek1000
@atomek1000 4 года назад
that creates also burden as you could end up with same measure calculated differently in two reports
@nareshrocks9532
@nareshrocks9532 3 года назад
Hi Patrick's where the start date is the lowest login date of any user with that domain and the end date is the most recent login of any user with the domain
@Ruudje1878
@Ruudje1878 Год назад
Just what I needed. Thanks Patrick!
@abigi4me
@abigi4me 4 года назад
Wow what a great tip to maintain data model. Love it Thanks Patrick 👌🏻👌🏻👌🏻👌🏻👌🏻
@GuyInACube
@GuyInACube 4 года назад
Thanks for watching! 👊
@roaming_bob8591
@roaming_bob8591 4 года назад
Patrick, looking for a solution that will show previous year result, and then static line for goal improvement for new year. Appreciate your help.
@leonidiakovlev
@leonidiakovlev 3 года назад
Another reason not to have single source of truth is that you will need to host the whole dataset at every workspace... Even for tiny reports... Because you can publish reports only on workspaces where lies the dataset the model is connected to... Thanks for video anyway, you are great!
@inchristalone2594
@inchristalone2594 4 года назад
Whooohooo. Guy in the Cube famous!
@JEffigy
@JEffigy 4 года назад
Another awesome video! But hey you forgot to mention a 'Common Data Model' and Dataflow entities. Also be worth extending out for the slightly older school SSAS Tabular model migration, if in a larger enterprise environment.
@njbkilcoyne
@njbkilcoyne 4 года назад
Can't agree with you more.. perfect.. efficient not lazy.
@GuyInACube
@GuyInACube 4 года назад
Love it 👊
@TechCoach
@TechCoach 4 года назад
Wow, for the 1st time in my life I have seen a video with 500+ likes and 0 dislike.these videos are gold.
@GuyInACube
@GuyInACube 4 года назад
Thank you for the kind words. 👊
@yogeshmahadnac9426
@yogeshmahadnac9426 2 года назад
Hi Patrick! Many thanks for this very useful video! 👏 Situation: Let's say we have 1 Dataset Workspace where all datasets are published. And then we have 3 reports connecting to 1 dataset in that workspace, ok? Questions: When you have finished building those reports in your Power BI Desktop, you publish them and now you want to give access to them (to designated users in your Active Directory, for example), do you only need to give access to the report, or, do you also need to give access to the Dataset Workspace as well? Thanks for your reply 🙏😊
@vincenzonosso7400
@vincenzonosso7400 2 года назад
Yòooo Patrick! Great video, as usual :) What if I have different remote coworkers, which want me to "merge" their Excel files (in a single dataset)? I mean, how would you efficiently manage this scenario? How to collect the files (maybe via a shared folder)? How the refreshing process could work? Thank you in advance!
@gabrielfuma4890
@gabrielfuma4890 4 года назад
Thank you Patrick, this is exactly what I needed!
@flyhigh1491
@flyhigh1491 4 года назад
Thanks for this reason. But the general problem with data sets is that you can't add additional sources :-( That's why in my Corpo people use hundreds of data flows, but now nobody wants to use a common data model, even if we introduced one for everyone, nobody uses it ;-)
@unigirl123
@unigirl123 3 года назад
mind blown!!! Thank you, thank you, thank you!
@MuhammadMansoorAhmed
@MuhammadMansoorAhmed 4 года назад
Great tips. Really enjoy your conversation
@GuyInACube
@GuyInACube 4 года назад
Thanks Muhammad! Glad you liked it. 👊
@miragliag
@miragliag 4 года назад
Thanks! Thats great! the only downside is if you rename measures or column in the linked reports it crashes... in the original report all graphs are updated automatically. Is there a way to fix this?
@swerick
@swerick 4 года назад
Power BI is basically Analysis Services under the hood, right? Doesn't model size bloat quickly as you continue adding measures to one unified model because of how aggregations occur for every dimension and attribute? Premium pricing can be a challenge as you start getting to P2, P3, etc. to ensure adequate capacity and vCores for such a large model. Also, would you recommend one model for everything, or one model by business unit or subject area? Thanks for the great video!
@cafealpha82
@cafealpha82 4 года назад
Sean Werick exactly problem we have. As we centralized the model, the size could go crazy. Then we needed the premium version. Oh well it isnt performing, so pay more for premium? When most of reports are static, I still believe one smaller data model that is specialized for the report performed much better than using the central data model that the report only utilizes 5% of data there
@swerick
@swerick 4 года назад
@@cafealpha82 That's exactly my concern and why the data mart approach seems to make more sense from a pricing and performance perspective, with customized models ALL based on an overlaying data model (only logical [Erwin, etc.]). All of the data marts are simply subsets of that model, just like Kimball methodology. It's more difficult from process, governance and change control perspectives though.
@emmanuelmandica2733
@emmanuelmandica2733 4 года назад
Thanks Patrick as usual your Videos Rocks and are very helpfull
@sarahacton7192
@sarahacton7192 4 года назад
This is exactly what I needed! Thanks team
@juancarlosfigueroafigueroa943
@juancarlosfigueroafigueroa943 2 года назад
Great thank you. I'm wondering, how can you reverse the path from 2 files (dataset & report for viz) to a single master PBI file? To test the RLS for example.
@husnabanu4370
@husnabanu4370 3 года назад
Hi, Thank you for the wonderful explanation...when we used shared dataset we connect with the entire dataset is it possible to choose only few tables from the shared dataset ??
@raz8657
@raz8657 4 года назад
This is Raj, appreciate that you brought Christopher blog post, he was my manager earlier..hehe
@dheitsc7871
@dheitsc7871 4 года назад
I already thought the same last year and so I developed a very large data model which combines many different sources (sharepoint, sql, mysql etc.). The problem now is that when at least one source is not available the whole datasets wont get refreshed. It would be really great if I could have one master dataset per source and combine them as I need them in my reports. The problem is that its not possible to create a report from more than one PowerBI Datasets actually.
@namaa1000
@namaa1000 3 года назад
Thanks Patrick for your great work! Is there an alternative solution for Power BI Report Server?
@timewithsopy
@timewithsopy 10 месяцев назад
With this approach my org can connect to the dataset from excel. Bring them to my data instead of the other way around (as much as possible).
@oladaposorinola6216
@oladaposorinola6216 4 года назад
Thank you Patrick!
@GuyInACube
@GuyInACube 4 года назад
Glad you liked it. Thanks for watching!👊
@oseliocandido1807
@oseliocandido1807 3 года назад
This is definitely the problem that i was passing. Clever solution
@stephanies870
@stephanies870 4 года назад
i tried to follow along and I still don't know what he's talking about but I watched the whole video because it's entertaining either way
@GuyInACube
@GuyInACube 4 года назад
I'm sorry you didn't understand what Patrick was talking about. The main idea is to reuse data instead of duplicating data across multiple Power BI Desktop files. We do try to have fun with it. Glad that came across. 👊
@stephanies870
@stephanies870 4 года назад
@@GuyInACube I'm a newbie to power BI and trying to catch on. I'm a slow learner and use to boring excel sheets.
@andrevioti
@andrevioti 4 года назад
@@GuyInACube Hello @Guy in a Cube. If I understood the approach properly. Essentially, we are making all the tables/measures recorded in the original pbi file/data modeling available in this new report without having the data modeling itself. it is kind of replication right? In this way, we can keep the integrity of the original data model
@iamscottr
@iamscottr 4 года назад
What's the most efficient way to convert over to a single dataset if you have already created a bunch of different reports in different files?
@filippogiustini2610
@filippogiustini2610 3 года назад
exactly! I do agree 100% on single data set approach for all good reasons mentioned here, therefore my question: How to fix all the many data set you eventually have already in place? Consider I am the administrator and I have about 30 Pro-license developers who filled up Pbi service with several tens of reports, each of them with dedicated data set. But in reality they could be leveraged on few common data sets. Did you find a way to fix up such a kind of mess?
@iamscottr
@iamscottr 3 года назад
@@filippogiustini2610 The technique I am using is to start a new PBIX, connect to the new data source, manually copy and past visuals in from the old PBIX, then manually fix all of the broken data associations. Very time consuming and error-prone.
@filippogiustini2610
@filippogiustini2610 3 года назад
@@iamscottr I can imagine, hope it could be some more handy tool to replace old datamodel with a link to an exisitng one. Btw I'm trying this workaround Thank you
@adamb4950
@adamb4950 4 года назад
Nice touch with the LSU watch. GEAUX TIGAHHS!
@johanaalarse6870
@johanaalarse6870 4 года назад
Thanks Patrick! You’re great. What about security? It’s possible to shared only one report and drill through different pages in others reports?
@zme888
@zme888 6 месяцев назад
Q: if i have one semantic model/dataset. What it i have multiple different frequencies that data needs to be refreshed? From slowly changing dimensions that are REAL slow, reference tables updated yearly to fact tables that need updating every hour. When i schedule a refresh of just the one semantic model i have to refresh everything frequently? Even if for some tables/sources they havent changed?
@MrNurbolb
@MrNurbolb 4 года назад
Hi. I really like your videos. You help me to know something new in pbi every day. Can You answer the next question: how can I get list of all visualisations and object used in them? I use DAX Studio, but it cannot help me. Can You?
@nguyenlamtong8503
@nguyenlamtong8503 4 года назад
Thanks for your tip. But can you also name some of the reason we should NOT use the same dataset
@rudisoft
@rudisoft 4 года назад
Hi Patrick, thanks for this! So why can't join data from multiple datasets? It would make cascaded datasets so attractive.
@monilgandhi822
@monilgandhi822 3 года назад
Hey, that's what I was looking for. However, I have already created versions of pbix files with the same data. Is there a way to link all of them to one data model and not recreate all visuals?
@masonwhitehouse7587
@masonwhitehouse7587 4 года назад
Great video Patrick but quick question how many people can use a given shared model at a time without creating issues.
@andresrojas6381
@andresrojas6381 Год назад
Q: Is it possible to create a dataset from a dataset? I would like to create different layers of datasets so everyone can use the level of granularity desired.
@robsonnascimento5935
@robsonnascimento5935 4 года назад
Thanks Patricks for this Tips, Awesome!!!
@GuyInACube
@GuyInACube 4 года назад
Thank you! 👊
@DanielTavares29
@DanielTavares29 3 года назад
Wooooow!! If i am not wrong this topic is perfect for the new composite models in Power BI (Dez/2020), right?!
@regulardev
@regulardev 2 года назад
Thanks for this video. I was doing this in my projects. Have a question - does this relatively slow down the reports as they are no more in import mode - but in live connect mode to the published dataset ?
@davidhollingsworth5608
@davidhollingsworth5608 3 года назад
So after I saw this video, I built a master dataset and built all of my reports off the one dataset, SO helpful. Thanks for this. But within the last 4 weeks every report that I have built off a dataset doesn't refresh when i have it open in PBI Desktop, is anyone else having this issue since about October 15th onwards?
@TheWrencher
@TheWrencher 9 месяцев назад
Wait so, doesn't saving it over a connected model create an over-tabulated dashboard, the first problem you stated?
@NaNuTumblue
@NaNuTumblue 4 года назад
Thank you From Thailand
@GuyInACube
@GuyInACube 4 года назад
Thank you for watching our videos! 👊
@marinakhanukaeva5666
@marinakhanukaeva5666 4 года назад
Hi Patrick, thanks for the video. Any chance you know about the limitations of using Powebi datasets across reports? I'm trying to build reports using a single PBI dataset, but it turns out that some visuals (like sparkline by OKViz) stop working when I use PBI dataset in the embedded report, while they work properly with DB imported data instead of PBI dataset. thanks!
@teamofsteve
@teamofsteve 3 года назад
Very well explained
@CrazySw3de
@CrazySw3de 3 года назад
Would taking an approach like this increase loading time when opening different reports? I am curious if there would be performance issues opening a report containing more data than it is necessarily using, or if it would only impact that dataset refresh. (Ex: Linking data from multiple departments on the off-chance you may want to cross-analyze in the future)
@gerardcanals1526
@gerardcanals1526 2 года назад
Thanks Patrick! I want to share with you one problem with splitting report and dataset. If we publish the dataset in one workspace and the report in another workspace, the users need to be at least contributor at both workspaces to download reports created by using the dataset. I reported this issue and they actualized the list of limitations and proposed an idea to solve it. This is a limitation and it is quite confusing since, if you give build access to users to the dataset, then, they can create, publish, view and edit reports with this dataset, but they can't download it. Please, would you vote for the idea to solve this problem?
@rajkumarrajan8059
@rajkumarrajan8059 3 года назад
Patrick - Is there a way to connect a excel data from Desktop to PowerBI online? I have an excel report which will be updated on every week, what i am doing now is updating the data in the excel open PowerBI desktop refresh my report and then publish the report to the workspace. Rather i want to connect the Excel report directly into my PowerBI online. Please provide some tips
@garyrowe58
@garyrowe58 Месяц назад
Hi, im trying to get my head around how to control the sources of the reports in a small company with only Pro licenses. Ive inherited reports off a separate model, and need to make changes to both the model and the reports. I can download a copy of the modrl, but the report is only in the service and i cant download a pbix (it says the pbix is ready for download, but i have no idea where to get it from!!!), and i dont want to touch the only source of the live reports ... How can i work on copies of the existing model and reports, and get them tested, before somehow replacing the currentvpriduction reports? I got excited when i learnt of the deployment pipeline, but then found we dont have premium ...
@arnohoedelmans
@arnohoedelmans 4 года назад
Hi Patrick like this. Is there a possibilty to develop in desktop a single dataset and the different reports in desktop and then publish the singledataset and the different reports based on the dataset. So changes in the dataset when developping can be made in desktop quickly without having it to publish.
@jonaskarlsson477
@jonaskarlsson477 3 года назад
Do you have a video how to setup a single dataset
@learnspreadsheets
@learnspreadsheets 4 года назад
Page 1, 3,4 is for group A, pages 1, 2, 5 for group B and pages 1, 3, 5 for group C. I get this scenario all the time and your workaround doesn’t make this ideal, any advice?
@GuyInACube
@GuyInACube 4 года назад
I assume that all the pages are sourced from the same data model? This scenario can be problematic. Do you maintain multiple reports repeating each page? What a nightmare, almost as bad as maintaining multiple models. You could get creative by hiding all pages besides a navigation page with a button corresponding to each group. Once a button is click it would expose navigation specific to that group. Not sure there is a good way to solve this, besides security to show and hide pages based on the authenticated user. There is an item on ideas.powerbi.com (ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/35607487-add-security-roles-for-separate-pages-not-rls). Let's vote it up.
@learnspreadsheets
@learnspreadsheets 4 года назад
@@GuyInACube Thanks for replying. I voted. I personally find this as such a common scenario that I'm surprised more people are not calling attention to it! I have one model, then either use "save as" on PBI online (to get multiple reports from one model) or create new PBI desktop files from the same report then republish them to the service. I envisage a fix where a user gets a checkbox upon publishing with the many report names, and tick which pages go to which reports - it would look like the sync sclicers tickbox pane.
@advent7324
@advent7324 2 года назад
Not easy to learn anything when you surround all the real learning with self-praise and over-the-top dialogue!
@na02venkatesh
@na02venkatesh 4 года назад
when I start developing new report on the Golden shared Power BI Dataset, and if I need to create new measures and add one more table to the existing data model. Will I be in a position to do these two things and can I save this modified dataset on Power bi service workspace. Can you please clarify it Patrick?
@GuyInACube
@GuyInACube 4 года назад
You will need to make the change to the model in the Power BI Desktop file then publish it back to the service. After that, when you open the .pbix file that contains the reports and the live connection all the changes will be reflected.
@chandramoulinaidu657
@chandramoulinaidu657 3 года назад
When you are using golden dataset ...then in this case why should one use dataflow? All ETL steps can be maintained in that single dataset itself
@texmexdragon2160
@texmexdragon2160 2 года назад
But is this still applicable with the advent of composite models? And what about chaining datasets? Seems to me there is also an argument to be made for having subject matter datasets that serve specific audiences within an organization.
@leox9974
@leox9974 4 года назад
Using Power BI Embedded with dynamic binding, I'm trying to use a single report to display various different datasets which I've successfully completed, however, now I'd like to automate copying/creating datasets so I can present the relevant data depending on the user. How can I automate creating duplicates of the same dataset (internal SQL server) that reference the same tables, but filter the rows based on a parameter? Also, ideally this would use direct query instead of import. Is there a better way to do this? Thank you.
@silvanoschuch8706
@silvanoschuch8706 Год назад
What happens when you already have 3 existent reports, 3 duplicated datasets, and want to have them pointing to one single dataset, without having to recreate the reports? After some online search, I used the REST API "Rebind Report In Group", which indeed pointed the 3 reports to 1 single dataset, but I noticed those reports are no longer downloadable ("File" -> "Download this file"). Which means, they can only be edited on the Power BI Service now. Is there a way to get this accomplished and still be able to download the reports from the Service, without having to recreate the reports? I am not sure if there is another option besides the binding API.
@agnesmarco
@agnesmarco 4 года назад
I see nonetheless some disadvantages. 1. When you connect to this central dataset I can 't add a new source to the new datamodel in this new pbix. Or can I? 2. When you make changes to the central dataset you must be aware that is used in several dashboards and you can get some errors in those dashboards.
@inchristalone2594
@inchristalone2594 4 года назад
This is where dataflows come into play. You publish all of your analytical dataflows (thus getting the 1 time load from source) and then build out your datasets from these flows. I would highly recommend radically limiting the number of datasets. If properly modeled, even the largest companies can run off of a handful of central models. Example: Microsoft has 1 model that's used for 90%+ of their enterprise reporting.
@Adeelkhalid17
@Adeelkhalid17 Год назад
I have a dataset that is consumed by multiple reports. Now I am adding data from last 20 years to same data set but existing reports should only use last n number of years. Is there a way to filter last n years when connecting to shared dataset or my only option is to apply report filter on all reports
@dannydavis1573
@dannydavis1573 2 года назад
This is such a great video! Thank you Patrick! I do have an important question though - how can you handle the deployment process to higher environments with this approach and deployment pipelines in a premium capacity? Can you? Ideally, I would have my DEVELOPMENT thin reports connected to my DEVELOPMENT golden dataset, and then my PRODUCTION thin reports connected to my PRODUCTION golden dataset. And then normally with deployment pipelines I would handle this environment promotion and dataset connection using parameters or data source rules, but it doesn't look like this option is available with a live connection to datatsets. How would you handle this?
@daleholter6821
@daleholter6821 3 года назад
Patrick, Awesome post. Question: How can I view the "code" behind the measure without returning to the original dataset?
@VishalJaiswal-jj3ke
@VishalJaiswal-jj3ke 3 года назад
You can try using thr performance analyser, it generates the dax query when you run it which you can copy and see
@sharavananp5570
@sharavananp5570 4 года назад
Hi, Can you make a video regarding shared dimension. Like a model having 6 facts and 5 shared dimensions and you have no choice to make a view in backend . So only way is to datamodel with joins in Power BI. But then the fear is, will it slow down the report .
@terryliu3635
@terryliu3635 4 года назад
Thanks for sharing Patrick. Honestly, I do not quite get it. Say we have an executive report (20 pages to deliver) which needs both Sales data and EHS data. If we already have a Sales dataset and EHS dataset from separate Workspaces, do we have to create a new dataset and duplicate some of the metrics / calculations?
@leonidiakovlev
@leonidiakovlev 3 года назад
So this one and only single source of truth should contain all the measures in all my reports including ad-hoc ones? What if one of the reports needs another source? Should I include it in this central dataset? The latter is the biggest obstacle to why I don't do centralised Power BI model. It is more efficient to create the centralised local database in Access, for example...
@michaelmaloy6378
@michaelmaloy6378 2 года назад
Relatively new to PowerBI, and trying to make sure I get how this should work. In order to create datasets, do you create new pbix "reports" that only have the datasets in them and then publish them (organized in a special workspace) so that other reports are able to access those datasets directly?
Далее
Самая сложная маска…
00:32
Просмотров 747 тыс.
Create custom keys for your Power BI relationships
9:44
Power BI Datamart vs Dataflow vs Dataset
17:50
Просмотров 44 тыс.
Power BI Tutorial | From Flat File To Data Model
10:27
Просмотров 156 тыс.
Handling MULTIPLE fact tables in Power BI
9:02
Просмотров 311 тыс.
POWERAPPS and POWER BI can do what?!? It's bananas!
9:37