Headquartered in Kirkland, Washington and servicing clients all over the world. Havens Consulting provides Business Intelligence solutions to help companies use their data to operate efficiently and profitably. We utilize the Power Platform stack, which includes Power BI, PowerApps, and Microsoft Flow. We also provide consulting for Tabular Modeling, SQL Server Analysis Services, Azure Analysis Services, PowerPivot, and other analytics software to help companies realize their goals. A variety of on-site and remote technology training's are also one of the many services we provide.
Whether you’re an analyst, report developer, manager, or small business owner, we will not only add value to your organization, but we’ll also revolutionize the way you use data. We bring extensive experience and professionalism to every project and customize our services to your individual needs and concerns.
I’m a dark mode person in general, but man those data model is dark! 😅 Can barely see the model. I also dislike the build pane, normally I could quickly see my measures used in a visual, but with dark mode is the not so visual. Still work to be done. Also, creating or making something transparent, made my visuals crash.
Yah, dark mode relationship map is impossible to decipher. Dark mode dax view is meh. Dark is good for script, but hard on gui, and pbi desktop is gui disigning gui.
So there was already a plan to do a refresh on the light mode, and reduce shadows since that was an accessibility thing. So that was the primary effort. They added dark mode as a bonus. ALSO, please give them feedback via votes and posts on Fabric Ideas. This is where they can learn. Also she showed you at the end how to provide direct feedback, they welcome it!
Under "Why do we need it?" you must cross a critical threshhold: Because the platform does not deliver this benefit out of the box. In order to give an honest answer developers must be humble. Did you explore all options or did you give up early and go for the workaround?
@HavensConsulting to display dynamic 'measure names' I am using a calculated table 'Filter T', the result of a crossjoin between my parameter field table with a date table. This 'Filter T' is related to the parameter table. Doing it this way the custom order does not work, have you come across this? Is there any workaround? P.D: For context, the user is selecting a month and I have done this so instead of showing a measure as 'EoM Sales' or 'Week-1 Sales' it displays the date based on the filter selected (e.g. 'EoM Sales' measure is displayed as '2024-03 Sales')
Hmmm, what's dynamic about the measure names? You can have a field parameter table with the same measure more than once in it, under different names. That's one way I've added logic for the "name" to change :)
@@HavensConsulting I am trying that the name shows the date of reference. So for example, I have a measure that shows sales for the last monday of the selected month, one measure for sales for the previous monday, and one more for two previous mondays. Then for the last monday of the month, I am also looking at other kpis like margin and margin %. But for sales we want to have the last 3 mondays of data, that's why I am not using date as a column in the matrix (because I don't need that showed for previous weeks in the other measures). Example of expected: Currently: Sales CurrentMonday-2weeks | Sales CurrentMonday-1week | Sales CurrentMonday | Margin CurrentMonday | Margin % CurrentMonday Need, user selection September 2024: Sales 09/09/2024 | Sales 16/09/2024 | Sales 23/09/2024 | Margin 23/09/2024 | Margin % 23/09/2024 Need, user selection August 2024: Sales 12/08/2024 | Sales 19/08/2024 | Sales 26/08/2024 | Margin 26/08/2024 | Margin % 26/08/2024
This might be slightly too complex to answer via just RU-vid comments 😅 I’d either recommend posting this to the fabric help forums, where you can upload an example PBIX, otherwise we do offer hourly consulting as well, through our havens consulting website
Really great video, thank you very much for it. I think we all really value your posts on Linkedin, Injae, and thanks Reid for having these discussions. Best regards
Leila Gharani’s big success is due to her training on Excel. There is far more Excel users than PBI. She has very good knowledge, tips, tricks and courses. My 1st Dashboard design course was her one and it was in Excel.
My vague definition of a workaround is something that Microsoft will give us in an easy way eventually. Like, do I spent days on this thing, or park it and wait for Microsoft to evolve the product?
@@paulrockliffe2378 Sadly can't always wait! Sometimes it can be years before a feature you want is added. But easier is always better (if you can wait)
Thank you for sharing this. I would like to ask a question I can't find after googling a lot: I converted a filtering measure to calculated groups. It works as expected, but it doesn't appears in report view's data pane as a calculation item... I can see & work with it in model pane, but in data pane I can expand only calculation group and calculation group column... why's that?
Any way to upload a screenshot somewhere publicly for me to look? Not sure I've seen this issue before. If it's a report connected to live model. Any symbols for calc groups or field parameters do go away.
I'm trying to create a custom PostgreSQL connector that used Microsoft Entra Id as the authentication method with the Power query SDK. As a test I'm trying to just recreate the PostgreSQL connector that is already in Power-bi with UsernamePassword authentication. This is my .pq file // Main function to connect to the PostgreSQL server [DataSource.Kind="PostgreSQLConnector", Publish="PostgreSQLConnector.Publish"] shared PostgreSQLConnector.Contents = (server as text, database as text) => let // Connect to PostgreSQL without hardcoding credentials result = PostgreSQL.Database(server, database) in result; // Publish metadata PostgreSQLConnector.Publish = [ Beta = true, Category = "Database", ButtonText = { Extension.LoadString("DataSourceButtonText"), Extension.LoadString("DataSourceButtonHelp") }, SourceImage = PostgreSQLConnector.Icons, SourceTypeImage = PostgreSQLConnector.Icons, SupportsDirectQuery = true ]; // Data source definition and authentication PostgreSQLConnector = [ Authentication = [ UsernamePassword = [] ], Label = Extension.LoadString("DataSourceLabel") ]; But I'm getting the error that I need to specify how to connect. I try to do this with Odbc.DataSource in stead of PostgreSQL.Database and this works but I don't want the need to install a PostgreSQL driver locally.
Just be careful with this calc group as it will replace any measure, and this may bring unexpected consecuences. It's safer to scope it to a certain base measure (e.g. [dummy]) and leave other measures as SELECTEDMEASURE () (i.e. Untouched)
I was one of those people who always tried to avoid calculated columns with dax. Thanks for showing me that I was wrong. There are some scenarios when the result is much better with dax-calculated columns.
Really glad someone with the necessary chops has finally stepped up and thoroughly debunked that silly "no DAX columns, EVER" position once and for all. Thanks, Tristan and Reid!
Kudos for you, man, for finding an extremely convoluted way to perfom a simple task of hiding column in a matrix. Not only it is difficult to undestand and manage, I will probably forget all about it in few months time, and I'll completely loose track of it. Moreover, your method kind of abuse the concept of calculation groups, as it doesn't provision for the reduction of the overall number of measures. It is mind boggling to me that one have to go all this trouble just to put some numbers in a matrix. But that Microsoft for you. I'm wandering were do they find all these great, great coders, and were do they find the managers which approve such great work. I think this is MS true specialy. Create SW which you have to work for rather than the other way around. It is hard to understand how MS could have neglected to provide such a simple feature already 3 yrs after you posted this video.
I’d love to hear your solution for this requirement then. Since mine is convoluted and abusive. What’s the simple solution for this, when it’s a hard requirement for the client? 🙂
@zvikabar-kochva3641: Lodging your complaints about Power BI in the comments on a @HavensConsulting RU-vid video won't have any effect, because Microsoft will never see them. Microsoft doesn't even monitor the comments on their own RU-vid channels, let alone all of the independent Power BI content creators like @HavensConsulting. You're basically just screaming into the void here, so if you're really that upset about what you perceive to be a problem with Power BI, why don't you contact Power BI Support or create an idea on the Power BI/Fabric community ideas website?
@@HavensConsulting , that's probably my bad English, but the criticism was entirely toward MC, not your solution. The thing is if you stick with the stock visuals, I don't think there is another solution. However, Zebra has visuals in which one can suppress a column if it contains no data. That said, I still think your solution, though produce the required funcionality, is convoluted. I applogize for using "abusive". It was distasteful.
@@zvikabar-kochva3641 appreciate the context, yeah I wasn’t sure if you were criticizing the technique or Microsoft, ha ha. Yeah, I definitely wish this was easier, but at least it is some solution for people, and hopefully it does improve in the future!
@@zvikabar-kochva3641 one newer solution as well, is to use field parameters, with measures, the column will automatically disappear, if the measure has no data in it
I have 10 levels of hierarchy, when I add them into field one by one, the first 3 levels are working fine, but when I start to add the following levels in from level4, the values display in my level 2 are gradually disappearing.
If I want to make new measures, if I create them in the MyMeasures table that was in the dataset file I published, and I add a new measure in the report file under that web connected MyMeasures table, will the measure be pushed into the cloud service or am I supposed to always add new measures to the dataset seperated version and always have to publish and refresh every time?
You can add them to either actually! Measures that would be used by multiple reports connected to the semantic model, it's recommended to add to the model. Local measures for things like titles, conditional formatting, etc. Can be added to the report PBIX's only. You can add local (report) measures to the same measures table where the core semantic model ones are. I usually put them in a folder called (Local Measures) myself.
Sandeep, is there a way to use notebooks or semantic link to back up and restore power bi semantic models? If yes can you point me to some documentation or show us how? Today I have to use powershell or ssms. Thanks.
Understanding the overall benefit, I still have one question. What performance difference is there, if any, between doing this and simply having the originating report and dataset in place, and simply attaching to that originating dataset as needed for other reports? All of these dependencies look the same in the Power BI Service, whether split or not.
Great question. Another big benefit is that you don't have to update the "model" each time you need a report update. Having them split means you can cleanly update your report whenever you need, without republishing your model. Especially if you have incremental refresh turned on. Because a model republish would trigger a full model refresh, including a full refresh of any incremental tables in there too. So having a model file entirely by itself presents any issues of having to update both, together. :)
Once a report is created using PBIR format, can it be reverted back to the old format? Saving it as .PBIX doesn't seem to work for me, as I still get the same warning and inability to publish.
Once it's saved in PBIR format it needs to stay in that format, no way to revert back (today). Outside of just going to your OneDrive version history to the previous version of the file.
When we generate dates in power BI dax, such as "date(2024,8,30)", it says these will return the date in datetime format. Do you know which of the three datetime formats these are loaded in?
Are you referring to the actual datetime format? Not the data type? If so I think it defaults to the locale of your machine, since places like US, Europe, Asia, etc. all have different standard formats
@@HavensConsulting In powerquery, dates can be either date, datetime, or datetimetimezone. Then once the values are in the model, you can only change their visible format. So my question is, which datatype from powerquery would match with dates generated entirely within the model (such as a calendar or date() dax function). I'm having issues comparing dates from powerquery with dates generated within the model as above. Sorry for any confusion about format/datatype. thanks for the quick reply!
@@oscarelworthy the model itself only has datetime, the rest is "formatting", so I'd recommend converting in PQ to date only first, for your fact dates to then correctly key to DIM - Date on your calendar table :)
Hi Folks, thanks for this! This is fantastic. With the new PBIR format, deleting everything after objects wasn't working properly. So, to have it working, I removed everything between brackets. The result was: "objects": { "merge": { "outspacePane": [] }
This works if no dates can be selected on the page, ie. the solution doesn't have to be dynamic. At the beginning of a new year I imagine they would want to look at data from the perspective of the last year, still, for a while. Otherwise a disconnected table with the CY and PY columns would have to be created and the measure adapted to the new context.
Hi Reid. Thank you very much for the tutorial. Please, I have a question about the slicers, how did you change the circle in the slider into a vertical bar ? Thank in advance
This was kinda interesting but way beyond my current level. Just trying to work out how the process of coding in vs would work and how I would then get the code back into the PBI report? Is it a manual process? I think it was alluded to in the SDK discussion, but not clearly enough for a simpleton like me...
If you're editing the model file in VS code, any chances would be there after saving the file, then reopening the report in Power BI Desktop. If you're not using PBIP files yet today, Power BI desktop is still a fine place to develop. VS Code is definitely in the advanced realm of people who like to code a lot more, nice, but not needed :)
@@walterstevens8676 PBIP creates a whole folder structure with model files, report files, page files, etc. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-sCMgfZQU4Aw.html Here's a livestream that should help explain that better :)