Welcome to Kratos BI! 🌟 Your premier destination for mastering Microsoft Fabric and Power BI. With over two decades of data expertise, we bring you the latest insights, tips, and tricks to transform your data into compelling stories.
📊 What We Offer:
In-depth tutorials on Power BI and Microsoft Fabric Real-world applications and case studies Expert advice on data visualization and storytelling Live Q&A sessions to answer your burning questions Community-driven content tailored to your needs 🔍 Why Subscribe? Stay ahead of the curve with our cutting-edge content, designed for data professionals by a data professional. Whether you’re a beginner or an expert, you’ll find valuable resources to enhance your skills and make an impact with your data.
🔗 Connect with Us:
Join our community of data enthusiasts Share your experiences and learn from peers Get exclusive access to resources and downloads ✅ Subscribe now and unlock the full potential of your data with Kratos BI!
I have service principal with tenant read write all, with admin consent, item read write all, workspace read write all, I added my service principal to a security group and added that security group as admin workspaces, also I provided tenant permission on fabric to enable service principal to use fabric api, service principal to create and read user profiles, witj all this when I make rest api I can only get items, for example datapipeline, notebook, but cannot get it's definition nor create a item in workspace. But I can do thia if I use my user I'd and manually copy getpowerbiaccesstoken from developer tool from browser, but if I use token generated from. Service principal to create a item it says, principal type not supported. Kindly help
This is a long awaited feature, thanks for sharing Chris! Its' nice that only admins can create/rename/delete. This gives the enterprise full control to follow the Pro Tip to align with data domains or data products. In the video, perhaps the example should have created just "Restricted", then we can combine with "Sales" for objects that require both. It allows future flexibility that can combine with other domains like "IT" where sensitive software or cloud vendor cost information exists.
First off, I love all of your videos! They are so helpful!! Secondly, it seems they've gotten rid of per capacity licensing. How would you accomplish this at this point in time?
The embed skus are still available for purchase in Azure. Just not the premium Power BI Skus. Power BI Premium capacites have been replaced with Fabric Capacities. P1 = F64, P2 = F128, P3 = F256. Just remember to reserve the capacity so you get the 41% discount.
Great video Chris! I have not check recently, but I know Microsoft kept increasing the duration of the Fabric free trial (F2 SKU). Any ideas on whether this will keep going?
Thanks for the deployment explanation, how we can we change underlying DB connection if I am deploying Semantic model/Lakehouse from Dev to Test/Production?
Great video! I really appreciate the strategic breakdown. Is there any chance you would have a link to that Marquette study? I couldn't find it via a cursory Google search and it sounds like something I'd really like to dig into.
I really wish I did. It was a semester long study done by 7-9 groups and all of the results were supposed to be published publicly. Unfortunately the study was never published.
@@anshikakhandelwal_ unfortunately, no lead so far. I managed to use R script visual, which is not optimal since the interaction from R visual to other visual is lost.
@@ChrisWagnerDatagod Somehow I managed to get around it. I created variables in pipelin which I then convert in the notebook to create something like a hub monitor,
Can we use Datamart as a Data Warehouse? If yes Data will be stored on Azure SQL Database. Is this storage charges included in Power BI license or need to pay additional amount?
Datamarts require Premium Per User ($20 a user a month - including report viewers) or a Premium capacity (F64 or higher). At this stage with Fabric Development, I would buy an F2 at $200-300 a month and use a warehouse. If you reserve the capacity it's less than $200 a mont.
You have to solve The Elizondo paradox...... Everyone has to understand the importance of the prepublication review. If you have a clearence and you are going to publish anything, even sending out resumes, you have to submit it for prepub review. Then the government reviews it looking for anything that may be classified or sensitive. Then you have to remove what ever they find. If you dont and publish it anyway you will lose your clearence or go to jail. Whatever then gets released after the review does not mean the government condones it or is saying that it is true. Their only concern is that its not classified. So in Elizondo's Glen Beck interview he said he held the Roswell material in his hands. So this can only mean 3 possible things. 1) The Roswell crash did not happen. In this case he could say he had tea on the ship with the Aliens and the government wouldnt care. He is just making it all up and mixing it with publicly known lore. 2) Roswell is not classified. In this case you could contact JPL where he said the material went for testing and they would be glad to send you a report on their findings and of course the government has come out and told everyone Roswell was true. 3) Roswell is classified and the government is allowing him to say certain things that are classified because it serves their own agenda. That means he is still working for them. There really isnt any other options. This paradox applies to all the other "whistleblowers" Who have done a prepublication review. If you think these people are telling you anything the government doesnt WANT you to know you are fooling yourself!
While you can do this, PowerPoint and other graphic editor offer a great deal of additional features for making the background really sharp. Plus pbi only loads 1 object at a time and each shape is an object that has to load. So a background improves performance too.
I have created the delta tables and wanted to create a dayamodel for refreshing dashboard. But, there is no dbo tables reflected in the SQL endpoint. Gonethrough some docs and saw its a limitation. Is that correct?
Hmmm, I just did this over the weekend and I am seeing them in the DBO schema. What specifically did you do where the docs say the dbo schema is not visible?
I have a dataflow that is not under my ownership but I want to use that dataflow to get a filter in my dashboard. But When I am loading the dataflow into the DataMart and doing few changes in the dataflow inside the DataMart it never loads whereas if I load it without any changes it happens smoothly. (PS: it have even when I load dataset those are under my ownership.) Do you have any solution for this.
The last time I solved this, I used the original dataflow as the source for my own dataflow. I just had to get the refresh timing right where my dataflow would refresh after the other dataflow. It was a little tricky, but I was able to get it working.
With Power BI Pro you can refresh up to 8 times a day. With PPU or Fabric, you can refresh as many times as you want. The UI allows you 48 times (on the hour or half hour), but you could use Power Automate or other tools to refresh more often.
Thank you for this great video. But I have a problem that When I refresh this not update new table in my excel file. Just update data in old table. How we can update a new table or new column in power BI from excel oneline?
Any model changes like a new column or new column definition (string becomes an integer), you need to do that in Power BI Desktop and then republish the report.