Тёмный
No video :(

Dive into Microsoft Fabric's Power BI Direct Lake 

Guy in a Cube
Подписаться 445 тыс.
Просмотров 27 тыс.
50% 1

Let's break down Power BI Direct Lake in Microsoft Fabric and explain how you can leverage one copy of the data from OneLake. Patrick explains!
Direct Lake
learn.microsof...
📢 Become a member: guyinacu.be/me...
*******************
Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.
🎓 Guy in a Cube courses: guyinacu.be/co...
*******************
LET'S CONNECT!
*******************
-- / guyinacube
-- / awsaxton
-- / patrickdba
-- / guyinacube
-- / guyinacube
-- guyinacube.com
**Gear**
🛠 Check out my Tools page - guyinacube.com...
#PowerBI #DirectLake #GuyInACube

Опубликовано:

 

15 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 23   
@gvasvas
@gvasvas 5 месяцев назад
Awesome demo! Quick and right on the spot.
@archanasrivastava6531
@archanasrivastava6531 3 месяца назад
Thanks for this insightful video. Do you have any performance/capability metrix for comparison between Import, Direct query and Direct Lake , pls share. Thanks in advance
@robcarrol
@robcarrol Месяц назад
Great demo. I've been using direct lake in a current project and absolutely love it
@toma4528
@toma4528 5 месяцев назад
Great video, Patrick!
@brunomagalhaes9349
@brunomagalhaes9349 Месяц назад
I have several sematic models that are alike. Do I need to have a fabric capacity to merge them and treat the data like I do for SQL? Thanks a lot.
@christophehervouet3280
@christophehervouet3280 4 месяца назад
Super post Patrick , as usual
@shekharkumardas
@shekharkumardas 5 месяцев назад
How to create dax column in direct lake dataset
@gnomesukno
@gnomesukno 5 месяцев назад
Not using it currently but I can see some potential benefits to it. Will have to look into it
@nishantkumar9570
@nishantkumar9570 5 месяцев назад
How costing will work for direct lake mode?
@toulasantha
@toulasantha 3 месяца назад
Less to start with Will be rocketing up after that Just like everything else MS 😂
@UnbelievableOdyssey
@UnbelievableOdyssey 2 месяца назад
If my Delta Lake is in Azure Data Lake Stoage can I still use Direct Lake?
@user-iv5tq4qk7m
@user-iv5tq4qk7m 5 месяцев назад
Q I love the ease of creating new semantic models but I keep coming across the problem whereby I have to give somebody access to the whole lake house in order to give them access to a segmented part of that data but I only want them to see via a semantic model. Is there any way that I can create a gold lake house in one workspace then create multiple semantic models in other workspaces and only give users access to those?
@npergand
@npergand 5 месяцев назад
You don’t need to give users access to the lakehouse, that’s just the default behavior. What happens is when you create a new semantic model it uses a gateway connection the lakehouse that is with SSO. You can see this in the semantic model settings screen. You can change that by creating a new connection to the lakehouse using a specific credential.
@Mike-en1rd
@Mike-en1rd 3 месяца назад
Do you know when Direct Lake will be available to use in Power BI Desktop?
@danrolfe7862
@danrolfe7862 5 месяцев назад
THIS IS BANANAS!!!!!!!! WOOOHOOOO Is there still a row limit? (On data that you can actually bring into Power BI) I seem to remember hitting an upper limit on rows using SQL Endpoint / Direct Query.. I had this MONSTER data set of about 14m rows that the stakeholder insisted he needed all of the data.
@googlogmob
@googlogmob 5 месяцев назад
Patrick, thanks 👍
@NicolasPappasA
@NicolasPappasA 2 месяца назад
Is direct lake using Delta Live Tables? It seems like it's the same technology.
@dilipinamdarpatil6301
@dilipinamdarpatil6301 5 месяцев назад
Awesome 🙏
@NateHerring1
@NateHerring1 5 месяцев назад
I watch Patrick
@Milhouse77BS
@Milhouse77BS 5 месяцев назад
I’m up
@EBAN4444
@EBAN4444 5 месяцев назад
Does this mean the massive 25GB model I have that holds too many years of data because the "business" needs it, even though they only look at a few years, can be removed and then only the partitions of data that is needed will be held into memory? Lowing the memory used on the capacity and the amount of data and CPU needed to crunch all the measures? Can I recreate the model using direct lake against our ADLS gen2 databricks parquet files which are already the fact tables we pull in. Do you need to setup partitions in the onelake or does it automatically do it for you? This does seem to remove the query folding performance gains, so it seems like the parquet files will need to be rewrote to be better optimized and only include the data that is needed in the model. Also is that python library to refresh a dataset available outside of onelake? aka would love an easy way to refresh a PBI model from an Azure databricks notebook versus adf xmla call
@googlogmob
@googlogmob 5 месяцев назад
Does Fablic available for developers for free?
@srikanthm4504
@srikanthm4504 5 месяцев назад
No your admin must enable and can do for a specific space.
Далее
would you eat this? #shorts
00:39
Просмотров 370 тыс.
Advancing Fabric - Lakehouse vs Warehouse
14:22
Просмотров 23 тыс.
Copilot for Power BI: Your Ultimate Copilot Guide
13:15