Тёмный

Advancing Spark - Understanding the Unity Catalog Permission Model 

Advancing Analytics
Подписаться 33 тыс.
Просмотров 11 тыс.
50% 1

Since the initial announcement of Unity Catalog, data security and permissions have been at the center of the story. You should use Unity Catalog to control which users can see which elements of data, to control access across multiple workspaces, to act as the entry point for BI Tools into your Lakehouse model! But then... how does security actually work?
In this video Simon walks through the Unity Catalog permissions model, looking at how security can be managed using SQL commands (just like the good old days) but also the new Data Explorer within Databricks SQL!
There's a wealth of information over on the Unity Catalog docs found here: docs.microsoft...
As always, come to Advancing Analytics if you need help rolling out an Enterprise-ready Delta Lakehouse!

Опубликовано:

 

7 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 15   
Год назад
It was worth waiting until the end. Ah, those buttons ... ;D
@AdvancingAnalytics
@AdvancingAnalytics Год назад
Shhhhh, noone saw that 🤣
@aqlanable
@aqlanable 2 года назад
I think these important parts worth mentioning , ex: credentials, external location and how to migrate from current hive metastore to unity catalog. I have a blog in draft on my WordPress, if its okay i can post it here
@arunr2265
@arunr2265 2 года назад
Please post it , Omar
@gopinathrajee
@gopinathrajee Год назад
Please do!!
@allthingsdata
@allthingsdata Год назад
We can't really use unity effectively as we aim for a client-agnostic data access model and unity assumes that you always go through it so it's a centralization of the authorization layer which goes against the open, client-agnostic lakehouse approach imo. Of course you could have databricks permissions managed via unity service principals plus other permissions on the storage layer managed via rbac + acl but that's double the effort. We currently prefer one auth layer which works for all tools that can do AD passthrough or obtain an Azure AD token and is enforced on the storage layer. Of course this has drawbacks, too, e.g. not cloud-agnostic but for us it's the better model currently. Also not a fan of onboarding lakehouse assets to make them unity ready.
@jordanfox470
@jordanfox470 2 года назад
@Simon with the release of Unity Catalog, do you have any insight if they're going to update delta live table to allow us to put objects in a single catalog but multiple different schema/databases? At the moment you define a target, and that target is the schema/database for every object in the delta live table pipeline. Seems like it'll be necessary to update in Unity.
@prasad8195
@prasad8195 11 месяцев назад
Hello @Simon I require your assistance with a specific use case. Suppose I create a view using the `%sql` declaration with the `CREATE OR REPLACE VIEW` statement and grant the Databricks group 'X' usage access to the schema and catalog, along with select access to the view. Consequently, a user who is a member of the Databricks group 'X' will gain visibility of the object and the ability to retrieve data from the view. However, a challenge arises when I execute the `CREATE OR REPLACE VIEW` statement again. It appears that the previously granted permissions for Databricks group 'X' vanish, subsequently restricting users in that group from accessing the object. Could you please provide guidance /feedback on this ? Your assistance is greatly appreciated.
@nikhilsahu4159
@nikhilsahu4159 Год назад
I do not find "Create Catalog "and "Create Metastore" on Azure Databricks even I have a premium account on azure databricks. Anyone know...Why?
@AdvancingAnalytics
@AdvancingAnalytics Год назад
Have you enabled Unity Catalog and associated the workspace to a metastore? There is some setup to do before workspaces will work with the new commands!
@user-bs8ku6cg9f
@user-bs8ku6cg9f 2 года назад
why you have no videos about palantir? they have the best software
@gordonegar7717
@gordonegar7717 2 года назад
Thoughts on using a single storage account container and metastore across environments?
@aqlanable
@aqlanable 2 года назад
In unity catalog its possible through databricks account portal, u can create a metastore and share it across multiple workspaces.
@user-bs8ku6cg9f
@user-bs8ku6cg9f 2 года назад
Gordon Egar, maybe you should check palantir foundry
@sankarazad7574
@sankarazad7574 Год назад
How do we provide security between the workspaces?? How can we keep dev, UAT and prod workspaces seperately
Далее
Advancing Spark - External Tables with Unity Catalog
17:25
Advancing Spark - Reflecting on a Year of Unity Catalog
18:14
Я ЖЕ БЕРЕМЕННА#cat
00:13
Просмотров 231 тыс.
Новый хит Люси Чеботиной 😍
00:33
Data Access Control with Databricks Unity Catalog
10:15
Advancing Spark - Delta Sharing
26:12
Просмотров 9 тыс.
Databricks Unity Catalog: A Technical Overview
17:29
Просмотров 24 тыс.
FASTEST Way To Learn Coding and ACTUALLY Get A Job
10:44