Тёмный

How to configure Automatic DLP 

Google Cloud Tech
Подписаться 1,2 млн
Просмотров 5 тыс.
50% 1

Need to search through millions of database entries? Automatic DLP has new features to configure scans for all your business needs. In this video, we show how you can configure Automatic DLP to create a data profile for a scan configuration. Watch to see how you can set the correct permissions, configure scans, and much more with Automatic DLP!
Chapters:
0:00 - Intro
0:53 - What is a Scan Configuration?
1:19 - Google Cloud Resource hierarchy?
2:20 - Scanning organizations and folders
3:31 - Scan Configuration example
4:20 - Roles and configuration
5:33 - Creating a configuration in Google Cloud Platform
5:55 - Manage Schedules
7:35 - Select inspection template
8:25 - Manage scan outcome (data profiles)
8:44 - Manage service agent container and billing
9:32 - Set location to store configuration
10:02 - Review and create
10:12 - Project level configuration
10:38 - Recap and wrap up
Automatic DLP: Data Visibility playlist → goo.gle/AutomaticDLPDataVisib...
Subscribe to Google Cloud Tech → goo.gle/GoogleCloudTech​
#AutomaticDLP

Наука

Опубликовано:

 

7 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 9   
@googlecloudtech
@googlecloudtech 2 года назад
Why are you configuring Automatic DLP? Let us know in the comments! Be sure to subscribe to get alerts when more episodes are released 😊 → goo.gle/GoogleCloudTech
@cloudlover9186
@cloudlover9186 Год назад
Can i do the same stuff thru python script? , how do i enable DLP thru python code . Please advise.
@thejuanes90
@thejuanes90 Год назад
Hello, thanks for the video it was very clear, however, it’s there any way to configure and define the number of rows we want to scan from the tables in the automatic DLP ? So we don’t scan all the table entirely ? Thank you.
@jsalsman
@jsalsman 2 года назад
Can you explain how finding, e.g., names, emails, and dates of birth in bigquery tables older than 48 hours and ending with "_temp" can help prevent data loss, please? I'm not saying it can't, just that the example shown is pretty opaque as to how it would address data loss at all.
@saimabehan217
@saimabehan217 2 года назад
I think this video is about data discovery stage. What to do with sensitive data could be in following videos.
@jsalsman
@jsalsman 2 года назад
@@saimabehan217 sure, but it's just not clear to me how this kind of discovery is likely to help prevent loss.
@debicabreragoog
@debicabreragoog 2 года назад
@@jsalsman ​ Hi James! I appreciate your feedback. In this case we had some tables that ended in _temp to demonstrate that they were supposed to be 'temporary'. Instead of "_temp" you can imagine a table that ends with "_socialsecurity" or "_sensitive". The 48 hours time range is so that we make sure empty or partially filled in tables aren't being profiled, as it can sometimes take a day for them to fill up completely. It all depends on the data that your organization stores - names, emails and dates of birth can all be considered PII, but this could be CC numbers, social security numbers or home addresses. Check out the next video in the series when it is posted soon to get some specific scenario examples. Hope that helps!
@jsalsman
@jsalsman 2 года назад
@@debicabreragoog thanks! I'm watching the other videos in this series with great interest.
@JohnDoe-rk7ex
@JohnDoe-rk7ex 2 года назад
😍😍
Далее
Data Loss Prevention(DLP) API in GCP
25:34
Просмотров 5 тыс.
How to optimize a GKE cluster
6:19
Просмотров 6 тыс.
Мой инстаграм: v1.ann
00:13
Просмотров 116 тыс.
How to build a data pipeline with Google Cloud
7:55
Просмотров 42 тыс.
How to create DLP templates
6:28
Просмотров 9 тыс.
Orchestration or Choreography?
8:24
Просмотров 5 тыс.
What are AI Agents?
12:29
Просмотров 125 тыс.
Exploring Automatic DLP for BigQuery
26:15
Просмотров 1 тыс.
Intro to building large GKE clusters - part 2
10:00
Просмотров 5 тыс.
How to process late data on Apache Beam
9:15
Просмотров 12 тыс.