Тёмный

Databricks Asset Bundles: A Standard, Unified Approach to Deploying Data Products on Databricks 

Databricks
Подписаться 110 тыс.
Просмотров 17 тыс.
50% 1

In this session, we will introduce Databricks Asset Bundles, provide a demonstration of how they work for a variety of data products, and how to fit them into an overall CICD strategy for the well-architected Lakehouse.
Data teams produce a variety of assets; datasets, reports and dashboards, ML models, and business applications. These assets depend upon code (notebooks, repos, queries, pipelines), infrastructure (clusters, SQL warehouses, serverless endpoints), and supporting services/resources like Unity Catalog, Databricks Workflows, and DBSQL dashboards. Today, each organization must figure out a deployment strategy for the variety of data products they build on Databricks as there is no consistent way to describe the infrastructure and services associated with project code.
Databricks Asset Bundles is a new capability on Databricks that standardizes and unifies the deployment strategy for all data products developed on the platform. It allows developers to describe the infrastructure and resources of their project through a YAML configuration file, regardless of whether they are producing a report, dashboard, online ML model, or Delta Live Tables pipeline. Behind the scenes, these configuration files use Terraform to manage resources in a Databricks workspace, but knowledge of Terraform is not required to use Databricks Asset Bundles.
Talk by: Rafi Kurlansik and Pieter Noordhuis
Connect with us: Website: databricks.com
Twitter: / databricks
LinkedIn: / databricks
Instagram: / databricksinc
Facebook: / databricksinc

Наука

Опубликовано:

 

24 июл 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 8   
@aviadshimoni175
@aviadshimoni175 Год назад
Can we get a link to the repo that was used for the demo? TIA
@tingxie2292
@tingxie2292 Год назад
14:00 demo start
@PrebenOlsen90
@PrebenOlsen90 5 месяцев назад
Is there a way to extract variables set in the bundle file, in the notebooks? So that we can set 'environment' = 'prod' and do a spark.conf.get("bundle.variables.environment") type?
@user-nu3vc1be5p
@user-nu3vc1be5p 10 месяцев назад
an we get a link to the repo that was used for the demo?
@lostfrequency89
@lostfrequency89 Месяц назад
For notebooks should we use even integrate github or we can use dabs for that matter ? I’m kinda confused
@JulPinkie-ns5rs
@JulPinkie-ns5rs 3 месяца назад
cheers
@namanbhayani1016
@namanbhayani1016 7 месяцев назад
So now what will happen to DBX? This looks like a replacement for it.
@villetakoo
@villetakoo 5 месяцев назад
DBX is not officially supported by Databricks. It is a Labs project. If Asset Bundles will cover all use cases DBX has been used for, I predict developers will migrate to use AB and the contributions to DBX will fade in time and it might become depracated.
Далее
Еду за гитарой…
01:00
Просмотров 112 тыс.
How to design a modern CI/CD Pipeline
9:59
Просмотров 81 тыс.
Databricks Asset Bundles: Advanced Examples
28:18
Все Смартфоны vivo Серии V30!
24:54
Просмотров 20 тыс.