Тёмный

Improving and Generalizing Flow-Based Generative Models with Minibatch Optimal Transport | Alex Tong 

Valence Labs
Подписаться 8 тыс.
Просмотров 6 тыс.
50% 1

Valence Labs is a research engine within Recursion committed to advancing the frontier of AI in drug discovery. Learn more about our open roles: www.valencelabs.com/careers
Join the Learning on Graphs and Geometry Reading Group on Slack: join.slack.com/t/logag/shared...
Abstract: Continuous normalizing flows (CNFs) are an attractive generative modeling technique, but they have been held back by limitations in their simulation-based maximum likelihood training. We introduce the generalized conditional flow matching (CFM) technique, a family of simulation-free training objectives for CNFs. CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models. In contrast to both diffusion models and prior CNF training algorithms, CFM does not require the source distribution to be Gaussian or require evaluation of its density. A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference, as evaluated in our experiments. Furthermore, OT-CFM is the first method to compute dynamic OT in a simulation-free way. Training CNFs with CFM improves results on a variety of conditional and unconditional generation tasks, such as inferring single cell dynamics, unsupervised image translation, and Schrödinger bridge inference.
Speaker: Alexander Tong - www.alextong.net/
Twitter Hannes: / hannesstaerk
Twitter Dominique: / dom_beaini
~
Chapters
00:00 - Intro
02:03 - Background on diffusion + flow models
10:31 - Why do diffusion models beat CNFs?
11:42 - Main idea: how can we train a CNF like a diffusion model?
18:25 - Flow matching
32:38 - Conditional flow matching
38:18 - Properties of flow depend on the choice of the probability path
47:40 - Score and flow matching
57:34 - Main takeaways
58:32 - Q+A

Наука

Опубликовано:

 

4 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 5   
@RichTong1
@RichTong1 3 месяца назад
This is not easy to understand, but worth thinking hard about!
@stathius
@stathius 8 месяцев назад
Awesome thank you for sharing!
@caiodaumann6728
@caiodaumann6728 Месяц назад
One question I have is, are these flows monotonically increasing? The usual "block" flows have this nice property, but do these continuous flows trained with flow matching also have this property in the transformations from base to data?
@adrienbufort795
@adrienbufort795 2 месяца назад
Amazing work !!!! I wonder how to extand those kind of generative model to categorical variable.
@chengc03
@chengc03 4 месяца назад
Hard to understand
Далее
Clifford Group Equivariant Neural Networks | David Ruhe
1:01:23
Learning Graph Cellular Automata | Daniele Grattarola
1:29:10
What are Normalizing Flows?
12:31
Просмотров 68 тыс.
Bayesian Flow Networks | Alex Graves
1:26:41
Просмотров 3,5 тыс.
Physicists Claim They Can Send Particles Into the Past
7:21
How I Understand Flow Matching
16:25
Просмотров 4,1 тыс.
Кто производит iPhone?
0:59
Просмотров 470 тыс.
ДОМОФОН НА КОМПЬЮТЕР
0:17
Просмотров 597 тыс.
Игровой Комп с Авито за 4500р
1:00