Тёмный
No video :(

EI Seminar - Matthew Johnson - JAX: accelerated ML research via composable function transformations 

MIT Embodied Intelligence
Подписаться 4,7 тыс.
Просмотров 3,6 тыс.
50% 1

Speaker: Matthew Johnson
Title: JAX: accelerated machine learning research via composable function transformations in Python
Abstract: This talk is about JAX, a system for high-performance machine learning research and numerical computing. It offers the familiarity of Python+NumPy together with hardware acceleration. JAX combines these features with user-wielded function transformations, including automatic differentiation, automatic vectorized batching, end-to-end compilation (via XLA), parallelizing over multiple accelerators, and more. Composing these transformations is the key to JAX's power and simplicity.
Bio: Matt Johnson is a research scientist at Google Brain interested in software systems powering machine learning research. He's the tech lead for JAX. When moonlighting as a machine learning researcher, he works on making neural ODEs faster to solve, automatically exploiting conjugacy in probabilistic programs, and composing graphical models with neural networks. Matt was a postdoc with Ryan Adams at the Harvard Intelligent Probabilistic Systems Group and Bob Datta in the Datta Lab at the Harvard Medical School. His Ph.D. is from MIT in EECS, where he worked with Alan Willsky at LIDS on Bayesian time series models and scalable inference.

Опубликовано:

 

29 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 4   
@keeperofthelight9681
@keeperofthelight9681 2 года назад
how to do convolution lstm and other things on JAX more tutorials please sir Mathew Johnson
@kshitijshekhar1144
@kshitijshekhar1144 2 года назад
flax is a high level nn library built on top of Jax, check out its documentation. It's a very new library, built for flexibility. And you can make a mark in that by making PRs
@AngeloKrs878
@AngeloKrs878 Год назад
1:07 subtitles "my experience with drugs couldn't be better"
@dbp_patel_1994
@dbp_patel_1994 Год назад
😂
Далее
Simon Pressler: Getting started with JAX
29:49
Просмотров 1,9 тыс.
NeurIPS 2020: JAX Ecosystem Meetup
1:02:15
Просмотров 27 тыс.
JAX Talk: Generating Extremely Long Sequences with S4
1:19:37
Magical NumPy with JAX - 01 Loopless Loops
6:11