Тёмный
No video :(

JAX: accelerated machine learning research via composable function transformations in Python 

ACM SIGPLAN
Подписаться 9 тыс.
Просмотров 11 тыс.
50% 1

JAX is a system for high-performance machine learning research and numerical computing. It offers the familiarity of Python+NumPy together with hardware acceleration, and it enables the definition and composition of user-wielded function transformations useful for machine learning programs. These transformations include automatic differentiation, automatic batching, end-to-end compilation (via XLA), parallelizing over multiple accelerators, and more. Composing these transformations is the key to JAX’s power and simplicity.
JAX had its initial open-source release in December 2018 (github.com/goo.... It’s used by researchers for a wide range of advanced applications, from studying training dynamics of neural networks, to probabilistic programming, to scientific applications in physics and biology.
Presented by Matthew Johnson

Опубликовано:

 

18 ноя 2020

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
Technology Today: A Paucity of Integrity and Imagination
1:20:01
Simon Pressler: Getting started with JAX
29:49
Просмотров 1,9 тыс.
НЕ ИГРАЙ В ЭТУ ИГРУ! 😂 #Shorts
00:28
Просмотров 303 тыс.
Coding Was HARD Until I Learned These 5 Things...
8:34
What is Automatic Differentiation?
14:25
Просмотров 109 тыс.
NeurIPS 2020: JAX Ecosystem Meetup
1:02:15
Просмотров 27 тыс.
JAX Crash Course - Accelerating Machine Learning code!
26:39
CPU vs GPU vs TPU vs DPU vs QPU
8:25
Просмотров 1,7 млн
Why Does Diffusion Work Better than Auto-Regression?
20:18