Тёмный

Day 1 Talks: JAX, Flax & Transformers 🤗 

HuggingFace
Подписаться 61 тыс.
Просмотров 14 тыс.
50% 1

Day 1 Talks: JAX, Flax & Transformers 🤗
0:00:00 Skye Wanderman-Milne (Google Brain): Intro to JAX on Cloud TPUs
0:42:49 Marc van Zee (Google Brain): Introduction to Flax
1:28:26 Pablo Castro (Google Brain): Using Jax & Flax for RL with the Dopamine library
Find more information about the speakers and the talks here github.com/huggingface/transf...

Наука

Опубликовано:

 

17 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 10   
@TheAIEpiphany
@TheAIEpiphany 2 года назад
Great talk! It'd be nice to stress out the pros/cons between Flax and Haiku since both were built with the same goal in mind (to the best of my knowledge).
@miladaghajohari2308
@miladaghajohari2308 2 года назад
great talk. I enjoyed the Mark's explanation of the flax philosophy as I am a flax user and I know understand the intuition better. However, I guess for anyone who is trying to understand flax, it is very good to also check the slides that are made available somewhere in the website. I would prefer if there was a comprehensible intro to flax somewhere though. It feels I needed to gather bits of information from multiple sources to gain an understanding Also, the dopamine library seems pretty cool. I am trying to code up some RL agents and I am wondering where should I start. I guess that may help me there. However, I am experimenting with easy games. I do not know if I need the advanced tricks to be able to handle them or not.
@medabottpro9124
@medabottpro9124 3 года назад
The Functional and Composition API is what really pulls towards Jax / Flax ecosystem
@shadowkiller0071
@shadowkiller0071 3 года назад
When I started with JAX coming from pytorch, it just seemed really confusing but the more I learn about it, the more interesting and clean it seems.
@0730pleomax
@0730pleomax 3 года назад
any tutorials recommended?
@shadowkiller0071
@shadowkiller0071 3 года назад
@@0730pleomax I've been going off docs, this series and this linen examples the second speaker mentions.
@geekye
@geekye 2 года назад
Interested to hear how it's been, if you've been using it at work or for research projects. How easy is it compared to Pytorch, I'm currently having a very hard time in parallelizing huggingface model on Pytorch
@shadowkiller0071
@shadowkiller0071 2 года назад
@@geekye I'll be dead honest I eventually gave up and just went back to pytorch LOL. We're using deepspeed and training on a gpu cluster now. JAX is only worth it if you can get a TRC trial and train on the v3-8s, or just generally speaking if you have access to TPUs. fwiw if you email them and say you want to use TRC to do something with Jax they will be VERY lenient with giving you extra days and things like that. Also what kind of parallelization are you trying to do? Deepspeed takes care of multigpu for most things.
@mortezajanatdoust
@mortezajanatdoust 2 года назад
Hi, there are currently two lib for RL using Jax, Dopamine and Rlax. Might I ask what is the difference? Thanks!
@syedashfaq9429
@syedashfaq9429 Год назад
rlax is a collection of building blcoks from which you can use to construct agents, while dopamine provides a monolithic landscape with tunable parameters that can be able to modify their behaviours.
Далее
Day 2 Talks: JAX, Flax & Transformers 🤗
1:51:26
Просмотров 6 тыс.
Simon Pressler: Getting started with JAX
29:49
Просмотров 1,7 тыс.
New model rc bird unboxing and testing
00:10
Просмотров 13 млн
Demo: JAX, Flax and Gemma
8:12
Просмотров 3 тыс.
What's New in Gradio 4.0?
1:00:41
Просмотров 15 тыс.
Everything Starts with a Note-taking System
21:23
Просмотров 176 тыс.
NeurIPS 2020: JAX Ecosystem Meetup
1:02:15
Просмотров 26 тыс.
The cloud is over-engineered and overpriced (no music)
14:39
What is PyTorch? (Machine/Deep Learning)
11:57
Просмотров 24 тыс.
ЗАКОПАЛ НОВЫЙ ТЕЛЕФОН!!!🎁😱
0:28
Choose a phone for your mom
0:20
Просмотров 7 млн
Colorful Vulcan w rtx 4070ti Super
13:30
Просмотров 60 тыс.