Тёмный
No video :(

Intro to JAX: Accelerating Machine Learning research 

TensorFlow
Подписаться 602 тыс.
Просмотров 1,8 млн
50% 1

Опубликовано:

 

29 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 62   
@domenicovalles2498
@domenicovalles2498 2 года назад
This guy is so epic. He looks like he's enjoying every second of life.
@iva1389
@iva1389 2 года назад
NumPy on steroids
@EnricoRos
@EnricoRos 2 года назад
This video maximizes dInsights/dtime, is well written and easy to understand! I want to see more videos from Jake!
@SinDarSoup
@SinDarSoup 2 года назад
JAke X
@oncedidactic
@oncedidactic 2 года назад
EduTube needs a like button for specifically this metric 🤜🤛
@nrrgrdn
@nrrgrdn 2 года назад
It maximizes Insights/time, not the derivative
@ilyboc
@ilyboc 2 года назад
@@nrrgrdn yeh maybe that's better, but I think he means you gain continuously more insights as you advance in the video
@emiljanQ3
@emiljanQ3 2 года назад
Looks great! I tend to default to numpy when I want to do something that is not fully supported in keras or pytorch and if i can get paralellization on gpu very easily from this that is perfect!
@OtRatsaphong
@OtRatsaphong 2 года назад
Thank you for this good intro to JAX. Very easy to follow and understand, Jake. Definitely going to add this to my toolkit. 👍🙏
@pablo_brianese
@pablo_brianese 2 года назад
I burst out laughing with the ExpressoMaker that overloads the + operator.
@karansarkar1710
@karansarkar1710 2 года назад
Thiis sounds very good especially the grad and vmap functionality. I think more libraries would have to be released to compete with pytorch.
@valshaev1145
@valshaev1145 Год назад
Thanks! For me it helps alot! Being a C/C++ / Python developer, somehow I left behind such an important framework / library.
@lacasadeacero
@lacasadeacero 2 года назад
i have a question, whats the porpuse of doing so many frameworks? time? efficiency? cuz i don't see it.
@subipan4593
@subipan4593 2 года назад
JAX seems to be more similar to PyTorch i.e., dynamic graph instead of static graph as in Tensorflow.
@bender2752
@bender2752 2 года назад
There's something called AutoGraph in TensorFlow actually
@geekye
@geekye 2 года назад
That's Flax. Jax is more like the backbone of that
@Shikalegend
@Shikalegend 2 года назад
This typically looks like a problem that could be easily solved with a language that supports multi-stage programming; meta-programming as a first class citizen, which is not really the case with Python. Like Rust or Elixir via the Nx library which is actually directly inspired of Jax.
@TohaBgood2
@TohaBgood2 2 года назад
Ok, this is seriously cool. Is this brand new? Haven't seen it before. Also, in the first code sample did you mean to import vmap and pmap instead of map, or is that some kind of namespace black magic I don't understand?
@enricoshippole2409
@enricoshippole2409 2 года назад
It has been around for over 2 years now I believe
@linminhtoo
@linminhtoo 2 года назад
ya it's a typo, there's no magic
@joshuasmith2450
@joshuasmith2450 2 года назад
How are you going to compare torch to tf/jax when run on a different GPU? There is no way you can argue the 2 gpus are comparable, they will be faster/slower at different types of computation regardless of software used. Should have compared the 3 on a common gpu if for some reason torch couldnt be run on the tpuv3.
@gaborenyedi637
@gaborenyedi637 2 года назад
Why do you need a new lib? Tensorflow can do 90+% of this, doesn't it? Is it a good idea to make a completely new thing instead extending the old one? One more question: do/will you have Keras support?
@CharlesMacKay88
@CharlesMacKay88 7 месяцев назад
2:14 why in predict function inputs is reassigned but never used ? should be outputs = np.tanh(outputs)
@sitrakaforler8696
@sitrakaforler8696 Год назад
Great content ! BRAVO and THANKS !
@L4rsTrysToMakeTut
@L4rsTrysToMakeTut 2 года назад
Why don't use julia lang?
@AlphaMoury
@AlphaMoury 2 года назад
I thought JAX was running as default in tensorflow, am I missing something here?
@brandomiranda6703
@brandomiranda6703 2 года назад
What is the difference btw numerical vs automatic differentiation?
@amitxi-y5q
@amitxi-y5q 2 года назад
Numerical differentiation computes f’(x) by evaluating the function around x: (f(x+h)-f(x-h))/2h with a small h. Automatic differentiation represents the function expression or code as a computational graph. It looks at the actual code of the function. The final derivative is obtained by propagating the value of local derivatives of simple expressions through the graph via the chain rule. The simple expressions are functions like +, -, cos(x), exp(x) for which we knows the derivatives at a given x.
@srinivastadepalli9431
@srinivastadepalli9431 2 года назад
Awesome intro!
@HibeePin
@HibeePin 2 года назад
Active: Jax enters Evasion, a defensive stance, for up to 2 seconds, causing all basic attacks against him to miss.
@dl8083
@dl8083 2 года назад
I knew this is going to come up lol
@markoshivapavlovic4976
@markoshivapavlovic4976 2 года назад
nice talk that will be interesting
@nightwingphd8580
@nightwingphd8580 2 года назад
This is wild!
@AJ-et3vf
@AJ-et3vf 2 года назад
Awesome video! Thank you!
@bicarrio
@bicarrio 2 года назад
it says "from jax import map", but it seems it should be vmap?
@boffo25
@boffo25 2 года назад
from jax import map as vmap
@RH-mk3rp
@RH-mk3rp Год назад
Something's wrong with the audio. His voice gets so soft it's hard to hear at the end of some sentences.
@hfkssadfrew
@hfkssadfrew 2 года назад
Seems tensorflow is fast enough?
@marcosanguineti2710
@marcosanguineti2710 2 года назад
Really interesting!
@captainlennyjapan27
@captainlennyjapan27 2 года назад
Top Jax OP
@brandomiranda6703
@brandomiranda6703 2 года назад
Does this support apples gpus in M1 max?
@toastrecon
@toastrecon 2 года назад
I also wonder if they utilize the neural processors, too?
@sashanktalakola
@sashanktalakola 2 месяца назад
1:14 lol they compared TPU runtimes with GPU runtimes
@rickhackro
@rickhackro 2 года назад
Amazing!
@eddisonlewis8099
@eddisonlewis8099 9 месяцев назад
Interesting Stuff
@matthewpublikum3114
@matthewpublikum3114 2 года назад
Is this much better than simd?
@markoshivapavlovic4976
@markoshivapavlovic4976 2 года назад
Nice framework.
@satwikram2479
@satwikram2479 2 года назад
Amazing👏
@brandomiranda6703
@brandomiranda6703 2 года назад
I dont get it. Why do we need this if pytorch and keras/tf already exist?
@simonb.979
@simonb.979 2 года назад
I mean it is kinda niche but suppose you solve a problem that heavily relies on many custom functions, e.g., a very specific algebra like quaternion-operations. Then you can write super-fast basic operations and compose them to build a complicated loss-function that as a whole you can then jit-compile and let it get optimized. Or differentiate it, or vectorize it, all with a tiny decorator.
@tclf90
@tclf90 2 года назад
torch and keras is "slow" and is only meant for the development phase. not sure how fast jax can outperform them. edit: "slow" as in computation/inference time
@MrAmgadHasan
@MrAmgadHasan Год назад
@@tclf90 So what frameworks are "fast"?
@kuretaxyz
@kuretaxyz 2 года назад
Seeing JAX on the TensorFlow channel, now I am scared they'll mess this codebase too. Please don't, k thx.
@mominabbas125
@mominabbas125 2 года назад
Wow! 🏋️
@harryali4601
@harryali4601 2 года назад
Is it me or does the backend technology of JAX sound very similar to the one in tensorflow.
@RoyRogersMusicShop
@RoyRogersMusicShop Год назад
Googles Bard sent me here . Anyone know why ?
@chrisioannidis2295
@chrisioannidis2295 2 года назад
Imagine if it had a real weapon
@AnimeshSharma1977
@AnimeshSharma1977 2 года назад
My Call Jax Son #AI ?
@HealthZo
@HealthZo 6 месяцев назад
😮😮😮😮 0:28
@jakewong6305
@jakewong6305 2 года назад
JAX come out because of torch
@example.com.
@example.com. 2 года назад
numpyやめるわ
@millco-.-
@millco-.- 2 года назад
its tiresome
Далее
JAX Crash Course - Accelerating Machine Learning code!
26:39
Reforged | Update 0.30.0 Trailer | Standoff 2
02:05
Просмотров 789 тыс.
Demo: JAX, Flax and Gemma
8:12
Просмотров 3,5 тыс.
Why Neural Networks can learn (almost) anything
10:30
Simon Pressler: Getting started with JAX
29:49
Просмотров 1,9 тыс.
NumPy vs SciPy
7:56
Просмотров 36 тыс.
Just In Time (JIT) Compilers - Computerphile
10:41
Просмотров 269 тыс.
Large language models with Keras
42:14
Просмотров 5 тыс.
This is why Deep Learning is really weird.
2:06:38
Просмотров 384 тыс.
What is Automatic Differentiation?
14:25
Просмотров 109 тыс.
Reforged | Update 0.30.0 Trailer | Standoff 2
02:05
Просмотров 789 тыс.