Тёмный

What is a Jacobian-Vector product (jvp) in JAX? 

Machine Learning & Simulation
Подписаться 23 тыс.
Просмотров 8 тыс.
50% 1

Опубликовано:

 

13 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 6   
@raayandhar6195
@raayandhar6195 Месяц назад
man your videos are so good and clear 👍
@MachineLearningSimulation
@MachineLearningSimulation 18 дней назад
Appreciate it, thanks 🙏
@porschepanamera92
@porschepanamera92 2 года назад
Would you be able to show some of the last video concepts (weak form derivations, adjoint sensitivity analysis, autom diff, etc) in a practical application using FEM? It's super interesting, but from my personal structural optimization perspective/bubble I still have to figure out how I could implement this. (I'm living in the Matlab world for now) At this moment I'm using a simplified, coded FEA solver, without the need for the jacobian in that context (square elements, linear analysis). Neither am I explicitly using the residual/loss terminology. So it would be nice to see how it all fits together in 1 simulation/optimization to see some parallels. I wish to go to Python or C++ in the future for scalability. Thanks!
@MachineLearningSimulation
@MachineLearningSimulation 2 года назад
That's a great point. It's also the long-term goal of the channel to show how one can use machine learning in the context of scientific computing. So far the two topics have been treated more or less detached from each other since I first wanted to create solid intro to some techniques. So I will definitely such videos in the mid to far future. Stay tuned 😉
@TimurIshuov
@TimurIshuov 2 месяца назад
@@MachineLearningSimulation Thank you!
@Nerdimo
@Nerdimo 9 месяцев назад
Would you mind sharing a cohesive explanation for what the tangent vector represents? I understand that in reverse mode AD, the vector we use to compute the vjp is usually a loss function w.r.t. to the outputs of a model (with that model being represented as one function). But in forward mode ad what is the tangents. Is it just a nudge in the inputs, I'm confused here.
Далее
when you have plan B 😂 @andreyreactions
00:11
Просмотров 4,4 млн
БЕЛКА РОЖАЕТ?#cat
00:28
Просмотров 392 тыс.
DeepONet Tutorial in JAX
51:38
Просмотров 3 тыс.
Oxford Calculus: Jacobians Explained
29:25
Просмотров 250 тыс.
WHY JAX? Why the Hell a 3rd ML framework in 2023?
17:36
What is Automatic Differentiation?
14:25
Просмотров 110 тыс.
NeurIPS 2020: JAX Ecosystem Meetup
1:02:15
Просмотров 27 тыс.
when you have plan B 😂 @andreyreactions
00:11
Просмотров 4,4 млн