Тёмный

Neural Differential Equations 

Siraj Raval
Подписаться 769 тыс.
Просмотров 136 тыс.
50% 1

This won the best paper award at NeurIPS (the biggest AI conference of the year) out of over 4800 other research papers! Neural Ordinary Differential Equations is the official name of the paper and in it the authors introduce a new type of neural network. This new network doesn't have any layers! Its framed as a differential equation, which allows us to use differential equation solvers on it to approximate the underlying function of time series data. Its very cool and will ultimately allow us to learn from irregular time series datasets more efficiently, which applies to many different industries. I'll cover all the prerequisites in this video and point to helpful resources down below. Enjoy!
Code for this video:
github.com/llSourcell/Neural_...
Please Subscribe! And Like. And comment. Thats what keeps me going.
Want more education? Connect with me here:
Twitter: / sirajraval
instagram: / sirajraval
Facebook: / sirajology
More learning resources:
• Backpropagation in 5 M...
• Build a Neural Net in ...
• The essence of calculus
towardsdatascience.com/paper-...
arxiv.org/abs/1806.07366
blog.acolyer.org/2019/01/09/n...
rkevingibson.github.io/blog/n...
Join us at the School of AI:
theschool.ai/
Join us in the Wizards Slack channel:
wizards.herokuapp.com/
Please support me on Patreon:
www.patreon.com/user?u=3191693
Signup for my newsletter for exciting updates in the field of AI:
goo.gl/FZzJ5w
#NeuralDifferentialEquations #SchoolOfAI #SirajRaval
Hit the Join button above to sign up to become a member of my channel for access to exclusive content! Join my AI community: chatgptschool.io/ Sign up for my AI Sports betting Bot, WagerGPT! (500 spots available):
www.wagergpt.co

Опубликовано:

 

16 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 391   
@flyingzipper
@flyingzipper 5 лет назад
1 - Basic neural network theory | 8:30 2 - "Residual" neural network theory | 12:40 3 - Ordinary Differential Equations (ODEs) | 17:00 4 - ODE Networks | 22:20 5 - Euler's Method to Optimize an ODENet | 27:45 6 - Adjoint Method for ODENet Optimization | 29:15 7 - ODENet's Applied to time series data | 30:50 8 - Future Applications of ODENets | 33:41
@buenaventuralosgrandes9266
@buenaventuralosgrandes9266 5 лет назад
Thanks broo
@flyingzipper
@flyingzipper 5 лет назад
Np !
@valken666
@valken666 5 лет назад
Only PyTorch implementation as of now? rtqichen's torchdiffeq Github.
@FaizalSyed
@FaizalSyed 5 лет назад
I have faith in humanity because of people like you 👏🙏
@praveenb9048
@praveenb9048 5 лет назад
Is there a popular term that RU-vid people use for a list of video bookmarks like this one?
@thoyo
@thoyo 5 лет назад
I love how you're always excited about what you're talking about. It's infectious.
@Rednas34
@Rednas34 5 лет назад
Your videos are a continues stream of super high quality learnings about new computing mechanisms! Thank you!
@trycryptos1243
@trycryptos1243 5 лет назад
Siraj... please tell me that you have travelled back in time to help us catchup with the future. I am just flabbergasted by the volume & intensity you handle.! I have no words to comment just a dropped jaw in pure awe!!!😘
@theaichannel242
@theaichannel242 5 лет назад
Really interesting research, AI is moving so fast right now. There is so many doors going to be opened. Modelling more complicated functions but still keeping the memory tied in. Amazing stuff, your videos are first class!
@vman049
@vman049 5 лет назад
I regularly watch Siraj’s videos and this is one of the best I’ve seen... got my adrenaline pumping when I saw that list of topics to be covered at 8:30!
@ozzn.t.8050
@ozzn.t.8050 5 лет назад
Please keep posting such videos for new interesting papers. It feels like, something under our noses with math, and we just need to notice it to completely solve AI in an unexpectedly simpler way. Delicious thing to watch. WTG.
@saratbhargavachinni5544
@saratbhargavachinni5544 5 лет назад
About a week back, I started working as Teaching Assistant to Under grad Differential Equations course, I wondered when I was reading the text, I had learnt all these theory myself I was in fresh men year but very rarely used these differential equations after the course and I wondered if I can use these in Machine learning (my area of interest). I am really excited after watching your video.
@GReddy567
@GReddy567 5 лет назад
Awesome, Thanks Siraj! The physics community is going to love this! Looking forward to you making more videos on this when this research expands!
@sashas5390
@sashas5390 2 года назад
The input-times weight-add a bias-activate song is brilliant and should be used in elementary schools
@36squared
@36squared 5 лет назад
Excellent video. It may be self evident, but It's important to conceptualize these improvements from both a mathematical and programming understanding. You tackled a tough concept beautifully!!! Good job, mate
@John-bb5ty
@John-bb5ty 5 лет назад
Last night when I was going to sleep I had a great idea for a self-evolving non-parametric neural-network. I was wondering for the longest time how I can get the integral of a function of the learning rate with multiple variables. Today I saw this, thank you.
@carlsilverberg8700
@carlsilverberg8700 4 месяца назад
You've gotten way better than the last time I checked you out. That was 4 years ago, lol, so I guess thats just normal. But great man! Loved it! Absolutely amazing content.
@motog9464
@motog9464 5 лет назад
I am feeling more happy and proud now for learning Mathematics as my favourite subject. Another interesting reason to explore the AI more and more ..... Thanks, Sirj :)
@notjustwarwick4432
@notjustwarwick4432 5 лет назад
I agree, I'm studying maths at university and it is awesome to see differential equations pop up in AI.
@65343739
@65343739 5 лет назад
Ramesh is that you?
@einemailadressenbesitzerei8816
Wow 🤯 . Thx for the introduction in this fresh new and ultra interesting topic. I love analogies that can be used to get useful ideas or methods in another area. All the more I am fascinated to see this in such an impressive way. I didn't know ResNets before and who would have thought that this would lead to finding the analogy that the Euler method can be used to approximate the hidden layers numerically.
@irisgu8890
@irisgu8890 5 лет назад
Thank you! I watched many videos on ODE with ResNet and yours is the best!!!
@mlguy8376
@mlguy8376 5 лет назад
This could be interesting for me as someone that spent many years during his PhD looking at nonlinear ODEs. Now as a ML guy this would be great to relate back to my original work. There is a caveat that I was not clear on, there is a difference between stability conditions for ODEs which was not clear in the paper how they treat this.
@elciohumphreys2596
@elciohumphreys2596 5 лет назад
Thank you for the video. One thing that I believe it's a kind of frustration it's when you try to solve a differential equation and you don't have any function initial value because actually it results in a serie of functions, not just one. Watching that video I just realized you already have those function initial values: simply they are those data you use to train the network!
@CrimsonTheOriginal
@CrimsonTheOriginal 5 лет назад
Thank you Siraj, ive been reading over this paper for the last two weeks seeing how I can use it for my Forex predictions
@arnau7915
@arnau7915 5 лет назад
I'm only half way through the video and I can already tell this is my favorite one of 2019, and possibly my favorite research paper ever! Thanks, Siraj!
@malolan98
@malolan98 5 лет назад
Hey, siraj! Please make a video on Spiking Neural Networks!
@Bbb78651
@Bbb78651 Год назад
This made me fall in love with AI and ML again. Thank you so much. I was going through a slump, but when watching this I couldnt stop smiling throughout the entire video
@carosare6700
@carosare6700 5 лет назад
Ohh come on! I needed this for my differential equations proyect last semester:/ such an interesting topic!
@MarkPederson
@MarkPederson 5 лет назад
+Siraj Raval I tried (and failed) to implement ODE nets on a gnn just before the end of the year. It was difficult not only because of the data source structure-ML in graph DBs is still in it's infancy-but also due to the relative dearth of info on this technique. Your explanations were helpful and (maybe even more important) your enthusiasm inspired me to go back and tackle it again; I'd forgotten why ODEnets are so appealing in the first place. Thank you!
@yasinilulea
@yasinilulea 5 лет назад
This is awesome, you're killing it mate!
@hyunsunggo855
@hyunsunggo855 5 лет назад
I just wanna keep stare at the evolving convolutional layer output with this one. Must be fun! :)
@Cleon7177
@Cleon7177 5 лет назад
Awesome breakdown of very involved topics, Siraj. Keep it up!
@OnlyGoodJawn
@OnlyGoodJawn 5 лет назад
Siraj dropped the most fire freestyle of 2019 in this video.
@SirajRaval
@SirajRaval 5 лет назад
@@marketsmoto3180 wait 10 hours for my next video
@OnlyGoodJawn
@OnlyGoodJawn 5 лет назад
Siraj Raval I cant eat or sleep until I get these new bars Siraj!
@1wisestein
@1wisestein 5 лет назад
Thanks Siraj, you're doing a great job!
@earthbjornnahkaimurrao9542
@earthbjornnahkaimurrao9542 5 лет назад
This looks more and more to me like consciousness is simply a sophisticated set of mathematical operations. This Neural Network architecture is able to optimize its own structure, like how many layers it has, in order to best solve a given problem. The set of equations looks a lot like the same equations used in optimized control theory where an observed state is compared to a desired state to give error state which is then applied by a multiplier and fed back into the system so as to move the system one order of magnitude closer to the desired state.
@pranavsreedhar1402
@pranavsreedhar1402 5 лет назад
thank you siraj for putting the effort to enclose a much larger, broader audience. Everyone benefits from this.
@kfique
@kfique 5 лет назад
Great video Siraj! Thanks and keep up the great work!!
@asharkhan6714
@asharkhan6714 5 лет назад
I like this style of video where you talk freely, just like your livestreams.
@Avivalious
@Avivalious 5 лет назад
Very intersting~ The way to illustrate maths(derivative, integral, partial derivative) is intuitive, I will spend time on Euler Function which I still not very clear. Thank you for uploading such a great introduction which is both profound and intuitive.
@mingc3698
@mingc3698 5 лет назад
Very interesting! Looking forward to seeing this applied in action with time series data. I'm still don't understand how this design would help irregular time series data prediction.
@akashthoriya
@akashthoriya 5 лет назад
I'm Artificial intelligence enthusiastic, please bring some more videos like this. it'll be helping a lot!
@waeljaber9284
@waeljaber9284 5 лет назад
Thank you for making these videos!
@Madferreiro
@Madferreiro 5 лет назад
Cant thank you enough! Thank you very much man, your channel is the best!
@hamid7011
@hamid7011 5 лет назад
thank you so much Siraj, I think you just opened my eyes on my next paper title.
@junkseed
@junkseed 5 лет назад
Thanks for this good intro into this topic!
@taranveersinghanttal
@taranveersinghanttal 5 лет назад
the movement of ur hands always inspire me ;p
@zzziltoid
@zzziltoid 5 лет назад
Thank you for the attempt, my suggestion is that you should use the time in the video more efficiently. This is a pretty advanced paper, and noone who doesn't know the basics of neural networks or what a differential is will attempt/succeed to understand it.
@jackkensik7002
@jackkensik7002 5 лет назад
I have been exploring differential equations and am so happy I found this video, it puts the calculus in a context that is really interesting and applicable!!
@chiminglee8325
@chiminglee8325 5 лет назад
I would like to learn more about the code started from 30:50 though. But I love this video! Thanks for sharing.
@setlonnert
@setlonnert 5 лет назад
Interesting that more and more abstract concepts are added to the deep learning mix. Once found to be a more of a bottom up idea. Besides GANs which I found to be adding higher concepts of the mimax to lower ones as the neural networks, there are also developments in structuring networks from a point of view in abstract algebra, or now by this ODE. It's good to get an overview of the developing flow ....
@samcoding
@samcoding 6 месяцев назад
When we're predicting timestep t+h is it that we just forecast this in one step, or do we subdivide the gap (between t and h) into lots of sub-timesteps where the output is evaluated and passed into the algorithm again (almost like autoregression)?
@weishenmejames
@weishenmejames 5 лет назад
Your vids are always of super high quality, often the topic is completely new to me yet you explain it in simple and easy to understand terms with clear examples. Well done!
@amanasci2481
@amanasci2481 5 лет назад
Only channel on RU-vid that motivates me to study Maths..
@macmos1
@macmos1 5 лет назад
Really glad I studied math and CS in college.
@DrAhdol
@DrAhdol 5 лет назад
So to summarize, the ODEblock essentially takes all those (ODEfunc) layers and represents them as one large layer? Also what is the need for the initial resblocks at the start of the model? It's definitely an interesting approach to NN and I'm curious about it's applications in time-series (or anything that has a sequential relationship) data.
@ayushsingh562
@ayushsingh562 5 лет назад
Thank you so much for putting this together
@ma271
@ma271 5 лет назад
Thank you Siraj!! Your videos are awesome
@engineeringwithmehran
@engineeringwithmehran 5 лет назад
Sir, you are a great teacher, math simplified. 👌👌
@tonynguyen8166
@tonynguyen8166 5 лет назад
correct me if im wrong, but the main part of ode they used was Euler's method to approximate i was wondering if you can use any other tools that was taught for solving odes to neural networks.
@loaywael
@loaywael 5 лет назад
Awesome Video, Hopping to cover more about new research papers in that simple way, I really enjoyed even I'm not mathematician.
@iszotic
@iszotic 5 лет назад
Thanks for explaining all these concepts
@jackingmeoff5594
@jackingmeoff5594 5 лет назад
You are such a clever brain. Great work man thanks.
@kick-tech4691
@kick-tech4691 5 лет назад
Hey mota bhay.....I think in this video you really tried to make things simpler , oh ...yeah . Thanks for considering my suggestion . Keep rocking bro , keep educating the people.
@phil.4688
@phil.4688 5 лет назад
At ~11:00 "That, in essence, is how deep learning research goes. Let's be real, everybody." You just won LeInternet for today ;-)
@vuppumadhuri9546
@vuppumadhuri9546 3 года назад
The code which was shown in this video at the end of the video, doesn't show the ODE definition block. I mean, where the ODE was specified, except for the solver. Without defining ODE, how's it possible to solve dx/dt or d2x/dt2?
@AndriyDrozdyuk
@AndriyDrozdyuk 3 года назад
"ODE block" is not really a block. Shameless plug, here is my explanation of this paper: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-uPd0B0WhH5w.html
@pixel7038
@pixel7038 5 лет назад
Can we apply neural ode onto the alpha fold problem? Using neural ode to encoder the amino acid sequential data
@naxyytt
@naxyytt 5 лет назад
Even though I'm good at math, I would have never imagined myself using differential equations again after high school... and here I'm
@MostafaElhoushi
@MostafaElhoushi 5 лет назад
Thank you for the great effort you put in
@albertlee5312
@albertlee5312 4 года назад
Thank you! I am trying to understand the implementation in Python, but I am confused about why we still need 2~3 Conv2D layers with activation function... if we consider hidden layers as a continuous function that can be solved by ODE solvers. Could you please help me with this?
@david0aloha
@david0aloha 5 лет назад
This is amazing. You are amazing. Thank you.
@franciscovannini5002
@franciscovannini5002 5 лет назад
Hi Siraj ! Thanks a lot for the video, it was very motivating. I was wonderding... Do you think you could make a video applying these methods to a common place time series ? Would be awesome raised to infinity !
@mapleandsteel
@mapleandsteel 5 лет назад
Nice job, bruv. Keep making the diaspora proud!
@nerdtek
@nerdtek 5 лет назад
Could you post a video on using the adjoint method to solve odes. I would just really appreciate a concise presentation. All of the material I have found on it, is hard to digest.
@macmos1
@macmos1 5 лет назад
This is an incredible research paper.
@robertweekes5783
@robertweekes5783 5 лет назад
Breakthroughs like this are why AGI is closer than we think !
@nano7586
@nano7586 5 лет назад
30:29 You know shit is about to get serious when Siraj takes on a ninja posture
@kid-vf4lu
@kid-vf4lu 5 лет назад
I'm excited to see how this will merge with orthogonal polynomials
@tissuebox1229
@tissuebox1229 5 лет назад
hi siraj, could you make a video about implementing an ODE block? after rewatching the video twice it still is a mistery as to what concretly is happening in them, thanks!
@SirajRaval
@SirajRaval 5 лет назад
absolutely, its not your fault. even the researchers are still fully defining this
@deterministicnonperiodic9369
@deterministicnonperiodic9369 5 лет назад
I've been waiting for this a long time...
@aion2177
@aion2177 5 лет назад
Freaking fucking awesome!! Streched my brain quite a lot😂 Thanks.
@aewfan4360
@aewfan4360 5 лет назад
Siraj bhai, Happy Uttarayan.
@jenniferkwentoh
@jenniferkwentoh 5 лет назад
Thanks for explaining this. Genius
@Stan_144
@Stan_144 5 лет назад
Next Einstein ?
@vonderasche2963
@vonderasche2963 5 лет назад
Really badass presentation.
@liuxue8574
@liuxue8574 2 года назад
Thanks for your great work!
@prasanth_m7
@prasanth_m7 5 лет назад
wow.....at some parts i wondered whether i accidentally enabled 1.5x mode. Slow down at the essential parts Siraj. Anyways....will try this out right now. I always come to your channel for inspiration and i get energised by the end of your video.
@frede1k
@frede1k 5 лет назад
Feeding the next layer plus the input reminds me of Mandelbrots fractals f(z) = z^2 + c. Here the input and output are complex numbers though
@avatar098
@avatar098 5 лет назад
13:37 when Siraj is about to drop some hardcore ML knowledge
@morainaxel8499
@morainaxel8499 3 года назад
HAHAHAHA, shit is getting serious
@vvin4u
@vvin4u 5 лет назад
Thanks siraj for your work
@saitaro
@saitaro 5 лет назад
I'm so glad you don't stop rapping from time to time, man
@Stelios.Posantzis
@Stelios.Posantzis 5 лет назад
That was cool. I had never heard of what a resnet was or what an ODEnet was until I watched your video. Great educational value! The ODEnet presentation, however, did not cover the adjoint method sufficiently in order to form some basic understanding of it, unlike the other parts . I'd like to find out more about it.
@SirajRaval
@SirajRaval 5 лет назад
great points
@Funcijej
@Funcijej 5 лет назад
I was waiting for a video on this
@osama82405
@osama82405 2 года назад
thank you,excellent explanation.
@ethiesm1
@ethiesm1 5 лет назад
This is huge---Thanks
@thexhitij
@thexhitij 6 месяцев назад
that was really great, thanks a lot!
@SuvradipDasPhotographyOfficial
Awesome siraj. You made my day.
@celsomiranda6293
@celsomiranda6293 5 лет назад
Can this be explained as the calculation of eigenvalues for its corresponding eigenvectors?
@offchan
@offchan 5 лет назад
I like that you said "I know that sounds complicated but don't go anywhere."
@tamerkaratekin9074
@tamerkaratekin9074 5 лет назад
I haven't finished watching yet, but this type of videos is what makes Siraj shine in the world of AI teaching. Latest AI paper explained in a very exciting and motivational way. He is very right when he says that you cannot find this type of lecture anywhere else.
@priyabratdash2629
@priyabratdash2629 5 лет назад
Can navier stokes eqns be solved using these?
@supersearch
@supersearch 5 лет назад
So, this method allows faster training only or also allows to find the optimal number of layers?
@veronmath3264
@veronmath3264 5 лет назад
Math is awesome i like that bru and my first time ever to hear about reinforcement learning.
@softwareovercoffee
@softwareovercoffee 5 лет назад
I love this! Thank you!! Why did it take so long to figure this out. None of the concepts presented here are out of reach of a Bachelors grad. Not a mocking question but genuinely curious why something like this in hindsight seems so obvious and yet we continue to spend so much time focused on biomimicry? Regardless really excited! What were the other 3 papers that got first place?
@yipperjones254
@yipperjones254 5 лет назад
Siraj - is there an opensource implementation of this yet? Can we access with auto-keras? PyTorch? something else?
@ILikeWeatherGuy
@ILikeWeatherGuy 5 лет назад
so this paper essentially makes vertical wormholes for marbles to skip specific air current layers, then digs valleys so the marble has more time to fall into the appropiate grouping.
@tvaditya111
@tvaditya111 5 лет назад
As always great explanations
Далее
Differential equations, a tourist's guide | DE1
27:16
Liquid Neural Networks
49:30
Просмотров 239 тыс.
▼ЕГО БОЯЛИСЬ МОНГОЛЫ 🍣
32:51
Просмотров 451 тыс.
Лайфхак для дачников
00:13
Просмотров 17 тыс.
Neural Ordinary Differential Equations
35:33
Просмотров 22 тыс.
This is why you're learning differential equations
18:36
We Need to Rethink Exercise - The Workout Paradox
12:00