Тёмный

03L - Parameter sharing: recurrent and convolutional nets 

Alfredo Canziani
Подписаться 39 тыс.
Просмотров 20 тыс.
50% 1

Course website: bit.ly/DLSP21-web
Playlist: bit.ly/DLSP21-RU-vid
Speaker: Yann LeCun
Chapters
00:00:00 - Welcome to class
00:00:49 - Hypernetworks
00:02:24 - Shared weights
00:06:10 - Parameter sharing ⇒ adding the gradients
00:09:33 - Max and sum reductions
00:11:46 - Recurrent nets
00:14:20 - Unrolling in time
00:16:17 - Vanishing and exploding gradients
00:19:48 - Math on the whiteboard
00:23:18 - RNN tricks
00:24:29 - RNN for differential equations
00:27:18 - GRU
00:28:23 - What is a memory
00:41:26 - LSTM - Long Short-Term Memory net
00:43:11 - Multilayer LSTM
00:46:01 - Attention for sequence to sequence mapping
00:48:41 - Convolutional nets
00:50:50 - Detecting motifs in images
00:56:57 - Convolution definition(s)
00:59:43 - Backprop through convolutions
01:03:42 - Stride and skip: subsampling and convolution “à trous”
01:06:56 - Convolutional net architecture
01:19:08 - Multiple convolutions
01:20:37 - Vintage ConvNets
01:32:32 - How does the brain interpret images?
01:37:18 - Hubel & Wiesel's model of the visual cortex
01:42:51 - Invariance and equivariance of ConvNets
01:49:23 - In the next episode…
01:52:54 - Training time, iteration cycle, and historical remarks

Опубликовано:

 

17 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 48   
@5Gazto
@5Gazto 3 года назад
The luxury of receiving lectures from the man himself.
@alfcnz
@alfcnz 3 года назад
🤩🤩🤩
@tankstocks
@tankstocks Год назад
Thank you so much for putting this on YT, such a brilliant teaching style!! loved it.
@alfcnz
@alfcnz Год назад
❤️❤️❤️
@oguzhanercan4701
@oguzhanercan4701 2 года назад
listening CNNs from its inventor is amazing, thanks alfredo
@alfcnz
@alfcnz 2 года назад
You're welcome 😊😊😊
@haodongzheng7045
@haodongzheng7045 2 года назад
Great video as always, and I love the story telling session at the end.😊
@alfcnz
@alfcnz 2 года назад
😊😊😊
@xXxBladeStormxXx
@xXxBladeStormxXx 2 года назад
16:42 haha well done Alfredo xD giving Yann some vfx
@alfcnz
@alfcnz 2 года назад
😁😁😁
@mahdiamrollahi8456
@mahdiamrollahi8456 3 года назад
1:32:00 was fantastic memory by Professor...so cool...
@vaibhavsingh8715
@vaibhavsingh8715 3 года назад
great way to start the day!!
@alfcnz
@alfcnz 3 года назад
☀️☀️☀️
@kevindoran9031
@kevindoran9031 2 года назад
Love these videos.
@alfcnz
@alfcnz 2 года назад
😀😀😀
@cambridgebreaths3581
@cambridgebreaths3581 3 года назад
Yay. Thank you kindly Alf...
@alfcnz
@alfcnz 3 года назад
🤗🤗🤗
@tjoh4605
@tjoh4605 2 года назад
Such a great lecture. Thanks for all the efforts for putting this in RU-vid.
@alfcnz
@alfcnz 2 года назад
🤟🏻🤟🏻🤟🏻
@tiajennifer7691
@tiajennifer7691 2 года назад
Best lecture so far! We have: young Yann: 1:32:20 Vertebrates evolved worst than the octopus: 1:33:55 Alfredo laughing at 1980s technological problems lmao: 1:54:44 And at the end we get the realization that Yann had to code everything in C like a madman 🙃 Aside from jokes, the RNNs were explained so well, and the CNNs explanation is of course a masterpiece.
@alfcnz
@alfcnz 2 года назад
😅😅😅
@woddenhorse
@woddenhorse 2 года назад
💯💯
@user-co6pu8zv3v
@user-co6pu8zv3v 3 года назад
Thank you, Alfredo. Today I watch your videos all day. :)
@alfcnz
@alfcnz 3 года назад
😍😍😍
@StanleySalvatierra
@StanleySalvatierra 3 года назад
Nice!!!
@alfcnz
@alfcnz 3 года назад
🥳🥳🥳
@doyourealise
@doyourealise 2 года назад
loved the animation :) hehe
@alfcnz
@alfcnz 2 года назад
🥰🥰🥰
@mahdiamrollahi8456
@mahdiamrollahi8456 3 года назад
I was thinking about how it is possible to do these jobs in more than 30 years ago and I saw that Professor mentioned that. I myself cannot believe it… nowadays if you just install python and pytorch, you have completed 90% of the job😉. Even how possible to do experiments on animals over 50 years ago….😕. You know, there were nothing just a unix and emacs like as you said just a mouse 🐭 and PowerPoint… my mind is blowing 🤯 Thanks Alfredo, Thanks Sir
@mitrus4
@mitrus4 Год назад
Thank you so much for sharing this to the public! This duo is insane : Alfredo + AI Godfather (yin and Yann :D). It's the first time I write all the notes and completing the homework taking into account all the suggestions you guys give. Considering no one will ever look into it and check whether it's correct.
@alfcnz
@alfcnz Год назад
You're welcome! Haha 🤣🤣🤣 I'm writing down the lecture notes too and compiling them into a book.
@davidlearnforus
@davidlearnforus 2 года назад
Thanks! I'm just a Lawyer who decided to learn ML and explanations are so good I still understand things with quick terminology searches. I may ask stupid question, but I'll still ask: what value exactly do add neural ODEs to normal RNNs, is all that putting all Ws just in specific suitcase? because in both cases Ws seem to remain the "black box" anyway? Or Ws become some kind of solution for ODEs after training? 🙏
@fredxu9826
@fredxu9826 3 года назад
First! just the right time to open up RU-vid
@alfcnz
@alfcnz 3 года назад
😄😄😄
@medwards1086
@medwards1086 2 года назад
FYI the z_t and 1-z_t in the equation for h_t are switch for the image displayed.
@alfcnz
@alfcnz 2 года назад
At what time stamp?
@medwards1086
@medwards1086 2 года назад
27:19 Also, really enjoying the lecture series. Thank you for making this available. 👍
@alfcnz
@alfcnz 2 года назад
Yes indeed. And I'm glad you're enjoying the course ☺️☺️🙂
@kirtankanani6669
@kirtankanani6669 2 года назад
at 1:03:24 why is Backprop to input is just this : summation(w_k * del_c / del_y_(j-k) ) it should be summation(w_k * del_c / del_y_(j-k) ) + summation(w_k * del_c / del_y_(j+k) ) as x at index j will influence range of y's from y[j-k] to y[j+k] , where k is the size of window and also assuming stride is 1 correct me if I am wrong
@ramyapriya7257
@ramyapriya7257 3 года назад
At 17:13 Yann says that, if we multiply the "z" vector with the "W" matrix, the result is simply a rotation of the z vector. Can someone please explain this? Does this mean that "W" acts as a rotation matrix? If so, how?
@alfcnz
@alfcnz 3 года назад
«Let's imagine W is a rotation matrix […]»
@ramyapriya7257
@ramyapriya7257 3 года назад
@@alfcnz Thanks for the clarification! Also, the videos are of top-notch quality, both content and graphics-wise. Loving the laser sabres!
@alfcnz
@alfcnz 3 года назад
Hahaha, bzzz, bzzz! 😄😄😄
Далее
04L - ConvNet in practice
51:41
Просмотров 10 тыс.
04.1 - Natural signals properties and the convolution
1:09:13
01L - Gradient descent and the backpropagation algorithm
1:51:04
11 - Graph Convolutional Networks (GCNs)
57:34
Просмотров 9 тыс.
07 - Unsupervised learning: autoencoding the targets
56:42
What is ChatGPT doing...and why does it work?
3:15:38
Просмотров 2,1 млн
AES: How to Design Secure Encryption
15:37
Просмотров 152 тыс.
10L - Self-supervised learning in computer vision
1:36:13