Тёмный

Residual Networks and Skip Connections (DL 15) 

Professor Bryce
Подписаться 4,7 тыс.
Просмотров 39 тыс.
50% 1

Davidson CSC 381: Deep Learning, Fall 2022

Опубликовано:

 

2 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 85   
@alexei.domorev
@alexei.domorev Год назад
ResNets are tricky to conceptualise as there are many nuances to consider. Dr Bryce, you have done a great job here offering such a brilliant explanation that is both logical and easy to follow. You definitely have a gift of explaining complex ideas. Thank you!
@vernonmascarenhas1801
@vernonmascarenhas1801 2 месяца назад
I am writing a thesis on content-based image retrieval and I had to understand the ResNet architecture in-depth and by far this is the most transparent explanation ever!!
@thelife5628
@thelife5628 2 месяца назад
Another example of a random youtuber with very less subscriber explaining a complex topic so brilliantly... Thankyou so much sir
@anirudhsarma937
@anirudhsarma937 Год назад
very very very good explanation. almost all explanations on this forget about the influence of random weights on the forward propagation and focus solely on the backward gradient multiplication. which is why i never understood why you needed to feed forward the input. thanks a lot
@ashishbhong5901
@ashishbhong5901 8 месяцев назад
i have seen a lot of online lectures but you are the best for two reasons, the way you speak is not monotonous which give time to comprehend and process what your are explaining, and the second is the effort put in video editing to speed up when writing things down on board which doesn't break the flow of the lecture. Liked your video. Thanks🙂!
@AdityaSingh-qk4qe
@AdityaSingh-qk4qe 5 месяцев назад
This is the clearest video that I've ever seen which explains the resnet for a layman, while at the same time conveying all the very important and relevant information related to resnet - I couldn't understand the paper - but with this video finally understood it - thanks a lot Professor Bryce - hope you create more such videos on deep learning
@Engrbrain
@Engrbrain Год назад
I am going to complete the entire playlist. Thanks, Bryce, you are a life saver
@zhen_zhong
@zhen_zhong 3 месяца назад
This tutorial is so clear that I can follow along as a non-native English speaker. Thanks a lot!
@nguyentranconghuy6965
@nguyentranconghuy6965 2 месяца назад
nice explanation, thank you very much Professor Bryce
@luisaruquipac.381
@luisaruquipac.381 Месяц назад
Awesome explanation! Thanks a lot.
@lallama202
@lallama202 6 месяцев назад
Love your explanation, very easy to understand the concept and the flow of the ResNet in 17 mins! Really appreciate it
@alissabrave424
@alissabrave424 2 месяца назад
Brilliant explanation! Thank you so much, Professor Bryce!
@shobhitsrivastava9112
@shobhitsrivastava9112 11 месяцев назад
Until now, this is the best Residual Network tutorial I have found. As constructive feedback, I would like you to dive more deeply into how shape mismatches are handled because that part is not at par with the rest of the highly intuitive explanations of various things happening in a ResNet.
@raulpena9865
@raulpena9865 Год назад
Thank you professor Bryce, Resnets where brilliantly explained by you. I am looking forward for new videos on more recent deep learning architectures!
@user-ux2gz7sm6z
@user-ux2gz7sm6z Год назад
your explanation is clear and concise! Thank you so much
@abdulsaboorkhan8337
@abdulsaboorkhan8337 6 месяцев назад
Thank you so much Mr Bryce.
@rabindhakal
@rabindhakal 5 месяцев назад
You have my respect, Professor.
@jonathanzkoch
@jonathanzkoch Год назад
Great video on this, super informative.
@garydalley2349
@garydalley2349 4 месяца назад
Awesome explanation. Got me through a learning hurdle that several others could not.
@user-ol1dx3nb3d
@user-ol1dx3nb3d 6 месяцев назад
Brilliant explanation. Thank you!
@giordano_vitale
@giordano_vitale 6 месяцев назад
Every single second of this video conveys an invaluable amount of information to properly understand these topics. Thanks a lot!
@kindness_mushroom
@kindness_mushroom 7 месяцев назад
Thank you for the clear, concise, yet comprehensive explanation!
@jiaqint961
@jiaqint961 2 месяца назад
Thanks for your video.
@subramanianiyer3300
@subramanianiyer3300 7 месяцев назад
Thank you Prof. Bruce for explaining this thing with minimal complicated technicality
@MrMiguelDonate
@MrMiguelDonate 3 месяца назад
Brilliant explanation!!!
@beatbustersindia3641
@beatbustersindia3641 8 месяцев назад
Brilliant explanation.
@sanjeevjangra84
@sanjeevjangra84 3 месяца назад
So clear and well explained. Thank you!
@vaibhavnakrani2983
@vaibhavnakrani2983 8 месяцев назад
awesome.Loved it clear and concise!
@schmiede1998
@schmiede1998 9 месяцев назад
Thank you so much for this video!
@nikhilthapa9300
@nikhilthapa9300 10 месяцев назад
Your explanations are very clear and well structured. Please never stop teaching.
@bakhoinguyen5156
@bakhoinguyen5156 8 месяцев назад
Thank you!!!
@minkijung3
@minkijung3 11 месяцев назад
Amazing. Thanks a lot. Your explanation is so clear. Please keep making videos professor!🙏
@AymanFakri-ou8ro
@AymanFakri-ou8ro 6 месяцев назад
very nice! thank you!
@strictly-ai
@strictly-ai 4 месяца назад
Best explanation of resnet on the internet
@lalop4258
@lalop4258 Год назад
Excellent class! I watched many videos before I came to this video and none explained the concept of residual networks as clearly as you did. Greetings from México!
@nilishamp245
@nilishamp245 Год назад
you are brilliant!! Thank you for explaining this so well!!!!❤❤❤
@rhysm8167
@rhysm8167 7 месяцев назад
this was fantastic - thank you
@user-uq7kc2eb1i
@user-uq7kc2eb1i 7 месяцев назад
Very nice video!
@lhdtomlee
@lhdtomlee 25 дней назад
Thank you Professor! This introduction is really helpful and detailed!
@sam-vv6gl
@sam-vv6gl 4 месяца назад
thank you for the great explanation
@rohithr2071
@rohithr2071 3 месяца назад
Best explanation i came across resnet so far.
@rishabhagarwal4702
@rishabhagarwal4702 2 месяца назад
Brilliant explanation, the 3D diagrams were excellent and I could understand some tricky concepts, thank you so much!
@business_central
@business_central Год назад
Omg this is so helpful! Thank you so much !!!
@genericchannel8589
@genericchannel8589 Год назад
Awesome explanation!! Thank you for your effort :)
@Bachelorarbeit-op4he
@Bachelorarbeit-op4he 7 месяцев назад
great explanation, thank you!
@1991liuyangyang
@1991liuyangyang 3 месяца назад
great explanation, simple and straightforward.
@ArtJug
@ArtJug Год назад
Wow This explanation is amazing. So clear! I saw some videos about resNets but none of them describes what skip connections mean inside, what is their inside structure and working logic. But your explanation gives me much more. You explained the way of thinking and inside structure and advantages. Wow!
@user-bg2vs5kh6n
@user-bg2vs5kh6n 6 месяцев назад
Great explanation, congrats.
@user-rb7vn3lt8t
@user-rb7vn3lt8t Год назад
Really Great explanation. Thanks Prof. ♥
@user-hd3uv9ym7f
@user-hd3uv9ym7f 8 месяцев назад
Thanks so much! very informative brief explanation
@puyushgupta1768
@puyushgupta1768 6 месяцев назад
16 golden minutes.❤
@adityabhatt4173
@adityabhatt4173 6 месяцев назад
Great Explanation !!!!
@ali57555
@ali57555 4 месяца назад
Thank you very much for putting the time and effort. This is one of the best explanations I've seen (including US uni. professors)
@sharmashikhashikha3
@sharmashikhashikha3 Год назад
You are a star!
@charlesd4572
@charlesd4572 Год назад
Superb!
@efeburako.9670
@efeburako.9670 21 день назад
Thx dude u are awesome !
@sajedehtalebi902
@sajedehtalebi902 Год назад
It was clear and useful. Tnx a lot
@happyvioloniste08
@happyvioloniste08 10 месяцев назад
Thank you 👏👏
@swethanandyala
@swethanandyala 2 месяца назад
Amazing expalinaton. Thank you sir
@SatyamAnand-ow4ub
@SatyamAnand-ow4ub Год назад
Awesome explanation
@amitabhachakraborty497
@amitabhachakraborty497 Год назад
Best Explanation
@DarkGoatLord
@DarkGoatLord Месяц назад
you saved my life
@trivendrareddy8236
@trivendrareddy8236 Месяц назад
Thank you sir great explanation
@lovenyajain6026
@lovenyajain6026 6 месяцев назад
Waow. Thankyou
@user-yv3ib9so5d
@user-yv3ib9so5d 3 месяца назад
What an explanation
@AsilKhalifa
@AsilKhalifa Месяц назад
Thanks
@paulocezarcunha
@paulocezarcunha 2 месяца назад
great!
@wouladjecabrelwen1006
@wouladjecabrelwen1006 9 месяцев назад
Who is this teacher? Damn he is good. Thank you
@axe863
@axe863 8 месяцев назад
Loss landscape looking super smooth .....
@zanzmeraankit4820
@zanzmeraankit4820 10 месяцев назад
got a meaningfull insights from this video
@kkjun7157
@kkjun7157 Год назад
This is such a clean and helpful video! Thank you very much! The only thing I still don't know is during the propagation, we now have two sets of gradients for each block? One for going through the layers, one for going around the layers, then how do we know which one to use to update the weights and bias?
@csprof
@csprof Год назад
Good question. For any given weight (or bias), its partial derivative expresses how it affects the loss along *all* paths. That means we have to use both the around- and through-paths to calculate the gradient. Luckily, this is easy to compute because the way to combine those paths is just to add up their contributions!
@sashimiPv
@sashimiPv 7 месяцев назад
Prof. Bryce is the GOAT!
@kranthikumar9998
@kranthikumar9998 11 месяцев назад
@csprof, By consistently including the original information alongside the features obtained from each residual block, are we inadvertently constraining our ResNet model to closely adhere to the input data, possibly leading to a form of over-memorization?
@anirudhsarma937
@anirudhsarma937 Год назад
Can you please talk about GANs and if possible stable diffusion
@newbie8051
@newbie8051 Год назад
Coudn't understand how we can treat the shape-mismatch 13:40 Great lecture nonetheless, thank you sir !! Understood what Residual Networks are 🙏
@praveshbudhathoki736
@praveshbudhathoki736 24 дня назад
Thanks for nice explanation But I have one query, in part 16:00 where you said "each output neuron get input from every neuron across the depth of previous layer", here doesn't that make each output depth neuron same??
@user-bw3bv1nz9l
@user-bw3bv1nz9l Год назад
👍
@mohammadyahya78
@mohammadyahya78 Год назад
Thank you very much. I am not sure yet how residual block lead to faster gradient passing when the gradient has to go through both paths please? It means as I understand that this adds more overhead to compute the gradient. Please correct me if I am wrong. Also can you please add more how 1x1 reduce the depth or make a video please if possible? For example, I am not sure how the entire depth say of size 255 gives output to one neuron.
@csprof
@csprof Год назад
You're right that the residual connections mean more-complicated gradient calculations, which are therefore slower to compute for one pass. The sense in which it's faster is that it takes fewer training iterations for the network to learn something useful, because each update is more informative. Another way to think about it is that the function you're trying to learn with a residual architecture is simpler, so your random starting point is a lot more likely to be in a place where gradient descent can make rapid downhill progress. For the second part of your question, whenever we have 2D convolutions applied to a 3D tensor (whether the third dimension is color channels in the initial image, or different outputs from a preceding convolutional layer) we generally have a connection from *every* input along that third dimension to each of the neurons. If you do 1x1 convolution, each neuron gets input from a 1x1 patch in the first two dimensions, so the *only* thing it's doing is computing some function over all the third-dimension inputs. And then by choosing how many output channels you want, you can change the size on that dimension. For example, say that you have a 20x20x3 image. If you use 1x1 convolution with 8 output channels, then each neuron will get input from a 1x1x3 sub-image, but you'll have 8 different functions computed on that same patch, resulting in a 20x20x8 output.
@wege8409
@wege8409 4 месяца назад
10:10 Concerns: shape mis-match nervous sweating
@spotlessmind9263
@spotlessmind9263 Месяц назад
Isn't this similar to RNNs where subsets of data is used for each epoch & in residual network, a block of layers is injected with fresh signal, much like boosting.
@rayananwar8106
@rayananwar8106 Месяц назад
Do you mean that RESNET is just a skip connection not an individual network ?????????
@EeniyahShelmon
@EeniyahShelmon 21 день назад
Free books
@davar5029
@davar5029 7 месяцев назад
Brilliant explanation. Thank you!
Далее