Тёмный

What is backpropagation really doing? | Chapter 3, Deep learning 

3Blue1Brown
Подписаться 6 млн
Просмотров 4,4 млн
50% 1

What's actually happening to a neural network as it learns?
Help fund future projects: / 3blue1brown
An equally valuable form of support is to simply share some of the videos.
Special thanks to these supporters: 3b1b.co/nn3-thanks
Written/interactive form of this series: www.3blue1brown.com/topics/ne...
And by CrowdFlower: 3b1b.co/crowdflower
Home page: www.3blue1brown.com/
The following video is sort of an appendix to this one. The main goal with the follow-on video is to show the connection between the visual walkthrough here, and the representation of these "nudges" in terms of partial derivatives that you will find when reading about backpropagation in other resources, like Michael Nielsen's book or Chis Olah's blog.
Video timeline:
0:00 - Introduction
0:23 - Recap
3:07 - Intuitive walkthrough example
9:33 - Stochastic gradient descent
12:28 - Final words
Thanks to these viewers for their contributions to translations
Italian: @teobucci
Vietnamese: CyanGuy111

Опубликовано:

 

17 май 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,4 тыс.   
@rbm0307
@rbm0307 6 лет назад
You might think that your videos are fodder for university students boning up on a subject, or mathematicians/engineers in the early stages of their careers - basically that you cater to a younger audience. I'll have to prove you wrong. I'm in my early 60's and have been involved with information technology in some form or fashion my entire career. I enjoy learning; always have. I've viewed many of your videos only because they interest me and have you subscribed on my RU-vid account so as to get notifications of updates. I find the topics about which you speak fascinating and am a bit jealous of those university grads today who now have access to this material at their fingertips. I wish with all my heart that I was able access these videos back when I was in university. It would have made life SOOO much easier for me back then. Your pedagogic skills are astounding, demonstrated by your ability to communicate difficult subjects precisely, concisely and simply. The animation format is integral with the presentation, adding to the delivery of the material. I salute you!! Please keep these videos coming.
@zottasi
@zottasi 6 лет назад
I could not have said it better.
@johncoleman1930
@johncoleman1930 6 лет назад
That is very inspiring to hear you say I am a student rn, not in university though, but I love learning and hope to continue to foster a love of learning.
@jinalin9351
@jinalin9351 6 лет назад
Well said.
@richardsleep2045
@richardsleep2045 5 лет назад
This 65 yr old ex s/w developer completely concurs!
@landwarrior348
@landwarrior348 5 лет назад
And great thing about democratizing this learning is that kids in Africa or Europe or Asia have access to same content to what kids in America have. This makes me wonder if learning becomes easier or easily accessible, what the kids of future will be learning?
@Niki_0001
@Niki_0001 6 лет назад
I can't claim to have understood everything from the first watch-through of this series, and I will watch these videos again with pen and paper in hand, but even this first viewing has made neural networks go from pure witchcraft and wizardry to something that actually makes sense in my head. I can't possibly thank you enough for posting these videos.
@wajdanali1354
@wajdanali1354 6 лет назад
and I thought everybody else here except me is a genius
@joshdominguez2765
@joshdominguez2765 5 лет назад
"...has made neural networks go from pure witchcraft and wizardry to something that actually makes sense in my head" I died coz I've never related to something so much xD D:
@Dr_Neo_Cortex.uka_uka
@Dr_Neo_Cortex.uka_uka 5 лет назад
I thought that I was the only one who did it.
@GabrielCarvv
@GabrielCarvv 4 года назад
I did not understand a single word. Please help me.
@fillthedao
@fillthedao 3 года назад
@@GabrielCarvv If this video did not explain this to you then nothing will I'm afraid... Don't get me wrong - it's not about you - its a tough subject but this is by far the most approachable presentation I have seen out there. I can't even imagine how one would present it to the fellow human being in a more approachable way... ;-) PS. Have you watched the previous 3 vids from his series?
@ThePRASANTHof1994
@ThePRASANTHof1994 4 года назад
Anyone else smiling through all of his videos because you're understanding so much so well like never before?
@makingmediocremachines2133
@makingmediocremachines2133 3 года назад
yup
@jehanbhathena6270
@jehanbhathena6270 2 года назад
Yes!!!!!
@angelbythewings
@angelbythewings 2 года назад
yeah man!
@vladimirbosinceanu5778
@vladimirbosinceanu5778 2 года назад
Every single time.
@pranavprameshgandhi9392
@pranavprameshgandhi9392 2 года назад
Hell yeah
@kevinconnolly6450
@kevinconnolly6450 3 года назад
This series is totally brilliant. I am 73 years old and used to teach mathematics. I am still learning stuff and with the help of sites like yours it makes it so much easier. Have you thought of doing any videos on the really complex subject of real analysis/. Keep up the good work. Kevin Connolly
@damian_smith
@damian_smith 8 месяцев назад
Cool! I'm mid forties and about to do a CompSci MSc. Gotta keep my own neurons going!
@yeimarsoto3196
@yeimarsoto3196 Месяц назад
​@@damian_smithSame! Im 31 getting my PhD in CS! Hopefully i dont need to be in a retirement home at 35.
@TheSimonvdp
@TheSimonvdp 6 лет назад
I disable adblock for this
@3blue1brown
@3blue1brown 6 лет назад
+Simon van der Poel Funny, because I disabled ads for this :)
@BrianFaure1
@BrianFaure1 6 лет назад
What's the first rule of adblock?
@BrianFaure1
@BrianFaure1 6 лет назад
TIGuardian - Apologies, I was talking to Simon, the first rule is you never talk about adblock.
@ntwede
@ntwede 5 лет назад
I tried, but my adblock uses neural networks. It decided it would rather re-enable itself.
@krakenmetzger
@krakenmetzger 4 года назад
Bro there's ads in the video
@srirams0071
@srirams0071 5 лет назад
Great explanation, your team is awesome. "A drunk man stumbling aimlessly downhill,but taking quick steps" is the best analogy ever for Stochastic gradient descent. :-)
@youtubeviolatedme7123
@youtubeviolatedme7123 Год назад
what the deuce?
@PeterGriffinLovesLois
@PeterGriffinLovesLois Год назад
stewie
@ivoryas1696
@ivoryas1696 Месяц назад
​@@youtubeviolatedme7123 Nah, it's actually not that uncommon. My material science professor even used it when discussing some Quantum processes! 😁
@xanthoptica
@xanthoptica Месяц назад
"A drunken sailor walking a plank" is often used to describe fixation of an allele by genetic drift in a small population. So next time you've had one too many, explain that you're feeling a little too stochastic to drive.
@TopGunMan
@TopGunMan 6 лет назад
These visualizations are spot-on. Only a few people in the entire world need to make a great explanations backed by powerful visualizations about a topic - the rest of the world just needs to discover these. So much time wasted by learners trying to locate easily-digestible information, among all the inferior presentation methods out there. Glad to have found one of the best for this topic.
@user-qj6hl5xb8q
@user-qj6hl5xb8q 2 года назад
It is challenging to go through books and lectres from authors and professors trying to sound smart.
@adityashukzy
@adityashukzy 4 года назад
Note to myself: Aditya, if you're having trouble understanding, read this. Scroll to 06:17. Listen to what he's saying, "In a sense, the neurons that are firing while seeing a 2, get more strongly linked to those firing when thinking about a two." Now, pause the video and listen. All this is is just a fancy way of saying: when we show our model a picture of a handwritten 2 and we tell it, "hey listen up this thing is a 2" (like in our training set we have labels for our input pictures) and then we find out the activation units that have a say in influencing the hypothesis value of the 2 label, ie, the activation units which can heavily increase or decrease the value of the output unit for the label 2, we tell those activation units "hey guys when you see something that resembles this thing, fire up the 2 label, ie increase the hypothesis value of the label for 2. Basically, we assign those activation units greater weights (or parameters) which influence the hypothesis value of the 2 label more (so that they can actually give us the right answer and say "oh look this is probably a two". I hope this helped and didn't complicate it further. If you don't get it, go over the video a few more times and review Andrew Ng's notes on this in the ML course on the Backpropagation lecture in week 5. Cheers, bro. Love you.
@hzmuhabbet
@hzmuhabbet 4 года назад
Thanks dude, it really helped. I am trying to follow Ng but after a point it sounds me like an Indian language. Sometimes it might be a bit difficult to visualize it in mind, and Ng is really a bit 'bad choice' about that part. I am not saying that he is a bad teacher but sometimes I feel like I am surfing through the universe...
@adityashukzy
@adityashukzy 4 года назад
@@hzmuhabbet I understand. Ng gets caught up in notation quite a lot so it's difficult to follow his intuition. I actually just noted this down for myself for future reference but I'm glad that you and other people are finding it helpful in understanding! I wish you well on your ML journey!
@HARIHaran-ks7wp
@HARIHaran-ks7wp 3 года назад
I finished the Andrew NG course except for the programming assignment part. I felt his explanation wasn't quite reaching my head and now watching this series has blown me away at how easy it's to understand things when they are visualized. Thanks for sharing your note snippet, pretty cool!
@nicolasstencel7775
@nicolasstencel7775 3 года назад
nice
@brianevans4
@brianevans4 6 лет назад
Your animations are amazing!
@ShanmeiLiuCS
@ShanmeiLiuCS 6 лет назад
How are the animations made? It is great!
@florianro.9185
@florianro.9185 6 лет назад
He has built an animation "engine" in python, which you can find on his GitHub
@ShanmeiLiuCS
@ShanmeiLiuCS 6 лет назад
Thanks!
@thescientist7753
@thescientist7753 5 лет назад
Brian Evans a
@chinmaydutta3783
@chinmaydutta3783 5 лет назад
github.com/3b1b/manim
@chaohongyang
@chaohongyang 2 года назад
3blue1brown > Indian guy on youtube >>> my CS teacher
@AishwaryaAR0013
@AishwaryaAR0013 3 года назад
I've watched these videos 3 times and everytime I watch them, I feel a bit smarter. It starts with understanding little to progressively understanding more and more and finally seeing the big picture. Can I say I've understood everything? No. I am getting there and I'll be coming back for these again. For anyone who's feeling discouraged, I can assure you that you'll get there. Thank you so much for creating quality content that brings the driest, most theoretical concepts to life! You're a hero.
@vgdevi5167
@vgdevi5167 Год назад
Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?
@TheVilivan
@TheVilivan 4 месяца назад
Thank you for saying this, I was pretty confused by some things in this series and I'm glad to hear that it gets better the more times you watch it. Learning some fundamental stuff like linear algebra and calculus would probably help as well, I really should get onto that
@thismakesnosense
@thismakesnosense 3 месяца назад
​@@vgdevi5167 Hey, this might be a little late but there are some links to learning about Neural networks/Backpropagation in the description if you haven't looked at them already.
@kemsekov6331
@kemsekov6331 2 года назад
I must say it. I am 100% seriously learnt English just to be able watch your videos. This is the content that will help everyone wandering grow stronger in their favourite subjects. I thank you for your work from the bottom of my heart ❤️❤️
@dinub8414
@dinub8414 2 года назад
I find it astonishing how well you convey some of the intricacies here, way better than most of ML practitioners who teach the public, whether as youtubers, online instructors or public speakers who end up on video on the internet. I very much resonate with the way you frame things, the metaphors you choose, your visualizations and of course your evident love for understanding and sharing thereof. Your work is a great gift to all of us - students, engineers, researchers, philosophers, random viewers from all walks of life. Thank you.
@saitaro
@saitaro 6 лет назад
Grant, you're a diamond.
@General12th
@General12th 6 лет назад
no hes a human
@-long-
@-long- 5 лет назад
you must be fun at the party
@timangar9771
@timangar9771 2 года назад
@@-long- I actually found it funny. Also, THE PARTY sounds dramatic, am I invited?
@duykhanh7746
@duykhanh7746 2 года назад
@@timangar9771 you gotta admit that we have weird sense of humor for watching this video for fun
@user-vt4bz2vl6j
@user-vt4bz2vl6j 2 месяца назад
no he's a pi creature
@MySkittlesRainbow
@MySkittlesRainbow 6 месяцев назад
First time I caught myself having a moment of awe while watching educational content. The production value is incredibly high... the way the connections twinkle and move to represent adjusting the weights, the small animations, the descent into various shapes, how the little arrows indicating the desired change move and change size. Beautifully put together! Thanks a lot!
@AyurvedaGyan
@AyurvedaGyan Месяц назад
your videos have made it incredibly easier to grasp fundamental concepts of ML. Great work!!
@UtsavMunendra
@UtsavMunendra 6 лет назад
Kurzgesagt and 3Blue1Brown in the single hour! I am in heaven.
@grainfrizz
@grainfrizz 6 лет назад
Utsav Munendra ikr?!?!?!
@zbzb-ic1sr
@zbzb-ic1sr 6 лет назад
Utsav Munendra I was about to comment this.
@shreyashervatte5495
@shreyashervatte5495 6 лет назад
So true!
@Martinmarshallmargella
@Martinmarshallmargella 6 лет назад
this is fav youtuber so farrrrr
@AlexiLaiho227
@AlexiLaiho227 6 лет назад
haha me too
@makebreakrepeat
@makebreakrepeat 6 лет назад
I just want to say that I love that you're moving into the mathematics of ML. The visualizations convey the concepts so well!
@shiladitya7739
@shiladitya7739 2 года назад
No form of words can express enough what magic you're creating! I don't know how much you actually think you impact us.. but let me tell you Grant, your effect on my life is immeasurable! And the fact that you learnt it the hard way, and made it so simple for us, so that we don't have to go through the same, makes me respect you even more and more every single day.. Thank you so much.. 3B1B is undoubtedly the best channel on RU-vid..
@donniegoodman8679
@donniegoodman8679 3 года назад
All of the videos shown on this channel have been written so well. Even I can understand, mathematics was a complete mystery for me in school. I have a sixth grade education and I feel really smart after watching one of these programs. Thanks so much. I'm really grateful that you have taken the time to educate the people that have a hard time understanding but still want to learn. What you're doing is just as important as any other volunteer or charity work. I'm excited I found this .
@CyberCookieMonster
@CyberCookieMonster Год назад
I am a “seasoned” systems engineer, inventor, and IT professional. IOW, I’m old (40+), lol. Like @Robert MacKinnon has said, you may think that your wonderful videos are only for those new in the field or younger folks in general. This couldn’t be further from the true. Your ability to explain things with clear diction, amazing graphics, and compelling story telling is phenomenal. They help me to not only refresh to get spun up quickly, but also serve as a point of reference for me to direct those newer in my teams; as your ability to run through this at just the right level and pacing far exceeds mine. I would go so far as to suggest that you freelance as a guest speaker or instructor for hire for larger firms having folks wanting to get spun up quickly. Apologies if this is old news and you’re already pursuing this. I simply feel compelled to tell you as we often don’t understand how awesome we are and need to be reminded by others. Consider this your reminder ;) Thanks again for your wonderful videos where you ask for nothing. I would pay for a Patron page if you get one to be able to ask for more content or explanation on topics such as the vanishing gradient problem prone to recurrent neural networks and how LSTM addresses it. I’ve been looking for a good one that does this graphically with intuitions. I haven’t found a good one yet, but I did come here first to check ;) All the best, CCM.
@cameronadams4366
@cameronadams4366 6 лет назад
You are a wizard when it comes to animations and understanding
@vgdevi5167
@vgdevi5167 Год назад
Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?
@brendawilliams8062
@brendawilliams8062 Год назад
You would like Quaternions.
@bikrammajhi3020
@bikrammajhi3020 Год назад
Thank you for what you are doing. I have just started learning ML and I find your video really helpful. Thank you again from bottom of my heart for everything you are doing. I just wanted to let you know that you are really making changes and inspiring youngsters like me.
@SamuelJFord
@SamuelJFord 6 лет назад
I have just found your videos. I have enjoyed watching science educators on youtube for many years, but these videos are the best examples of complex ideas being explained clearly whilst still being entertaining. I can't believe I had never heard of this channel before.
@gmish27
@gmish27 6 лет назад
I've been reading the book for quite some time but your explanations using animations have pushed my understanding to new levels. So many thanks to you. PS: Keep the background music. Holds our concentration for a long time.
@mark.fedorov
@mark.fedorov 6 лет назад
Man, the clarity and the animations make your videos masterpieces
@vincevasvari9818
@vincevasvari9818 5 месяцев назад
I love that Fermat is labeled as ,,Tease"
@jacqueskirstein9833
@jacqueskirstein9833 6 лет назад
Thank you so much for all the time and effort that you put into making a visual understanding of Neural Networks. I've been trying (admittedly on and off) to get a good intuition on Neural Networks for at least a year now. This is by far the best fundamental video I have come across to get an understanding of what the theory behind the scenes is for. I know that this will increase my (and many others') learning curve by a large amount. Thanks again!
@lilysu2529
@lilysu2529 3 года назад
I love how you enunciate the words as if you are truly interested and into the topic. The art direction and the animations are also on point. Kudos to the animator!
@trevinsmall9069
@trevinsmall9069 Год назад
All I can say is thank you! Every 3b1b video that I have watched is simply incredible. The chronology of ideas introduced throughout a video makes complex topics easy to follow, and the animations are BEAUTIFUL. I have never been able to visualize math so well before. You are an amazing teacher and have contributed invaluable knowledge to society! I'm still in shock that educational resources this profound are available for free. I truly appreciate the work you are doing!
@vgdevi5167
@vgdevi5167 Год назад
Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?
@paskymail
@paskymail 5 лет назад
I had to stop seeing your video for a moment because I needed to write down my admiration for your work. It is outstanding the amount of intelligence, communication skills, careful graphic design and passion in very single frame of your videos. Congratulations and thank you for your contribution to maths education. See you in patreon!
@soheilsepahyar1437
@soheilsepahyar1437 3 года назад
Thank you so much for the simplicity in your explanations. You try to explain everything not by their definition, but with an understandable, simple way. Thank you!
@Michael-vs1mw
@Michael-vs1mw 6 лет назад
* eagerly waiting for a video about convolutional neural networks *
@3blue1brown
@3blue1brown 6 лет назад
+Michael Incog It'll be a little while. The next few videos won't be for this series, so I'll probably return back to this in a few months.
@UtsavMunendra
@UtsavMunendra 6 лет назад
Computerphile did some videos on convolutional neural network
@wanderingrogue3039
@wanderingrogue3039 5 лет назад
3Blue1Brown Please please make one soon
@alberttamazyan
@alberttamazyan 4 года назад
@@3blue1brown hasn't the few months passed? The whole community is waiting for a video on convolutional neural networks.
@ciherrera
@ciherrera 4 года назад
@@alberttamazyan Ikr :(
@CedricChee
@CedricChee 6 лет назад
So far, I think this is the best intuitive intro to backprop I've seen. This channel means a lot to me. Because of its teaching style, I managed to get back to learning and grok maths while I was studying machine learning last year, 13 years since doing my undergrad in CS. It's almost the end of 2017 and I still keep hearing from the people I talked to that they fear/hate maths because they think it's a tough subject to tame. May be this example tell us why our education system (internationally) is still broken? As an aside, David Perkins, in his book "Making Learning Whole" also touch about this widespread diseases of the educational system, namely "elementitis". I think we can do better. Grant is doing great work to lower the barrier and making math more accessible to everyone. This is not an easy feat. I think we need some sort of concerted effort for encouraging more people to teaching maths or any subject through intuition. Visualization is one way to improve the teaching methodology. We can also distill intuition from stories, feelings, situations, etc. More examples, see: distill.pub/ colah.github.io/
@ollerich32
@ollerich32 5 лет назад
Cedric Chee spot on! And thanks a lot for those links.
@davidswygart7472
@davidswygart7472 5 лет назад
I am a neuroscientist, and all your comparisons of actual neural networks to artificial neural networks seem pretty spot on to me. This is a great video series.
@PunmasterSTP
@PunmasterSTP Год назад
All of the other commenters beat me to it, but I can't express enough how insanely high the quality of these videos is, and how much they help teach people new things. Thank you so much for making all of them and sharing some of your vast amount of knowledge!
@Kaixo
@Kaixo 6 лет назад
You are amazing!! Best explanation I've seen on the internet yet!!
@vgdevi5167
@vgdevi5167 Год назад
Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?
@lianmccc
@lianmccc 3 года назад
This series is better than my in total 6-hour lecture from school.
@jimmieboboyo
@jimmieboboyo 4 года назад
Normally, I never comment on youtube videos, but I'm making an exception for this one. Even though you might not see this, I want to thank you for your contribution to everyone's education. And for FREE access to it too. Amazing videos with amazing presentation, structure, and explanations. I'm 3 years late, but please do keep up the great work!
@davidjurelius5967
@davidjurelius5967 6 лет назад
This series is the best I've seen on neural networks. Thank you for making it. ❤❤❤
@kushh7550
@kushh7550 Год назад
I never understood neural nets like this before... Thanks!
@chriswangux
@chriswangux 5 лет назад
Came for the knowledge, stayed for the Animations. Here I'm thinking how it can be done in After Effects, and turned out it's custom engine!
@erikaambrioso8966
@erikaambrioso8966 3 года назад
Im taking a Neural networks course at my university right now and I'm finally starting to understand this stuff after watching this video series! Thank you so much!
@Kishimita7204
@Kishimita7204 Год назад
Ngl man one of my Goals, once I get a good enough job. Will be to be able to live contribute to your patreon. Your videos have been so entertaining, inspiring, and helpful that I hope the move that I give allows you to continue with this channel.
@manjunathsastry7540
@manjunathsastry7540 6 лет назад
Oh dear! Absolutely brilliant. This is one heck of video. How would you consider creating an exhaustive ML and Neural Network video series? I'm sure there are a lot of curious folk out there, waiting for it!
@marcotroster8247
@marcotroster8247 2 года назад
I love the statement "the most confusing part is the notation". This is so true 😂 It would be really, really nice if you did a last video on how to program this network in a comprehensive way 😄 IMO the inventors of DNN chose all those activation functions and matrice multiplications really carefully to make computation feasible. But all this beautiful simplicity is gone once these unnecessarily complicated math formulas come into place. Honestly, you did an amazing job with your videos on DNNs and backpropagation. The first time I seemed to understand it.
@castrojosua
@castrojosua 18 дней назад
I changed my major from cybersecurity to philosophy and mathematics because of these videos. This is much better than any college course I’ve taken.
@juliangonzalez28jg
@juliangonzalez28jg 4 года назад
Implementing a neural network from scratch for a class project and this series helped me so much to get started. Thanks!!!
@JemimaGoodall
@JemimaGoodall 4 года назад
This should be art. I don't normally comment on videos but I feel I must express my gratitude and appreciation for your amazing explanations and animations! Visualisation is so important and helpful and you have nailed it beautifully!
@shreyanshdarshan3199
@shreyanshdarshan3199 6 лет назад
You have made life so much easier.
@cvm7549
@cvm7549 5 лет назад
This is the best video on machine learning I have seen so far. Thank you for making this complexity as simple as possible.
@krikkiteer
@krikkiteer 6 лет назад
your videos are simply amazing.. awesome visualisation of what actually happens...i imagine a huge effort went into making those.... kudos - i almost never comment in that manner... greatest thanks and respect for that!
@ObitoSigma
@ObitoSigma 6 лет назад
Now to watch all three parts at once!! Great video 3b1b, keep up the great work man! δ.δ
@vgdevi5167
@vgdevi5167 Год назад
Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?
@sohamharnale2878
@sohamharnale2878 6 лет назад
This man is the eternal fountain of knowledge straight from the heavens
@prithvidhelia3516
@prithvidhelia3516 26 дней назад
This is definitely one of the best educational video series I have seen. Explanation is concise, abstract when needed and visualizations are phenomenal
@angelanikolova7354
@angelanikolova7354 5 лет назад
I am writing my internal assesment for school about neural networking and thank god your channel exists cuz everything about this subject is now clear to me. You explanations are perfect!
@SkyTitans
@SkyTitans 6 лет назад
Thank you soooo much! This channel is pure *GOLD*!
@grainfrizz
@grainfrizz 6 лет назад
9:00 Eureka moment. I cried.
@ChopinFlutist
@ChopinFlutist 6 лет назад
I actually got touched for real from these videos. A great example of the beauty of the math.
@adiadiadi333
@adiadiadi333 6 лет назад
Felt obvious to me 😕
@Kiros37100
@Kiros37100 6 лет назад
aditya sai You are so intelligent! Wow!
@sohamharnale2878
@sohamharnale2878 6 лет назад
Hahaha I feel you mate
@milanstevic8424
@milanstevic8424 6 лет назад
@Kiros37100 You're so bitter and sarcastic! Wow!
@abhinavgarg5611
@abhinavgarg5611 2 года назад
I have watched first 3 videos of this series and i have gained a lot of insight about what basically happens within the black box. Thanks a lot 3Blue 1Brown for these wonderful video illustrations. It took me more than 3.5 hours along with note making. I have still not completely understood NN, but its ok. I am looking forward to video number 4 of these wonderful series. Thanks a ton once again :))))))
@ammararif7524
@ammararif7524 2 года назад
Half the time I'm in awe of the quality of the content. The other half I'm learning. Thank you so much man.
@markdashark3443
@markdashark3443 3 года назад
I wish I had the option to pay my college tuition to you instead of my university. Well done mate!
@TheLouisandKyleShow
@TheLouisandKyleShow 3 года назад
This video is literal ART. Grant is my favorite artist.
@yuewang2186
@yuewang2186 6 лет назад
Am constantly in awe of how intuitive this is.
@akashbansal4842
@akashbansal4842 8 месяцев назад
Thankyou so much for such a visually descriptive and intuitive video regarding Backpropagation in deep learning.
@thevoodooninja
@thevoodooninja 6 лет назад
I've never clicked on a video this fast
@assaissa
@assaissa 6 лет назад
What for did you clicked on the video?
@veratsien2014
@veratsien2014 4 года назад
This is a lifesaver for completing week 5 of Andrew Ng's Machine Learning course!! Been stuck at backpropagation for hours and finally found some clarity in this video. I love your channel! ❤️
@Endothermia
@Endothermia 4 года назад
Haha, I'm going through exactly the same thing!
@veratsien2014
@veratsien2014 4 года назад
@@Endothermia lol, high five and hang in there mate. :)
@Hgkbukk
@Hgkbukk 5 лет назад
You are great at teaching! So many gets lost in the mathematics of it, not caring if their audience understand it or not. You make it incredibly simple! Great job!!
@Spektrob
@Spektrob 6 лет назад
This is by far the best educational content i've ever seen on youtube. Thank you and go on with it.
@bricechivu8573
@bricechivu8573 5 лет назад
I might misunderstand but maybe not. At 7:38, we want to reduce the activation of the neuron responsible for 3 right? So we should decrease the activation of the neurons that have a positive weight to 3, no? In the video, it's actually the opposite. For example, the first neuron has a positive weight with the neuron responsible for 3, so we should decrease its activation right? Can someone help me on that please?
@nickgardner5641
@nickgardner5641 5 лет назад
I had the same question. It seems like the color-coding of the video at this point is inconsistent? (it applies the same procedure for adjusting the preceding layer of neurons to all the other output neurons as it does to the output neuron for 2, when really the signs in the procedure should be reversed for all the neurons besides 2, since we want their activations to decrease?)
@TheBukkitArea
@TheBukkitArea 6 лет назад
AAAAAAAAA YEEEESSSS I'VE BEEN WAITING FOR THIS VID
@minerscale
@minerscale 6 лет назад
Really well done. I'll probably have to go through the entire series once again, but that's to be expected. There is so much content in so little time. I'll probably actually get into designing a neural network from the ground up.
@TheOnlyKeksSuchti
@TheOnlyKeksSuchti 4 года назад
You're saving my presentation! I didn't know how to explain neural networks to my fellow students but now I know how to do it! Thanks mate!
@toolegittoquit_001
@toolegittoquit_001 6 лет назад
You guys just gotta stop blowing my mind ....💥
@m0rjc
@m0rjc 4 года назад
"This is where we were in the 80s/90s" - I did my degree in the 90s, so this is the level I learned to. The more up to date stuff would be interesting.
@LakTheShadow
@LakTheShadow 4 года назад
It is stunning how much better and clearer you can explain all this than my professor at university, and you are even a lot faster! Thank you so much for producing all those brilliant videos. Some of some help me understand what my professors are trying to tell me, and some of them are just fascinating. Keep on going!
@vgdevi5167
@vgdevi5167 Год назад
Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?
@shukkkursabzaliev1730
@shukkkursabzaliev1730 Год назад
wow, I am surprise anyone knows this sophisticated concepts to explain them clearly, and yet you also accompany them by amazing visuals which I believe not easier to do. Thank you!
@samanrahbar8088
@samanrahbar8088 6 лет назад
Hey! Just want to shout you are my HERO! I'm an AI and ML researcher and I LITERALLY cry watching those awesome explanations! YOU ARE MY HERO!
@TheFadime123
@TheFadime123 2 года назад
I think it'd be extremely beneficial if you were a complete lecture series for deep learning. Your way of teach is billion times better than this somewhat popular Andrew Ng lecture series, and also a lot more intuitive. Please consider this:) Best wishes
@atul6147
@atul6147 Год назад
Sanderson>>> Ng
@Archonch
@Archonch 4 года назад
I cannot overstate how much I wish I had these videos during my machine learning course a few years back
@joshisushant
@joshisushant 5 лет назад
your videos are insanely amazing, and perhaps THE best source to learn any mathematical concept. We are extremely grateful that you take so much time and effort to create such a fantastic content.
@toostoned42069
@toostoned42069 6 лет назад
I would expect the training data to also include a bunch of random images that the network could classify as "not a number" (not hotdog 😉) That way, instead of just taking its wildest guess at which number it's detecting, it could also intelligently say "this doesn't look like a number to me"
@kalebbruwer
@kalebbruwer 6 лет назад
Nick V Yes, this would probably force it into more logical methods than what it turned out to be last video.
@erilgaz
@erilgaz 6 лет назад
it could also intelligently say "This doesn't look like anything to me." FTFY
@skyacaniadev2229
@skyacaniadev2229 6 лет назад
That's a good idea. Also I wanna see what will happen if alphabet and numbers are trained together.
@allenwalker4703
@allenwalker4703 6 лет назад
From what i understood i think you could just program it to say that whenever all the values in the last layer are either negative or really low without changing the back propagation process. Not sure tho.
@TiagoTiagoT
@TiagoTiagoT 6 лет назад
You need to be sure your "random non-number" is actually a lot of different things, otherwise the "not a number" result would have similar chances of being triggered as the number results when for example instead of alphabetical characters you present the NN with white noise it didn't train on.
@vitalijsfescenko5055
@vitalijsfescenko5055 4 года назад
Not drunk, just stochastic
@anoriginalnick
@anoriginalnick 6 лет назад
Amazing use of visuals to explain concepts. Well done man ...
@maxsmith1733
@maxsmith1733 6 лет назад
Hello, I'd like to start by saying your videos are amazing and beyond intersting. One thing I'm still confused about is how to update the biases through backpropagation? Thanks! and keep it up!
@ybahman
@ybahman 6 лет назад
This!
@just_a_duck3371
@just_a_duck3371 3 года назад
I also don't get it what these "nudges" actually mean. Do I simply add value from computed gradient to the weight?
@twiggy_witch
@twiggy_witch 6 лет назад
At 12:25, he labels Fermat as a tease 😂
@alectoperez1383
@alectoperez1383 6 лет назад
#FermatsLastTheorem
@masonhunter2748
@masonhunter2748 3 года назад
And Albert Einstein as a genius
@plingvivin3032
@plingvivin3032 Год назад
@@masonhunter2748 what
@terencechengde
@terencechengde Год назад
You have no idea how useful this video is to me. I learnt deep learning at uni but honestly this has just helped me through some hard time!
@neilmcfarlane
@neilmcfarlane 6 лет назад
Amazing series; the best I've been able to find on explaining such a complex topic. Thanks!
@conoroneill8067
@conoroneill8067 6 лет назад
I feel like I'm missing something with the functionality of backpropogation. So, I get that you adjust the weights between the last two rows to adjust how the second last row feeds into the last row in the optimum way, and then you change the second last way, and then you backpropogate the second row of weights away from the end, etc. But, what I don't understand is that if you change the second last weights, that's going to change the second last row of inputs, which means that the tweaked weights between the last hidden layer and the output will be wrong again. I'm trying to work out what I'm missing. Is it that all the layers of weights are tweaked simultaneously in order to be accurate? If so, I'm still not sure I understand how that works (I'll probably try and watch the video again in the morning when my brains properly awake.)
@3blue1brown
@3blue1brown 6 лет назад
+Conor O'Neill Good question! Backpropagation is just giving the list of adjustments you will make, but you don't make them until all are computed. And of course, since this is part of (stochastic) gradient descent, that's a process you repeat multiple times.
@alectoperez1383
@alectoperez1383 6 лет назад
You bring up a really good point. The next video, which will cover the actual mathematics of how backpropagation works, does take that into account. If you've taken calculus before, you've heard of the chain rule, which describes how to differentiate an expression like f(g(x)). The chain rule states that d/dx f(g(x)) = f'(g(x)) * g'(x). The chain rule is general enough that it's applicable to pretty much all situations where you can express stuff in terms of an f and g, even when f and g take multiple inputs. The important thing is that it allows you to only analyze one input at a time, making it easy to compute stuff. What happens in the case of backpropagation is that you're taking the derivative in terms of each of the weights of the cost function, and the chain rule accounts for how changes affect other changes. The math is too complex to put in a youtube comment (mainly because youtube doesn't have mathjax, so I can't write a lot of the symbols I'd need, like summations), but the next video should explain all of that!
@conoroneill8067
@conoroneill8067 6 лет назад
Thanks! That helps! Thanks for a great video - I love your work and the effort and care you put into these videos. Truly incredible. Thanks again.
@ashirizly
@ashirizly 6 лет назад
Consider that the changes merited by each example are quite small (because each example alone doesn't carry enough weight to merit making big changes), so the changes of weights based on these values are still accurate. It doesn't matter if you do a tiny step after each example, or a larger one after a thousand examples, the over all effect of each example is still like a single tiny step in the direction it points to.
@lucianodebenedictis6014
@lucianodebenedictis6014 6 лет назад
For me the most difficult part about maths is ALWAYS notation, as they expect me to know what exactly that greek letter stands for in every situation. Or maybe this is what's it's all about
@Skironxd
@Skironxd 6 лет назад
I find it really confusing when there's notation cross-over. Especially in the world of physics and engineering, there's only so many defined symbols that can be used for so many more concepts and topics.
@screwhalunderhill885
@screwhalunderhill885 6 лет назад
It's pretty natural for a mathematician. That's a problem when subjects, like this one, overlap with other research areas.
@lucianodebenedictis6014
@lucianodebenedictis6014 6 лет назад
I think we are all meaning the same thing
@grossersalat578
@grossersalat578 6 лет назад
This series is really well done and so important - thank you so much!
@omarcusmafait7202
@omarcusmafait7202 6 лет назад
This channel is absolutely my favourite!
@shengchuangfeng227
@shengchuangfeng227 5 лет назад
At 7:37, shouldn't the activations associated with the digit 0 neuron (whose activation should be decreased) change the opposite direction? For example, if the neuron at the very top of the second-to-last layer increases activation, with a positive weight, the digit 0 neuron's activation would increase. Is that correct?
@ludd8489
@ludd8489 3 месяца назад
Same question here...Someone has an answer for that...?
@GKS225
@GKS225 6 лет назад
OMG just noticed that Markus Persson (Notch - Creator of Minecraft) sponsored this video.
@fossilfighters101
@fossilfighters101 6 лет назад
Oh honey. The 3b1b comment section has been going crazy about this for years.
@GKS225
@GKS225 6 лет назад
fossilfighters101 It's the first time I noticed it. It's nice to see famous people supporting content creators that educate people
@sallerc
@sallerc 5 лет назад
It's a quite common Swedish name, so I wouldn't be to sure it's Notch.
@squibble311
@squibble311 4 года назад
NOTCH SPONSORED 3B1B OMFG
@mattiasli
@mattiasli 4 года назад
extremely well made high quality material, every animation, every transition perfectly timed with the explanation. priceless
@raver8056
@raver8056 4 года назад
I love these moments in maths or other sciences, where you have this one singular moment of pure enlightenment. You got me at 7:37 ....I got goose bumps when the first column of "+" appeared. This is just brilliant.
@MrDivyanshu33
@MrDivyanshu33 4 года назад
I'm just starting out with Deep learning this quarantine and man...it's a bit difficult to get my head around these concepts.
@JonesDTaylor
@JonesDTaylor 4 года назад
Me too man! I am doing the course by Andrew Ng and things are getting much clearer.
@MrDivyanshu33
@MrDivyanshu33 4 года назад
@@JonesDTaylor Me too 😁!
@parthvasoya3562
@parthvasoya3562 4 года назад
@Winston Mcgee hey..why??
@plasmasheep4098
@plasmasheep4098 4 года назад
@@JonesDTaylor same!
@RitikSharma-pc5yj
@RitikSharma-pc5yj 4 года назад
Include me too in your category...
@thewingdings1324
@thewingdings1324 6 лет назад
Why don't you have ads on these. I have no money for patreon and I want to help you out
@3blue1brown
@3blue1brown 6 лет назад
+TheWingDings1 I just think it's a nicer experience. If you have no money, don't worry about it, just watching is thanks enough.
@johncoleman1930
@johncoleman1930 6 лет назад
I know what you mean, the best thing you can do to help the channel is by subscribing,liking, and sharing videos as cliche as it sounds it does help creators, not a creator myself just someone who wants to see these kinds of educational channels grow!
@codinghub3759
@codinghub3759 3 года назад
@@johncoleman1930 I have subscribed, I like every 3blue 1brown video I see. But, I don't know any one who is interested in this kind of stuff. So, I can't really share
@riddhimanna8437
@riddhimanna8437 3 года назад
This is such a wholesome exchange!🥺 Thank you so much Grant and I wish the best for you and the others who commented here!
@hdee5615
@hdee5615 6 лет назад
Your animations are so good!! And when you do recaps and reference to what you have showed us..this is to awesome!
@SuperReminou
@SuperReminou 6 лет назад
Amazing work !! Such a pleasure to understand those underlying notions ! Thank you so much
@tejas8211
@tejas8211 6 лет назад
Dude how are you making these animations
@screwhalunderhill885
@screwhalunderhill885 6 лет назад
he makes them in python
@shaileshupadhyay3920
@shaileshupadhyay3920 4 года назад
Dr tenma how are u
Далее
Backpropagation calculus | Chapter 4, Deep learning
10:18
The Most Important Algorithm in Machine Learning
40:08
Просмотров 191 тыс.
the new PS4 jailbreak is sort of hilarious
12:21
Просмотров 48 тыс.
ChatGPT: 30 Year History | How AI Learned to Talk
26:55
MIT Introduction to Deep Learning | 6.S191
1:09:58
Просмотров 139 тыс.
NEW GPT-4o: My Mind is Blown.
6:28
Просмотров 470 тыс.
Is the Future of Linear Algebra.. Random?
35:11
Просмотров 184 тыс.