Тёмный
No video :(

3.5: Mathematics of Gradient Descent - Intelligence and Learning 

The Coding Train
Подписаться 1,7 млн
Просмотров 241 тыс.
50% 1

In this video, I explain the mathematics behind Linear Regression with Gradient Descent, which was the topic of my previous machine learning video ( • 3.4: Linear Regression... )
This video is part of session 3 of my Spring 2017 ITP "Intelligence and Learning" course (github.com/shi...)
3Blue1Brown's Essence of Calculus: • Essence of calculus
My videos on calculus:
Power Rule: • 3.5a: Calculus: Power ...
Chain Rule: • 3.5b: Calculus: Chain ...
Partial Derivative: • 3.5c: Calculus: Partia...
Support this channel on Patreon: / codingtrain
To buy Coding Train merchandise: www.designbyhu...
Donate to the Processing Foundation: processingfoun...
Send me your questions and coding challenges!: github.com/Cod...
Contact:
Twitter: / shiffman
The Coding Train website: thecodingtrain....
Links discussed in this video:
Session 3 of Intelligence and Learning: github.com/shi...
3Blue1Brown's Essence of Calculus: • Essence of calculus
Source Code for the all Video Lessons: github.com/Cod...
p5.js: p5js.org/
Processing: processing.org
For More Coding Challenges: • Coding Challenges
For More Intelligence and Learning: • Intelligence and Learning
📄 Code of Conduct: github.com/Cod...

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 486   
@iamgrinhausgases
@iamgrinhausgases 3 месяца назад
Just wanted to say that this is easily the best and clearest explanation of gradient descent I've come across, on the web and in the books I've read. Thank you, sir.
@stevenrhodes2818
@stevenrhodes2818 6 лет назад
OMG! Thank you! My power went out and I figured I would try to learn gradient descent on my phone.. This is the first time it's made sense.. All those experienced mathmaticians suck at being teachers, making it sound all crazy complicated and shit. You sir are amazing.
@fernandoerazo8135
@fernandoerazo8135 Месяц назад
My wifi just went out and instead of using it as an excuse, I am using my laggy phone to try and learn .
@vcrmartinez
@vcrmartinez 5 лет назад
Hey, I'm a Brazilian student/software engineer studying Rec Systems and ML. Tons of articles, papers and videos did not do what you've just done. Now everything is crystal clear, thanks for the explanation.
@JulietNovember9
@JulietNovember9 5 лет назад
Going through Andrew Ng's Coursera... got stuck on how the Cost Function derivatives/partial derivatives are obtained.... 11:00 and on... Oh... MY... GOSH... this is GOLD!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Thank you so much!
@calluma8472
@calluma8472 5 лет назад
Same. Haven't done calculus since 1999, this helps a lot.
@seamusforever7081
@seamusforever7081 5 лет назад
Same here. I was so confused.
@tharukabalasooriya3269
@tharukabalasooriya3269 4 года назад
@@calluma8472 gosh, how old are you?
@abirchakraborty624
@abirchakraborty624 4 года назад
Spot on ... This is a small bridge for the Andrew Ng's Coursera course . Specially the explanation how chain rule and power rule are coming into picture here, really helps.
@pemadechen9901
@pemadechen9901 4 года назад
found my coursera classmate
@nikitachoudhary3878
@nikitachoudhary3878 4 года назад
after read 20-30 articles , after watched 20+ videos, i watched best video ever of Gradient Descent. i falling love with you. best explanation.
@SaurabhSinghAus
@SaurabhSinghAus 4 года назад
Coming here from Andrew Ng's ML course. Got confused with Gradient Descent. This is Gold. You explained Linear regression so well.
@alishaaluu
@alishaaluu 5 лет назад
Usually, math makes me cry but while watching this I am learning and laughing at the same time. How cool is that? Lol. All thanks to you, bro! Keep the good work on. Cheers!!
@ajitkumar15
@ajitkumar15 5 лет назад
Thank you so much.After failing my exam on Machine learning I was searching videos on Gradient Descent topic.After watching so many videos I landed on this page.By far this is the best video.You are simple great teacher because you have understood the topic very well that is why you are able to explain in really simple way.. Thanks a million !!!!!
@TheSocialDrone
@TheSocialDrone 4 года назад
I had searched on this subject and watched several other videos before I could find this amazing video on the topic; I am more than happy that I am now able to explain this concept to anyone - now it is so much clear, thank you sir!
@umessi10
@umessi10 6 лет назад
Thank you so much for this. One of the best explanation of gradient descent on youtube. So far I'm loving your Intelligence and Learning series. Think I'm gonna binge watch the entire series now.
@CamdenBloke
@CamdenBloke 6 лет назад
I'm studying up for an interview to transfer to the Machine Learning department Wednesday. This is enormously helpful in providing an actual mathematical (not just conceptual) understanding of gradient descent. Thanks!
@shivamkaushik6637
@shivamkaushik6637 6 лет назад
It was teacher's day yesterday, here in India. And today I have got this amazing teacher. Thank You
@badcode3976
@badcode3976 3 года назад
I have seen some of your videos to get some concepts that I didn't get the first time on my ML class and I'm truly convinced that these are the best tutorials on RU-vid about ML, you make every concept so simple to understand and funny, at the same time. Thanks a lot!!! Keep doing this great content!!!
@TimHulse
@TimHulse 4 года назад
You nailed it sir! I was confused when partial dJ/dm = 2*Error at one moment then suddenly had partial dError/dm glued onto it, but your clarification at point 19:15 clarified it. Please keep making videos!
@watchnarutoshippuden3228
@watchnarutoshippuden3228 6 лет назад
Your videos are the only calculus and ML videos I can understand, you are the best! I just subbed to you; 'minimum' is singular and 'minima' is plural.
@r.d.machinery3749
@r.d.machinery3749 5 лет назад
This is a clearer explanation than Professor Ng's explanation in his machine learning video series. Ng denotes m and b as theta0 and theta1. He also reverses the terms in his line equation which confuses the Hell out of everybody. In addition, he doesn't take you through how the partial derivative is worked out and he doesn't show the code. A great explanation in only 22 minutes.
@matheuscorrea3277
@matheuscorrea3277 4 года назад
I searched for a video like this for a long time, and the only one I could clearly understood was yours. thank you so much and congrats for the explanation
@simaosoares165
@simaosoares165 6 лет назад
You did it fantasticly! These are concepts that I know well already but find them difficult to explain, so I'll recommend your videos when pure IT guys (and not so educated audiences) ask me about the internals of the ML algos that I use.
@8eck
@8eck 4 года назад
3Blue1Brown's was hard for me, your explanations are waaay better.
@BinuVADAT
@BinuVADAT 6 лет назад
You went in to details and explained the concept. Loved the way of your fun-filled teaching. Thank you !
@via_domus
@via_domus 6 лет назад
you're a very good teacher, a bit crazy though lol
@Dennis4Videos
@Dennis4Videos 3 года назад
The great kind of crazy tho!
@anusha8085
@anusha8085 3 года назад
@@Dennis4Videos yes!
@Best_Real_Experience
@Best_Real_Experience 9 месяцев назад
That's the types of people who should teach us ❤❤
@paedrufernando2351
@paedrufernando2351 6 лет назад
you deserve 200 million subscribers.. more than that your personality is really great!!!
@smrazaabidi1495
@smrazaabidi1495 7 лет назад
Really awesome, what a elegant style of delivering the concepts, mind boggling. I wish & dream i should work and get education under his supervision. Moreover, gestures, tone, humor was extra extra outstanding, i'm speechless. :). I must say that it is the best ever explanation of gradient descent I've seen so far. Thanks a lot.
@cuteworld9310
@cuteworld9310 3 года назад
Amazing esplanation, easy enough for a high school student to learn. Amazing how simple you made this complex concept. You sir are a genius!
@azr_sd
@azr_sd 5 лет назад
Daniel I have been following you since you had 2000 subs. I always enjoyed your videos man. I started learning Deep Learning on my own and got stuck at understanding Gradient descent and I know it is the back bone of Ml and DL I want to know it deeply. I have watched around 3 videos before this and your video just explains it beautifully. Thanks for this video it helped me alot. Please keep doing these kind of videos which explains the math behind these ML and DL algorithms and again Thank you for your videos. :) Iam gonna follow you more and more from now. If it's possible try to make an awesome course on Udemy with math and programming of ML and DL . Thank you again.
@rupambanerjee3
@rupambanerjee3 5 лет назад
This is the best gradient descent video I have ever seen! Great work!
@lggood8375
@lggood8375 4 года назад
Thanks for doing this within the 20 minutes you made it clear. Than the many hours I have watched and read articles others have made and were totally confusing. keep doing this... you made me unafraid of all the math.
@-long-
@-long- 6 лет назад
I never saw any tutor who put so much emotion in the video like you lol Excellent channel! Thanks so much
@alexmisto423
@alexmisto423 5 лет назад
After watching this i finally figured out the calculus behind back propagation. Thank you! BIG LIKE
@timt.4040
@timt.4040 6 лет назад
Was looking for an accessible explanation of gradient descent, and this was by far the best one I found--thanks!
@abirchakraborty624
@abirchakraborty624 4 года назад
Spot on ... This is a small (but a very important ) bridge for the Andrew Ng's Coursera course . Specially the explanation how chain rule and power rule are coming into picture here, really helps.
@anannyapal9939
@anannyapal9939 5 лет назад
I was literally so frustrated with these things meesing up my head.... thank you sir for helping me to survive....you are just fanstastic🙏🙏
@LudwigvanBeethoven2
@LudwigvanBeethoven2 6 лет назад
You are so engaging that turns this boring math to something actually interesting.. thank you so much
@feliciafryer3271
@feliciafryer3271 6 лет назад
Thank you!!! I’m a computer science graduate student and trying to understand gradient descent. This video is awesome, can’t wait to watch more of your videos.
@boemioofworld
@boemioofworld 7 лет назад
amazing explanation! One of the best explanation in the whole youtube IMO
@TheCodingTrain
@TheCodingTrain 7 лет назад
Glad to hear, thank you!
@idrisalhajiadamu7590
@idrisalhajiadamu7590 2 года назад
@@TheCodingTrain thanks for the explanation. I couldn't find the next value where you explained batch gradient descend.
@TheBestcommentor
@TheBestcommentor 4 года назад
I've been studying this subject for a couple months in my final semester of college, and for reason, the connectiom between the loss function and the parabola just made it all click
7 лет назад
I can not believe this video aired just when i needed it, that you so much!
@JotaFaD
@JotaFaD 7 лет назад
A great youtuber recommending another one! Although I know Calculus from college, I think you did a great job explaining some of the rules. Keep it up Daniel.
@shantanu991
@shantanu991 6 лет назад
I think, you made it so simple. I was looking for a proper explanation of this formula. Liked. Subscribed.
@praveenharris6170
@praveenharris6170 6 лет назад
I cannot describe how useful this was to me! Thank you!
@prachikhandelwal2259
@prachikhandelwal2259 4 года назад
I wish I had a teacher like you. You are amazing sir I think it doesn't matter whatever are you studying but your teacher has the power to make the concept Easy or Difficult. And you are the one who makes everything extremely easy! and Yeah you are damn funny.
@spacedustpi
@spacedustpi 6 лет назад
This was the best tutorial on this subject that I've found, thank you for this too!
@TheWeepingCorpse
@TheWeepingCorpse 7 лет назад
thank you for everything you do. I'm a c++ guy but your videos are very interesting.
@thecodersexperience
@thecodersexperience 6 лет назад
hey can u tell me best place to learn c++
@syedbaryalay5849
@syedbaryalay5849 6 лет назад
plus i would not jump into c++ as my first langauge, try learn an easy langauge and then start with c++. plus you need to be sure what are you gonna c++ for.
@CamdenBloke
@CamdenBloke 6 лет назад
Deitel and Deitel's book.
@douloureuxcrouton5780
@douloureuxcrouton5780 6 лет назад
gibson If his goal is to only learn C++, then learning C first is unnecessary. I would even argue it is a big mistake.
@michelaka6836
@michelaka6836 6 лет назад
Superb break down of this often miss-explained concept.!!!! A+
@samyakjain7300
@samyakjain7300 5 лет назад
A beautiful mathematical explanation of Gradient Descent! Way to go man...
@Ryeback101
@Ryeback101 4 года назад
Excellent videos. I just went through the playlist and they explain the concepts really well. You sir are a hero!!
@swatigautam9802
@swatigautam9802 6 лет назад
You're videos are entertaining and informative at the same time . Love it!
@rupalagrawal3534
@rupalagrawal3534 4 года назад
its a great video.A simple and easy language is used to explain every concept.Great work!!
@hamadaparis3556
@hamadaparis3556 3 года назад
this video demystified everything of the previous one, thank you so much
@adacristina3763
@adacristina3763 5 лет назад
Amazing! The best explanation so far
@binilg
@binilg 6 лет назад
Man, you are super!! I had a hard time understanding the mathematics of gradient descent and you made it very easy. Thank u
@ryanmccauley211
@ryanmccauley211 6 лет назад
After watching a ton of videos I finally understand it thanks to you. Thank you so much!
@user-xn4yu5rn9q
@user-xn4yu5rn9q 5 лет назад
At first I thought this is BS, now I’m so thankful
@life_outdoor9349
@life_outdoor9349 6 лет назад
One of the best video tutorial I came across !!
@AdityaSingh-lf7oe
@AdityaSingh-lf7oe 4 года назад
Thank u so much!!! This is just what I needed... U rock!!
@jys365
@jys365 6 лет назад
Thank you so much for this explanation! I spent a few days on this concept not getting how the gradient formula was set as shown here!
@debasishhazra3222
@debasishhazra3222 4 года назад
wonderful way of teaching and just fantastic video. Just Loved it Man...!!!
@mathhack8647
@mathhack8647 2 года назад
I lov this. " I tried again " Love this ,
@satyaNeelamraju7
@satyaNeelamraju7 7 лет назад
An excellent video.. The best video in the internet for Gradient Descent Algorithm. Thanku so much :) ... Keep posting like this
@Kidkromechan
@Kidkromechan 6 лет назад
This is EXACTLY what i had been searching for the past week pfft.. Thank you Sir ^_^
@99chintu
@99chintu 5 лет назад
That is one of the best explanations I saw on the youtube..Thanks a lot..
@snackbob100
@snackbob100 4 года назад
this video is fantastic, you are a very talented teacher
@danielkashkett7040
@danielkashkett7040 4 года назад
This really helped me understand the MSE derivative. Great job!
@crehenge2386
@crehenge2386 7 лет назад
it's interesting how different youtube channels become different classes^^ Khan gives you allt the calculus you could ever want if you're a beginner
@jebastinkoilraj1951
@jebastinkoilraj1951 5 лет назад
OHH man.. you're one hell of a teacher... Loved it
@annperera6352
@annperera6352 3 года назад
Thank you Sir.This teaching gained you a subscriber
@mikelosei382
@mikelosei382 3 года назад
very, very, very helpful!!! I'm in grade 12 and was researching how exactly calculus could be applied to com sci, and this was a life saver! I had no idea what I was doing before this XD Thankss
@TheCodingTrain
@TheCodingTrain 3 года назад
So glad to hear thanks for the nice comment!
@michaelstopa1774
@michaelstopa1774 3 года назад
Dude, you are completely mad. But in the most noble sense of this word;)! Fantastic way of explaining an actually quite complex piece of math. And it's very funny too;). Congrats and hats off. You're an excellent educator.
@chandimaindatissa6562
@chandimaindatissa6562 5 лет назад
Simple and easy to understand. Thanks for sharing other important links. Well done!!
@pramodkhatiwada6189
@pramodkhatiwada6189 6 лет назад
It's awesome, i understand the concept of error minimizing and i jump over here to comment.
@ho9462
@ho9462 5 лет назад
Best explanation ever in Gradient Descent.
@olicmoon
@olicmoon 6 лет назад
it's crystal clear even for a person like me lacking understanding of calculus
@pradiptahafid
@pradiptahafid 5 лет назад
Dan, I have been a math tutor for 1,5 years. I know what is derivative yet gradient descends is still confusing for me at least if someone asks me, I have no idea how to explain it in less than 3 sentences. Your explanation is mindblown. I guess you will make a lot of University go bankrupt. People just open your channel instead.
@TheCodingTrain
@TheCodingTrain 5 лет назад
Thank you so much!
@rashariyad1821
@rashariyad1821 3 года назад
Thank you, its very simple yet amazing explanation.
@kenhaley4
@kenhaley4 7 лет назад
Well, since I gave you a negative review on the calculus video, I feel I owe you one here. I thought that was great! The only thing you glossed over was the fact that the cost function is actually a summation of all the errors of each x value. But, since the derivative of a sum is simply the sum of the derivatives, putting the computation of m and b inside the for loop works fine. (Seeing that in your code at first actually bothered me, but now I see that it's no problem--it's exactly what's needed.) I found it fascinating how simple everything turns out after going through all the calculus. And I think that was the important point. Nice job.
@TheCodingTrain
@TheCodingTrain 7 лет назад
Thank you, I really appreciate it. I think there is still more room for improvement and, in a way, I'm just making these videos to help give myself the background for future videos. But I'm glad this one seems to be better received!
@AliAzG
@AliAzG 6 лет назад
The best explanation of gradient decent! Thank you very much.
@tanwirzaman
@tanwirzaman 6 лет назад
One of the best explanations of GD
@philips5130
@philips5130 3 года назад
I love your energy! Nice explanation btw
@Raghad-mz8el
@Raghad-mz8el 5 лет назад
thank you for explaining this. and even better than what my professor did in multiple lectures.
@Suigeneris44
@Suigeneris44 6 лет назад
You're an amazing teacher! I wish I had a Math teacher like you!
@georgyandreev7469
@georgyandreev7469 4 года назад
Man that’s awesome. Being Russian I’ve understood every single thing. Keep up!
@lightningblade9347
@lightningblade9347 6 лет назад
Instant subscription, I adore the passion you have for what you do :) .
@madhumitamenon7757
@madhumitamenon7757 5 лет назад
Thank you for making this look so simple!! You are an amazing teacher.!!
@dipakgaikwad3664
@dipakgaikwad3664 Год назад
You are awesome, atleast tried not skip the mathematics, not like most of them who just run away from mathematics
@cimmik
@cimmik 5 лет назад
Thank you so much. Finally I found an explanation that I could undestand. Good job, Daniel :D
@divyagulakavarapu9614
@divyagulakavarapu9614 5 лет назад
Nice one. I actually had a doubt in GD, but watching your video I think I'm clear a bit.
@dmill989
@dmill989 4 года назад
Great explanation thank you! I kept seeing the chain rule, which I understand, but no one was explaining explicitly which chain of functions we are using it on and that the loss function is at the end of the chain.
@bosepukur
@bosepukur 7 лет назад
very energetic presentaion ...loved it
@TheCodingTrain
@TheCodingTrain 7 лет назад
Thank you!
@RazingForx
@RazingForx 2 месяца назад
WOW, this is such a great video, love it.
@crvnse
@crvnse 5 лет назад
Awesome explanation of cost funciton , derivatives and its usage in ML
@joesan3597
@joesan3597 4 года назад
That was indeed very well explained! Thanks for the video! Could you please explain me why you treat the J(m, b) = ERROR^2 and then substitute the ERROR with mx + b -y when doing the partial derivative?
@aGh-rw7dd
@aGh-rw7dd 7 лет назад
No worries I understood this lecture and appreciate it. I have studied Calculus 1 &2 in my high school
@Omar-kw5ui
@Omar-kw5ui 4 года назад
Excellent explanation. Although I must point out that we travel in the direction of the negative of the gradient. So we multiply by -(learning rate)
@luthfiramadhan7591
@luthfiramadhan7591 4 года назад
sir can u explain pls i dont gwt it why this guy + the gradient
@realcygnus
@realcygnus 6 лет назад
superb ! as always. Knowledge of the more advanced concepts/techniques, especially higher level maths/abstractions etc. is primarily, precisely what separates the boys from the men / the script kiddies from software engineers etc. fear it NOT ! he plays the part so well mostly as a great teaching strategy/angle so as to reach as many as possible & to make them feel they aren't alone. Even if he really doesn't like it. I'd like to think its mostly an act anyway. & the Oscar goes to: Dan Shiffman ! He do knows his shiz though ! these vids have been priceless to me. thanks ! can't wait for the rest of Neural Networks
@111rave
@111rave 6 лет назад
Your tutorials are so helpful! Thank you. For some reasons, once, your video got muted and it was so hilarious to see your waving arms and your fuzzy energy! :D
@adityarprasanna
@adityarprasanna 5 лет назад
It doesn't matter whether it's (guess - y) or (y - guess) in the cost function because it's being squared anyway right?
@shubhamshah6645
@shubhamshah6645 3 года назад
Beautiful explanation, Thanks for this tutorial
@adnanahmed316
@adnanahmed316 5 лет назад
Wow! Thanks! It would be so fun for people who work with you. May Allah bless you.
@quanghong3922
@quanghong3922 5 лет назад
So lucky when i was at high school i did a lot of math exercise about derivative and i love it
@shadmankudchikar6978
@shadmankudchikar6978 7 лет назад
awesome u really gave different way to look gd will love to see more ML videos by you. Awesome work bro!
Далее
Gradient Descent, Step-by-Step
23:54
Просмотров 1,3 млн
Can You Bend This Bar?
01:00
Просмотров 4,3 млн
C’est qui le plus fort 😂
00:18
Просмотров 10 млн
Coding Challenge 180: Falling Sand
23:00
Просмотров 883 тыс.
Solve any equation using gradient descent
9:05
Просмотров 53 тыс.
Coding Challenge 185: Unfolding Fractals
31:04
Просмотров 31 тыс.
Why the gradient is the direction of steepest ascent
10:32
Fast Inverse Square Root - A Quake III Algorithm
20:08