Тёмный

The spelled-out intro to neural networks and backpropagation: building micrograd 

Andrej Karpathy
Подписаться 550 тыс.
Просмотров 1,9 млн
50% 1

Опубликовано:

 

28 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,8 тыс.   
@ThetaPhiPsi
@ThetaPhiPsi 2 года назад
This is the single best explanation of backprop in code that I've seen so far. I've once implemented a neural network from scratch, except autograd, so Micrograd is a good fit and so clear and accessible. Thanks Andrej!
@leslietetteh7292
@leslietetteh7292 Год назад
Actually true. And exactly the same, I've once implemented a neural network from scratch, and I broadly understood, but this is the best explanation of backpropagation I've seen. Excellent work.
@harshmalik3470
@harshmalik3470 3 месяца назад
I can't even comprehend the level of mastery it must take, to be able to distill such a complex topic in such a simple format, and the humility to give it our for free so that others may learn. Thankyou so much Andrej for doing this, you're truly amazing.
@aieverythingsfine
@aieverythingsfine 8 дней назад
yeah it was really impressive tbf
@bycloudAI
@bycloudAI 2 года назад
This is literally gold, you have explained everything so intuitively and made it so much easier to understand! Thank you so much Andrej for sharing this in-depth knowledge for free!
@ophello
@ophello Год назад
You literally don’t know what “literally” means.
@flflflflflfl
@flflflflflfl Год назад
@@ophello Not necessarily. One can use a word incorrectly while still knowing its true meaning.
@SOMEONE-jg6jg
@SOMEONE-jg6jg Год назад
love your videos bro
@writethatdown100
@writethatdown100 6 месяцев назад
@@ophello I know this is a year old comment, and my reply is pointless, but _technically_ 🤓Merriam Webster lists "used in an exaggerated way to emphasize a statement or description that is not literally true or possible" as one of the definitions. People define the dictionary. Not the other way around. And yes, it *literally* doesn't matter at all, but it annoyed me that you were wrong when trying to _correct_ somebody else's well meaning compliment.
@__amkhrjee__
@__amkhrjee__ Месяц назад
I am mind blow by the sheer simplicity & clarity of your explanation. You are an inspiration.
@imtexaspete
@imtexaspete 2 года назад
"remember back in your calculus class?...." nope. I'm subscribing anyway whenever I need a humble reminder that I don't know anything and there are people way way smarter than I am.
@omkarajagunde4175
@omkarajagunde4175 2 года назад
W O W Same realisation 🙌🙌😔😔😔
@Forrest_dev
@Forrest_dev 2 года назад
It's never too late to learn.
@vidbina
@vidbina Год назад
The beautiful part of tech is the feeling of constantly being mind blown when realizing how little one knows and how much there is to learn. Studying micrograd has been on my list for a while thanks to George Hotz and this series is making the owning of this context so much easier. Loving it. ❤️
@ycombine1053
@ycombine1053 Год назад
Not smarter, more experienced. You are capable of understanding all of this given enough time and dedication.
@pastuh
@pastuh Год назад
If someone can explain, means its simple
@sanjay-89
@sanjay-89 10 месяцев назад
This was an exceptional lecture. Just wanted to say thank you for taking the time to make this. I have spent time in university courses, reading books, doing assignments and yet, I truly understood more out of this single lecture than from anything else prior.
@mohit9920
@mohit9920 2 года назад
That was incredible. Never has anyone been able to simplify Neural Networks in this manner for me. Please keep making such videos, you're doing gods work. By god, I mean the imminent AGI :)
@liamroche1473
@liamroche1473 Год назад
Prescient. ;)
@notderek7408
@notderek7408 Год назад
Hey Andrej, idk if you'll read this but I wanted to echo others' appreciation for this fantastic introduction. I've been a SWE for many years but always ML-adjacent despite a maths background. This simple video has instilled a lot of intuition and confidence that I actually grasp what these NN's are doing and it's a lot of fuel in my engine to keep diving in. Thank you!
@DrKnowitallKnows
@DrKnowitallKnows 2 года назад
Andrej, the fact that you're making videos like this is AMAZING! Thank you so much for doing this. I will be spending some quality time with this one tonight (and probably tomorrow lol) and can't wait for the next one. Thank you, thank you, thank you!
@2ndfloorsongs
@2ndfloorsongs 2 года назад
And thank you for your videos, Dr Know It All. Always appreciate them.
@mattphorwich
@mattphorwich 2 года назад
I was stoked to discover Andrej sharing the knowledge on these videos as well!
@lonnybulldozer8426
@lonnybulldozer8426 2 года назад
You made love to the video?
@0GRANATE0
@0GRANATE0 2 года назад
And what happend? Do you now understand DNN?
@swathichadalavada9244
@swathichadalavada9244 29 дней назад
Thank you Andrej, Your implementation of neural networks from scratch is impressive! The clarity and simplicity in your code make complex concepts like backpropagation much easier to grasp.
@keikaku9298
@keikaku9298 2 года назад
CS231 was life-changing for me. You are a fantastic educator. I hope this new endeavor works out for you!!
@kerwinmarkgordo3458
@kerwinmarkgordo3458 Год назад
Thank you so much for doing a step by step simulation on how gradient descent works. I am grateful for the passion and effort you make in order to teach. These lessons are very essential as we continue to dive deep into learning.
@heliosobsidian
@heliosobsidian Месяц назад
Wanted to say thanks for that awesome backpropagation video. I've been scratching my head over this stuff for a while now - had all these bits and pieces floating around in my brain but couldn't quite connect the dots. Your explanation was like a lightbulb moment for me! Everything finally clicked into place. Really appreciate you putting this out there for us to learn from.🙌🙌🙌
@ajmeryexperiences4186
@ajmeryexperiences4186 Год назад
Every morning I just visit this channel to check whether any video is uploaded or not , waiting for next lectures
@lu0142
@lu0142 9 дней назад
这是我第一次看完长达两个半小时的视频,虽然是分了三次。受益匪浅,同时也明白 andrej 为什么创业去做教育相关的事情。 视频对 nn 知识面的分解,对每个知识点的讲解,都非常不错!同时也非常期待 andrej 后续的创业产品。
@zachli3070
@zachli3070 6 месяцев назад
It's the most apparent and most straightforward explanation of backpropagation and training of neural networks I have ever learned, with effortless work to understand with a minor background in CS and Math!
@deveshbhatt4063
@deveshbhatt4063 7 месяцев назад
man this is an absolute masterpiece. i can finish at me own pace, and the intricate details and possible bugs are explained clearly. Feels like Morgan Freeman narrating. I can listen to Andrej all day long.
@TheAIEpiphany
@TheAIEpiphany 2 года назад
58:41 "As long as you know how to create the local derivative - then that's all you need". Ok Karpathy. Next paper title "Local derivatives are all you need". Nice to see you on RU-vid! :))
@wahibkapdi5493
@wahibkapdi5493 Год назад
This is the first time that I learn't about Neural Networks and actually understood it. Thank You.
@nadeembaig5943
@nadeembaig5943 2 месяца назад
The fact that I am able to understand all this so clearly without having to rewind, speaks volumes about this amazing person
@cmcq33
@cmcq33 Год назад
This is a combination of topic mastery and communication expertise. I thought I fully understood gradient descent/backprop, and have used it for years. However, I've never dove into manual calculation of gradients because it felt...gratuitous. I'm glad I set aside the 2 hours for this video, however. Now I understand it at the level where I can explain it to an intern at a conceptual level without leaning on formulae and hand-waving, which is a great feeling. Thanks Andrej!
@emmanueladebiyi2109
@emmanueladebiyi2109 6 месяцев назад
Amazing how you broke this down into first principles. I understood a lot of these concepts before now but I'm pleasantly surprised at how much clarity I gained by watching this video. Thank.
@KobeeFinsac
@KobeeFinsac 3 месяца назад
Thank you Andrej for this incredible and detailed video The clarity with which you explain backpropagation and the construction of micrograd is exceptional. Bravo and thank you for sharing your knowledge with us. You are an immeasurable source of inspiration
@ivanburduk3586
@ivanburduk3586 3 месяца назад
Thank you for making this! As someone trying to understand from the ground-up how neural nets are trained and how GPT works, for the purposes of skiling up in the AI Safety field, this was really educational and informative, while being super easy to follow! While I'm still a bit confused about some of the Python syntax (having not worked with it for a while and in not incredible depth), this was still super helpful in understanding conceptually what backpropagation and gradient descent looks like in a step-by-step fashion at the code level. Looking forward to working through more of your videos!
@AmanBansil
@AmanBansil Год назад
I'm pausing frequently each time I encounter something I don't understand and using GPT-4 as an assistant to dive deeper. thank you for this and other amazing instructional videos. I (we) truly appreciate your efforts.
@jaymn5318
@jaymn5318 7 месяцев назад
This guy is a legend. Truly honored to listen to his lectures and finally understand all the operations happening under the hood. Lots of respect to Andrei Karpathy for devoting his time to educate the masses. No words to express the gratitude for his effort.
@unnisbees_1920
@unnisbees_1920 3 месяца назад
Absolute gold, watched this after 3B1B's series on neural nets and I must say, these videos have shifted my view on DL from dauntingly complex, evolving and fast paced to maybe I can learn. Really grateful for the content you post @AndrejKarpathy !
@sidjnsn
@sidjnsn Год назад
Lol, the patience to put this together AND the grace to let us see your bugs - all in one human. Thank you for spelling out the detail of what goes on in that simple diagram with boiler-plate description that we’ve all seen a million times. I finally feel like I really understand it. Now if I can just remember!
@DarokCx
@DarokCx Год назад
Wooow what an introduction! It is by far the best and the easiest to understand. The way you break up and simplify things in a way that we are not loosing the main focus on the WHY we are doing this, is absolutely impressive ! Thanks for sharing your knowledge.
@MUBEENAHAMEDKABIRRIBAYEE
@MUBEENAHAMEDKABIRRIBAYEE 9 месяцев назад
With a strong background in calculus, it was pretty easy for me to understand backprop (I even was like 'HECK yeah I still got it' when I was basically answering to your questions throughout the video), but I have zero coding knowledge. I just started Python a few months back and now I'm getting used to it, but WOW am I equipped with everything neural nets with just one video. Thanks, Andrej!
@SydneyPanda2016
@SydneyPanda2016 6 месяцев назад
These are amazing Andrej. Beautifully explained, logical, easy to follow. Thank you so much for generous knowledge sharing and time you put into creating the content.
@sergiman94
@sergiman94 8 месяцев назад
I am a beginner in the path of AI and this video helps a LOT on how to implement and understand the core components of a neural net, thank you for this video and god bless you 🙏
@snejati86
@snejati86 8 месяцев назад
you know you're going to heaven right? thank you!
@sumit-p4p
@sumit-p4p Месяц назад
This is my first ever comment on any RU-vid video, and I wanted to write this because of just how grateful I am for your video series. Your intuitive explanation has been a tremendous help for me to understand lots of concepts within NNs. Thank you for making world-class knowledge accessible, Andrej. Onto the next video I go. :)
@sagarthacker5114
@sagarthacker5114 Год назад
I usually don't comment on RU-vid but dude seriously this has been hands down the best explanation for back propagation I have come across!! Thank you so much!
@juanherr19
@juanherr19 7 месяцев назад
This is my first time ever commenting on a youtube video, and I just wanted to say thank you so much Andrej!
@santiagocalvo
@santiagocalvo Год назад
Almost done with the video and this is the best video ever on neural networks, thanks you so much, from the bottom of my heart, you are a GOD
@AiEquation
@AiEquation Год назад
I just watched end to end. Now I will watch it again, but this time I will write out the code and take notes. Best ML video I have seen this year. Cant believe I havent stumbled onto it earlier
@yashingle9460
@yashingle9460 Год назад
Wow the clarity of concepts is just overwhelming
@yiqunnian1443
@yiqunnian1443 8 месяцев назад
This is simply one of the best videos I have watched.
@JungHeeyun-t3x
@JungHeeyun-t3x Год назад
It is literally "Zero to Hero". For some reasons, it took me some days to fully understand this video, but i know i will cover all the videos in depth. It is so fortunate for me noticing your videos.
@bramar1278
@bramar1278 6 месяцев назад
One of the best videos I come across youtube. Thanks a lot for this video. Wish you continue share your knowledge and wisdom like this.
@ittaig
@ittaig Год назад
Greate lecture - simply and thoroughly explained with neat and clear code. One of the best lectures I have ever heard. Happy to have found this treasure. Thank you, Andrej!
@sujaggu1
@sujaggu1 10 месяцев назад
What a beautiful lecture! Absolutely clear, powerful and systematic exposition of a complex topic. Thank you for your work!
@yyxx9309
@yyxx9309 Месяц назад
I thought there wouldn't be videos more helpful than Khan academy anymore but now I found this >< Thank you so so much!!
@bluestarwars
@bluestarwars Месяц назад
Amazing. Mr. Karpathy is probably an excellent engineer - but he is 100% an excellent teacher. Thank you.
@maxilorent89
@maxilorent89 Год назад
This is definitely the best video on backpropagation I ever saw. Thank you very much for the time you spent creating this amazing video.
@cezarmocanu5043
@cezarmocanu5043 9 месяцев назад
Honestly, I have no words. This is an amazing presentation, in terms of code, math, logic. Can't wait to continue with the other videos. Just amazing. Thank you so much for taking the time, and sharing your knowledge
@SaraHekmat-xr8rt
@SaraHekmat-xr8rt 5 месяцев назад
Thank you for sharing your valuable knowledge and making this complex topic so clear and understandable. Your trainings are much appreciated.
@za_daleko
@za_daleko Год назад
Andrej ur work is golden. I’m data scientist and it giving me nice revision about deep learning and expanding my intuition.
@rogermenezes
@rogermenezes Год назад
Just filled with gratitude today! Whatever led to this video... Whoever created diodes and integrated chips, whoever created the computer, made the personal computer accessible, video formats, to Al Gore for creating the internet, to folks making streaming video possible, to the creators of RU-vid, Jupyter, matplotlib, python....thank you all! Andrej, this video was magical 🙏
@skoppisetti
@skoppisetti Год назад
What a way to pay it forward. Kudos! Thank you. Thank you. Thank you, for the time you are taking to make these videos.
@ebateru
@ebateru Год назад
Thanks a lot for this elaborate explanation Andrej, very helpful.
@InglesConConfianza
@InglesConConfianza 9 месяцев назад
That was amazing. I finally get how these neural networks work.
@dariovaquilema7763
@dariovaquilema7763 7 месяцев назад
I am really appreciate your effort on this class. It was amazing, and finally I can said I have gotten a better understanding on this topic. You are a great an wonderful person. Be happy and take care...!!
@cw9249
@cw9249 Год назад
2:07:00 i just get a big stupid smile on my face, seeing this magic, and understanding how gradient descent is done now 😁beautiful work and explanation!
@evansdoe
@evansdoe 6 месяцев назад
Stunning, exceptional, simple, and clear
@djsocialanxiety1664
@djsocialanxiety1664 6 месяцев назад
this is the single best video on this topic. truly gold thank you Andrej
@khushnoodnaqvi3793
@khushnoodnaqvi3793 6 месяцев назад
Too good! So many 'aha' moments in one single lecture. I coded alongside, sometimes before the explanation (e.g. The zero_grad() before backward()). Very fulfilling experience. I will recommend it to my sons (one working and other in college) and other people. Also can't wait to follow along coding on more such teaching videos. PS: The bloopers after the curtains are funny 😀
@MilanGatyas
@MilanGatyas Год назад
Andrej thanks a lot for this video - it took me whole day to code with you and I refreshed my memory and learned a lot. I love your teaching attitude!
@9tongagi
@9tongagi Год назад
Amazing!! Best NN lecture videos I've ever seen
@JaviRamirezG
@JaviRamirezG 8 месяцев назад
This Video is just pure Gold!, Thank you so much for doing this video. The way you explain things is off Chart !!! Thank you so much Man
@stri8ted
@stri8ted Год назад
You explanation of the chain rule was brilliant.
@shaurya-dobhal
@shaurya-dobhal 6 месяцев назад
what a masterclass! PS the outtakes were hilarious
@abarkadun
@abarkadun 11 месяцев назад
Really enjoyed this exercise Andrej. Thanks a lot!
@mohitsrivastava5880
@mohitsrivastava5880 Год назад
Now that every other comment can validate the indispensability of this gem of a video, I would like to mention that the bloopers at the end really cracked me up 😂. Thank you for this humility! Really appreciate what you are doing!
@turoniy
@turoniy 7 месяцев назад
Thank you for sharing your knowledge in that brilliantly simple way.
@codyr2318
@codyr2318 5 месяцев назад
Thank you so much for releasing this I’m so appreciative
@getravi2k
@getravi2k 5 месяцев назад
This is an amazing session on ML and Neural networks.
@styssine
@styssine 10 месяцев назад
Was really worried about zeroing the gradient. What a relief.
@akhileshpandey123
@akhileshpandey123 Год назад
Thanks for putting all knowledge in such a basic example, now everything makes sense. 🙏
@santiagocalvo
@santiagocalvo 3 месяца назад
And I'm back watching this series, my god you are a genious
@Omery-od6vu
@Omery-od6vu 7 месяцев назад
The most top quality content I've ever learnt.
@jarhrodriguez646
@jarhrodriguez646 Год назад
This is so amazing, insightful, and enjoyable; all at the same time. Thanks so much Andrej!
@sciab3674
@sciab3674 5 месяцев назад
thanks, wise people always express complex things in simple language or example
@aayushjoglekarpersonal7392
@aayushjoglekarpersonal7392 Год назад
Thank you for making this tutorial! I have always been on a lookout for something like this. Normal videos either discuss super deep details or go on a brief overview. This was a perfect balance between depth and showing the actual usage of what we built. Bingeing your playlist now! :D
@MakeKasprzak
@MakeKasprzak 8 месяцев назад
Yes I just sat through 2+ hours of calculus and enjoyed every minute of it.
@somdubey5436
@somdubey5436 6 месяцев назад
This guy is so humble, almost a child-like innocence oozes out of him :)
@siarheirusak1874
@siarheirusak1874 Год назад
This is a piece of art. The explanation I have always wanted. Thanks, Andrej.
@recepkucek3016
@recepkucek3016 Год назад
@Andrej About implementing backward function recursively , I think it is actually BFS(bread first search) not topological sort. Topological sort starts from outer nodes(here data nodes and output node) and ends with inner nodes.
@pravachanpatra4012
@pravachanpatra4012 11 месяцев назад
2:13:00 Micro grad implements back prop. Can create Value objects and do operations with it. In the background it creates a computational graph and keeps track of stuff. Can call backward prop of a value object with apply chain rule to do back prop.
@Gatekept524
@Gatekept524 Месяц назад
Thankful to be alive during this time
@yenomdab3162
@yenomdab3162 28 дней назад
Thanks From INDIA ❤ Such an informative and it was a great intro for me thankyou so much andrej!!
@p20ph37
@p20ph37 Год назад
I went into this video knowing nothing about calculus, machine learning, or much python. After watching the video, I feel like a genius.
@koushik7604
@koushik7604 Год назад
It’s great to learn with you Andrej, thanks for this precious information ❤
@creativeuser9086
@creativeuser9086 Год назад
You're a good guy Andrej. Sincerely speaking.
@PrakashBisht
@PrakashBisht 2 месяца назад
Thank you Andrej. You are an Amazing Teacher. :)
@patrickfarrell7507
@patrickfarrell7507 7 месяцев назад
If you're following along, at 1:54:44, you need to have implemented an __radd__ function in the Value object to allow you to subtract a value object from an int
@nicholasjohnson8692
@nicholasjohnson8692 Год назад
The reason you use a set at 23:11 is so if you do a + a you don't end up with a in the _prev tuple twice, which would presumably screw with the backdrop later.
@OnkarNora
@OnkarNora 4 дня назад
first 15 mins and the guy litrally told me why derivatives were thought in my school
@JaazFelicio
@JaazFelicio 19 дней назад
I've been re-watching this class for a while, and so far, I'm still haven't mastered it
@AlexCage19
@AlexCage19 11 месяцев назад
Боже дай здоров'я цьому чоловіку. Це просто прекрасно!
@DirectCherry
@DirectCherry Год назад
Absolutely amazing lecture and repo. This really helped me grasp the concept of backpropagation, both mathematically and programmatically! My only critique is that your implementation of the MSE loss function in your lecture was missing the 1/n, making it more of a sum squared error than a mean squared error.
@yangautumn43
@yangautumn43 10 месяцев назад
thanks Andrej, super helpful, super easy to understand!
@thirdreplicator
@thirdreplicator Год назад
You're the best teacher in the world!
@charlesgormley9075
@charlesgormley9075 Год назад
Taking a deep learning course in undergrad and this is helping sm! You should make a textbook with videos and projects!
@lobintsevvladislav5982
@lobintsevvladislav5982 8 месяцев назад
What a nice explanation! Thank you for the video
@arsnakehert
@arsnakehert Год назад
Damn, this dude can teach Thank you for making this, man
@dontwannabefound
@dontwannabefound 2 месяца назад
Why the inputs to the operations are considered 'children' becomes clear when you go to actually run the backprop. Basically you topologically sort the computational graph and start with the final output and then go backward from there. So if the final output is the root, then the inputs to that output are its children.
@nbme-answers
@nbme-answers Год назад
34:41 multiplicative derivative 46:21 bkmk 1:30:01 object oriented language is a cluster
Далее
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 308 тыс.
Barno
00:22
Просмотров 701 тыс.
Inside Out 2: BABY JOY VS SHIN SONIC 4
00:16
Просмотров 3,9 млн
The Most Important Algorithm in Machine Learning
40:08
Просмотров 439 тыс.
PyTorch at Tesla - Andrej Karpathy, Tesla
11:11
Просмотров 519 тыс.
What is Back Propagation
8:00
Просмотров 60 тыс.
Watching Neural Networks Learn
25:28
Просмотров 1,3 млн
Why Neural Networks can learn (almost) anything
10:30
I Built a Neural Network from Scratch
9:15
Просмотров 310 тыс.
How I’d learn ML in 2024 (if I could start over)
7:05