Тёмный
No video :(

10.4: Neural Networks: Multilayer Perceptron Part 1 - The Nature of Code 

The Coding Train
Подписаться 1,7 млн
Просмотров 311 тыс.
50% 1

Опубликовано:

 

21 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 300   
@bilourkhan3345
@bilourkhan3345 5 лет назад
A happy face always helps to learn with ease and fun. Keep it up man !
@saramariacl
@saramariacl 4 года назад
yes, is so true
@bilalazeemshamsi7895
@bilalazeemshamsi7895 3 года назад
Sir, the question is how can a person who is in this field be this happy? lol :P
@somecho
@somecho 4 года назад
One year ago I was watching your tutorials on how to draw squares on a canvas with code. One year later I'm trying to build machine learning models, also with the help of your tutorials. I'm not even a CS student, I'm a pianist!
@rakib17874
@rakib17874 3 года назад
great! Do u play in concert??
@seemarai5310
@seemarai5310 6 лет назад
Well, i have to say you can be elected for the best teacher award. You are simply a perfect teacher.
@jonkleiman8018
@jonkleiman8018 6 лет назад
It's because of your teaching that I've decided to pursue a career in this field. A brilliant balance of fun and seriousness.
@TheCodingTrain
@TheCodingTrain 6 лет назад
Best of luck to you!
@optymystyc
@optymystyc Год назад
I'm here because the coursera course instructor in the class I'm taking just can't explain it with as much joy and happiness as you. I feel like the information I'm getting here is paired with enthusiasm and that's the way it should first be introduced to my brain.
@stopaskingmetousemyrealnam3810
@stopaskingmetousemyrealnam3810 4 года назад
Taking it back to Boolean Algebra makes it very clear why MLPs are a natural solution to the XOR problem, thank you. Nobody's done that yet in anything I've seen, even though it's obvious in hindsight and maybe should have been obvious in advance.
@LeandroBarbksa
@LeandroBarbksa 6 лет назад
I love how excited you are explaining this.
@chandranshsharma1685
@chandranshsharma1685 5 лет назад
Amazing teacher.I have my semester exam tomorrow and was searching a lot about multi layer perceptron on the internet and wasn't able to find good explanation.thank god I found your video.💙
@shedytaieb1083
@shedytaieb1083 3 года назад
Man, you have no idea how the content you're creating is useful and interesting GOOD JOB
@vanshitagupta4183
@vanshitagupta4183 2 года назад
You teach this subject with such passion. It is kinda getting me excited about learning it too
@8eck
@8eck 4 года назад
Linearly separable and linearly not separable explanation is the best explanation! It's now logical, why there are multiple layers was required! GREAT! Thank you!
@joachimsaindon3658
@joachimsaindon3658 3 года назад
This video is a great example of why your channel is one of my favorites.
@morphman86
@morphman86 5 лет назад
I have been trying for quite some time to figure out what the "hidden layer" is, how it works and what the purpose is. So many others either get right up to that subject and then stop posting, or talk about it as if I should already know. So for some time, I have only been able to do simple perceptrons. Now I finally understand that hidden layers are just layers of multiple perceptrons being pushed into other perceptrons, where each perceptron has been trained to complete a different task. Thank you!
@paulorugal
@paulorugal 6 лет назад
You're the BEST CS TEACHER THAT I NEVER HAD
@marcocastellano2451
@marcocastellano2451 4 года назад
I found you years ago when I needed to learn steering algorithms. You made the math and algorithm simple(r) to understand and your videos are a lot like a drop of sunshine in my day. It reminds me of Reading Rainbow when I was a youngster. Now I am back to continue my work on CNNs. And there you are again in my suggested videos :D
@muskansaxena5708
@muskansaxena5708 3 месяца назад
The way you teach is fun , it's like you're yourself enjoying teaching, which us students love ...one could fall in love with the knowledge presented here!!
@anshrastogi9430
@anshrastogi9430 2 года назад
I literally want this sort of sense of humour in my college professor. Thanks for saving my semester. Love from India.
@TheTimeforwar
@TheTimeforwar 4 года назад
If every black kid in the hood had a teacher like this they'd all succeed at understanding this easily; why? Because this guy's likability makes you want to learn. When you enjoy the person teaching you, you will usually enjoy 'what' they're teaching you. The 'capacity' to 'understand' has very little to do with 'achievability' in human affairs & 'thinking' certainly pertains to human affairs. I'm understanding concepts I've never encountered before, not because I'm 'smart', but because the instructor in this video is interesting, funny, has a charm individually his own and is not intimidating or threatening in anyway, least of all, neither is he boring. Every young person deserves a teacher like this.
@Matt23488
@Matt23488 5 лет назад
"Maybe you just watch my previous videos on the Perceptron" Yes. Yes I did.
@FredoCorleone
@FredoCorleone 2 года назад
What a master. We are really fortunate to have Daniel as instructor here on RU-vid!
@henriqueb287
@henriqueb287 3 года назад
Keep going man, I wish you were my teacher from college. Fun, smiles and learning together. Such a great experience to learn with you, 15 minutes passed like nothing but full of knoweledge. Love from Brazil! Keep going!
@scipsyche5596
@scipsyche5596 7 лет назад
Good job The topic is very interesting, what's more interesting is the way he teaches☺
@ericmrozinski6143
@ericmrozinski6143 4 месяца назад
An excellent and exciting explanation! This is exactly what I was looking for in trying to understand the motive behind the multi-layer perceptron. Not to be taken for granted!
@najibsaad5765
@najibsaad5765 7 лет назад
You are outstandingly interesting. Keep going!
@TheCodingTrain
@TheCodingTrain 7 лет назад
Thanks for the nice feedback!
@SidVanam
@SidVanam 4 года назад
Cool to see how you linked the "Linearly Seperable" terminology to the boolean Truth tables! - Learned something applicable and new!
@justincollinns
@justincollinns 5 лет назад
Your answer to "But what is XOR really?" at 10:46 was just what I needed! Thank you!
@anaibrahim4361
@anaibrahim4361 Год назад
I was extremely happy when I discovered that you had posted a video on a topic that I was searching for.
@critstixdarkspear5375
@critstixdarkspear5375 6 лет назад
You should be given your own show on the science network. More educational, fun, engaging and entertaining than 99% of the crap we pay for. Better than most courses I have seen on programming. Bill Nye + Bob Ross + Mr. Rogers. 11/10
@redIroncool
@redIroncool 5 лет назад
I actually love your enthusiasm!!!!
@danishshaikh2994
@danishshaikh2994 Год назад
Man, I'm speechless, god level explanation 🔥🔥🔥
@Sripooja.Mahavadi
@Sripooja.Mahavadi 4 года назад
How can someone dislike his video .he seems to be a genuine happy man..exuding joy...let him be :) The kind of excitement he has towards his code is what I need towards my life ;)
@mkthakral
@mkthakral 2 года назад
Teachers like you are so rare. Gem.
@rogerhom1512
@rogerhom1512 Год назад
I've seen a lot of videos about neural networks, both advanced ones (which go over my head) and beginner ones (which are too general). That XOR example in this video was an epiphany for me! Now I have an intuitive sense of what makes neural networks so special (vs., say, linear classifiers.) Now I feel like I'm finally ready to go deeper into this subject
@TheCodingTrain
@TheCodingTrain Год назад
I'm so happy to hear this!
@rogerhom1512
@rogerhom1512 Год назад
​@@TheCodingTrain Yah, that bit about how a single layer network can only solve linearly-separable problems, and how hidden layers fix this limitation, finally makes intuitive sense to me thanks to the XOR example. Thanks! Not sure if you cover this in subsequent videos, but I'd be interested to hear your take about why having multiple hidden layers can be useful, vs. just one hidden layer.
@CloverSerena
@CloverSerena 3 года назад
I like you. You are the ideal teacher. The genuine sincere pleasure of teaching what you love to others. I can feel that love.
@anonymousvevo8697
@anonymousvevo8697 Год назад
the only channel with no haters ! amazing sir! good luck love you
@parths.1903
@parths.1903 3 года назад
This dude is so awesome, I can watch him teach all day. Love you, pal.
@kineticsquared
@kineticsquared 6 лет назад
Outstanding explanation of linearly separable. You make it very easy to understand why multiple perceptrons are required. Plus I love Boolean logic. Thank you.
@joshvanstaden7615
@joshvanstaden7615 3 года назад
Give this man some Concerta! Lol, in all honesty, I love being taught by people who are passionate about what they do. Keep it up!
@carlosdebourbondeparme6021
@carlosdebourbondeparme6021 4 года назад
you can only lunch if you are hungry AND thirsty. love the videos :)
@battatia
@battatia 7 лет назад
third! Really appreciating these tutorials, much friendlier than others!
@TheCodingTrain
@TheCodingTrain 7 лет назад
Thanks, that's nice to hear!
@waisyousofi9139
@waisyousofi9139 2 года назад
it is a unique talent to teach and bring smile at the same time. Wow..
@backtashmohammadi3824
@backtashmohammadi3824 2 года назад
Holly Juice. That was an amazing explanation. My Professor at the uni confused me a lot . but this video made my day
@samwakieltojar8154
@samwakieltojar8154 4 года назад
this man has ENERGY
@Sworn973
@Sworn973 7 лет назад
Interesting, so basically same analogy to electronics building Logic gates from transistors. You kind like add they together to get more complex operations. Very good material. Keep going, I'm really into this
@grainfrizz
@grainfrizz 6 лет назад
6:57 genius. Very effective teacher
@usmanmehmood7614
@usmanmehmood7614 6 лет назад
this video just made me simply happy. Great Thanks from Pakistan. NUST needs to hire such professors
@fernandolasheras6068
@fernandolasheras6068 4 года назад
OMG. Best video of NN basics concepts by far. And craziest too. Very fun to watch. Congrats!!!
@ahmarhussain8720
@ahmarhussain8720 3 года назад
I got that click where you suddenly understand a concept, by watching this video, thanks so much
@webberwang6520
@webberwang6520 6 лет назад
I haven't heard this great an explanation before on RU-vid, great stuff!
@kumudtripathi4054
@kumudtripathi4054 5 лет назад
Loved the way you are teaching...I have already known mlp but your way of teaching makes me watch it again
@kdpoint4221
@kdpoint4221 5 лет назад
U made me understand better than any simplified notes.......
@Bo_om2590
@Bo_om2590 4 месяца назад
This guy has a golden heart
@waisyousofi9139
@waisyousofi9139 2 года назад
What a nice teacher. truly enjoying the way you teach and convey your knowledge.. plz keep going.....
@nicholask9251
@nicholask9251 7 лет назад
Great videos and tutorials, Big fan here. Cool that you dont just make code but also explain the concept at beginning.
@graju2000
@graju2000 4 года назад
Man I wish they'd give you nobel prize for teaching!
@d.g.7417
@d.g.7417 Год назад
I'm speechless. What a beautiful explanation!
@AM-jx3zf
@AM-jx3zf 3 года назад
wow this guy is so animated. instantly likeable.
@TheAsimjan
@TheAsimjan 4 года назад
Amazing explaining... Magically deliver a complex topic
@morphman86
@morphman86 5 лет назад
Another way to see linearly separable problems: If it has a binary output, as in it either is or it isn't. With the dots on the canvas, they are either below the line, or they aren't. We just picked "aren't" to mean "above", but that's how we humans chose to read the output. We read it as "below" or "above", the computer reads it as "is" or "isn't". If you draw a line across your data and define a relationship between the data point and the line, the point either falls into that relationship, or it doesn't.
@doug8171
@doug8171 5 лет назад
Great example of the need for more than one perceptron layer for the XOR.
@drakshayanibakka11
@drakshayanibakka11 4 года назад
More excited to watch your videos. keep rocking with your enthusiasm
@my_dixie_rect8865
@my_dixie_rect8865 6 лет назад
Love this video. Explained it really well. I have an exam on Wednesday which covers MLP and the functions of layers and neurones. This should help form my answer.
@HeduAI
@HeduAI 5 лет назад
Awesome explanation! You are so gifted!
@leylasuleymanli725
@leylasuleymanli725 6 лет назад
today i should study MLP but because of my some problems i could not concentrate.But after watching your tutorial you make me smile and forget about problems and understand the topic.Thanks a lot :)
@TheCodingTrain
@TheCodingTrain 6 лет назад
glad to hear!
@likeyou3317
@likeyou3317 6 лет назад
Damn Dan you seem to be such a lovely person and I say it as a man! Keep doing these tutorials becouse I don't know if there is any other channel on yt explaining neural networks on code as good as you do it.
@missiongrandmastercurvefev8726
Awesome. Your way of teaching is perfect.
@raitomaru
@raitomaru 5 лет назад
Really enjoyable class!
@venkatdinesh4469
@venkatdinesh4469 3 года назад
ur teaching style is really awesome....
@NightRyder
@NightRyder 5 лет назад
Thanks got my exam in 8 days!
@furrane
@furrane 7 лет назад
Great video as usual Dan, I'm looking forward to the sequel =) On a side note, I think everyone here understands !AND but the usual way is to call this gate NAND (for Not AND).
@TheCodingTrain
@TheCodingTrain 7 лет назад
oh, hah, yes, good point!
@tecnoplayer
@tecnoplayer 7 лет назад
Thanks for teaching us assembly, sensei.
@4Y0P
@4Y0P 7 лет назад
I love the way you explain things, energetic but informative, loving these videos!
@yisenliang8114
@yisenliang8114 Год назад
Fantastic explanation! This is just what I need.
@Cipherislive
@Cipherislive 5 лет назад
What a genius teacher you are . Appreciate you sir
@montserratcano2389
@montserratcano2389 6 лет назад
Great video! Thank you very much! You just save my academic life :)
@TheCodingTrain
@TheCodingTrain 6 лет назад
Glad to hear!
@mohamedchawila9734
@mohamedchawila9734 5 лет назад
Ting!!! i've learned something, 'xor' => 8:04
@baog4937
@baog4937 5 лет назад
Sir, your method is Excellent
@jt-kv3mn
@jt-kv3mn 5 лет назад
this is more than just an Neural Networks tutorial! thx
@sarveshrajan1624
@sarveshrajan1624 6 лет назад
awesome and easy explanation. thanks!
@TheCodingTrain
@TheCodingTrain 6 лет назад
Thank you!
@PoojaYadav-hr2ub
@PoojaYadav-hr2ub 4 года назад
Woweeeeee ... Another level of explanation
@gururajahegdev2086
@gururajahegdev2086 2 года назад
Very Nicely Explained. Great Tutorial
@N00byEdge
@N00byEdge 7 лет назад
Hey there. I work as an artificial intelligence expert. I write state of the art neural networks libraries in C++ for a living. If you would like to talk about NNs with me in personal, or just ask me any questions, feel free to do so. I like your teaching style and I think knowledge about these kinds of things should be more universal!
@N00byEdge
@N00byEdge 7 лет назад
And with that, of course I can also help with training algorithms and how to work with your data when using neural networks.
@wawied7881
@wawied7881 7 лет назад
Goodjob! Quite interesting topic
@Smile-to2ii
@Smile-to2ii 2 года назад
I love your energy and smiling face.
@kashan-hussain3948
@kashan-hussain3948 5 лет назад
Thank you Sir for making concepts easier.
@jan_harald
@jan_harald 7 лет назад
9:25 YOU FORGOT TO MENTION *STRAIGHT* LINES it's REALLY easy to draw a curved line solution...
@TheCodingTrain
@TheCodingTrain 7 лет назад
Yikes, thanks for this important clarification!
@furrane
@furrane 7 лет назад
A geometric "line" is what you would call a straight line.
@sandarabian
@sandarabian 5 лет назад
Only strait lines are linear!
@RafaelBritodeOliveira
@RafaelBritodeOliveira 7 лет назад
I'm really enjoying those videos. Thank you very much for all your hard work.
@elizabethmathewst
@elizabethmathewst 5 лет назад
Beautiful presentation
@sachinsharma-kw4zd
@sachinsharma-kw4zd 6 лет назад
You are amazing bro.keep it up.i m learning a lot from you
@TheCodingTrain
@TheCodingTrain 6 лет назад
Thank you!
@edvaned8207
@edvaned8207 7 месяцев назад
Great classe, great professor. Thanks for sharing it with us.
@nageshbs8945
@nageshbs8945 4 года назад
11.40 very well explained thankyou!!
@KishanKa
@KishanKa 6 лет назад
Nice way you have explained the basics, thanks 😊
@augustoclaro
@augustoclaro 6 лет назад
thank you so much man! your videos are the best I found about the subject. you are a genious!
@xavmanisdabestest
@xavmanisdabestest 5 лет назад
Wait so perceptrons are these crazy learning logic gates that work on linear systems. That's rad!
@dubonzi
@dubonzi 6 лет назад
Im loving your channel
@60pluscrazy
@60pluscrazy 2 года назад
Very well explained and expressed 👌🙏
@learnapplybuild
@learnapplybuild 5 лет назад
So Much Excitement you have to share knowledge ......i liked that gesture .... keep it up dude ...Thank you
@srinivasadineshparupalli5139
@srinivasadineshparupalli5139 4 года назад
Awesomeness at it best.
@gajuahmed4426
@gajuahmed4426 3 года назад
Couldn't Appreciate more Brother
@algeria7527
@algeria7527 7 лет назад
i realy love the way you teaches. good work keep up.
Далее
Why Neural Networks can learn (almost) anything
10:30
Новый фонарик в iPhone с iOS 18
00:49
Просмотров 369 тыс.
The Most Important Algorithm in Machine Learning
40:08
Просмотров 382 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 935 тыс.
Watching Neural Networks Learn
25:28
Просмотров 1,2 млн
The Essential Main Ideas of Neural Networks
18:54
Просмотров 926 тыс.
Neural Network from Scratch | Mathematics & Python Code
32:32
Coding Challenge 180: Falling Sand
23:00
Просмотров 874 тыс.