Тёмный

Neural Networks from Scratch - P.5 Hidden Layer Activation Functions 

sentdex
Подписаться 1,3 млн
Просмотров 289 тыс.
50% 1

Neural Networks from Scratch book, access the draft now: nnfs.io
NNFSiX Github: github.com/Sentdex/NNfSiX
Playlist for this series: • Neural Networks from S...
Spiral data function: gist.github.com/Sentdex/454cb...
Python 3 basics: pythonprogramming.net/introdu...
Intermediate Python (w/ OOP): pythonprogramming.net/introdu...
Mug link for fellow mug aficionados: amzn.to/3bvkZ6B
Channel membership: / @sentdex
Discord: / discord
Support the content: pythonprogramming.net/support...
Twitter: / sentdex
Instagram: / sentdex
Facebook: / pythonprogramming.net
Twitch: / sentdex
#nnfs #python #neuralnetworks

Опубликовано:

 

13 май 2020

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 978   
@tuhinmukherjee8141
@tuhinmukherjee8141 4 года назад
Bro the effort he puts in to make us understand this stuff is highly admirable. Thanks for doing this man. Will be waiting for pt. 6
@mayurpanpaliya
@mayurpanpaliya 4 года назад
When pt 6 will be released ?
@trashtop1810
@trashtop1810 3 года назад
@@mayurpanpaliya He is waiting for you to buy the book haha
@robenromero4947
@robenromero4947 3 года назад
I am excited for part 6
@subhammishra5445
@subhammishra5445 3 года назад
@@bossragegamer4081 5 months now :(
@Mohamm-ed
@Mohamm-ed 3 года назад
6 months
@ConorFenlon
@ConorFenlon 3 года назад
Dude, you're a legend. Bought the ebook Pre-Order yesterday, absolutely CANNOT WAIT for full release. My favourite thing about your videos, is your enthusiasm. For example, at 8:38, "What's so cool about ReLU is it's ALMOST linear, it's sooooo close to being linear, but yet that little itty-bitty bit of that rectified clipping at 0, is exactly what makes it powerful; as powerful as a sigmoid activation function, super fast, but this is what makes it work, and it's so cool! So WHY does it work??" Dude, I've never been so PUMPED to learn from someone with such enthusiasm in my LIFE. You take all the time you need to do this man, do it your way, and take your time, and you'll change the world. Thank you so much. Much love from Ireland. edit: spellings
@josephmejia9520
@josephmejia9520 3 года назад
Seriously! PUMPED encompasses all my feels as I follow along.
@nishantsvnit
@nishantsvnit 4 года назад
18:17 Seeing the neurons fire when activated and die when deactivated really helped to see what really goes under the hood of a neural network. Thanks for this really helpful animation and the whole nnfs initiative as a whole.
@Orchishman
@Orchishman 4 года назад
can you please explain how the activation point is getting changed by changing the bias? doesn't that flout the activation function which says y=x only when x>0?
@tuhinmukherjee8141
@tuhinmukherjee8141 4 года назад
@@Orchishman the bias here is essentially setting the activation point because y= max(0,max(0,-x+0.5)+0.48) will give you 0.48 for x being greater than or equal to 0.5 which serves as the lower bound for the function
@nishantsvnit
@nishantsvnit 3 года назад
@@Orchishman I created a graph so that you can play with the parameters and see for yourself how this is actually happening. I considered the first two rows and the last row of neurons only to make it simple (so 6 neurons in total in the hidden layer). I have numbered the neurons such that first neuron of first row has subscript 11, second neuron of first row has subscript 12 and so on (so, for example, the second neuron of the 8th row has subscript 82). Now to simulate the movement that is happening in the video, adjust the slider for the variable w_22 and see what happens with the plot. You will see where the area of effect of the second neuron comes into play. You can also adjust other sliders for weights and biases to see see their influence on the output. Here is the link: www.desmos.com/calculator/gruatlyner When you open the plot, the values of the weights and biases are the same as seen in the video till 17:01. I hope this helps.
@rakeshkottu
@rakeshkottu 4 года назад
its been a month, still waiting for part 6.
@yaron3479
@yaron3479 3 года назад
ecpect something huge
@josephastrahan6403
@josephastrahan6403 3 года назад
Waiting also :), it will be worth the wait though for his quality.
@disappointedsquid
@disappointedsquid 3 года назад
Yes, please
@Nightmare-or2yd
@Nightmare-or2yd 3 года назад
I emailed Harrison about it, he says that he is finishing the draft (which is nearly complete) before continuing the series.
@aidankemp-harper2559
@aidankemp-harper2559 3 года назад
@@Nightmare-or2yd yayyyyy
@jonathanmorgan4480
@jonathanmorgan4480 Год назад
This is by far the best explanation of how neural nets work that I have ever found. This should be it's own standalone teaching. The sine wave example with visuals - perfect! Thanks so much.
@mariyanzarev6423
@mariyanzarev6423 3 года назад
Hey sentdex, since the other parts are still in the works I’d like to give some feedback. Thanks for doing all this, the graphics help a ton to see how everything works. The only suggestion is to explain why the different concepts even exist, with some real life examples. This looks like it would be great for someone experienced that has used activation functions and everything else you discuss, and now they would like to closely see how it works. For a noob like me, it is not clear why they even exist, and it feels a bit like we are just listing different concepts without a clear picture of why, and what we are trying to achieve with this network. For example when you were showing how well the ReLu fits the data its not clear if that is actually desirable since it seems to overfit the data.
@mdimransarkar1103
@mdimransarkar1103 2 года назад
It is all result of years of experiment, scientists just try to look for patterns.
@karthikkashyap4557
@karthikkashyap4557 Год назад
Hi, Here's a link of my video where I've explained how relu helps in fitting lines to the data. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-9t4TD5mcWSI.html.
@rkidy
@rkidy 4 месяца назад
For anyone that’s sees this in the future and agrees, this series generally balances practicality with understanding. I would heavily recommend also giving 3blue 1browns series on neural networks a look as that focuses far more on understanding and doesn’t really go into code that much.
@syedalamdar100
@syedalamdar100 4 года назад
This is the first time I understand how to build a neural network. I love the work. My impatient side is wishing that all the videos be made available for this series but this will keep me hooked and awaiting your next post. Amazing job!
@mizupof
@mizupof 3 года назад
This video just blew my mind. I still haven't bought the NNfS book yet. But this doesn't reflect how much it love to watch and re-watch your videos. This series will probably stay State-of-the-Art for a long time. Thank you!
@patrickjdarrow
@patrickjdarrow 4 года назад
Dude. Having been here for the last 5ish years, it's awesome to see how far your production level has come. Always good content, now shinier. Would love to see a video on how the videos themselves are made.
@crohno
@crohno 4 года назад
I just wanted to thank you for all this stuff, I am in the process of getting a PhD in neuroscience and artificial neural networks seems like a great tool to help with research. You make it really clear, and unlike other tutorials that tend to just show how to use certain libraries you really get down to how they actually works. As soon as the book is out I am getting a physical copy!!
@emado.7834
@emado.7834 3 года назад
Bro, I just felt obligated to leave a comment for the perfect video you have made. This was literally the best visualization I have ever seen on youtube. This video deserves an oscar.
@NikhilSinghNeil
@NikhilSinghNeil 4 года назад
finally a video giving a clear insight of an activation function. This is by far the best explanation of activation function I've come across. Really appreciate your work behind this series and getting into the crux of these topics.
@2010karatekid
@2010karatekid 4 года назад
I took my first machine learning course last semester and unfortunately all of the activities we did looked like those from the CS231 class you mentioned--no explanation, just code snippets and output. They were doable but considering it was most students first foray into python, it was quite a rough time to say the least. However, I am extraordinarily pleased to have found your channel and this series in particular--your instruction has helped more in the last 5 videos than my entire semester at university. Thanks for doing what you do.
@RoughlyAverage
@RoughlyAverage 4 года назад
I really struggled with the explanation on feature sets / features / samples / classes, I definitely don't think I fully get it (first time that has happened in this series so far!) The animation you mentioned would for sure help!
@nishantsvnit
@nishantsvnit 4 года назад
For the spiral dataset, - features are the x-coordinates (x) and y-coordinates (y) of the points - In the code, there are 300 x and 300 y values associated with the 300 points - feature sets are the pairs (x, y) that fully define one point in the dataset - In the code, there are 300 feature sets - classes are the labels associated to the points - In the code, there are 3 classes defined by the colors - red, blue, green - and each feature set (x, y) corresponds to one of these 3 classes (with 100 points each) - samples are the combination of feature sets and classes that form the dataset - For example: (x = 0.2, y = -0.5, color = red) and (x = -0.5, y = -0.2, color=blue) are samples from the dataset Edit: Calling the function X, y = spiral_data(100, 3) creates samples belonging to 3 classes with 100 feature sets each. X (feature sets) is an array of shape (300, 2) and y (classes) is a vector of size 300.
@iAmTheSquidThing
@iAmTheSquidThing 4 года назад
Same here. That's the only thing so far in this series which confused me.
@iAmTheSquidThing
@iAmTheSquidThing 4 года назад
@@nishantsvnit Ahh so a "feature set" is essentially "the set of features which a sample has" but unlabelled?
@nishantsvnit
@nishantsvnit 4 года назад
@@iAmTheSquidThing You are right. But it is better to not call it "unlabeled" because that is a term used for feature sets that have no labels assigned to them (which was not discussed in the video). In the example in the video, all the feature sets have corresponding labels (i.e., the 300 x,y coordinates belong to one of the 3 colors). So to rephrase your sentence, you can say that feature set and label are the two components that make up a sample. If there is no label, the sample (or feature set) is called unlabeled. For more information on these terminologies, I would encourage you to see this: developers.google.com/machine-learning/crash-course/framing/ml-terminology#examples
@userre85
@userre85 4 года назад
@@nishantsvnit thanks
@vedangpingle1914
@vedangpingle1914 4 года назад
40 mins ! Oh boy this is gonna be good
@satwikram2479
@satwikram2479 4 года назад
Yes😍
@shauryapatel8372
@shauryapatel8372 4 года назад
it IS good
@muna4840
@muna4840 Год назад
I'm going to be very good someday at building/training neural nets. It's all because of my curiosity that made me stumble on this fantastic playlist..... now I'm reading your book and practicing (coding after reading between the lines and understanding the theory) and consulting this playlist and several other resources in order to gain a deeper understanding. Thank you so much for being really amazing.
@TonyTheTrain
@TonyTheTrain 3 года назад
I know I'm late to the party, but the animations are amazing. I watched the double neuron part probably 20 times with the sound off to figure out what was going on. I had a recommendation for the animation and as I was typing it, I realized that I STILL didn't fully understand what was going on. I've got it now - thank you for the animations! This would be MUCH more difficult without them. Specifically - the input of the second neuron going "backwards" was bending my brain.
@bamitsmanas
@bamitsmanas 4 года назад
Its wonderful what you're doing! Im just loving the in-depth knowledge of this course. Although I'm in high school I'm not finding it difficult to catch on!!👍👍
@AliAbbas367
@AliAbbas367 3 года назад
I cannot explain how much amazing you way of explaining is. I just saw all of your videos in one go and now I am waiting for another one. Thank you.
@VinayAggarwal
@VinayAggarwal 4 года назад
Huge Respect!!!! You are really going far and beyond to make people understand this stuff. You are setting new milestones for educators around the world. ❣️Appreciate your efforts.
@themetalcommand
@themetalcommand 4 года назад
Man loved the video! So helpful and easy to learn. Need pt. 6 sooner, too eager to learn about back-propagation and weight/bias adjustments!
@palashchanda9308
@palashchanda9308 4 года назад
Waiting for P.6 eagerly..
@sandeshtulsani1517
@sandeshtulsani1517 3 года назад
same
@maxwellcrafter
@maxwellcrafter 3 года назад
@@sandeshtulsani1517 Same
@user-yg2xq1jq7t
@user-yg2xq1jq7t 5 месяцев назад
This entire series has been amazing. I really appreciate your effort to simplify and get things to a granular level. Kudos to you.
@josephmejia9520
@josephmejia9520 3 года назад
I love how passionate he is throughout all these videos it brings me joy while learning this subject.
@zendr0
@zendr0 2 года назад
This is the best explanation video of how activation function works in the WWW 🚀. And thank you the one who put his time and effort in creating such beautiful animations for us. Thank you very much ❤
@TheRelul
@TheRelul 3 года назад
Man, this is just beautiful. Thank you and Daniel and the whole team responsible for this. You are bringing beauty into the world.
@Extorc
@Extorc 3 года назад
Hey can u explain me what is an activation function atol
@michaeljburt
@michaeljburt 2 года назад
Absolutely brilliant explanation as to why a non-linear activation function can lead to good mapping of desired non-linear outputs. This is actually an extremely pertinent topic in my field of study (electrical engineering, power systems, which for three phase AC circuits have non-linear power flow solutions). Seeing "how" these ReLU neurons can model non-linear functions is absolutely mind blowing. Bravo!
@ifmondayhadaface9490
@ifmondayhadaface9490 3 года назад
You’re the first person I’ve seen who actually explains how something like ReLu is so helpful and powerful. Looking forward to part 6!
@parasjain3211
@parasjain3211 4 года назад
This week was the hardest to pass of this quarantine! Please don't make us wait so long 🙏🏻🙏🏻🥺🥺
@Hacker097
@Hacker097 4 года назад
idk mate, probably not a good idea to put pressure on him to upload more
@kris10an64
@kris10an64 4 года назад
@@Hacker097 He is showing appreciation
4 года назад
Is there any paper for this optimizer? I've never heard of one before. How does it work?
@HT79
@HT79 4 года назад
Perhaps the author could help us out.... Hey @Daniel Optimizer Kukiela, please tell us about your optimizer!
@dhruvdwivedy4192
@dhruvdwivedy4192 4 года назад
Hello Daniel nice to see you here😂😂❤️
@user-ns8dl3vm5z
@user-ns8dl3vm5z 4 года назад
You are a legend man
@whoisabishag3433
@whoisabishag3433 4 года назад
How Does This Help Irene To SLAP ME"?" 👠😋😎
@josephastrahan6403
@josephastrahan6403 4 года назад
When he was saying optimizer, he was saying that the guy literally did it 'by hand'. So there is no optimizer, it was done by a human :P, if i understood correctly.
@JackSimpsonJBS
@JackSimpsonJBS 3 года назад
This series (and the book) are incredible! Such an amazing teacher - I can't wait for part 6 :)
@Sionlockett
@Sionlockett 4 года назад
Those 3blue1brown api animations are amazing. MASSIVE production value. That really helped me understand this video to another level, thanks.
@horticultural_industries
@horticultural_industries 3 года назад
My guy cannot decide where to put his camera
@srikarraoayilneni7074
@srikarraoayilneni7074 4 года назад
Well, that's a long wait. Honestly I'll wait forever. 😂😂 But here it is, finally.♥️
@HomeBologn
@HomeBologn 3 года назад
This series is the best one you've ever done, hands down. Easiest to follow, helpfully illuminated by the manim animation (manimations?). 11/10
@alicemystery5520
@alicemystery5520 4 года назад
Xomg, that was impressive as always sentdex ! The visuals are a big help. Thanks for all of your tutorials. Naturally, I will buy the book to show my appreciation and continue on as this channel has become the edge of what you can do with python.
@PaderRiders
@PaderRiders 4 года назад
Hey man! Thanks for your awesome videos! Im interested in this theme and your explaining it pretty good! Im waiting for your next video 😉 Greetings from Germany and keep on producing 🚀
@benicamera1577
@benicamera1577 3 года назад
Lol PaderRiders interessiert sich für Nerdstuff XD
@nivrak5411
@nivrak5411 4 года назад
Me: Sees new video by sentdex about neural networks Hand: Invents FTL Travel to click the video
@alrineusaldore6764
@alrineusaldore6764 Год назад
Ever since I've heard of ReLU I've always questioned why is it better than sigmoid and the others even though it looks like 2 linear functions put together. Now I finally understand how it works and why it's so efficient! I also understand linearity and non-linearity much better than before and my thirst for knowing why and how it's all happening is satiated. Thank you for those amazing videos!
@sidchakravarty
@sidchakravarty 3 месяца назад
This is one of the BEST explanations of why ReLU works. I took 24 screen shots only for this video because of the amount of detail this video has. Eagerly waiting for the book to arrive today!!!
@gamalaburdene5243
@gamalaburdene5243 4 года назад
I never comment on videos, but I've been following sentdex for the last couple of years and this is amazing. Please keep up the good work, thank you for teaching me so so many things.
@sentdex
@sentdex 4 года назад
Thank you! Will do!
@erichartz594
@erichartz594 4 года назад
I think an animation would be immensely helpful for absorbing the section about features and classes. Got lost for a while between data set and feature set and feature class
@Gingnose
@Gingnose 3 месяца назад
There are very few topics online about learning what neurons and synapses in silico (Node and activation function) does in intuitive way. This video is already 3 years old but still holds the crown, sir.
@AlizerLeHaxor
@AlizerLeHaxor 2 года назад
This is amazing, I'm coding along with C# and this is the first time I actually understand how Neural Networks work.
@satwikram2479
@satwikram2479 4 года назад
Why NNFS has stopped?
@reyboyvideogames
@reyboyvideogames 4 года назад
no stopped, sentdex is doin the draft first before uploading the new video i think
@shrideepgaddad8721
@shrideepgaddad8721 4 года назад
Hi, I noticed that you did not paste the code for generating a dataset in the description. Also thanks for the new video!
@nishantsvnit
@nishantsvnit 4 года назад
The link to the code is there in the description under "Spiral data function"
@michaelparker6868
@michaelparker6868 3 года назад
At 29:27 if you freeze the screen you can see it. I copied the text into Jupyter notebook and it worked.
@thecathode
@thecathode 3 года назад
The detailed explanation and animations of fitting the sine wave are awesome!
@lewisdrakeley9631
@lewisdrakeley9631 2 месяца назад
Fantastic video! I never understood the need for activation functions, now I get it completely. Incredible work thank you!
@unixtreme
@unixtreme 3 года назад
I want to get the book but tbh I'm on the fence, my brain doesn't allow me to sit and go through paper, if this series resumes then I will because it will be good as a complement but not as a main means of studying in 2020.
@splch
@splch 4 года назад
what optimizer do u use? noob: ADAM intellectual: Daniel
@witek_smitek
@witek_smitek 4 года назад
FINALLY Some explained me with easy, visual way, what impact has layers. I was always wondering why de heck we need 2 layers, or why 8 neurons at each, and why not a 100? Thanks a lot! You are doing a great work. I can't wait for the next part!!
@d3c0deFPV
@d3c0deFPV 4 года назад
Amazing videos, I'm learning a lot that I missed the first time around. I was running into problems with my models not working well with my data and knew it was time to get back to basics. Thanks for making these!
@Hunar1997
@Hunar1997 4 года назад
Next course, deep q-learning from scratch XD
@judedavis92
@judedavis92 4 года назад
When is the next one coming?
@arjunp3574
@arjunp3574 2 года назад
Thank you so much for really making me understand the working of the activation function. After seeing this video, my motivation to learn neural networks skyrocketed. And the effort you put into this video is overwhelming and I really appreciate you from the bottom of my heart. Once again , Thank You ❤️❤️❤️❤️
@shammabeth
@shammabeth 3 года назад
Bought your book, but still eagerly waiting for the next video. These are very well produced deep dives that are easy to understand.
@heyrmi
@heyrmi 4 года назад
There has to be a place like heaven inside heaven for you.
@shauryapatel8372
@shauryapatel8372 4 года назад
aren't you the guy that was faster than light?
@kelpdock8913
@kelpdock8913 3 года назад
@@shauryapatel8372 yeah he was
@subratkishoredutta4132
@subratkishoredutta4132 4 года назад
When is part 6 going to be released
@OfficialYouTube3
@OfficialYouTube3 3 года назад
He said expect it sometime between June 2022 and December 2038
@subratkishoredutta4132
@subratkishoredutta4132 3 года назад
@@OfficialRU-vid3 thts a long wait dude🙂
@OfficialYouTube3
@OfficialYouTube3 3 года назад
@@subratkishoredutta4132 Yes but Sentdex is a busy man... writing books, running a RU-vid channel, maintaining a website, did you know he is raising three different families on two different continents? (that last one is a secret)
@subratkishoredutta4132
@subratkishoredutta4132 3 года назад
@@OfficialRU-vid3 yes he is..
@kelpdock8913
@kelpdock8913 3 года назад
@@OfficialRU-vid3 that last one is wrong, he got a neural network to do the other two
@tasnimnishatislam7607
@tasnimnishatislam7607 4 года назад
Hey, I am a beginner in machine learning and you genuinely have been an inspiration. Thanks for existing!
@joeyfoursheds
@joeyfoursheds 3 года назад
Having watched or read dozens of explanations about activation functions, this is the first one that enabled visualisation of what's going on behind the numbers. Congratulations, it's pretty hard to come up with something unique on this topic, especially on RU-vid, and this knocked it out the park. What's being used to create the animated net diagrams?
@HT79
@HT79 4 года назад
Finally! First view and first comment 😍
@danielwit5708
@danielwit5708 4 года назад
Live success! Yay!
@haztec.
@haztec. 4 года назад
Neu - ral - net. It's in the brain.
@peterkanini867
@peterkanini867 11 месяцев назад
Awesome explanation. This should have been one of the first things you explain. "A Neural Network Fits Complex Functions to the Data Given"
@serix_16
@serix_16 4 года назад
These videos are gold. Seriously I can't thank you enough! I really want to buy the e-book version of this amazing book and also join your channel but unfortunately, I live in a sanctioned country and every international transaction is blocked... Thank you again and keep up the good work!
@davidgomez79
@davidgomez79 4 года назад
at 25:10 When I code in python and I'm under 80 characters for my line of code. I rename my variables extra long just to end up at 82 characters to trigger the pep8 lovers. I hate pep8
@kris10an64
@kris10an64 4 года назад
Why do you hate pep8?
@davidgomez79
@davidgomez79 4 года назад
@@kris10an64 it has unreasonable rules that shouldn't always apply. People are too strict with them. Raymond Hettinger himself agrees. Its suppose to make code more readable but its very flawed especially since python is already based on indentation. The 80 characters per line rule is the worst rule. If you have nested if blocks or if you like to work with lambda functions and iterators it can easily become long and well it can make the code blocky and ends up making it hard to read which is the very thing it was meant to avoid. In many cases following pep8 isn't the best option. Are you one of those pep8 absolutists? It also feels very restricting.
@davidgomez79
@davidgomez79 4 года назад
@@kris10an64 search for a video on youtube titled: "Beyond Pep8" where Raymond Hettinger talks about his dislikes of pep8 too and how some if its aspects are silly at best.
@davidgomez79
@davidgomez79 4 года назад
@@kris10an64 Here is how I like to code. Maybe it comes from me preferring C++ but these 2 functions can take 2 strings with hex values and xor them together: def stringXOR(a,b): return ('0' * len(a if a > b else b) + '%02X' % (int(a,16) ^ int(b,16)))[-len(a if a > b else b):] def stringAND(a,b): return ('0' * len(a if a > b else b) + '%02X' % (int(a,16) & int(b,16)))[-len(a if a > b else b):] Make that pep8 friendly and it looks like hell.
@davidgomez79
@davidgomez79 4 года назад
@@kris10an64 Here's another example. This is how I like to reverse a hex string: def byteFlop(hexstr): return ''.join(reversed([hexstr[y:y+2] for y in range(0, len(hexstr), 2)])) Show me a pep8 version that is better.
2 года назад
wow, the example of linear vs non-linear activation function is amazing! This series is pure gold
@DSinghsLAB
@DSinghsLAB 3 года назад
Excellent work!! Please dont abandon so many of us and continue with part 6 and beyond... loads of respect for the time you put in making these videos.. Thank u from the most hidden layer of my heart for such explanations!!
@sentdex
@sentdex 3 года назад
Working on the book atm. Videos after
@DSinghsLAB
@DSinghsLAB 3 года назад
@@sentdex sure No Problem!! I promise I wont learn Neural Nets from any other source Till you are back in business !! Warm wishes for the book !!
@mmrsagar
@mmrsagar 4 года назад
Best tutorial Visually, Verbally, Programmably and Conceptually. Thanks for enlightening us sentdex.
@jacobjohanson2503
@jacobjohanson2503 2 года назад
The most important thing to do when teaching anything is to answer the questions "why?", "what?", and "how?". This was done beautifully.
@dolomikal
@dolomikal 4 года назад
These keep getting better and better. I'll say again, this is exactly what I was looking for to get into AI as a (currently) non-AI dev. You go deep enough to understand what's going on behind the scenes but stay at high enough of a level that it doesn't feel like an advanced math course. Truly an art. Great work!
@acidtears
@acidtears 4 года назад
That's because he hasn't talked about backpropagation yet lol You should watch 1-2 videos of 3blue1brown on linear algebra just to understand how derivatives work, as that will increase your understanding immensely. Also, the math isn't that complicated as you usually just need to understand it once and then you can apply it globally to other network architectures as well, as they tend to operate on the same underlying principles.
@sonoVR
@sonoVR 4 года назад
This series is great man! I myself am coding along in PHP, recreated the dot function and ReLU functions so I too can have a deep understanding of the subject. Made a Neural Network with 3 of those Dense layers and it flows amazingly well! I'm excited for the next lesson and interested to see whether the ReLU activation function can solve the XOR problem like the Sigmoid does. Perhaps you could explain that sometime :) Anyway, you've got a new subscriber!
@yogiisdaman
@yogiisdaman Год назад
you have remarkable skill in explaining things concisely yet understandably, thank you for your videos!
@shanka8518
@shanka8518 4 года назад
Very good content. Really shows the intuition in how a neural network works. Hopefully pt. 6 comes out soon
@masterfloort
@masterfloort 4 года назад
The animations are so smooth I could just sit there watching a graph all day!
@user-ns8dl3vm5z
@user-ns8dl3vm5z 4 года назад
its like my favorite TV series uploaded new episode, this gonna be wild as always thank u & Daniel :)
@HarshithMK
@HarshithMK 4 года назад
Loving the series so far! All I do is wait for the next video to come out...
@oubaidachouchane8654
@oubaidachouchane8654 3 года назад
Really thank youuuu for this amazing series of videos! I really enjoy the way you explain complex concepts and the visual demonstration is extremely helpful! Many thanks keep it up!
@tanishqvyas8387
@tanishqvyas8387 4 года назад
Really looking forward to the next video. Please keep making videos for this series covering each and every topic in the field of neural networks. I wish there was a certification course from the side of sentdex which we could take up, learn, write exam and get certified
@user-pl2cx2vm4l
@user-pl2cx2vm4l 11 месяцев назад
Man, this is the best neural network tutorial that I've ever seen. Thank you and keep going!
@luciorossi75
@luciorossi75 3 года назад
The sine wave fitting is so informative! Kudos on the animations
@BB-sd6sm
@BB-sd6sm 4 года назад
This is the best video explanation I have ever seen on activation functions. Bravo
@AnilDhulappanavar
@AnilDhulappanavar Год назад
Hats Off Sentdex !!! It has been very helpful for me to learn a little bit of Neural Network. Really aappreciate all the effort gone behind this.
@allanmcelroy
@allanmcelroy 3 года назад
Great series of videos, with clear, thorough explanations. Can't wait for future parts (so much so that I've ordered the e-book :) )
@hadiqakhan3288
@hadiqakhan3288 3 года назад
I just switched my field to computer science now i have a project to make on Neural Network which comprises of space data like creating a model which predicts space parameter. so far iI have been following your tutorials now I'm stuck what to do next? running out of time! but the way you made me come this far with so proper understanding, so much love and respect your way
@kyand920
@kyand920 3 года назад
First I watched the video, I didn't really understand at 100%. I then read the book, and I understood a LOT more, I re-watched the video and I realize how awesome neural networks are!! Thank you Sentdex!!
@yarutgruter4925
@yarutgruter4925 4 года назад
please tell me this series didnt die out, I'am loving this so much!
@elmitte
@elmitte 4 года назад
thumbs up for the content also for the stylistic/aesthetic change! at first I thought you were left handed!!! then I realized that you've flipped your video horizontally to be more engaging with the viewers, well done!
@shobhitbishop
@shobhitbishop 4 года назад
Hi Harrison, Just wanted to thank you for this awesome series on NN, it really helps me alot in understanding things clearly from the scratch. You have built a confidence in me that yes I can also learn this complex topic! Thank you 😊
@sentdex
@sentdex 4 года назад
You're very welcome!
@sharkk2979
@sharkk2979 2 года назад
u r saviour of humanity, ppl will withstand skynet, bcz u showed us how skynet get train. THANKS FOR SAVING MY FUTURE GENERERATION!!
@heavymetalqueenxtc
@heavymetalqueenxtc 4 года назад
That was great. Especially the explanation for why you need a non linear activation function.
@saturnine.
@saturnine. 2 года назад
Alright I'm glad I decided to actually watch this far, because I never quite understood the point of activation functions before now. That was a really nice explanation.
@sidharthgiri1610
@sidharthgiri1610 4 года назад
okay period. the animations and explanation is MINDBLOWING
@1OJosh
@1OJosh 4 года назад
This is amazing, I love your channel. I watch this everyday, I'm going to show my dad this whole playlist. He's going to love it and then we're going to have some fun trying to do something similar but more simple I guess xP
@khaben6986
@khaben6986 2 года назад
I'm literally on the moon 😍 as a beginner who didn't find much good tutorials cuz they involve stuffs I don't know well as statistics, working with tensorflow...etc I was desperate to get the best videos for a start...BUT MIRACULOUSLY I FOUND YOURS 👌👏👏 THE BEST BEGINNER GUIDE SOOOOO FAR ❤
@KylePapili
@KylePapili 4 года назад
Incredible explanation of a high level topic. Should be used in universities / institutions across the globe
@divyanshusahu6413
@divyanshusahu6413 4 года назад
This is the most valuable channel to me and hopefully many others on youtube. PLEASE UPLOAD THE NEXT VIDEO SOON
@AlfrihPetruFeras
@AlfrihPetruFeras 3 года назад
I'm horribly waiting for the next episode... please don't make us wait longer! You're doing a great job and your effort is really appreciated.
@KonstantinPrydnikov1
@KonstantinPrydnikov1 4 года назад
it's absolutly amazing graphical illustration of subject, you are making revolution in vlog teaching, and that approach make you like a god in teaching tech environment)
@ianik
@ianik Год назад
Quality content for free. Second time going through the series. You are a good man
@pnptea173
@pnptea173 3 года назад
Part 6 cant come quick enough, loving this series
Далее
Neural Networks from Scratch - P.6 Softmax Activation
34:01
The Most Important Algorithm in Machine Learning
40:08
Просмотров 226 тыс.
How I Did The Mcdonalds Drink Trick 🤯🥤#shorts
00:16
The Truth About Learning Python in 2024
9:38
Просмотров 107 тыс.
Neural Networks Explained from Scratch using Python
17:38
This is why Deep Learning is really weird.
2:06:38
Просмотров 338 тыс.
Neural Network Architectures & Deep Learning
9:09
Просмотров 769 тыс.
The Sigmoid Function Clearly Explained
6:57
Просмотров 95 тыс.
How I’d learn ML in 2024 (if I could start over)
7:05