Learn how to code a simple Convolutional Neural Network with this fully annotated Jupyter Notebook: lightning.ai/lightning-ai/studios/build-train-and-use-a-convolutional-neural-network The full Neural Networks playlist, from the basics to deep learning, is here: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-CqOfi41LfDw.html Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
Hey josh, is that possible to make videos about rcnn, fast rcnn, faster cnn & yolo I watched some videos and read some paper, didnt clear explain math part(only understand basic concept) Especially how to caculate selective search, how to train(one image contain many classification, how to train when we have many many images)
As a Cambridge qualified PhD Mathematician, I cannot begin to describe how fantastic your series are. The way you simplify the concepts, yet keep true to the underlying Mathematics is quite amazing. Not to mention the great animations, dynamic graphs and equations, etc. Well done Josh, for making principled data science accessible to the general audience.
I can't imagine how much time and effort you put for: 1. Creating the content and simplifying it for us 2. Create the animated ppts 3. Explaining every step with great detail and simplicity I just wanna give a huge hug to you sir! You are an asset. ❤❤
Amazing work. I'm started learning DS and I can't imagine how I can handle all of this information without your videos. Big thanks for everything you've done, do and will do
This is the best explanation for CNN I have ever come across. I am very sure this is best I will ever see. I cannot you thank you enough. I have had explanations from my instructors who are PhD, MTechs and what not!! even they could not explain why filters are able to extract features and why we use global pooling. The answer I got was to reduce the number of inputs nodes to NN (which is partly true also) but the way you have explained the importance of pooling, I was amazed and equally happy to see. Thank You Josh Sir. I think you should be knighted for your efforts 😃👏🏻👏🏻👏🏻👏🏻👏🏻
I can't stop thanking you for your content! I am a master in data science student and usually before engaging with the commonly unfathomable statistical learning books I come to your channel to grasp the topics.
"I don't know how much time does Artificial Neural Networks take to train, learn the input data, But you are putting more efforts and it taking much time in your training time".Thanks to your efforts sir. , your videos really explains very well and it helps us in visualizing easily.
I've never seen such a simple yet very good explanation of a CNN. Thanks a lot! As a non-native english speaker I really love the simplicity and the written texts in your videos.
I literally binged your neural network videos in a day like a Netflix show and now realized that I am at the end of the series to date and I need to wait for a new episode!
I was really confused on this concept before I came across this video, now I feel I understand it way better. You really helped a lot! Thank you so much!
Came for this video, ended watching half of the series. Just learned this last week in deep learning and wanted to repeat everything neat and nice, thank you very much!
This series is the best thing that happened to me before my Deep Learning exam lol. Everything is explained in such a simple and fun matter and it made me actually enjoy learning these concepts and makes me want to learn even more about the subject.
I am speechless, of your work and how you achieved your teaching intention, at least in my case, I would say that this explanation is PERFECT, I havent watched a lot of youtube friendly explaining videos on CNNs but surely this one is perfect, dont need to see another one🎉
Hi Josh, you know you are awesome, you know you and I both are in this domain and I have also started learning to play guitar. I hope this channel will help me in my journey.
I love your videos, I have binge watched your entire machine learning series. One suggestion I might add is the following: It can be confusing to use 1s to represent black pixels and 0s to represent white pixels, because in Computer Vision a black pixel has a value of 0 and a white pixel has a value of 255. So when normalized Black = 0 and White= 1. Thank you so much for these videos btw I love them.
Thank you Josh!!! You truly are the best at explaining these concepts. I would love to see future videos on how to train the kernels, and more on image recognition/computer vision (clearly explained of course). I also got your book and it's really nice, maybe there can be a part 2 in the future 👀
All of your videos are amazing. You are very talented to explain complicated things in a simple way. I am looking forward to seeing embedding, attention and transformers videos from your point of view.
This is my favorite video so far! I not so familiar with math but I want to learn all this stuff because I love science and I need this background and your videos have made my journey not just easier but possible! Thank you so much!
I've watched all your videos on NNs. It would be really great if you could make a video on backpropagation for CNNs. It would complement the series. Your hard work is highly appreciated.
I've been intimidated by the name CNN for so long only to find out it's simple after all. Thanks for creating such a lecture by simplifying things that I could understand. Can't imagine the work behind it. Could you please do more neural networks such as RNN's and LSTM's to this series?
The best CNN explanation I've ever seen. However, i have one question about the part of classification of 0 or 1. As a classification problem, why there is no sigmoid or softmax function used in the last layer, are we just using the raw output to make prediction?
Typically you would probably want to use softmax paired with the cross entropy loss function for this sort of problem. However, to keep the network as simple as possible (i.e. in order to fit it on the screen) and because the math still worked out, I just used the sum of the squared residuals of the raw output to train this CNN. I was surprised that it worked, but it did! BAM!
I really enjoyed it; although I'm somewhat familiar with the CNN, bbut the part where pooling basically rewards the match between the filter & the image clicked :] best,
Take my words , this is the very very best explanation I ever seen of neural networks mechanism.you made all imagination in to visualisation. See if you help to understand us LSTM
Josh I want your NLP videos. Your videos are soooo good that I can't express. But also I don't want others to get such valuable resources for free. Hehe!
hey josh, watched your entire playlist on neural nets and appreciate all of the work you have done just a suggestion - you missed fairly important concepts like strides and padding looking forward for more videos 👍🏻