Тёмный

Neural Networks [Machine Learning] #4: Python Implementation 

Hello World HD
Подписаться 6 тыс.
Просмотров 3,8 тыс.
50% 1

Опубликовано:

 

29 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 16   
@rolandray3106
@rolandray3106 6 лет назад
Great video. A question though, how do we avoid overfitting?
@hwhd
@hwhd 6 лет назад
Great question! That is something that I should have addressed in the video, but anyhow, before you do anything you must first know whether or not you are overfitting. You can do this by comparing the cost with the training data and the validation data. If they are very similar then just leave your neural network the way it is. If the cost with training data was significantly better than with validation data then you are overfitting, and you can do several things in order to change this. The easiest approach would be to have an extremely large training set. This would result in an accurate representation of the overall population (all the data), and therefore you can easily find a good relationship between inputs and outputs. You could also slightly augment the data in order to, essentially, get more of it. For example, if you were trying to identify which number a certain handwritten digit on a greyscale (black and white) represented, you could rotate the numbers, or scale them, a small random amount and add another data point. However, if you do this too much, the Neural Network would find that the primary relationship was rotation or scaling between different pixels which compose the digit. Still, we can do more than that. You can also do something called "dropout". This means you randomly don't use certain neurons and their synapses; they are "dropped out". Here is a good illustration of it on a Neural Network model: cdn-images-1.medium.com/max/1800/1*yIGb-kfxCAK0xiXipo6utA.png Dropout serves the purpose of making sure that very complicated co-adaptations don't develop. Co-adaptations occur when many neurons are used to find a single relationship. Oftentimes, this relationship is an overly-complex one that is only present in the training data. A simpler way of doing this would be to reduce the number of parameters (# of neurons, hidden layers, ect.), however, this doesn't allow for a large Neural Network which, despite its susceptibility to unneeded co-adaptations, offers practical benefits for certain datasets.
@rolandray3106
@rolandray3106 6 лет назад
This is info very useful. Thanks.
@fatemeh2560
@fatemeh2560 5 лет назад
@@hwhd Great vedio thanks alot. Is there a way to use the early stop technique? I mean keeping a sub sample of the data as validation set and while training the rest of the data checking if the error over the validation set is increasing at one point and then stop the training at that point. Would the code be very complicated if we do this? I personally have no idea how to incorporate early stop technique in python. Thanks
@davidludvigsen4667
@davidludvigsen4667 4 года назад
Your videos are AWESOME, I have not been able to find anyone anywhere that both explain the mathematics behind NN's and shows a build up from the bottom how to make a real NN using python. Hats off man! I had a bit of a rough time seeing how you made the maths into that chunk of code you used. In my opinion that could have been explained a bit more, but that might just be because i don't know the numpy lib well enough to understand what functions you are using xD Thanks again for the great video!
@hwhd
@hwhd 4 года назад
Thanks!
@JamesSmith-qq9dd
@JamesSmith-qq9dd 6 лет назад
Great job! Very informative and helpful.
@hwhd
@hwhd 6 лет назад
Thanks!
@FeKelvin
@FeKelvin 5 лет назад
Very well explained! Congratulations boy!
@hwhd
@hwhd 5 лет назад
Thanks!
@johnjohn8273
@johnjohn8273 6 лет назад
Great video! Thanks for making this!
@marktwain7091
@marktwain7091 6 лет назад
Great video! Just enough detail for me to get the gist.
@SKIllEXable
@SKIllEXable 6 лет назад
love you my dude ur the best
@conradclasen7901
@conradclasen7901 5 лет назад
Thanks for making this. Super helpful
@charliemccoy7731
@charliemccoy7731 6 лет назад
Good job
Далее
Neural Network Regression Model with Keras | Keras #3
19:05
ДУБАЙСКАЯ ШОКОЛАДКА 🍫
00:55
Просмотров 3,2 млн
Intro to graph neural networks (ML Tech Talks)
51:06
Просмотров 179 тыс.
What are Genetic Algorithms?
12:13
Просмотров 51 тыс.
neural network from scratch in C
8:58
Просмотров 85 тыс.