Тёмный

CS 152 NN-8: Multi-label classification 

Neil Rhodes
Подписаться 3,6 тыс.
Просмотров 6 тыс.
50% 1

Day 8 of Harvey Mudd College Neural Networks class

Опубликовано:

 

10 фев 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 5   
@736939
@736939 2 года назад
Interesting, why do we use not a cross-entropy but binary-cross-entropy for the multi-label classification?
@NeilRhodesHMC
@NeilRhodesHMC 2 года назад
Because it's as if we have k separate binary classification problems. So, for each, we use binary cross-entropy.
@gregthemagican
@gregthemagican Год назад
Great tutorial! A lot of useful information! Although applying some pretrained models in my problem I am left with a very atrange situation. After running a prediction on test data the model returned probabilities per label for images in test but those probabilities are very similarto other probabilities. So for example I get probability minimum 0.24992 and maximum 0.25801 for label X, min 0.396754 and max 0.399788 for label Y etc. So basically the result is almost the same for every image. While training for more epochs I'm down even with the same probabilities for every image for each class label and I do not know what to do now as I can't find any answer why it's like that. I tried manipulating parameters like learning rate, adding dropout but it did not help. Could you help me as an expert what might be the cause?
@NeilRhodesHMC
@NeilRhodesHMC Год назад
Try training with a very small number of images, and verify that you can get the training loss low, and the training accuracy high with those small number of images. This should work because the NN will overfit on those images (think of it as memorizing the images). Only if that works should you try a larger number of images.
@gregthemagican
@gregthemagican Год назад
@@NeilRhodesHMC Thanks a lot for a quick answer! By a very small number of images do you mean restricting the amount of images per training epoch or smaller batch size? In terms of a loss, loss is also pretty stable and hardly moving per epoch being set on 0.7 + - 0.1. It's very strange and I tested multiple architectures but still after x epochs still same results :(
Далее
CS 152 NN-8:  Multi-category classification
8:17
ELA NÃO ESPERAVA POR ISSO 🥶 ATTITUDE #shorts
00:20
CS 152 NN-8:  Regularization-Label Smoothing
4:38
Просмотров 4,1 тыс.
Physics Major vs Math Class
4:10
Просмотров 2,2 млн
Machine Learning | Multi Label Evaluation Metrics
8:18