Тёмный

[DL] Categorial cross-entropy loss (softmax loss) for multi-class classification 

Badri Adhikari
Подписаться 4,7 тыс.
Просмотров 11 тыс.
50% 1

This video is about [DL] Categorial cross-entropy loss (softmax loss) for multi-class classification

Наука

Опубликовано:

 

2 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 11   
@djeros666
@djeros666 2 года назад
This was unbelievably helpful!! thank you!! very clear explanation!!
@robertthapa5038
@robertthapa5038 3 года назад
Nice explanation. Although few words to highlight some key points regarding the topic. - We take the exponential function while calculating the probablities f(s)i because any returned function score could be negative and the exponential helps to avoid that. While dividing by the sum of exponentials is done in order to make the probabilities sum equal to 1 (normalization). - The logarithmic function is used to calculate cross entropy for given proabilities because products are expensive and log helps to convert product to sum function. Again negative sign is used before sum because log for values between 0 and 1 results negative values.
@100deep1001
@100deep1001 2 года назад
Spot on!
@100deep1001
@100deep1001 2 года назад
The log also acts as a strong penalizer for large loss values. That's the main intuition behind using log
@nehakhawer7895
@nehakhawer7895 2 года назад
excellent and consice mathematical explanation backed up by a numerical example ,Thankyou
@_skeptik
@_skeptik 2 года назад
Thank you so much for this good explanation
@Dz-yz8rt
@Dz-yz8rt 3 года назад
Thank you, this helps me a lot!
@mekdesmekonnen2242
@mekdesmekonnen2242 2 года назад
well explained!
@dhoomketu731
@dhoomketu731 3 года назад
well explained.
@fujiwatiirawanayusaputri6954
@fujiwatiirawanayusaputri6954 2 года назад
Character In the video It's great, I like it a lot $$
@subhodeepmondal7937
@subhodeepmondal7937 9 месяцев назад
I've only one question, if the network wrongly classifies a 0 1 0 (one hot) to (0 0.56 0.44) so how cross entropy loss is penalizing for 0 represented as 0.44? is it neglecting?
Далее
Categorical Cross - Entropy Loss Softmax
8:15
Просмотров 16 тыс.
Intuitively Understanding the Cross Entropy Loss
5:24
Why do we need Cross Entropy Loss? (Visualized)
8:13
What is entropy? - Jeff Phillips
5:20
Просмотров 4,4 млн
Water powered timers hidden in public restrooms
13:12
Просмотров 651 тыс.
New Git Users Be Like...
3:09
Просмотров 625 тыс.
TGV speed record 574,8 km/h
9:20
Просмотров 50 млн
Can you solve the pirate riddle? - Alex Gendler
5:24
The Most Beautiful Equation in Math
3:50
Просмотров 13 млн