Тёмный

Categorical Cross - Entropy Loss Softmax 

Matt Yedlin
Подписаться 2,1 тыс.
Просмотров 17 тыс.
50% 1

This is a video that covers Categorical Cross - Entropy Loss Softmax
Attribution-NonCommercial-ShareAlike CC BY-NC-SA
Authors: Matthew Yedlin, Mohammad Jafari
Department of Computer and Electrical Engineering, University of British Columbia.

Наука

Опубликовано:

 

5 фев 2020

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 24   
@rickragv
@rickragv 3 года назад
this is hidden gem! Thank you!
@abiolaadeye2961
@abiolaadeye2961 3 года назад
Excellent Video Series. I love the question and answer format ! Thanks!
@veganath
@veganath Год назад
Wow! Hats of to you guys, perfect in demystifying Categorical Cross-Entropy.... thank you!
3 года назад
thank you guys! such a great explanation!
@bengonoobiang6633
@bengonoobiang6633 2 года назад
Good Video. The course format make the course look so easy to understand.
@keiran110
@keiran110 3 года назад
Great video, thank you.
@himanshuvajaria6426
@himanshuvajaria6426 3 года назад
Thanks guys!
@cedricmanouan2333
@cedricmanouan2333 3 года назад
Great !
@igomaur4175
@igomaur4175 2 года назад
wowww
@g.jignacio
@g.jignacio 4 года назад
Very good explanation! it's been so hard to find a numerical example. Thank you guys!
@keshavkumar7769
@keshavkumar7769 3 года назад
wonderfull
@rizalalfarizi9196
@rizalalfarizi9196 4 года назад
thank you very much clear explanation, love it sir
@hassanmahmood7284
@hassanmahmood7284 3 года назад
Awesome
@trajanobertrandlleraromero6579
@trajanobertrandlleraromero6579 5 месяцев назад
Vine buscando cobre y encontré oro!!!!
@RitikDua
@RitikDua 4 года назад
Very good explaination
@MultiTsunamiX
@MultiTsunamiX 4 года назад
At 6:44 there is a mistake in the equation, .715 should be in last log parenthesis instead of .357
@KuldeepSingh-cm3oe
@KuldeepSingh-cm3oe 3 года назад
Brilliant
@wesleymelencion3618
@wesleymelencion3618 3 года назад
why were you using logarithm base 2 ?
@raymondchang9481
@raymondchang9481 10 месяцев назад
how much is an intercontinental ballistic missle?
@shivamgupta187
@shivamgupta187 3 года назад
if i am not wrong, you have used softmax function to normalize i.e. to sum up the probability to 1 but in your examples it is .147+.540+.133+.180 = 1 .160+.323+.357+.160 = 1 .188+.118+.715+.079 = 1.1 can you please help me to understand the above discrepancy
@horvathbenedek3596
@horvathbenedek3596 3 года назад
You can see that they messed up, and wrote .188 instead of .088 when transferring from the softmax to the y-hat vector. I guess they added y-hat manually, resulting in the mistake.
@BrandonSLockey
@BrandonSLockey 3 года назад
batman and robin
@lucyfrye6723
@lucyfrye6723 Год назад
It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.
@mattyedlin7292
@mattyedlin7292 Год назад
Hello Lucy Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.
Далее
Machine Learning: Where are We?
2:33
Просмотров 406
Binary Cross-Entropy
10:20
Просмотров 13 тыс.
ЮТУБ ТОЧНО ВСЕ!
11:23
Просмотров 786 тыс.
Brawl Stars Animation: PAINT BRAWL STARTS NOW!
00:52
Cross-Entropy Loss Log-likelihood Perspective
12:34
Просмотров 8 тыс.
Automatic Differentiation.
12:42
Просмотров 12 тыс.
Softmax Function Explained In Depth with 3D Visuals
17:39
Activation Functions - Softmax
8:41
Просмотров 35 тыс.
Water powered timers hidden in public restrooms
13:12
Просмотров 650 тыс.
Why do we need Cross Entropy Loss? (Visualized)
8:13
Cross Entropy Loss Error Function - ML for beginners!
11:15
iPhone 16 - 20+ КРУТЫХ ИЗМЕНЕНИЙ
5:20