Тёмный

How to remember (instead of catastrophically forget) 

Thinkstr
Подписаться 1,5 тыс.
Просмотров 1,1 тыс.
50% 1

Опубликовано:

 

5 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 16   
@ritvikmath
@ritvikmath 3 года назад
Love this series on catastrophic forgetting, I'd never thought about it before!
@Thinkstr
@Thinkstr 3 года назад
I hadn't heard of it until a few weeks ago!
@microwavecoffee
@microwavecoffee 3 месяца назад
Stupid anime reference from me but in the anime psychopass they averaged out the brains of the most societally outcast people to create a superintelligence to manage society, kinda feels similar to remembering the most memorable examples 😂 Great video 👍
@Thinkstr
@Thinkstr 3 месяца назад
I watched an episode or two of psychopass but didn't get that far! I gotta keep going... Popculture references aren't stupid, they can be great ways to communicate! One of my favorite animes is Gurren Laggan, folks have serious spiritual relationships to that one, haha
@microwavecoffee
@microwavecoffee 3 месяца назад
​@@Thinkstrthat's a classic for sure!
@Omnicypher001
@Omnicypher001 6 месяцев назад
so ideally you have a data set of not just poor handwriting, but examples where 2s look like 1s and 1s look like 2s, so it gets good at solving the edge cases. you want there to be a fine line between concepts, and weird examples allows it to classify clearly. labeling ambiguous examples is worth more than labeling obvious examples, because they better define the boundary of the vector space holding these answers. extremely ambiguous examples are by definition, at the boundary of their classification, and if they cancel each other out, you don't really need normal examples, because those are just the average of the extreme examples. Weird examples don't just make it better at remembering, it makes it better at understanding the actual boundaries of the concepts its classifying.
@Thinkstr
@Thinkstr 6 месяцев назад
I think that's a good way to put it. If you remember the really unusual cases, maybe they're describing the most notable features.
@abdulrazique4779
@abdulrazique4779 12 дней назад
Is it not possible to increase number of classes dynamically? like at start we only know there is 5 classes only(0-4) but we don't know how many more classes will come so made the model with 5 outputs, then 3 more classes came and we add 3 more neurons to last layer (let say 5, 8 and 9) also at this stage we don't know how many more classes will come and at end 2 more classes came so we add 2 new output neurons for 6 and 7.
@Thinkstr
@Thinkstr 12 дней назад
Maybe that would work, but if we stop training with examples of the earlier classes, I think forgetting could still be a problem.
@AbdulRazique-z2j
@AbdulRazique-z2j 12 дней назад
@@Thinkstr yes, but what if we update fisher matrix before every new task. will it work? means is it possible to add new weights every time we add new classes in the model?
@Thinkstr
@Thinkstr 11 дней назад
@@AbdulRazique-z2j Huh, I really don't know... You're reminding me of GANs which make progressively larger images, first learning small images and then expanding them.
@AbdulRazique-z2j
@AbdulRazique-z2j 11 дней назад
@@Thinkstr na not GANS, just neural network, in which we can increase output neurons of last layer when new class introduce.
@JanosLagos
@JanosLagos 9 месяцев назад
Where can I see the code you made?
@Thinkstr
@Thinkstr 9 месяцев назад
I'm afraid I can't find the code for this video, but the next video did the same thing for reinforcement learning, and I've got that code here: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-fxyttf6T5cA.html github.com/TedTinker/Tinker_FROMP_RL
@vi5hnupradeep
@vi5hnupradeep 2 года назад
thankyou so much man ! good work 💯
@Thinkstr
@Thinkstr 2 года назад
Thanks for watching! I gotta make more of these.
Далее
Continual Learning and Catastrophic Forgetting
42:07
Просмотров 13 тыс.
This mother's baby is too unreliable.
00:13
Просмотров 14 млн
Catastrophic Forgetting
3:10
Просмотров 1,5 тыс.
Are you a fish? (Your Inner Fish)
12:53
Просмотров 1,8 тыс.
AI / Neuroscience Chat - Catastrophic Forgetting
32:32
Просмотров 1,1 тыс.
Ted's Brain
7:50
Просмотров 116
Welcome to Smallville!
23:28
Просмотров 284