Тёмный

MoCo (+ v2): Unsupervised learning in computer vision 

Soroush Mehraban
Подписаться 3 тыс.
Просмотров 2,2 тыс.
50% 1

In this video, I present an elaborate explanation of the MoCo (Momentum Contrast) framework, encompassing both MoCo and MoCo v2 versions. This innovative approach serves as a novel method for pretraining computer vision models in an unsupervised manner. By leveraging extensive collections of unlabeled images, this technique effectively enhances the overall generalizability of our models.
Paper link: arxiv.org/abs/1911.05722
Table of Content:
00:00 Introduction
00:27 Unsupervised Learning in NLP
01:28 Analogy in Computer Vision
02:51 Contrastive Learning
09:35 Larger dictionary
10:44 MoCo
14:31 PyTorch pseudocode of MoCo
21:17 Additional notes
24:57 Results
27:16 MoCo V2
Icon made by Freepik from flaticon.com

Опубликовано:

 

28 июн 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 14   
@shilashm5691
@shilashm5691 6 месяцев назад
reason for having labels as zeros: labels is the ground truth 'index' for the 1+len(queue) wide tensor logits. As shown in the code snippet of KaimingHe's answer, this is always index 0 (l_pos is the first column of the result tensor logits). logits later is feed into the CrossEntropy criterion, i.e. the contrasting happens through the entanglement of the logit scores by the softmax function.
@starshipx1282
@starshipx1282 10 месяцев назад
Pls keep up the great work. You will eventually have a much larger audience. Love you explanations and choice of papers. Thanks a lot Soroush 😊
@soroushmehraban
@soroushmehraban 10 месяцев назад
Thanks for the feedback! Appreciate it🙂
@ramonp88
@ramonp88 3 месяца назад
Thank you! Awesome.
@akbarmehraban5007
@akbarmehraban5007 Год назад
It is perfect for me
@alihadimoghadam8931
@alihadimoghadam8931 Год назад
👏👏
@syedabdul7684
@syedabdul7684 8 месяцев назад
0 means 1st class. Generally classes in classification are indexed 0 to N-1, so label 0 means positive class
@soroushmehraban
@soroushmehraban 8 месяцев назад
But when we look at cross entropy formula, label value would be multiplied by log of predicted value. How can it be positive when it is zero and the result of that multiplication ends up being zero?
@syedabdul7684
@syedabdul7684 8 месяцев назад
@@soroushmehraban internally, I think it converts the batch N to one hot encoded NxC.
@soroushmehraban
@soroushmehraban 8 месяцев назад
@@syedabdul7684 I checked the code on GitHub though. Couldn’t see that.
@syedabdul7684
@syedabdul7684 8 месяцев назад
@@soroushmehraban The target that this criterion expects should contain either: Class indices in the range [0, C) where C is the number of classes. From PyTorch docs, as you can see it expects class indices 0 to C-1(C not included)
@soroushmehraban
@soroushmehraban 8 месяцев назад
@@syedabdul7684 I see. Now it makes sense! Thank you for the clarification.
Далее
DINO: Self-Supervised Vision Transformers
21:12
Просмотров 2,2 тыс.
220 volts ⚡️
00:16
Просмотров 487 тыс.
The Most Important Algorithm in Machine Learning
40:08
Просмотров 292 тыс.
Fast R-CNN: Everything you need to know from the paper
38:37
Contrastive Clustering with SwAV
18:47
Просмотров 11 тыс.
Supervised Contrastive Learning
30:08
Просмотров 57 тыс.
Momentum Contrastive Learning
19:51
Просмотров 14 тыс.
220 volts ⚡️
00:16
Просмотров 487 тыс.