Тёмный

Understand Cosine Similarity | 2 Minute Tutorial 

Daniel Krei
Подписаться 1,4 тыс.
Просмотров 9 тыс.
50% 1

This is a quick introduction to cosine similarity - one of the most important similarity measures in machine learning!
Cosine similarity meaning, formula and example!
If you like this video, hit a like and subscribe!
Paper icon used in a video:
www.flaticon.com/free-icons/p..." -Paper icons created by monkik - Flaticon

Опубликовано:

 

15 июл 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 35   
@muna4840
@muna4840 8 месяцев назад
Such a simplistic yet on-point explanation.... cheers mate!
@desecrator718
@desecrator718 20 дней назад
Thanks for the quick and crisp explanation
@maxreed9666
@maxreed9666 2 месяца назад
Brilliantly concise explanation. Have a like. :)
@simo_woman
@simo_woman Год назад
Great having it explained in short! Thanks and waiting for the next one!
@992_cup
@992_cup 2 месяца назад
Thank you for the straightforward examples. On point!
@danielkrei
@danielkrei 2 месяца назад
Thank you!
@minlingg91
@minlingg91 11 месяцев назад
useful video and clear explanation! thank you and keep up the good work!
@danielkrei
@danielkrei 10 месяцев назад
Glad it was helpful!
@amisharadinkhai
@amisharadinkhai Месяц назад
simple and well explained! thank you
@omkarrajmane9408
@omkarrajmane9408 19 дней назад
Wow, wan't expecting much as it was such a short video. But this was the most valuable video that was so concise and made me understand the concept much better! Thanks a lot !
@danielkrei
@danielkrei 19 дней назад
Thank you!
@soberian
@soberian 10 месяцев назад
Great and efficient explanations, you deserve million views, this help me alot. Can you explain about K-Nearest Neighbors? I would love to watch your explanation.
@danielkrei
@danielkrei 9 месяцев назад
Great suggestion!
@galacticimaginarium
@galacticimaginarium 3 месяца назад
Indeed pretty amazing explaination, helped me a lot! Thanks.
@exoticcoder5365
@exoticcoder5365 10 месяцев назад
Very useful to have a quick recall on the calculation part 👍
@EjazAhmed-pf5tz
@EjazAhmed-pf5tz 8 месяцев назад
unbelievable i have tommorrow exam and still writing this comment which i do not usually thank you so much for such simple explanation and explaining 2 hour lecture in just 2 minutes thank you once again
@danielkrei
@danielkrei 8 месяцев назад
Thank you!
@benny9794
@benny9794 6 месяцев назад
Super well explained - thanks man!
@danielkrei
@danielkrei 6 месяцев назад
Glad it was helpful!
@muna4840
@muna4840 7 месяцев назад
Is the magnitude of B ~= 1.732 or ~= **2.236**
@n.waitforit.z7182
@n.waitforit.z7182 10 месяцев назад
Great stuff, thanks!
@danielkrei
@danielkrei 10 месяцев назад
Glad you liked it!
@aleefbilal6211
@aleefbilal6211 14 дней назад
bro looks 18 and 30 at the same time. anyways, great and quick explanation. Thanks.
@mrbacal7
@mrbacal7 8 месяцев назад
Thank you, good work!
@danielkrei
@danielkrei 8 месяцев назад
Glad it was helpful!
@sriharinair227
@sriharinair227 9 месяцев назад
thanks!
@toastrecon
@toastrecon 10 месяцев назад
I was listening to a talk the other day, and someone mentioned that cosine similarity might be replaced eventually by something called "learned representation"? I may have it wrong, but have been struggling to find any info on it. Have you heard of that?
@danielkrei
@danielkrei 10 месяцев назад
thanks for question! Term "learned representation" is usually used when talking about embeddings. Those learned representations (or embeddings) are vectors made by an algorithm, and containing features extracted from input. Similarity or distance measures are used to measure how simmilar/different those embedings are. For example how simmilar are two faces, to images of a car etc.
@toastrecon
@toastrecon 10 месяцев назад
@@danielkrei Oh, interesting! I thinking that the vector store had all of the embedded vectors that were high dimensional representations of what the model found in terms of similarity or relatedness, and that cosine similarity was the algorithm used to quantify relatedness between two ideas, words, or phrases? I didn't know if there was some new way of representing those connections, if adding to the vector store after the initial embeddings were generated would create issues. Maybe you'd be stuck recalculating the entire "matrix" if you added more info? Thanks for the video!
@nikola4628
@nikola4628 10 месяцев назад
Can you explain why 3 vectors? Because there are three sentences? Then you got v1, v2, v3 and v1, v2 and v1, v3, why there is no v2, v3? Is it always first and then all others?
@soberian
@soberian 10 месяцев назад
Isn't the task is to calculate the first one? So which is the most similar to the first one.
@danielkrei
@danielkrei 9 месяцев назад
In this example I was looking for similarity between the first vector and the other two. Depending on the task you may compute the score for other pairs too.
@Wilhuf1
@Wilhuf1 9 месяцев назад
Oh I thought cosine similarity ranged only between 0.0 and 1.0.
@danielkrei
@danielkrei 9 месяцев назад
Good point! It really depends on data you are using but theoretical range is the same as the range of cosine [-1;1]
Далее
Cosine Similarity, Clearly Explained!!!
10:14
Просмотров 84 тыс.
ROLLING DOWN
00:20
Просмотров 6 млн
ВОДЯНОЙ ПИСТОЛЕТ ЗА 1$ VS 10$ VS 100$!
19:09
The Kernel Trick in Support Vector Machine (SVM)
3:18
Просмотров 250 тыс.
5 Math Skills Every Programmer Needs
9:08
Просмотров 1 млн
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 214 тыс.