Тёмный

Hierarchical Cluster Analysis [Simply explained] 

DATAtab
Подписаться 173 тыс.
Просмотров 71 тыс.
50% 1

Опубликовано:

 

5 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 61   
@4chanFootballMemes
@4chanFootballMemes 7 месяцев назад
I loved learning about "Heyrakikal" clustering
@amobindubuisi2631
@amobindubuisi2631 4 месяца назад
this is an extremely good material. top-notch. never seen something so easily explained as done on this content.
@odiakaolika5715
@odiakaolika5715 6 месяцев назад
You just made my evening with your simple explanation.
@datatab
@datatab 6 месяцев назад
Glad it was helpful and many thanks for your feedback! Regards Hannah
@kartikeyamishra9995
@kartikeyamishra9995 Месяц назад
Thanks for the wonderful video, Hannah! This is great material. I am preparing for a risk certification, and this really helped me revise my concepts in a much better way. Have a great day!
@prernaprasad7665
@prernaprasad7665 Месяц назад
I love your channel. Your explainations are so good and so clear.
@datatab
@datatab Месяц назад
Thank you so much!
@osmancetinkaya8930
@osmancetinkaya8930 Год назад
How might be the sqr of 17 (16+1) =equal to 3,162 ? it must be 4,123 is not?
@manuelruelas3496
@manuelruelas3496 Год назад
The error is that the x distance is 3 (from 1 to 4) not 4, so it’s the sq root of 10.
@ozgurogur1297
@ozgurogur1297 Год назад
I found it very understandable and simple. thanks a lot!
@nakirambau7632
@nakirambau7632 11 месяцев назад
thank you so much, you have explained it so well
@datatab
@datatab 11 месяцев назад
Glad it was helpful!
@jff711
@jff711 Месяц назад
Well-explained. Thanks!
@ibrahimabubakarzango9803
@ibrahimabubakarzango9803 6 месяцев назад
Pls endeavour to avoid making mistakes thanks for comment section i could have got it so difficult to comprehend. That aspect of sqrt of 17 is terrible. But u did well and this video is good too
@datatab
@datatab 6 месяцев назад
Hi thanks for youre feedback! We try to avoid mistakes, sorry for that and for the resulting trouble! Regards, Hannah
@Oladayo1
@Oladayo1 6 месяцев назад
well, that's because it's the sqrt of 10 not sqrt of 17. The mistake was using 4 instead of 3
@alexfrancois
@alexfrancois Год назад
Beautifully explained, thanks! 🙏 Incredibly clear.
@asmaaadel-z7h
@asmaaadel-z7h 18 дней назад
Great video, thanks ❤
@Motivasi.Quotes
@Motivasi.Quotes 3 месяца назад
such a very good vidio. Thank u so much for your explanation
@rileyharper7679
@rileyharper7679 10 месяцев назад
The Euclidean distance horizontal component at 2:17 should be 3 not 4 since 4 - 1 = 3. Also, the manhattan distance should be 4 and the maximum distance should be 3 for the same reason.
@playbros332
@playbros332 9 месяцев назад
I agree they are wrong, but shouldn't it be square root of 17, which is 4.12?
@fabianr9394
@fabianr9394 9 месяцев назад
Because you go 3 steps to the right and 1 up; so sqrt(3^2 + 1^2)​@@playbros332
@manuelleitner1996
@manuelleitner1996 Год назад
Great video, thank you!!!
@datatab
@datatab Год назад
My pleasure!
@lazartrifunovic3831
@lazartrifunovic3831 3 дня назад
Is this Agglomerative Clustering?
@H4ck3er01
@H4ck3er01 Год назад
well explained thank you so much
@rodidoesburg4061
@rodidoesburg4061 8 месяцев назад
How do you name the clusters? Just from left to right, so cluster 1, cluster 2, cluster 3. Or are there more methods to name a cluster?
@ibethdiaztapia1033
@ibethdiaztapia1033 3 месяца назад
hi. it should 3 - 1 for euclidean as the formula is square of XB1-XA1
@shawnkim6287
@shawnkim6287 Год назад
thank you so much. you clarified a lot!!!! 😀
@nazhifmuh.kasyfan2148
@nazhifmuh.kasyfan2148 5 месяцев назад
I would like to ask, is Hierarchical Cluster Analysis always associated with the Euclidean Distance? Thank you
@datatab
@datatab 5 месяцев назад
Hi many thanks for your question, Hierarchical Cluster Analysis (HCA) is not always associated with the Euclidean distance. While Euclidean distance is commonly used, HCA can work with various distance metrics depending on the nature of the data and the analysis goals. Here are some common distance metrics used in HCA: - Euclidean Distance: This is the straight-line distance between two points in a multi-dimensional space. It's one of the simplest and most widely used distance metrics. - Manhattan Distance (also known as City Block or L1 distance): This is the sum of absolute differences between coordinates. It can be suitable when diagonal movement isn't meaningful. - Cosine Similarity: This measures the cosine of the angle between two vectors, commonly used in text analysis and other contexts where vector magnitude might vary. - Mahalanobis Distance: It accounts for correlations in data by incorporating the covariance matrix, making it suitable for data with different scales and correlations among variables. - Minkowski Distance: A generalization of Euclidean and Manhattan distances, with a parameter 'p' to control the degree of the norm. - Correlation-based Distance: This distance uses the correlation between data points rather than absolute differences. It's common in gene expression analysis or other contexts where relationships between variables matter more than absolute values. I hope this was helpful : ) Regards Hannah
@samuraixyz22
@samuraixyz22 Год назад
I would like to RU-vid tutorials like this. Do you have recommendations on what softwares to use?
@datatab
@datatab Год назад
DATAtab : )
@samuraixyz22
@samuraixyz22 Год назад
@DATAtab where can you learn more about it?
@saurabhjoshi3010
@saurabhjoshi3010 10 месяцев назад
nicely explained
@matheusdelima1743
@matheusdelima1743 Год назад
Great content. I'm a fan :)
@datatab
@datatab Год назад
Glad it was helpful and many thanks for your nice feedback! Regards Hannah
@iqraahmad130
@iqraahmad130 Год назад
youre kinda cute
@luisamar8214
@luisamar8214 5 месяцев назад
How you calculate the distances between Lisa, Joe with the others?? you have a group of positions not just one... how do you do that? thankss!
@datatab
@datatab 5 месяцев назад
Hi, in this case you would first calcualte the center between Lisa and Joe and then the diestance from this center to one other Person. Regards Hannah
@ricardorivashernandez4023
@ricardorivashernandez4023 Год назад
Real good!
@maxwellspyk494
@maxwellspyk494 Год назад
hi where can i find the elbo method
@datatab
@datatab Год назад
Oh sorry, it will be there soon!!!
@fredh3152
@fredh3152 5 месяцев назад
i love your accent
@datatab
@datatab 5 месяцев назад
: )
@s.h.a6472
@s.h.a6472 3 месяца назад
خدا خیرت بده بانو
@ahmad3823
@ahmad3823 5 месяцев назад
4-1=3 though!
@datatab
@datatab 5 месяцев назад
: )
@python4ncert202
@python4ncert202 Год назад
Nice video! I want to know the name of algorithm that you have used here to explain hierarchical clustering.
@Nothingimportant1
@Nothingimportant1 Год назад
I want too, but it is hight probable that she does not tell us. Statistics saying.
@muhammadwaseem_
@muhammadwaseem_ Год назад
@@Nothingimportant1 AGNES
@user-vo4ew1gx
@user-vo4ew1gx Год назад
Excellent explanation. Why it takes too long to create a new video?
@datatab
@datatab Год назад
Good question! : ) We need almost two weeks to prepare the topic and to create the slides! Regards Hannah
@user-vo4ew1gx
@user-vo4ew1gx Год назад
@@datatab i hope it will be fast :)
@mahidahmed7
@mahidahmed7 Год назад
klaaastarrrrss
@PaulKam1997
@PaulKam1997 Год назад
is and not und at 3:15
@datatab
@datatab Год назад
Thanks : )
@asrarbw
@asrarbw 6 месяцев назад
Claaaastars 😂
@abdulaziznazarov9661
@abdulaziznazarov9661 7 месяцев назад
i think you have a mistakes with calculating
Далее
Clustering: K-means and Hierarchical
17:23
Просмотров 200 тыс.
Exploratory Factor Analysis
15:25
Просмотров 170 тыс.
HA-HA-HA-HA 👫 #countryhumans
00:15
Просмотров 5 млн
ВЫЖИЛ В ДРЕВНЕМ ЕГИПТЕ!
13:09
Просмотров 180 тыс.
StatQuest: Hierarchical Clustering
11:19
Просмотров 438 тыс.
IAML19.5 Single-link, complete-link, Ward's method
8:51
12. Clustering
50:40
Просмотров 300 тыс.
Logistic Regression [Simply explained]
14:22
Просмотров 184 тыс.