Тёмный

Introduction to Machine Learning - 06 - Linear discriminant analysis 

Tübingen Machine Learning
Подписаться 40 тыс.
Просмотров 28 тыс.
50% 1

Опубликовано:

 

20 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 21   
@googlesong8679
@googlesong8679 16 дней назад
this is the best LDA video I have seen. thank you so much.
@saketdeshmukh6881
@saketdeshmukh6881 3 года назад
I wish I had found this before my masters. intuitive with right amount of mathematical rigor.
@YuchengLin
@YuchengLin 2 года назад
So wonderfully presented! Whenever I started to feel there was much math, some cute drawings appeared to give me simple and visceral intuition.
@jiajieli5138
@jiajieli5138 3 года назад
Highly recommended Machine Learning Instruction!
@TheCrmagic
@TheCrmagic 2 года назад
Sir, You are a great teacher.
@AD-ox4ng
@AD-ox4ng Год назад
This is my guess for the number of parameters (in the covariance matrix alone) at 38:16: Full - p^2 (There are p*p distinct elements) Diagonal - p (There are only p distinct elements along diagonal, all else is 0) Spehrical - 1 (Same as diagonal but equal variance in all dimensions, so only one number to compute) If the model is separate, multiply the number above by 2, otherwise 1. Add 2p to account for the mean vectors as well. (There are p distinct means to calculate for each of the two classes)
@woodworkingaspirations1720
@woodworkingaspirations1720 Год назад
This solved my problem. Thank you sir. Needed a summarized view of the math. Perfect.
@IamMoreno
@IamMoreno 2 года назад
simply beautifully explained, sir you have all my gratitude
@micahdelaurentis6551
@micahdelaurentis6551 3 года назад
These have been excellent videos so far
@xiaochelsey880
@xiaochelsey880 2 года назад
Great video. Thank you so much for showing all the math!
@severian6879
@severian6879 Год назад
Excellent explaination! Thank u very much!
@nauraizsubhan01
@nauraizsubhan01 3 года назад
Sir can you please tell Does this course offers any course related to robotics and autonomous systems, during the program.
@vincentole
@vincentole 3 года назад
Great videos! Thank you for this.
@calcifer7776
@calcifer7776 3 года назад
this is gold, thank you
@Jeremy-zs3nn
@Jeremy-zs3nn 3 года назад
Thanks for posting - very helpful video. I did get a bit confused with some of the notation. Looking at the slide titled estimating gaussian parameters (25:49) - the covariance matrix we're estimating is indexing over Ck which is the subset of the design matrix for which Y=k? are X and mu_k both matrixes or is mu_k a vector?
3 года назад
Thanks. Let me see... x_i is a vector (sample number i). mu_k is a vector (average over all samples belonging to class k, so with Y=k). Sigma_k is a matrix (covariance matrix over all samples belonging to class k). I usually use lowercase bold for vectors and uppercase bold for matrices.
@Jeremy-zs3nn
@Jeremy-zs3nn 3 года назад
@ great, thank you for the quick reply!
@CootiePruitt
@CootiePruitt 3 года назад
👍 Great video - thank you!
@indigod3323
@indigod3323 3 года назад
Very great teacher, I wish I could study in Tubingen
@hfz.arslan
@hfz.arslan 3 года назад
Sir can you please share the slides or notes thanks
@sunshinebabe6203
@sunshinebabe6203 3 года назад
Thank you! :)
Далее
FISHER'S DISCRIMINANT ANALYSIS
38:36
Просмотров 14 тыс.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Просмотров 326 тыс.
Linear discriminant analysis (LDA) - simply explained
24:26