Тёмный

Eigen values and Eigen vectors (PCA): Dimensionality reduction Lecture 15@ Applied AI Course 

Applied AI Course
Подписаться 88 тыс.
Просмотров 46 тыс.
50% 1

Опубликовано:

 

7 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 14   
@MuctaruKabba
@MuctaruKabba 3 года назад
Thank you sir. Brilliant methods and explanations to help a near beginner trying to grasp the nuance of eigenvalues and eigenvectors as applied in PCA and unsupervised machine learning. I'm not there yet, but I sure feel like I can and will after listening to you. Who needs machines ... lol?
@rajmaheshwarreddy2859
@rajmaheshwarreddy2859 6 лет назад
nice explanation of eigen vectors and its corresponding values; also the geometric picture of spread of the information on 2dto1d!!
@AJ-fo3hp
@AJ-fo3hp 3 года назад
Thank you verymuch It covers 1. column std. matrix 2.Co variance matrix 3.eigen vector 4.eigen value 5.Orthogonal matrix of v1, v2, .... vd 6.projection 7.projection 100%, 75%,60%,50% In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.
@rishabhshirke1175
@rishabhshirke1175 3 года назад
7:57 Always true only if it's a symmetric matrix
@debanjandas4877
@debanjandas4877 5 лет назад
Applied AI Course In 1:57 covariance matrix says Cov of X(S)= Transpose(X).X But in previous vedio we saw Cov of X(S)=1/n( Transpose(X).X ) ..Why this difference??
@msravya7694
@msravya7694 5 лет назад
Sir, what is the purpose of using covariance matrix?
@AppliedAICourse
@AppliedAICourse 5 лет назад
The goal of PCA is to find projection directions along which the variance of the projected data points is maximum. If you would write this goal in mathematical terms, and simplify the resulting objective, you would find a matrix in the resulting eigenvalue equation. This matrix is nothing but the covariance matrix. Hence, the covariance matrix naturally arises from the goal/objective of PCA. Please go through the link: www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/
@gurudakshin6420
@gurudakshin6420 5 лет назад
I dint understand anything from this. Do I need to watch any prior video relates to pca in order to understand this videos concepts
@AppliedAICourse
@AppliedAICourse 5 лет назад
Yes, these lectures are in a sequence. You need to first watch the videos for linear algebra and then watch PCA for a clear understanding. You can watch these videos in order on our website as these are our sample/free videos. Please login to AppliedAICourse.com and check out the sample videos which are in clean order starting from basics of Python
@akhilkrishna8521
@akhilkrishna8521 4 года назад
sir why cant we take v2 as our vector having maximum variance in a particular direction
@AppliedAICourse
@AppliedAICourse 4 года назад
As V1 has higher variance than V2. If you were to pick a single dimension with maximum variance, we pick V1 and not V2.
@prativadas4794
@prativadas4794 5 лет назад
can we get the answer for why u1=v1?
@questforprogramming
@questforprogramming 5 лет назад
Check optimization videos...which are paid in their Website...
Далее
Eigenvalues & Eigenvectors : Data Science Basics
11:58
Просмотров 144 тыс.
Bike Challenge
00:20
Просмотров 20 млн
Fixing Plastic with Staples
00:18
Просмотров 1,4 млн
21. Eigenvalues and Eigenvectors
51:23
Просмотров 630 тыс.
Eigenfaces
4:33
Просмотров 49 тыс.
Principal Component Analysis (PCA)
26:34
Просмотров 407 тыс.
Principal Component Analysis (PCA)
13:46
Просмотров 378 тыс.