Тёмный

Principal Component Analysis (The Math) : Data Science Concepts 

ritvikmath
Подписаться 163 тыс.
Просмотров 91 тыс.
50% 1

Let's explore the math behind principal component analysis!
---
Like, Subscribe, and Hit that Bell to get all the latest videos from ritvikmath ~
---
Check out my Medium:
/ ritvikmathematics

Опубликовано:

 

10 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 168   
@loveena419
@loveena419 3 года назад
Finally, a video that explains the math behind PCA so clearly. Went through all the other videos and it helped a lot! Thank you!
@joachimguth6226
@joachimguth6226 4 года назад
Very well presented. You are a great teacher. Hopefully you are going to cover the entire AI space.
@ritvikmath
@ritvikmath 4 года назад
That is the goal!
@warrenbaker4124
@warrenbaker4124 3 года назад
@@ritvikmath Oh wow!!! I'm so happy to see you're taking this on. I'm a huge fan and this is a real highlight for me. Thanks for all you do!!
@Moiez101
@Moiez101 Год назад
@@ritvikmath i fully support that goal! I just started with data science bro. Loving your videos, you're a great teacher.
@qaarloshilaal2778
@qaarloshilaal2778 3 года назад
Thanks infinitely for all your videos, you're literally the best at explaining these concept in a clear and excellent way in order to continue with what we have to study/ do! Huge respect man.
@ritvikmath
@ritvikmath 3 года назад
You're very welcome!
@pigtowndanzee
@pigtowndanzee 4 года назад
Love your teaching style. Keep these videos coming!
@kaeruuuu_
@kaeruuuu_ 2 года назад
Necessary videos: 1. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-X78tLBY3BMk.html (Vector Projections) 2. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-glaiP222JWA.html (Eigenvalues & Eigenvectors) 3. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-6oZT72-nnyI.html (LaGrange Multipliers) 4. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-e73033jZTCI.html (Derivative of a Matrix) 5. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-152tSYtiQbw.html (Covariance Matrix)
@TamNguyen-qi8di
@TamNguyen-qi8di 3 года назад
Dear rivitmath, Thank you so much sir for your clear explanation. Even being in my last year of college, I am still struggling with the basics of statistics. With your help, I have been striving exponentially in class and looking to graduate from college in this semester. Your videos have been so so so helpful and i wish you an amazing health to continue with your content. I wish you could have been my professor in college. Thank you for putting out the high quality contents. Words can't describe how much I appreciate you, sir. Thank you. You have changed my life.
@ritvikmath
@ritvikmath 3 года назад
Thanks for the kind words. Wishing you much success!
@riteshsaha6881
@riteshsaha6881 Месяц назад
This is super helpful. Way better than my professor's explanation
@vinceb8041
@vinceb8041 3 года назад
I've been wrestling to get all intuitional and computational components for doing pca for a while, and seeing it all come together here helps tremendously! Great as always, 10/10 video :)
@clxdyy.luveditxs
@clxdyy.luveditxs 5 месяцев назад
Thank god I found your channel. I am studying masters degree in computer science in a prestigious university and cost me a lot of money but your channel is very useful to dig deeper and understand many things. Stay on the good work!
@vinceb8041
@vinceb8041 3 года назад
12:20 Quick note on why going down the list of eigenvalues is legit, the covariance matrix is a symmetric matrix, and it can be shown that if such a matrix has more than one eigenvalues that are not the same, the corresponding eigenvectors will be orthogonal.
@amaramar4969
@amaramar4969 6 месяцев назад
I had to go thru the prerequisite videos to clarify my concepts first, but after that this PCA explanation is amazing! I think you are equivalent to 10 college professors out there in terms of teaching skills. I hope you get that proportion money and the college professors feel ashamed and work harder to catchup to your standards. Again, amazing!
@pratik.patil87
@pratik.patil87 8 месяцев назад
Thanks Ritvik, I went through multiple resources to figure out this exact questions " why does eigen vectors and eigen values of a covariance matrix represent the direction and strength of the biggest increase in variance" . Thanks your video clarifies it beautifully. One question still though, I understand the equation we use to maximise but why do we need the constraint(uT u =1)?
@bhajman123
@bhajman123 3 года назад
Byfar the most accessible description of pca...finally was able to clearly connect the covar matrix and the eigen values to variance maximization
@_arkadij
@_arkadij 7 месяцев назад
Very appreciative of the explanation why we end up with using vectors corresponding to the biggest Eigenvalues. Thanks so much
@paulbrown5839
@paulbrown5839 3 года назад
This is a very strong video. It requires proper study. I hope you do more of this great stuff. Thank You!
@DeRocks1607
@DeRocks1607 3 месяца назад
You are great teacher.. ultimately I understood
@jhonportella5618
@jhonportella5618 3 года назад
Great, great video I really appreciate your effort and good methodology to teach. I have a question on the projection math. on your projection video you obtained P=XUU but here you used P=U*XU. Maybe this is a silly question but I would really appreciate if you can tell me why this equivalence is possible. Many thanks
@shivamkak7981
@shivamkak7981 11 месяцев назад
Such a well curated explanation of PCA, thanks so much!
@resoluation345
@resoluation345 6 месяцев назад
The best series to explain the maths behind PCA
@BleachWizz
@BleachWizz 4 года назад
I'm loving your content, you're showing a part of math that is not usually shown. The part where you actually use it, where you make your choices and why are you choosing them. Like it's nice to understand the equations and why it gives you a 0 on the sweet spot, but it's also nice to remind that it not only works but it was build to work with that intention. So in the end you still need to figure out how do you get your problem to fit in one of those, what can you choose in these big generic operations to fit it into your problem.
@ritvikmath
@ritvikmath 4 года назад
Thanks for the feedback! I do try to focus a lot more on the "why" questions rather than the "how" questions.
@christinejiang6386
@christinejiang6386 5 месяцев назад
wow! thank you! I watched all the videos before watching this one, they really helps a lot!
@user-xw5cg7by6t
@user-xw5cg7by6t Год назад
This video is super great! I was wondering why Covariance matrix is used to compute PCA, but this video made my doubts clear!!
@ritvikmath
@ritvikmath Год назад
Glad it was helpful!
@sidddddddddddddd
@sidddddddddddddd Год назад
What you've called the closed form of the covariance matrix is actually the biased estimator of the covariance matrix \Sigma. And if you divide by (N-1) instead of (N), you get the unbiased estimator of \Sigma. Awesone video! Thanks :D
@akrylic_
@akrylic_ 4 года назад
There's a property of transposes around 6:45 that you could have mentioned, and I got tripped up for a second. The reason why you can write u^T*(xi-xbar) as (xi-xbar) ^T*u is because (AB)^T =(B^T)(A^T) It's a cool trick, but not obvious
@ritvikmath
@ritvikmath 4 года назад
Very true, thanks for filling in the missing step!
@zechengchang3444
@zechengchang3444 3 года назад
Can you explain more? How does (AB)^T =(B^T)(A^T) have anything to do with u^T*(xi-xbar)? Thanks.
@alphar85
@alphar85 3 года назад
I stopped at 01:33 and I am going to watch the other 5 videos. you are such a blessing mate.
@bilalbayrakdar7100
@bilalbayrakdar7100 Месяц назад
bro you are the best, thanks for you effort
@mashakozlovtseva4378
@mashakozlovtseva4378 4 года назад
Everything was clearly understood from math side! Thank you for your link on Medium account!
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@thinkingAutomata
@thinkingAutomata 3 года назад
Thanks Ritvik. Excellent explanation of PCA. Good job, well done!
@Chill_Magma
@Chill_Magma Год назад
Straight to the point and thorough you deserve to be subscribed from my 3 accounts
@ShubhamYadav-ut9ho
@ShubhamYadav-ut9ho 3 месяца назад
Amazing explanation as always
@robertbillette4671
@robertbillette4671 2 года назад
Like everyone else has mention, amazing clarity and style.
@133839297
@133839297 Год назад
You have a gift for teaching.
@mathematicality
@mathematicality 2 года назад
Simple and straight to the point. aBsolutely welldone!
@josephgan1262
@josephgan1262 3 года назад
Thanks for the amazing video! can anyone please explain why the projection is u1T . Xi * u? In the projection video it is ( Xi . u ) u. Are they equivalent?
@volsurf1274
@volsurf1274 3 года назад
Concise, clear and superbly explained. Thanks!
@ritvikmath
@ritvikmath 3 года назад
Glad it was helpful!
@yarenlerler67
@yarenlerler67 Год назад
Ahh such a clean explanation. I really appreciate! I will have practical statics for astrophysics exam soon, and I was having some problem with the theory part. All your videos were very helpful! I hope I am gonna get a good grade from the exam. :)
@ajanasoufiane3903
@ajanasoufiane3903 5 лет назад
Great video, it would be nice if you could show the big picture through the SVD decomposition :)
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@knp4356
@knp4356 4 года назад
Hey Ritvik, It would be great if you can generate some problems for viewers to solve. Watching is great but if you can supplement with actual problems then it would drive the points into viewers head. You can then further post solutions on your medium site. Hopefully at least 4-5 problems per each video. I've watched many videos on DS subjects but something in your teaching method is making it simpler to understand. Thanks.
@ritvikmath
@ritvikmath 4 года назад
I honestly really appreciate that you're trying to help me be more effective at what I do. I think it's a great idea and I'll look into it. Thanks :)
@jaivratsingh9966
@jaivratsingh9966 2 года назад
Simply excellent!
@Chill_Magma
@Chill_Magma Год назад
Seeing your videos increases my confidence on math stuff :DDD
@berkoec
@berkoec 3 года назад
Such a well-explained video - keep up the great work!
@ritvikmath
@ritvikmath 3 года назад
Thanks a ton!
@cll2598
@cll2598 2 месяца назад
Epic explanation
@suvikarhu4627
@suvikarhu4627 2 года назад
@ritvikmath 5:02 I don't understand where is this formula of projection (proj(xi)=ut xi u) coming from. The projection video does not say that. What the projection video exactly says is that the proj(xi) = (xi dot u)*u. No transpose there! Where did you get that transpose from? And the dot product is missing ? Another question, at 5:50 why do you take only the magnitude of the vector?
@ahmadawad4782
@ahmadawad4782 4 года назад
Watched many videos about linear algebra and PCA. You're the one who made it clear for me. Thanks!
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@Rockyzach88
@Rockyzach88 Год назад
Just finished the LA section in the Deep Learning book and I can tell this is going to help supplement and fill in this gaps of understanding. Good vid.
@ritvikmath
@ritvikmath Год назад
I hope so!
@mwave3388
@mwave3388 2 года назад
I'm preparing for a job interview. Thanks, the best PCA video I found.
@543phi
@543phi 4 года назад
Thanks for this video! As a Data Science student, your lecture helped to clarify a lot....I appreciate your teaching style.
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@mainakmukherjee3444
@mainakmukherjee3444 Год назад
We find the equation of the variance of the vector, on which we are going to project the data, and then tried maximizing it, because, the vector, for which the variance will be highest (max eigen value), is gonna retain most of the information of the data, after dimensionality reduction.
@martinw.9786
@martinw.9786 2 года назад
Thank you very much for the explanations - very very well done. Your references to the mathematical backround is key!
@proxyme3628
@proxyme3628 Год назад
Brilliant explanation of why eigen vector is the one from maximum optimisation, never saw such great explanation before. Wish your course is in Coursera. I do not think any text book explains the eigen value as Lagrangian Multiplier and eigen vector as maximising variance. Thanks so much.
@MaxDavidsonArgentina
@MaxDavidsonArgentina 4 года назад
Thanks for sharing your knowledge. It's great to have people like you helping out!
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@erfanbayat3974
@erfanbayat3974 3 месяца назад
this video is amazing
@gc6327
@gc6327 4 года назад
Hi Ritvik- Can you do a video on factor analysis. That would be huge! Thanks buddy!
@nuamaaniqbal6373
@nuamaaniqbal6373 2 года назад
cant thank u enough!! u r truly the boss!
@sandeepc2833
@sandeepc2833 4 года назад
Cleared most of my doubts. Thanks a lot.
@paulntalo1425
@paulntalo1425 3 года назад
You have made it clear. Thank you
@GeoffryGifari
@GeoffryGifari 2 месяца назад
Hmmm i noticed that if two categories are strongly correlated, the plot will look close to a straight line. Going to multidimensional space, that "line" looks like the vector u1 in the video, on which the data are projected. Does that mean PCA will perform better the more correlated two (or more) categories are?
@nahidakhter8646
@nahidakhter8646 3 года назад
Beautifully explained! Thanks so much!
@kisholoymukherjee
@kisholoymukherjee 2 года назад
Hi ritvik, thanks for the video. Can you please tell me how the vector projection formula is being used to calculate the projection of xi on u here? The formulae in the two videos seem to be quite different. Would really appreciate if you could help understand the underlying math
@ArpitAnand-yd7tr
@ArpitAnand-yd7tr Год назад
That's just a dot product between the potential u1 and Xi. It gives the magnitude of the projection in the direction of the unit vector u
@muhammadghazy9941
@muhammadghazy9941 2 года назад
thank you man appreciate it
@Sriram-kj6kl
@Sriram-kj6kl 2 года назад
Your videos help a lot man.. Thank you 👍
@kakabudi
@kakabudi 2 года назад
Really great video! Thanks for explaining this concept wonderfully!
@aravindsaraswatula2561
@aravindsaraswatula2561 2 месяца назад
Awesome video
@user-kw6ib6ks1q
@user-kw6ib6ks1q 6 месяцев назад
great explanation. Really appreciate it. thanks
@ritvikmath
@ritvikmath 6 месяцев назад
Glad it was helpful!
@cameronbaird5658
@cameronbaird5658 Год назад
Phenomenal video, thank you for the hard work 👏
@Cybrean1
@Cybrean1 3 года назад
Excellent presentation and delivery … wish you all the success!
@ritvikmath
@ritvikmath 3 года назад
Thank you! You too!
@yurongluo447
@yurongluo447 6 месяцев назад
Your video is helpful for us. Can you create one video to explain Independent Component Analysis in detail? Thanks.
@Markks100
@Markks100 Год назад
I don't understand why the projected form of Xi on U1 is U1^TXiU. From your lecture on vector projections, P=(X.U)U, so why the change?
@santiagolicea3814
@santiagolicea3814 Год назад
This is a great explanation, thanks a lot. It'll be great if you can also make a video showing a practical example with some data set, showing how you use the eigenvectors projection matrix to transform the initial data set.
@nandhinin799
@nandhinin799 4 года назад
Clearly explained, helped me greatly in understanding the basis of PCA.
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@simranjoharle4220
@simranjoharle4220 Год назад
Your videos are extremely helpful! Thank you!
@ritvikmath
@ritvikmath Год назад
Glad you like them!
@arun_kanthali
@arun_kanthali 2 года назад
Great Explanation.. Thank-you 👍
@diegolazareno8020
@diegolazareno8020 5 лет назад
Never stop making these videos!!! One of Logistic Regression would be nice
@ritvikmath
@ritvikmath 5 лет назад
Hey I appreciate the kind words! I do have a vid on logistic regression here: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-9zw76PT3tzs.html
@brianogrady37
@brianogrady37 2 месяца назад
I wish you specified what values represented the Principal Conponents earlier on. But great video regardless.
@XXZSaikou
@XXZSaikou 5 месяцев назад
nicely explained! but I noticed you didn't mention the need to standardize the original data for PCA. Is standardization a little trick to make things faster or is it needed in the underlying math?
@deplo
@deplo 4 года назад
Hi Ritvikmath, thank you for your super informative videos! I took all courses on this topic but I was wondering if you could expand it with factor analysis and correspondence analysis. It would be interesting to know how different methods work and relate to each other because it would provide a deeper perspective. Thanks
@thirumurthym7980
@thirumurthym7980 3 года назад
@ 4.54 - you are referring about projection video - on how you arrive projections formula. There is no such mention of U transpose in that projections video.
@shashanksundi5669
@shashanksundi5669 3 года назад
Just perfect !! Thank you :)
@quark37
@quark37 Год назад
Fun video. Thank-you. And thanks for all the pre-req videos. Question: I've seen other videos that describe PCA vectors as orthogonal, but using eigenvectors they would not necessarily be orthogonal, right? What is the correct way to think about the orthogonality of PCA vectors? Thanks. * I think I answered my own question. The eigenvectors in question are of the covariance matrix of the related variables. This matrix is symmetrical so the eigenvectors will be orthogonal. Correct?
@DarkShadow-tm2dk
@DarkShadow-tm2dk 3 года назад
I HOPE U WILL REPLY 🛑🛑🛑 Aren't we suppose to standardize data before applying PCA and if we do standardize data then mean = 0 At 8:12 the second part of the equation will get cancelled right? So the equation changes
@AG-dt7we
@AG-dt7we 6 месяцев назад
Thanks for such amazing videos. Have 1 question: In the projection video you derived projection as X. V / ||V|| * u here you took started with u1T Xi u. What is the difference? Will be helpful if you can point me to some resources !
@subhabhadra619
@subhabhadra619 2 года назад
Awesomely represented..
@herberthubert6828
@herberthubert6828 3 года назад
you rock, thank you
@mmarva3597
@mmarva3597 3 года назад
Thank you very much !! really helpful
@404nohandlefound
@404nohandlefound Год назад
Could you please explain how this links to SVD
@poornanagasai262
@poornanagasai262 Год назад
It's really a great explanation and one question I got is from the video of vector projection it is clear that the vector onto which we wanna project has the value is (u.x)u where (u.x) is the magnitude and u being the unit vector. Here comes my question in this present video(math behind pca) you used (u^T .x)u as the vector magnitude of the vector which is projected on to. What is the difference in using u and u^t(u transpose)? Can you please answer me?
@rajathjain314
@rajathjain314 4 года назад
Very Intuitive, Great Job Ritvik!
@zilezile4942
@zilezile4942 4 года назад
Good morning If you have difficulty understanding the statistical models and programming them with the R software; You have difficulty understanding where the main components come from when you do principal component analysis; You need to discover the statistics for functional data in particular the analysis in functional principal components; you have no idea how to model by functional linear model ... You like clear and detailed explanations. Click on this link amikour.wordpress.com/nos-formations/
@fahimfaisal4660
@fahimfaisal4660 2 года назад
Excellent
@rabiizahir2885
@rabiizahir2885 2 года назад
Thanks a lot.
@user-or7ji5hv8y
@user-or7ji5hv8y 3 года назад
I think the links to those videos are missing in the notes section.
@erlint
@erlint 9 месяцев назад
Shouldn't you also constrain u_2 to be orthogonal of u_1 if you want 2 dimensions? Such that each dimensions principal component will be orthogonal to each other. Or is that just an SVD and PCA relationship thing?
@seetaramdantu3190
@seetaramdantu3190 3 года назад
excellent...well explained
@ritvikmath
@ritvikmath 3 года назад
Glad it was helpful!
@MohamedMostafa-kg6gk
@MohamedMostafa-kg6gk 3 года назад
Thank you for this great explanation .
@ritvikmath
@ritvikmath 3 года назад
You are welcome!
@georgegkenios486
@georgegkenios486 3 года назад
Amazing work mate!
@ritvikmath
@ritvikmath 3 года назад
Thanks a lot!
@alejandropalaciosgarcia2767
@alejandropalaciosgarcia2767 3 года назад
Bro, you are awsome
@Tankwell-cq5ky
@Tankwell-cq5ky 2 года назад
Very well presented - well done!😊😊
@ernestanonde3218
@ernestanonde3218 2 года назад
great video
@danfirth3017
@danfirth3017 4 месяца назад
Only question is that in my lecture notes it’s using the variance matrix instead of covariance matrix, is this an issue or does it still work?
@AshishKGor
@AshishKGor 2 года назад
Thanks sir.
@PR-ud4fp
@PR-ud4fp Год назад
Thanks 😊
Далее
Inverse Transform Sampling : Data Science Concepts
10:54
Vector Projections : Data Science Basics
14:58
Просмотров 65 тыс.
Самый БОЛЬШОЙ iPhone в МИРЕ!
00:52
Просмотров 783 тыс.
Principal Component Analysis (PCA)
26:34
Просмотров 407 тыс.
Lagrange Multipliers : Data Science Basics
10:01
Просмотров 54 тыс.
EM Algorithm : Data Science Concepts
24:08
Просмотров 69 тыс.
Ali Ghodsi, Lec 1: Principal Component Analysis
1:11:42
Просмотров 101 тыс.
SVM (The Math) : Data Science Concepts
10:19
Просмотров 100 тыс.
Lecture: Principal Componenet Analysis (PCA)
51:13
Просмотров 172 тыс.