Тёмный

Principle Component Analysis (PCA) | Part 2 | Problem Formulation and Step by Step Solution 

CampusX
Подписаться 250 тыс.
Просмотров 83 тыс.
50% 1

Опубликовано:

 

26 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 173   
@ADESHKUMAR-yz2el
@ADESHKUMAR-yz2el Год назад
गुरुर्ब्रह्मा ग्रुरुर्विष्णुः गुरुर्देवो महेश्वरः। गुरुः साक्षात् परं ब्रह्म तस्मै श्री गुरवे नमः।। you are the real teacher sir.. 💫
@katadermaro
@katadermaro 3 года назад
This is hands down the best PCA explanation I have seen on the internet. Period!
@abhinavkatiyar6950
@abhinavkatiyar6950 2 года назад
True
@sidindian1982
@sidindian1982 2 года назад
indeed
@tapanmahata8330
@tapanmahata8330 Месяц назад
No
@anushilabiswas6947
@anushilabiswas6947 7 дней назад
@@tapanmahata8330 so who is the other person?
@sudhanshusingh5594
@sudhanshusingh5594 2 года назад
Feature extraction perfect example is u sir High dimension knowledge ko 2D me convert krk smjhne layak bana dete ho.. Thankq so much sir.
@amartyatalukdar1024
@amartyatalukdar1024 8 месяцев назад
I have studied eigenvalues and eigenvectors, multiple times, but this video explained to me the depth of it in a very simple way! One of the best teachers out there.
@HarshSingh-ms1tu
@HarshSingh-ms1tu Месяц назад
are bro hello
@ug1880
@ug1880 7 дней назад
What an explanation. I was searching for this explanation from past many months ....
@11aniketkumar
@11aniketkumar Год назад
गुरू ब्रह्मा गुरू विष्णु, गुरु देवो महेश्वरा गुरु साक्षात परब्रह्म, तस्मै श्री गुरुवे नमः Aapko koti koti naman hai sir, jo ye gyan aapne youtube ke jariye hum sabko diya
@ujjalroy1442
@ujjalroy1442 16 дней назад
The rarest and best discussions about pca... Thanks a lot sir
@bibhutibaibhavbora8770
@bibhutibaibhavbora8770 11 месяцев назад
Omg in one video he explains the most difficult linear algebra topic and applied it to machine learning and also showed us the code. Hats off
@Lets_do_code-vl7im
@Lets_do_code-vl7im 10 месяцев назад
its a video of 3 parts and in a way he is explaining dum man can understand the concept
@AltafAnsari-tf9nl
@AltafAnsari-tf9nl Год назад
You are simply amazing. I bet no one can teach PCA so well.
@avinashpant9860
@avinashpant9860 2 года назад
I am always amazed how important the concept of eigenvectors and eigenvalues are, they are one of the most important concepts of quantum mechanics. Every operator( ex- energy, momentum) in Q.Mech is a linear operator and our aim usually is to find the corresponding eigenvectors and eigenvalues. Time-independent Schrödinger eq usually takes the form of eigenvalue equation Hψ =Eψ. It's so amazing to see how these concepts are finding their role in Machine Learning as well. MY love for Math keeps on growing. As always thank you for your amazing videos
@lightyagami7085
@lightyagami7085 2 года назад
No one has ever explained Eigen Vectors in such a simple way, You are awesome !!
@abdulwahabkhan1086
@abdulwahabkhan1086 3 дня назад
The dedication that reflects from your content just shows how good of a teacher you are. Me hamesha sochta tha k eigenvectors or eigenvalues ko seekh k kia krunga. Aaj is video se jawab mil gaya. Bht shukriya!
@pulimiyashwanth9925
@pulimiyashwanth9925 Год назад
this channel is so underated,by seeing this pca video everyone one can understand dimensonality reduction,thank you sir for the hard work
@nishantgoyal6657
@nishantgoyal6657 9 месяцев назад
One of the best videos I found for PCA. You have great skills brother.
@farhansarguroh8680
@farhansarguroh8680 Год назад
My head hurts, this is sooo descriptive and apt and worthy enough of all the time. best. Kuddos
@alkalinebase
@alkalinebase Год назад
Honestly everything I know, I owe it to you, thankyou for being the real HERO in the need!!!
@sahilkirti1234
@sahilkirti1234 7 месяцев назад
00:02 PCA aims to reduce dimensionality while maintaining data essence. 02:55 Projection and unit vector for PCA 10:27 Principle Component Analysis (PCA) helps to find the direction for maximum variance. 12:48 Variance measures the spread of data 19:22 Principal Component Analysis (PCA) helps in understanding the spread and orientation of data. 21:56 PCA provides complete information about data spread and orientation. 27:10 Principle Component Analysis involves transformations and changing directions of vectors. 29:39 Linear transformation does not change vector direction. 34:24 Principal Component Analysis (PCA) uses eigenvectors for linear transformation. 36:36 Principal Component Analysis (PCA) helps identify vectors with the highest variation in data. 41:55 Principal Component Analysis allows transforming data and creating new dimensions. 44:15 PCA involves transforming the dataset to a new coordinate system 49:14 Using PCA to find the best two-dimensional representation of 3D data 52:07 Principle component analysis (PCA) involves transforming and transporting the data. Crafted by Merlin AI.
@krishnendubarman8490
@krishnendubarman8490 Год назад
You are really a good teacher, I am in IIT Bombay, Environmental Engineering, Mtech , but I wanted to learn ML, this playlist is so far best understandable for me.
@shubhamagrahari9745
@shubhamagrahari9745 11 месяцев назад
Bhai isse acha private se CS krliya hota
@bahubaliavenger472
@bahubaliavenger472 3 месяца назад
Lmao​@@shubhamagrahari9745
@narendraful
@narendraful Месяц назад
Great explanation!!... Thanks for putting in so much effort and time!!.. Can't believe that all of this is free!!
@laxmimansukhani1825
@laxmimansukhani1825 Месяц назад
Awesome !!! Thank you going to such depths and visualization for teaching the PCA !! Proud to learn through you !!👃👃
@vibhasharma6097
@vibhasharma6097 2 месяца назад
It's 4 in the morning and i can say, I can say that i understood PCA by this particular video only...best explanation!!! PERIODT.
@bhavikpunmiya9641
@bhavikpunmiya9641 7 месяцев назад
Thankyou So Much Sir, You not only Cleared my doubt's about how PCA works, but also for the first time gave me mathematical intitution of Eigen Value and Eigen Vector and even Matrices Transformation which I am learning from previous so many years Best Explaination I've seen regarding this topic
@samikshakolhe5086
@samikshakolhe5086 2 года назад
The Epic PCA explaination ever seen on RU-vid and never done by any DS RU-vidr Yet, Hat's off to your teaching skills sir
@world4coding
@world4coding Год назад
sir ye dekh kar maja ko maja aaa gaya. kya hi bolu sir ab . PCA ka Best video tha ye. Love you sir.
@singnsoul6443
@singnsoul6443 11 месяцев назад
I am blown away by understanding the true meaning of eigen vectors. I always knew the definition but now I have understood the meaning. You are a savior!
@rajpoot_011
@rajpoot_011 3 месяца назад
Sir, it is really the best explaination on the PCA, i was struggling to learn PCA before but after watching this video mine mostly concepts are cleared. Thanks Sir ji for this valuable content.
@satyamgupta4808
@satyamgupta4808 Год назад
Sir No one in RU-vid taught this much intutive. Even paid course cant teach this much indepth
@varunahlawat9013
@varunahlawat9013 Год назад
The most satisfying Machine Learning lecture that I've ever seen by far🤩🤩
@akshaythakor5501
@akshaythakor5501 10 месяцев назад
Actually I was learning PCA for the first time. When I watched the video for the first time I didn't understand it but when I watched it a second time then all the topics very clearly. This video is amazing
@ChandrakalaKurubaTechieStuffs
@ChandrakalaKurubaTechieStuffs 2 месяца назад
Nithish bhayya u nailed it
@vashugarg2072
@vashugarg2072 Год назад
Best Teaching Skills I have ever seen ,for All Machine Learning concepts, Hats of you Sir!🎉🎉🎊🎊
@QAMARRAZA-pm6nc
@QAMARRAZA-pm6nc 7 месяцев назад
how can i thank you, what a wonderful teacher available for free for the help of many students
@aounhaider8335
@aounhaider8335 Год назад
You have cleared my concept which was not well explained by any other instructor on youtube! Great job❤❤
@soumilyade1057
@soumilyade1057 Год назад
45:44 I think it's gonna be (3,1) and when transposed it's gonna be (1,3) which then is multiplied with the matrix representing the dataset. (1,3) × (3, 1000) . This representation is valid too
@tr-GoodVibes
@tr-GoodVibes 2 года назад
This is called Teaching ! Thanks for this wonderful explanation.
@pravinshende.DataScientist
@pravinshende.DataScientist 2 года назад
wow content ! you are playing a big role for me to make me as a data scientist .. thank you sir!
@akshaypatil8155
@akshaypatil8155 Год назад
Hi pravin if u have got the job, could you guide me a little. I have questions related to how the work gets distributed in a data science dept. of a company. How the data science dept works and how the work gets distributed etc etc...Could u plz share ur email?
@shubhamagrahari9745
@shubhamagrahari9745 11 месяцев назад
​@@akshaypatil8155no 👎
@nitinchityal583
@nitinchityal583 Год назад
Speechless .....you deserve a million subscribers at least
@campusx-official
@campusx-official Год назад
Natural Language Processing(NLP): ru-vid.com/group/PLKnIA16_RmvZo7fp5kkIth6nRTeQQsjfX
@ali75988
@ali75988 10 месяцев назад
One small explanation of shortcut in lecture at 16:04, co-variance actual formula includes xmean and ymean, here both were zero, that's why shortcut sum(x*y)/3 formula for covariance is: covariance(x,y) = summation[(x-xmean) (y-ymean)] / n basically this is the same reason, covariance matrix has variance at diagonal 22:57 both features are same x so covariance(x,x) = summation[(x-xmean)(x-xmean)]/n which is actually the formula for variance
@rajshah7734
@rajshah7734 2 месяца назад
Thanks, this really helped me!!
@hasanrants
@hasanrants Месяц назад
thank you Nitesh for this masterclass explanation.
@zeronpz
@zeronpz 2 месяца назад
The way you simplify things 👏
@vijendravaishya3431
@vijendravaishya3431 3 месяца назад
those who are wondering why three eigen vectors everytime, because covariance matrix is a symmetric matrix, and Real Symmetric Matrices have n linearly independent and orthogonal eigenvectors. zero vector is not considered an eigen vector although it satisfies Ax= λx, like wise there might be upto n LI eigen vectors for n*n symmetrix matrix
@ritugujela8345
@ritugujela8345 2 года назад
Thank you so much sir, you always leave us awestruck by your remarkable explanation and in-depth knowledge. I never knew this topic can be explained with this much clearity. The teacher I never knew I needed in my life ❤️✨
@responsiblecitizens8422
@responsiblecitizens8422 6 дней назад
GOAT vdo regarding PCA
@soyam7
@soyam7 Месяц назад
bhai har bar rona ata hai , ye banda itna acha q hai 😭💓💓💓💯💯
@SonuK7895
@SonuK7895 2 года назад
This Channel is Goldmine 🙌
@monicakumar6769
@monicakumar6769 Год назад
This is the best video I have watched on this topic!
@rafibasha1840
@rafibasha1840 2 года назад
Thanks for the excellent video bro ,@16:21 in covariance we substrate values from mean and then multiply right
@deepanshugoel3790
@deepanshugoel3790 Год назад
you have made a complex topic like PCA so easy for us to understand
@bangarrajumuppidu8354
@bangarrajumuppidu8354 3 года назад
what an amzing explanation very intutive am follwing ur whole series sir
@muhammadumair1280
@muhammadumair1280 2 года назад
Love from KARACHI ,PAKISTAN.
@osho_magic
@osho_magic Год назад
I have seen and understand lin alg playlist at 3blue1 brown But you enhances my doubts clarifications even more thanks
@susamay
@susamay Год назад
u r the best Nitish. Thanks for all these.
@priyadarshichatterjee7933
@priyadarshichatterjee7933 11 месяцев назад
most elegant explanation of Eigen Vector and Eigen values ever seen. Thanks sir for this one
@ShahFahad-ez1cm
@ShahFahad-ez1cm 3 месяца назад
We are not actually projecting the vectors onto the unit vector, but we are projecting them onto a line spanned by the the unit vector.
@anoopkaur6119
@anoopkaur6119 Год назад
Awesome video, omg , u explained every concept so clearly. thx a lot sir
@sudiptahalder423
@sudiptahalder423 Год назад
Best explanation of PCA I've ever seen!! ❤️
@ssh0059
@ssh0059 5 месяцев назад
Wow best video on PCA on internet
@aryastark4064
@aryastark4064 11 месяцев назад
you are a saviour to my sinking boat❣. thanks a lot.
@TheMLMine
@TheMLMine 10 месяцев назад
Great step-by-step explanation
@Rupesh_IITBombay
@Rupesh_IITBombay 8 месяцев назад
Thank You sir. such a crisp explaination....
@amirman6
@amirman6 2 года назад
pretty good explanation of doing PCA computatinally without using the sklearn
@pravinshende.DataScientist
@pravinshende.DataScientist 2 года назад
wow is my 1 st expression after watching this vedio....
@morancium
@morancium Год назад
One small correction: 52:55 Eigen Vectors are the COLUMNS of the matrix which is given as the output of np.linalg.eig() not the rows which you have used... please correct me if I am wrong
@sameerabanu3115
@sameerabanu3115 Год назад
extremely superb explanation, kudooss
@adityabhatt04
@adityabhatt04 2 года назад
This is even better than Josh Starmer's video.
@gauravpundir97
@gauravpundir97 Год назад
Great explanation @Nitish
@pravinshende.DataScientist
@pravinshende.DataScientist 2 года назад
your content is best ever!! thank your sir!
@islamicinsights6342
@islamicinsights6342 3 года назад
wow sir thanks, you are the best pr apne aage videos Q nhi bnayi hai unsupervised learning p mein wt kr rha hu plz reply
@ParthivShah
@ParthivShah 7 месяцев назад
Thank You Sir.
@pavangoyal6840
@pavangoyal6840 Год назад
Request you to continue deep learning series
@pramodshaw2997
@pramodshaw2997 2 года назад
god bless you. wonderful session
@ashishshejwal8514
@ashishshejwal8514 Год назад
Speechless ,too good to grasp
@jiteshsingh6030
@jiteshsingh6030 2 года назад
I am Gonna Mad😶 ; You are Truly Legend 🔥
@nitinchityal583
@nitinchityal583 Год назад
Do a series on time series analysis and NLP please
@anshuman4mrkl
@anshuman4mrkl 3 года назад
Amazingly explained. 🤩👏🏻
@namansethi1767
@namansethi1767 2 года назад
Thanks sir for this amazing explanation
@kindaeasy9797
@kindaeasy9797 9 месяцев назад
sabse bada eigenvector sabse bade eigen value ke corresponding hoga , but ek eigen value ke corresponding more than one eigen vector hotai hai , infact poora eigen space hota hai (except 0 vector ofcourse)!! in R2 plane ,it will have uncountable eigen vectors corresponding to the largest eigen value
@kindaeasy9797
@kindaeasy9797 9 месяцев назад
ooh i figured it out , i think if the eigen vectors are LD they have the same direction and direction is what matters , and i we have and LI one , then we will have one more u which works equivalently good
@descendantsoftheheroes_660
@descendantsoftheheroes_660 Год назад
guru G aapke charan kha h ...GOD bless you sir
@core4032
@core4032 2 года назад
Sir, I want to know ki how libraries are working , agar aap uspe basic explanation batade , baki to this series is very awesome .
@faizanhussain1411
@faizanhussain1411 15 дней назад
If you are facing an error then you should know that in Pandas version 2.0 and above the append() method is deprecated and removed. Replace append () with concat()..
@sushantsingh1133
@sushantsingh1133 5 месяцев назад
You are best in business
@abhijitkumar7831
@abhijitkumar7831 5 месяцев назад
Amazing Tutorial
@pramodshaw2997
@pramodshaw2997 2 года назад
one question do we need to sort the eigen vectors based on highest eigen values and then choose the eigen vectors accordingly? also sum of top K eigen values will show us how many eigen vectors we need to take (in case of high dimensional data)
@brajesh2334
@brajesh2334 3 года назад
very excellent explanation.....
@AyushPatel
@AyushPatel 3 года назад
sir i just wanted to ask that can we write our own machine learning algorithms instead of using sklearn and tensorflow i mean from scratch plz make a video about that. I have been following you whole series. Sir do reply. Thanks to your efforts
@campusx-official
@campusx-official 3 года назад
Yes we can do that. Will make videos on those topics, sometime in future
@AyushPatel
@AyushPatel 3 года назад
@@campusx-official ok thanks will that video come in this playlist
@RitikaSharma-pt9ox
@RitikaSharma-pt9ox Год назад
Thankyou sir for this amazing video.
@utkarshtripathi9118
@utkarshtripathi9118 Месяц назад
Sir, Can you teach us how to Fine-Tune the model from basics please
@somanshkumar1325
@somanshkumar1325 2 года назад
Brilliant explanation! Thank you so much :)
@gauravagrawal8078
@gauravagrawal8078 10 месяцев назад
excellent explanation
@lijindurairaj2982
@lijindurairaj2982 2 года назад
was very useful for me, thank you :)
@harsh2014
@harsh2014 Год назад
Thanks for explanations!
@josebgeorge227
@josebgeorge227 6 месяцев назад
Hi Sir just to understand the concept well, so when we do the transformation of the data D, do we use the matrix of Eigen Vector (Calculated using the Covariance Matrix) or do we use the Covariance Matrix itself? Its using the matrix of Eigen Vector right?
@AbcdAbcd-ol5hn
@AbcdAbcd-ol5hn Год назад
😭😭😭😭thanks a lot sir, thank you so much
@user-up2vg6ld4l
@user-up2vg6ld4l 20 дней назад
while calculating the variance of the u axis data, why are the using xi(mean) of the original data(data on the two axes)?
@DataScienceWithAhmad
@DataScienceWithAhmad 2 месяца назад
amazing sir love you
@BAMEADManiyar
@BAMEADManiyar Год назад
I think, There can only be n eigen values for a n*n matrix. And n unit eigen vectors for it. but can be as many as eigen vectors as possible. we need to just multiply by some k to to that unit eigen vectors to get some more eigen vectors. :)
@saru_mangla
@saru_mangla Год назад
Justt soo awesome ! Cant describe !
@suchetharchand2725
@suchetharchand2725 2 месяца назад
Bhai yee kya hi pdha diya thank u
@VIP-ol6so
@VIP-ol6so 6 месяцев назад
clearly explained
@hello-iq6yz
@hello-iq6yz Год назад
Amazing clarity !!!