Тёмный
No video :(

Principle Component Analysis (PCA) | Part 1 | Geometric Intuition 

CampusX
Подписаться 227 тыс.
Просмотров 87 тыс.
50% 1

This video focuses on providing a clear geometric intuition behind PCA. Learn the basics and set the foundation for understanding how PCA works in simplifying and preserving important information in your data.
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.camp...
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:44 - What is PCA
05:16 - Benefits of using PCA
07:33 - Geometric Intuition
25:01 - What is variance and why is it important?

Опубликовано:

 

18 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 85   
@aienthu2071
@aienthu2071 Год назад
So grateful for the videos that you make. I have burnt my pockets, spent hours on various courses just for the sake of effective learning. But most of the times I end up coming back at campusx videos. Thank you so much.
@hritikroshanmishra3630
@hritikroshanmishra3630 Год назад
ho gya to?
@investmentplan880
@investmentplan880 8 месяцев назад
Same
@henrystevens3993
@henrystevens3993 2 года назад
Unbelievable...nobody taught me PCA like this.... Sir 5/5 for your teachings 🙏🙏 god bless you ❤️
@vikramraipure6366
@vikramraipure6366 2 года назад
I am interested with you for group study, reply me bro
@prasadagalave9762
@prasadagalave9762 5 месяцев назад
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset. 08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence. 12:45 Feature selection involves choosing the most important features for predicting the output 17:00 Feature selection is based on the spread of data on different axes 21:15 PCA is a feature extraction technique that creates new features and selects a subset of them. 25:30 PCA finds new coordinate axes to maximize variance 29:45 Variance is a good measure to differentiate the spread between two data sets. 33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.
@SumanPokhrel0
@SumanPokhrel0 Год назад
It's like im watching a NF series , at first you're introduced to different terms, methods their usecases and in the last 10 mins of the video everything adds up and you realize what ahd why these stratigies are in use. Amazing.
@rafibasha1840
@rafibasha1840 2 года назад
You have done good research on every topic bro ,nice explanation ..I am so happy I found this channel at the same time feeling bad for not finding it earlier
@DataScienceSchools
@DataScienceSchools 2 года назад
Exactly. same here
@makhawar8423
@makhawar8423 Год назад
Can't have better understanding of PCA than this..Saved so much time and energy..Thanks a lot
@krishnakanthmacherla4431
@krishnakanthmacherla4431 2 года назад
Wow , i regret why I did not get to this channel, very clear as a story , i can explain a 6 year old and make him/her understand ❤️👏
@geetikagupta5
@geetikagupta5 Год назад
I am loving this channel more and more everytime I see a video here.The way content is presented and created is really awesome.Keep Inspiring and motivating us.I am learning a lot here.
@kirtanmevada6141
@kirtanmevada6141 8 месяцев назад
Totally an Awesome playlist for learning Data Science/Mining or for ML. Thank you so much sir! Means a lot!!
@eshandkt
@eshandkt 2 года назад
one of the finest explanation of pca I have ever seen Thankyou Sir!
@nawarajbhujel8266
@nawarajbhujel8266 7 месяцев назад
Top have this level of teaching, one should have deep level of understanding both from theoritcal as well as practical aspects. You have proved it again. Thank for providing such valuable teaching.
@sidindian1982
@sidindian1982 2 года назад
Sir , the way you explained the Curse of Dimensionality & its Solutions in Previous vedio -- Just mind blowing ..... YOU ARE GOD
@avinashpant9860
@avinashpant9860 Год назад
Awesome explanation and best part is how he drops important info in between the topic, like such a good interpretation of scatter plot is in this video which i wouldn't find even in dedicated scatter plot video. So perfect.
@prankurjoshi9672
@prankurjoshi9672 Год назад
No words to express how precious your teaching is....
@ParthivShah
@ParthivShah 5 месяцев назад
Thank You Sir.
@rahulkumarram5237
@rahulkumarram5237 Год назад
Beautifully explained !!! Probably the best analogy one could come up with. Thank you, sir.
@fahadmehfooz6970
@fahadmehfooz6970 Год назад
Never have I seen a better explanation of PCA than this!
@zaedgtr6910
@zaedgtr6910 9 месяцев назад
Amazing explanation....NO one can explain pca as easily as you have done. Better than IIT professors.
@narmadaa2106
@narmadaa2106 10 месяцев назад
Excellent sir I have listened to different video lectures on PCA, But i didn't understand it properly. But your's is the best one. Thank you so much
@SameerAli-nm8xn
@SameerAli-nm8xn Год назад
First of all the playlists is amazing you have done a really good job in explaining the concepts and intrusions behind the algorithms, I was wondering could you create a separate playlist for ARIMA SARIMAX and LSTM algorithms i really want to see those above algorithms in future class
@raghavsinghal22
@raghavsinghal22 2 года назад
Best Video for PCA. I'll definitely recommend to my friends 🙂
@balrajprajesh6473
@balrajprajesh6473 2 года назад
I thank God for blessing me with this teacher.
@shubhankarsharma2221
@shubhankarsharma2221 Год назад
Very nicely explained topics. One of the best teacher on ML.
@siddiqkawser2153
@siddiqkawser2153 Месяц назад
U rock dude! Really appreciate that
@motivatigyan6417
@motivatigyan6417 Год назад
U r outstanding for me sir...i can't able to understand untill i watch your video
@jawadali1753
@jawadali1753 2 года назад
your teaching style is amazing , you are gem
@vikramraipure6366
@vikramraipure6366 2 года назад
I am interested with you for group study, reply me bro
@armanmehdikazmi5390
@armanmehdikazmi5390 8 месяцев назад
hats off to you sirrrr
@DataScienceSchools
@DataScienceSchools 2 года назад
Wow, how simply you did it.
@ritesh_b
@ritesh_b Год назад
thanks for the great explaination please keep explaining in this way only
@VIP-ol6so
@VIP-ol6so 4 месяца назад
great example
@sachinahankari
@sachinahankari 5 месяцев назад
Variance of grocery shop is greater than number of rooms but you have shown reverse..
@beb57swatimohapatra21
@beb57swatimohapatra21 10 месяцев назад
Best course for ML
@beautyisinmind2163
@beautyisinmind2163 Год назад
Damn, you are the Messiah in ML teaching
@harsh2014
@harsh2014 Год назад
Thanks for the explanations!
@user-nv9fk2jg5m
@user-nv9fk2jg5m 9 месяцев назад
You are so good in this, i m like 'tbse kha thae aap'
@ytg6663
@ytg6663 3 года назад
बहुत सुंदर है👍👍🙏❤️🔥
@MARTIN-101
@MARTIN-101 Год назад
sir you have no idea, how much you are helping data learners like me. thanks a lot. how can i help you. is there any where i can pay to you as a token of appreciation ?
@aadirawat4230
@aadirawat4230 2 года назад
Such an underrated channel for ML.
@DeathBlade007
@DeathBlade007 Год назад
Amazing Explanation
@shivendrarajput4413
@shivendrarajput4413 2 месяца назад
that is what we do
@Ishant875
@Ishant875 7 месяцев назад
I have a doubt, If a variable is in range 0 to 1 and another variable is in range 0 to 1000(will have more variance / spread ). Why choosing 2nd variable just by looking at variance make sense? It may be matter of units like in km and cm. For this problem we use scaling. Am I right?
@sahilkirti1234
@sahilkirti1234 5 месяцев назад
you are the god
@jazz5314
@jazz5314 2 года назад
Wowww!!!! Best video
@761rishabh
@761rishabh 3 года назад
Nice Presentation sir
@kindaeasy9797
@kindaeasy9797 7 месяцев назад
but agar PCA ke geometric intuition mai mai clockwise ghumau axis ko toh variance toh rooms ka kam ho jaega na , or agar mai same process kru by taking washroomn on x axis and rooms on y tab toh washroom select ho jaega na ??
@mukeshkumaryadav350
@mukeshkumaryadav350 Год назад
amazing explanation
@jiteshsingh6030
@jiteshsingh6030 2 года назад
Just Wow 🔥 😍
@pankajbhatt8315
@pankajbhatt8315 2 года назад
Amazing explanation!!
@qaiserali6773
@qaiserali6773 2 года назад
Great content!!!
@rafibasha4145
@rafibasha4145 2 года назад
Hi Bro,please make videos on feature selection techniques
@amanrajdas4540
@amanrajdas4540 2 года назад
sir your videos are really amazing, I had learned a lot from your videos. But I have a doubt in feature construction and feature extraction. They both are looking similar. So can you please ,tell me the one major difference between these two.
@vikramraipure6366
@vikramraipure6366 2 года назад
I am interested with you for group study, reply me bro
@yashjain6372
@yashjain6372 Год назад
best explanation
@msgupta07
@msgupta07 2 года назад
Amazing explanation... Can you share this one note for windows 10 notes of this entire series "100 days of Machine Learning"
@1234manasm
@1234manasm Год назад
Very nice explanation my i know which hardware you use to write on the notepad?
@ambarkumar7805
@ambarkumar7805 Год назад
what is the difference between feature extraction and feature contruction as both are reducing the no of features?
@arpitchampuriya9535
@arpitchampuriya9535 2 года назад
Excellent
@pkr1kajdjasdljskjdjsadjlaskdja
@pkr1kajdjasdljskjdjsadjlaskdja 7 месяцев назад
bhai ye video viral kiyu nahi ho raha hai ..thank you sir ❤
@mustafachenine7942
@mustafachenine7942 2 года назад
Is it possible to have an example of pictures to classify them into two categories? If the dimensions are reduced in pca and classification in knn is better , please
@x2diaries506
@x2diaries506 Год назад
Dear sir I am confused about the variance formula and your interpretation. Kindly recheck.
@AyushPatel
@AyushPatel 3 года назад
sir i just wanted to ask that can we write our own machine learning algorithms instead of using sklearn and tensorflow i mean from scratch plz make a video about that. I have been following you whole series. Sir do reply. Thanks to your efforts
@ytg6663
@ytg6663 3 года назад
Ha Likhsaktr ho yaar... Yes you can...
@vikramraipure6366
@vikramraipure6366 2 года назад
I am interested with you for group study, reply me bro
@AyushPatel
@AyushPatel 2 года назад
@@vikramraipure6366 actually currently i am working on some other project so.. i am sorry.. thanks for the proposal!
@yashwanthyash1382
@yashwanthyash1382 Год назад
My suggestion is use sklearn library for existed algorithms. If that doesn't work create your own algorithm.
@surajghogare8931
@surajghogare8931 2 года назад
Cleaver explaination
@0Fallen0
@0Fallen0 Год назад
24:24 Aha! So PCA finds an alternate co-ordiante system and uses the change of basis matrix to transform the data.
@arshad1781
@arshad1781 3 года назад
thanks
@learnfromIITguy
@learnfromIITguy Год назад
solid
@Star-xk5jp
@Star-xk5jp 7 месяцев назад
Day3: Date:11/1/24
@ayushtulsyan4695
@ayushtulsyan4695 Год назад
Bhai ek playlist dedo for statistical application in Data Science
@namansethi1767
@namansethi1767 2 года назад
Thanks you sir
@vikramraipure6366
@vikramraipure6366 2 года назад
I am interested with you for group study, reply me bro
@namansethi1767
@namansethi1767 2 года назад
Done....Give me your mobile no. ..... I will call i when I free
@vatsalshingala3225
@vatsalshingala3225 Год назад
❤❤❤❤❤❤❤❤❤❤❤❤❤❤
@devnayyar9536
@devnayyar9536 Год назад
Sir notes milege app ki
@murumathi4307
@murumathi4307 3 года назад
This work same SVM? 🤔
@campusx-official
@campusx-official 3 года назад
Yes
@murumathi4307
@murumathi4307 3 года назад
@@campusx-official tq sir .. your class aswame 🙏
@sahilkirti1234
@sahilkirti1234 5 месяцев назад
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset. 08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence. 12:45 Feature selection involves choosing the most important features for predicting the output 17:00 Feature selection is based on the spread of data on different axes 21:15 PCA is a feature extraction technique that creates new features and selects a subset of them. 25:30 PCA finds new coordinate axes to maximize variance 29:45 Variance is a good measure to differentiate the spread between two data sets. 33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions. Crafted by Merlin AI.
Далее
Get 10 Mega Boxes OR 60 Starr Drops!!
01:39
Просмотров 15 млн
7 Days Stranded In A Cave
17:59
Просмотров 56 млн
Principal Component Analysis (PCA)
13:46
Просмотров 374 тыс.
How to get a Haircut in English
15:40
Просмотров 20 тыс.
StatQuest: PCA main ideas in only 5 minutes!!!
6:05
Просмотров 1,2 млн
Principal Component Analysis (PCA)
6:28
Просмотров 202 тыс.
Get 10 Mega Boxes OR 60 Starr Drops!!
01:39
Просмотров 15 млн