Тёмный

1 Principal Component Analysis | PCA | Dimensionality Reduction in Machine Learning by Mahesh Huddar 

Mahesh Huddar
Подписаться 91 тыс.
Просмотров 284 тыс.
50% 1

Опубликовано:

 

5 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 97   
@kenway346
@kenway346 Год назад
I noticed that your channel contains the entirety of Data Mining taught at the Master's level! Thank you very much, subscribing immediately!
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Do like share and subscribe
@ishu_official
@ishu_official 9 месяцев назад
Super explanation.. today is my machine learning paper
@MaheshHuddar
@MaheshHuddar 9 месяцев назад
Thanks and welcome Do like share and subscribe
@ankitjha_03
@ankitjha_03 5 месяцев назад
Mine is tomorrow!
@nurulsyuhadah984
@nurulsyuhadah984 3 месяца назад
How was it?
@junaidahmad218
@junaidahmad218 9 месяцев назад
This man has depth knowledge of this topic.
@MaheshHuddar
@MaheshHuddar 9 месяцев назад
Thank You Do like share and subscribe
@varshabiradar1482
@varshabiradar1482 2 дня назад
You have explained very well. It will be good if you explain, "What is the use of calculating the pca in the geometrical representation you explained at the last?"
@VDCreatures-kc6uf
@VDCreatures-kc6uf 10 месяцев назад
Super explanation..the best channel in RU-vid to learn machine learning and ann topics ❤❤
@MaheshHuddar
@MaheshHuddar 10 месяцев назад
Thank You Do like share and subscribe
@User22_2g
@User22_2g 5 дней назад
Great work sir! Really very helpful for my exams. Very grateful to you 🙏 One suggestion sir, kindly please share practice problems in the description or in comment box sir. Could you please do it as soon as possible sir? Thanks a lot sir 😀
@TrueTalenta
@TrueTalenta Год назад
Amazing step-by-step outline! I love it💌, so I subscribe!
@MaheshHuddar
@MaheshHuddar Год назад
Thank You Do like share and subscribe
@gamingaddaaa2141
@gamingaddaaa2141 15 дней назад
Thank you sir for the information
@priyalmaheta690
@priyalmaheta690 6 месяцев назад
content and teaching is very good please also provide the notes it will be helpful
@MaheshHuddar
@MaheshHuddar 6 месяцев назад
Thank You Do like share and subscribe
@venkateshwarlupurumala6283
@venkateshwarlupurumala6283 Год назад
Very clear Explanation Sir.... Thank you so much...
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Please do like share and subscribe
@jambulingamlogababu8914
@jambulingamlogababu8914 11 месяцев назад
Excellent Teaching. Salute to you sir
@MaheshHuddar
@MaheshHuddar 11 месяцев назад
Welcome Do like share and
@NandeeshBilagi
@NandeeshBilagi 6 месяцев назад
Clear and nice explanation. Thanks for the video
@MaheshHuddar
@MaheshHuddar 6 месяцев назад
Welcome Do like share and subscribe
@radhay4291
@radhay4291 11 месяцев назад
Thank u very much.Very clear explanation and it is to understand
@MaheshHuddar
@MaheshHuddar 11 месяцев назад
Welcome Do like share and subscribe
@thilagarajthangamuthu2935
@thilagarajthangamuthu2935 Год назад
Thank you sir. Clear and easy to understand. Thank you.
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Do like share and subscribe
@rodrigorcbb
@rodrigorcbb Год назад
Thanks for the video. Great explanation!
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Do like share and subscribe
@krishnachaitanya3089
@krishnachaitanya3089 Год назад
Thats a clear explanation i have seen
@MaheshHuddar
@MaheshHuddar Год назад
Thank you Do like share and subscribe
@PRANEETHAMALAKAPALLI
@PRANEETHAMALAKAPALLI 9 месяцев назад
tq sir for this wonderful concept
@MaheshHuddar
@MaheshHuddar 9 месяцев назад
Welcome Do like share and subscribe
@Straight_Forward615
@Straight_Forward615 6 месяцев назад
thanks a lot for this wonderful lecture.
@MaheshHuddar
@MaheshHuddar 6 месяцев назад
Welcome! Do like share and subscribe
@Dinesh-be8ys
@Dinesh-be8ys Год назад
thank u for uploading like this video
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Do like share and subscribe
@yashtiwari4696
@yashtiwari4696 Год назад
Sir please upload the content of ensemble methods bagging boosting and random forest
@MaheshHuddar
@MaheshHuddar Год назад
Ensemble Learning: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-eNyUfpGBLts.html Random Forest: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-kPq328mJNE0.html
@ashishlal7212
@ashishlal7212 5 месяцев назад
Thank you so much today is my data mining and ML paper
@MaheshHuddar
@MaheshHuddar 5 месяцев назад
Welcome Do like share and subscribe
@Blackoutfor10days
@Blackoutfor10days 5 месяцев назад
Can you add the concept of hidden Markov model in your machine learning playlist
@MaheshHuddar
@MaheshHuddar 5 месяцев назад
Sure Working on it
@Blackoutfor10days
@Blackoutfor10days 5 месяцев назад
@@MaheshHuddar okay 👍
@Blackoutfor10days
@Blackoutfor10days 5 месяцев назад
@@MaheshHuddar my exam is near
@kapras711
@kapras711 Год назад
super explanation .. very easy to understand with out any hook ups sir thanks ...Inspr KVV.Prasad
@MaheshHuddar
@MaheshHuddar Год назад
Thank You Do like share and subscribe
@priya1912
@priya1912 9 месяцев назад
thank u so much
@MaheshHuddar
@MaheshHuddar 9 месяцев назад
Welcome Do like share and subscribe
@yuva_india123
@yuva_india123 Год назад
Thanks sir for your explanation 🎉
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Do like share and subscribe
@RaviShankar-gm9of
@RaviShankar-gm9of 8 месяцев назад
Super Bhayya ...
@shubhangibaruah3940
@shubhangibaruah3940 9 месяцев назад
thank you sir, you were amazing🤩
@MaheshHuddar
@MaheshHuddar 9 месяцев назад
Welcome Please do like share and subscribe
@SaifMohamed-de8uo
@SaifMohamed-de8uo 8 месяцев назад
thank you so much you are great professor
@MaheshHuddar
@MaheshHuddar 8 месяцев назад
You are very welcome Do like share and subscribe
@srinivas664
@srinivas664 2 месяца назад
Nice presentation tq sir
@sinarezaei4288
@sinarezaei4288 7 месяцев назад
Thank you very much master huddar❤
@MaheshHuddar
@MaheshHuddar 7 месяцев назад
Welcome Do like share and subscribe
@waidapapa1514
@waidapapa1514 Год назад
Why we are not dealing with e2 means why we not do e2^T.[cov matrix]
@rohanshah8129
@rohanshah8129 Год назад
Here, we had considered 2 dimension as the high dimensonal data for example. One of the most usecase of PCA is in dimensionality reduction. So, if you want you can use e2 and get second PC. But then think about it. From 2 variable, we again got 2 variables. That's why he has shown only PC1. However, in reality we generally use 2 PC axes (mostly depends on your data). If it has a lot of variables, then 3 or 4 can also be good but we don't generally go beyond that. So, in this case you will need e2, e3 and e4 as well. So this is how it works.
@aravind25
@aravind25 4 дня назад
Should we consider e2 in place of e1​@@rohanshah8129
@Husain8570
@Husain8570 Месяц назад
Best explanation
@putridisperindag6986
@putridisperindag6986 10 месяцев назад
thank you very much Sir, for ur explantion on that video. I still confused so I would like to ask how to get the value of: [-4.3052, 3.7361, 5.6928, -5.1238] how can I get the value. I still dont get. Thank u Sir
@jvbrothers5454
@jvbrothers5454 9 месяцев назад
yeahh im also confused how did he get im getting values diffrent 0.3761 5.6928 -5.128
@HamidAli-ff2zn
@HamidAli-ff2zn Год назад
Thank you so much sir amazing explaination♥♥♥
@MaheshHuddar
@MaheshHuddar Год назад
Welcome Do like share and subscribe
@nikks9969
@nikks9969 7 месяцев назад
Hello sir, thank you for your explanation.I have a doubt at 08:17 why you have considered only first equation?
@MaheshHuddar
@MaheshHuddar 7 месяцев назад
You will get same answer with second equation You can use either first or second no issues
@RaviShankar-gm9of
@RaviShankar-gm9of 8 месяцев назад
linear discriminent analysis please make a video bhayya
@ManjupriyaR-p9r
@ManjupriyaR-p9r 11 месяцев назад
Thank you very much sir
@MaheshHuddar
@MaheshHuddar 11 месяцев назад
Welcome Do like share and subscribe
@Ateeq10
@Ateeq10 8 месяцев назад
Thank you
@MaheshHuddar
@MaheshHuddar 8 месяцев назад
Welcome Do like share and subscribe
@mango-strawberry
@mango-strawberry 4 месяца назад
thanks a lot
@MaheshHuddar
@MaheshHuddar 4 месяца назад
You are most welcome Do like share and subscribe
@Abhilashaisgood
@Abhilashaisgood 4 месяца назад
thankyou sirr, how to calculate 2nd pc?
@MaheshHuddar
@MaheshHuddar 4 месяца назад
Select the second eigen vector and multiply to the given feature matrix
@zafar151
@zafar151 7 месяцев назад
Excellent
@MaheshHuddar
@MaheshHuddar 7 месяцев назад
Thank You Do like share and subscribe
@demodemo-o4z
@demodemo-o4z Год назад
Hi Sir, Great explanation about PCA. But when I searched the covariance matrix for more 2 variables it's showing that covariance is only done between 2 variables. How to calculate the covariance if a dataset have more than 2 variables. Could you please give an explanation on that.....!!
@fintech1378
@fintech1378 Год назад
you need to do for all pairwise combinations
@shahmirkhan1502
@shahmirkhan1502 Год назад
@fintech1378 is right. You need to do pairwise combinations. For example, for 4 variables, your covariance matrix will be 4x4 with the following combinations: cov(a, a) cov (a, b) cov (a, c) cov(a,d) cov(b, a) cov(b, b) cov(b, c) cov(b, d) cov(c, a) cov (c, b) cov(c, c) cov(c, d) cov(d, a) cov(d, b) cov (d, c) cov(d, d)
@rohanshah8129
@rohanshah8129 Год назад
If there are n variables, covariance matrix will be of nxn shape.
@parthibdey6005
@parthibdey6005 7 месяцев назад
is this covariance for reducing 4 to 1@@shahmirkhan1502
@MadaraUchiha-wj8sl
@MadaraUchiha-wj8sl 8 месяцев назад
Thanks you,sir
@MaheshHuddar
@MaheshHuddar 8 месяцев назад
Welcome Do like share and subscribe
@advancedappliedandpuremath
@advancedappliedandpuremath Год назад
Sir book name please
@muhammadsaad3793
@muhammadsaad3793 10 месяцев назад
Nice!
@MaheshHuddar
@MaheshHuddar 10 месяцев назад
Thank You Do like share and subscribe
@aefgfaefafe
@aefgfaefafe Год назад
بحبككككككككككككككككككككككككككككككككككككككككك يا سوسو
@MaheshHuddar
@MaheshHuddar Год назад
What it means..?
@abishekraju4521
@abishekraju4521 Год назад
@@MaheshHuddar According to google translate: _"I love you sooo"_
@jameykeller5055
@jameykeller5055 9 месяцев назад
devru sir neevu
@MaheshHuddar
@MaheshHuddar 9 месяцев назад
Do like share and subscribe
@brucewayne.64
@brucewayne.64 10 месяцев назад
Thanks Sir
@MaheshHuddar
@MaheshHuddar 10 месяцев назад
Welcome Do like share and subscribe
Далее
Мои РОДИТЕЛИ - БОТАНЫ !
31:36
Просмотров 555 тыс.
But what is a convolution?
23:01
Просмотров 2,6 млн
All Learning Algorithms Explained in 14 Minutes
14:10
Просмотров 251 тыс.
StatQuest: PCA main ideas in only 5 minutes!!!
6:05
Просмотров 1,2 млн
Principal Component Analysis (PCA)
13:46
Просмотров 385 тыс.