Тёмный

Eigenvalues & Eigenvectors : Data Science Basics 

ritvikmath
Подписаться 166 тыс.
Просмотров 146 тыс.
50% 1

Опубликовано:

 

29 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 136   
@maditea
@maditea 3 года назад
*me learning linear algebra for the first time even though i passed the class three or four years ago*
@EdeYOlorDSZs
@EdeYOlorDSZs 3 года назад
Oof
@jonathanokorie9857
@jonathanokorie9857 2 года назад
lol
@spiderjerusalem4009
@spiderjerusalem4009 Год назад
hate it. Seems more theory rather than intuition. Let alone super rigorous dumbed down overrated book such as axler's. All students would end up doing would be memorizing them for the sake of getting through all the complexity since the intuition is nowhere to be found
@paingzinkyaw331
@paingzinkyaw331 4 года назад
I just subscribed!
@ritvikmath
@ritvikmath 4 года назад
yay! thanks :)
@TheDroidMate
@TheDroidMate 4 года назад
Just came to your vids by accident.. now I'm asking how I could not have got here earlier..!
@tobydunbar1153
@tobydunbar1153 3 года назад
You have an easy, inclusive and coherent way of teaching. Great job!!👌
@ritvikmath
@ritvikmath 3 года назад
Thank you! 😃
@batuhantekmen6607
@batuhantekmen6607 4 года назад
Last 2-3 mins are invaluable, I knew eigen values and eigenvectors but didnt know where to use them. Thanks very much!
@wirelessboogie
@wirelessboogie 6 месяцев назад
Same here, cheers to the author of the video!
@romainthomas8238
@romainthomas8238 2 года назад
as a native french speaker, I understand your videos better than most of any french courses I've read / watched ! Thanks a lot, you save me a lot of time and desperation :D
@ritvikmath
@ritvikmath 2 года назад
Je suis heureux d'aider! (Sorry I only took 3 years of french in high school 😁)
@jairomejia616
@jairomejia616 Год назад
Take the last section of the video, knowing that eigen is a German word for "own" and you will never forget what is the importance of eigenvalues and eigenvectors.
@nexus1226
@nexus1226 4 года назад
Just came across your channel .. Your videos are absolutely amazing! I'm in a multivariate analysis course, where I need to refresh my linear algebra skills, so these videos are really helpful.
@ritvikmath
@ritvikmath 4 года назад
No problem!
@junaidasghar8462
@junaidasghar8462 4 года назад
which analysis course you doing?
@Ivan-mp6ff
@Ivan-mp6ff 4 месяца назад
What an amazing example in eigenvector. Help fish to find fish with same "figures". Surely has been used in dating apps😂
@visheshmp
@visheshmp 2 года назад
You are amazing ... anyone who is watching this video please don't miss last few minutes.
@sepidehmalektaji3770
@sepidehmalektaji3770 2 года назад
And the best explanation eigenvector/value prize goes to.... this guy!... good job man ...great video
@sirivilari6796
@sirivilari6796 4 года назад
Really did a good job man. Appreciate your time and valuable information
@sirginirgin4808
@sirginirgin4808 3 года назад
Many thanks. You summarised important chapter of linear Algebra in just less than 12 minutes.
@siddhantrai7529
@siddhantrai7529 3 года назад
last 30 secs taught me more than last 3 months. Thanks you sir. Your way of teaching is impeccable. I am absolutely stunned by the last minute intution.MIND = BLOWN
@ritvikmath
@ritvikmath 3 года назад
Great to hear!
@ryansolomon2778
@ryansolomon2778 3 года назад
How this man has not blown up bigger that someone like blackpenredpen is beyond me. I am in Calculus II right now, and this video made perfect sense to me.
@ritvikmath
@ritvikmath 3 года назад
That's the goal :)
@djangoworldwide7925
@djangoworldwide7925 Год назад
Evals and Evecs are everywhere in DS
@saraaltamirano
@saraaltamirano 4 года назад
I have searched and searched for an explanation like this one, took me months to finally found an explanation that anyone can understand. You are a talented teacher, thank you!
@marcowen1506
@marcowen1506 4 года назад
there's a really old maths book by Bostock and Chandler that has a good explanation of this too.
@saulflores5052
@saulflores5052 4 года назад
Thank you! I took linear over a year ago and your explanations clear up so many questions I had.
@ritvikmath
@ritvikmath 4 года назад
Great to hear!
@chathuraedirisuriya6535
@chathuraedirisuriya6535 4 года назад
Excellent explanation. Very useful in my mathematical modelling of infectious disease learning. Thank you
@PaddyMcCarthy2.1
@PaddyMcCarthy2.1 Год назад
All teachers seem to fail at the stage where they include the identity matrix [02:07] why is that? The reason is, because they understand that putting the identity matrix in does not affect vector or lambda. But they never tell you this vital bit of information. And they still wonder why people fail to understand mathematics. They had to learn it themselves. but they are not including this vital piece of knowledge in their explanations. It now seems a trivial point to them, but for a student starting out, it is not trivial. In fact, for any student with a basic understanding of Algebra, they would wonder why I only one side of the equation Ax = λx being multiplied by I, the identity matrix, surely this breaks the rule of algebra which says whatever you multiply one side of the equation by, you must multiply the other side of the equation by. And yet here, we see only one side of the equation being multiplied by I, the identity matrix. Without any explanation as to why you can do that. It's time you start explaining why it is alright to multiply λx by I, the identity matrix on one side and not the other side of the equation: answer, because as any identity does, it does not change the number. hence the word identity. [02:23] let's subtract, Ax - λx = λIx, ready? Ax - λx -[λIx]= λIx - [λIx], okay how does that equal Ax - λIx = 0? well, λIx - [ λIx] equals zero, so the right hand side of the equation is fine. but what about the left hand side? What we have is Ax - λx -[λIx] . Okay so let's apply a little algebra: like terms can be added or subtracted. No like terms so, nothing can be subtracted here. Amazingly, ritvikmath seems to think these can be subtracted. Actually, in his calculations the term λx just magically disappers, so he is left with Ax - λIx = 0. He could have got to this result a different way. Let's start out with Ax = λx, then subtract λx from both sides (as laws of algebra suggest) that would give: Ax - λx = λx - λx. which results in, Ax - λx = 0. Now he has a choice to include the identity matrix Ax - λIx = 0. see, same result. Nothing magical, nothing disappears, every step accounted for. His main argument is right. And I look forward to his video on determinants of matrices, i.e. proving that a matrix is non-invertable.
@AG-dt7we
@AG-dt7we Месяц назад
How does this property of a vector (eigen vector) remains in the same dimension even after transformation (by A) helps in some problem solving (related to ML)?
@VictorOrdu
@VictorOrdu Год назад
Lovely! I have a question: if the scalar (eigenvalue) is negative, when multiplied is the vector's direction not changed by 180 degrees?
@mlaursen
@mlaursen 3 месяца назад
Why couldn’t you have been my teacher when I was studying eigenvectors. Sigh.
@donalddavis8033
@donalddavis8033 2 года назад
This is way better than the explanation I had in my linear algebra course long ago!
@user-or7ji5hv8y
@user-or7ji5hv8y 3 года назад
Wow, I don’t know why professors rarely provide motivation like this.
@123gregery
@123gregery 2 года назад
You are good. I knew some linear algebra but I couldn't get the "feel" of it. Watching this Data Science Basics series has changed it.
@nicholasnelson1005
@nicholasnelson1005 Год назад
Didn't really go over finding the Eigenvector 😕 just solved the system of equations and left it be.
@EdeYOlorDSZs
@EdeYOlorDSZs 3 года назад
funny how the mathematical understanding behind it is very important to grasp, however we will never have to calculate the eigenvectors and values by hand after university.
@paulntalo1425
@paulntalo1425 3 года назад
You have not just made my day but my career. I have following you for two days and seems u just keep cracking the rocket science. Am doing an Msc Data Science. Thank u so much
@arsemabes
@arsemabes 3 года назад
this just made me so happy... THANK U!
@anujasebastian8034
@anujasebastian8034 3 года назад
I've been learning eigen values and vectors solved a bunch of problems without even understanding what i was doing....Thanks a lot for that explanation!!!
@mcwulf25
@mcwulf25 4 года назад
Good explanation of the math. But for 40 years I still struggle with what eigenvalues really are. Your fish example was better than most I have heard but I am still missing something vital.
@ritvikmath
@ritvikmath 4 года назад
It's definitely a tricky concept and I'm glad this video helped a little bit. Took me a long time to understand too. I think the easiest explanation is that an eigenvector is one where the matrix will map a vector to a multiple of itself (so that the input vector and the output vector both point in the same direction). Why does this matter? Because the same direction ensures the same ratios between each individual vector component which loosely means that the input and output vectors have the same proportions.
@mcwulf25
@mcwulf25 4 года назад
@@ritvikmath That helps. I have come across it in PCA and in quantum mechanics. (Yes, I am eclectic). Another question: do eigenvalues HAVE to be real?
@rajgopalmanoharan
@rajgopalmanoharan 11 месяцев назад
Thank you, really helpful, awesome explanation :)
@kewtomrao
@kewtomrao 2 года назад
How did u assume the shape of X was (2,1)? Was it because of two eigen values?
@ranjbar
@ranjbar 2 года назад
my man gave the fish a mohawk :))) thanks for the content though. much love
@CaterpillarOGM
@CaterpillarOGM Год назад
It would be useful to be able to like these videos more than once to express how appreciated they are for a newbie! Thank you!
@chandrikasaha6301
@chandrikasaha6301 6 месяцев назад
Which video to follow for the importance of invertibility?
@terrym2007
@terrym2007 6 месяцев назад
Actually, the concepts are foundational....
@lilmoesk899
@lilmoesk899 4 года назад
great explanation. thanks.
@iwasforcedtomakethis8818
@iwasforcedtomakethis8818 3 года назад
wtf is aroung his neck
@caiobustani5223
@caiobustani5223 2 года назад
probably protection for a recently done tattoo...just guessing here since I've noticed too!
@komelmerchant6772
@komelmerchant6772 4 года назад
These are awesome videos! They really Intuitively connect theoretical concepts in linear algebra with application in ways that I was never explicitly taught! Keep up the great work!
@NishantAroraarora007
@NishantAroraarora007 4 года назад
I have a quick question, at 1:01 you mention that lambda is such that it is a real number, can't this be extended to imaginary numbers as well? Btw, Thanks for you great work!
@OrionConstellationHome
@OrionConstellationHome 2 года назад
Yes.
@marcogelsomini7655
@marcogelsomini7655 3 месяца назад
11:35 useful to think about the same "ratio", thank you boss
@theodoresweger4948
@theodoresweger4948 Год назад
You not only explained the math operations, but thanks for the insite of why we are doingit. Thanks for the enlightment.
@ritvikmath
@ritvikmath Год назад
No problem!
@minxxdia1132
@minxxdia1132 4 года назад
this video made it all look so so simple, thankyou very much!
@nikolaosnikolaou1556
@nikolaosnikolaou1556 Год назад
really nice explanation thank you!!
@yingchen8028
@yingchen8028 4 года назад
I love your videos .. super helpful not just to refresh the knowledge but also understand it in a more intuitive way!! Thank you so much !
@ritvikmath
@ritvikmath 4 года назад
No problem!
@SoreneSorene
@SoreneSorene 3 года назад
Hey, special thanks for that last application example 😊
@scottlivezey9479
@scottlivezey9479 4 года назад
I haven’t had a reason to dive into this kind of topic for over 20 years: only saw it during undergrad & grad school. But I enjoyed your technique of going through it.
@ritvikmath
@ritvikmath 4 года назад
Thanks!
@rezaerabbi2492
@rezaerabbi2492 3 года назад
Could you upload a video on Hat Matrix?
@danieljulian4676
@danieljulian4676 9 месяцев назад
Your presentation skills are top-notch. Since this is the first of your videos I've watched, I don't yet know whether you devote another video to other properties of eigenvectors. You stress the collinearity, but don't talk about the way the hypervolume of some set of vectors collapses. Maybe you do this in a video where you define the determinant. Maybe your mentioning the null space of the matrix covers this. At any rate, I'll say at this point that I'll probably find all your presentations worthwhile. Best wishes in growing your channel.
@sanketannadate4407
@sanketannadate4407 4 месяца назад
God Tier Explanation
@Ahmad_Alhasanat
@Ahmad_Alhasanat 3 года назад
Wondering who hit dislike!!
@blueis910
@blueis910 3 года назад
You're helping me a lot refreshing these concepts! So happy I found your channel!
@ritvikmath
@ritvikmath 3 года назад
Awesome! Thank you!
@user-or7ji5hv8y
@user-or7ji5hv8y 3 года назад
Does matrix A have to be square?
@mustafizurrahman5699
@mustafizurrahman5699 Год назад
Awesome splendid mesmerising
@nandakumarcheiro
@nandakumarcheiro 3 года назад
In aerodrums screens eigen vector and eigenvalues for different landing planes may be manipulated with out collision by having graphics accordingly correct? That might have been a better explanation.
@ernestanonde3218
@ernestanonde3218 2 года назад
just found you & I LOVE YOU
@AnDr3s0
@AnDr3s0 3 года назад
Thanks man! Really good explanation! Keep it up!
@ritvikmath
@ritvikmath 3 года назад
Thanks, will do!
@ChristianLezcano-n2u
@ChristianLezcano-n2u Год назад
thank you, I learned faster and easier with your explanation, you rock!
@bocckoka
@bocckoka 4 года назад
I think it's important to point out what an operator can do to a vector (A*x) in general, and then point out that these eigen directions are special, because here the operator's effect is just scaling. And this is useful, because...
@小宇-b1r
@小宇-b1r Месяц назад
I'm crying
@alinazem6662
@alinazem6662 Год назад
Killed it. Period.
@MohamedMostafa-kg6gk
@MohamedMostafa-kg6gk 3 года назад
Thank you for this great explanation .
@ritvikmath
@ritvikmath 3 года назад
Glad it was helpful!
@gello95
@gello95 4 года назад
I appreciate you making these videos as they have helped so much in understanding abstract machine learning/data science concepts! :) Cheers to you!
@sahil0094
@sahil0094 3 года назад
Cheers to me . I taught him
@user-xj4gg9jm3q
@user-xj4gg9jm3q 2 года назад
this cleared up so much and importance of why we need eigenvectors, tysm!!
@unzamathematicstutormwanaumo
@unzamathematicstutormwanaumo 7 месяцев назад
You are good 👍
@sali6989
@sali6989 2 года назад
thank you I love this topic it gives me a lot in the foundation of basic data science most likely in machine learning
@danielwiczew
@danielwiczew 4 года назад
Great video!
@agamchug595
@agamchug595 2 года назад
This is amazing! Thank you so much for making these videos.
@umehmoses8118
@umehmoses8118 Год назад
Thank you
@Fat_Cat_Fly
@Fat_Cat_Fly 3 года назад
soooo good!
@luiswilbert2377
@luiswilbert2377 2 года назад
great job
@edphi
@edphi Год назад
Best video on eigen
@marcosfuenmayor563
@marcosfuenmayor563 3 года назад
amazing !!
@neurite001
@neurite001 4 года назад
Talking about fishy vectors... 8:31
@JasonBjörne89
@JasonBjörne89 4 года назад
Your videos are absolutely top-notch. Keep it up!
@adisimhaa65
@adisimhaa65 8 дней назад
thank you soooo much
@ritvikmath
@ritvikmath 7 дней назад
You're welcome!
@abhinavmishra9401
@abhinavmishra9401 4 года назад
This is amazing. You are amazing.
@ritvikmath
@ritvikmath 4 года назад
thanks!
@theodoresweger4948
@theodoresweger4948 4 года назад
Thank you very well explained and I like the fish analogy. .
@prakashb1278
@prakashb1278 3 года назад
This helps a lot!! Thank you so much
@andreo1030
@andreo1030 4 года назад
Thanks for your clear explanation
@reemalshanbari
@reemalshanbari 4 года назад
so we always gonna use only one eigenvalues, am I right?
@MichaelGoldenberg
@MichaelGoldenberg 3 года назад
Very clean and clear presentation.
@DataScience-s8q
@DataScience-s8q Год назад
WOW!! bless you man🌺
@paulbrown5839
@paulbrown5839 3 года назад
Good video! Thanks.
@robertpenoyer9998
@robertpenoyer9998 3 года назад
Math and engineering classes always seem to treat Ax = λx as an abstraction. I wish someone would say at the beginning of the discussion that Ax = λx means that an eigenvector is a vector that points in the same direction after it's been operated on by A.
@s25412
@s25412 3 года назад
they could point in different direction though, is lamda is a negative number
@robertpenoyer9998
@robertpenoyer9998 3 года назад
@@s25412 My comment about direction was a generality. Of course, A might transform x so that it points in the opposite direction, but the eigenvector will point along the same line as it was pointing before being operated on by A. A scalar multiple of an eigenvector is also an eigenvector.
@s25412
@s25412 3 года назад
@@robertpenoyer9998 Thank you
@karthikeya9803
@karthikeya9803 4 года назад
Explanation is awesome
@sahirbansal7027
@sahirbansal7027 4 года назад
hey, can we subtract the mean of each column from the column so as to make it zero mean before calculating the cov matrix. and in some textbooks it is divided by n-1 instead of n. why is that? Thanks
@neel_in_germany
@neel_in_germany 4 года назад
I think because with (n-1) the estimator is unbiased...
@paingzinkyaw331
@paingzinkyaw331 4 года назад
It is because of the difference between "population" and "sample" if you use for population then the accuracy must be considered so that we use n-1 it's for more accuracy.
@saadelmadafri8050
@saadelmadafri8050 4 года назад
great fish !
@ritvikmath
@ritvikmath 4 года назад
haha thanks!
@sarfrazjaved330
@sarfrazjaved330 2 года назад
This man is a real gem.
@尾﨑元恒-u8q
@尾﨑元恒-u8q 3 года назад
新たに定理を発見しました。
@himanihasani
@himanihasani 4 года назад
great explanation!!
@brownsugar85
@brownsugar85 4 года назад
These videos are amazing!
@matthewchunk3689
@matthewchunk3689 4 года назад
2:51 Thank you. A teacher who melds big pictures with equations
@beaudjangles
@beaudjangles 4 года назад
Fantastic
@SNSaadu1999
@SNSaadu1999 4 года назад
does Ax = LAMDA X holds for all x?
@MayankGoel447
@MayankGoel447 4 года назад
Nope
@nicoleluo6692
@nicoleluo6692 Год назад
🌹🌹🌹
@ritvikmath
@ritvikmath Год назад
🌹 🌹 🌹
@premstein16
@premstein16 4 года назад
Hi, could you please do the computation for eigen value -2 and eager to know how to plug in x1 and x2
@amjedbelgacem8218
@amjedbelgacem8218 Год назад
You are a literally a Godsend and a savior, Machine Learning is becoming more clear with every video i watch of you fam, thank you!
Далее
The Covariance Matrix : Data Science Basics
11:00
Просмотров 232 тыс.
Eigenvalues and Eigenvectors
18:32
Просмотров 39 тыс.
Watermelon magic box! #shorts by Leisi Crazy
00:20
Просмотров 32 млн
21. Eigenvalues and Eigenvectors
51:23
Просмотров 635 тыс.
Vector Projections : Data Science Basics
14:58
Просмотров 66 тыс.
Eigenvectors and eigenvalues - simply explained
11:40
Eigenvalues and Eigenvectors
33:57
Просмотров 51 тыс.
Derivative of a Matrix : Data Science Basics
13:43
Просмотров 394 тыс.
3 x  3 eigenvalues and eigenvectors
12:29
Просмотров 74 тыс.
Eigenvalues and Eigenvectors
19:01
Просмотров 237 тыс.
🔷15 - Eigenvalues and Eigenvectors of a 3x3 Matrix
31:10
Watermelon magic box! #shorts by Leisi Crazy
00:20
Просмотров 32 млн