hate it. Seems more theory rather than intuition. Let alone super rigorous dumbed down overrated book such as axler's. All students would end up doing would be memorizing them for the sake of getting through all the complexity since the intuition is nowhere to be found
as a native french speaker, I understand your videos better than most of any french courses I've read / watched ! Thanks a lot, you save me a lot of time and desperation :D
Take the last section of the video, knowing that eigen is a German word for "own" and you will never forget what is the importance of eigenvalues and eigenvectors.
Just came across your channel .. Your videos are absolutely amazing! I'm in a multivariate analysis course, where I need to refresh my linear algebra skills, so these videos are really helpful.
last 30 secs taught me more than last 3 months. Thanks you sir. Your way of teaching is impeccable. I am absolutely stunned by the last minute intution.MIND = BLOWN
How this man has not blown up bigger that someone like blackpenredpen is beyond me. I am in Calculus II right now, and this video made perfect sense to me.
I have searched and searched for an explanation like this one, took me months to finally found an explanation that anyone can understand. You are a talented teacher, thank you!
All teachers seem to fail at the stage where they include the identity matrix [02:07] why is that? The reason is, because they understand that putting the identity matrix in does not affect vector or lambda. But they never tell you this vital bit of information. And they still wonder why people fail to understand mathematics. They had to learn it themselves. but they are not including this vital piece of knowledge in their explanations. It now seems a trivial point to them, but for a student starting out, it is not trivial. In fact, for any student with a basic understanding of Algebra, they would wonder why I only one side of the equation Ax = λx being multiplied by I, the identity matrix, surely this breaks the rule of algebra which says whatever you multiply one side of the equation by, you must multiply the other side of the equation by. And yet here, we see only one side of the equation being multiplied by I, the identity matrix. Without any explanation as to why you can do that. It's time you start explaining why it is alright to multiply λx by I, the identity matrix on one side and not the other side of the equation: answer, because as any identity does, it does not change the number. hence the word identity. [02:23] let's subtract, Ax - λx = λIx, ready? Ax - λx -[λIx]= λIx - [λIx], okay how does that equal Ax - λIx = 0? well, λIx - [ λIx] equals zero, so the right hand side of the equation is fine. but what about the left hand side? What we have is Ax - λx -[λIx] . Okay so let's apply a little algebra: like terms can be added or subtracted. No like terms so, nothing can be subtracted here. Amazingly, ritvikmath seems to think these can be subtracted. Actually, in his calculations the term λx just magically disappers, so he is left with Ax - λIx = 0. He could have got to this result a different way. Let's start out with Ax = λx, then subtract λx from both sides (as laws of algebra suggest) that would give: Ax - λx = λx - λx. which results in, Ax - λx = 0. Now he has a choice to include the identity matrix Ax - λIx = 0. see, same result. Nothing magical, nothing disappears, every step accounted for. His main argument is right. And I look forward to his video on determinants of matrices, i.e. proving that a matrix is non-invertable.
How does this property of a vector (eigen vector) remains in the same dimension even after transformation (by A) helps in some problem solving (related to ML)?
funny how the mathematical understanding behind it is very important to grasp, however we will never have to calculate the eigenvectors and values by hand after university.
You have not just made my day but my career. I have following you for two days and seems u just keep cracking the rocket science. Am doing an Msc Data Science. Thank u so much
I've been learning eigen values and vectors solved a bunch of problems without even understanding what i was doing....Thanks a lot for that explanation!!!
Good explanation of the math. But for 40 years I still struggle with what eigenvalues really are. Your fish example was better than most I have heard but I am still missing something vital.
It's definitely a tricky concept and I'm glad this video helped a little bit. Took me a long time to understand too. I think the easiest explanation is that an eigenvector is one where the matrix will map a vector to a multiple of itself (so that the input vector and the output vector both point in the same direction). Why does this matter? Because the same direction ensures the same ratios between each individual vector component which loosely means that the input and output vectors have the same proportions.
@@ritvikmath That helps. I have come across it in PCA and in quantum mechanics. (Yes, I am eclectic). Another question: do eigenvalues HAVE to be real?
These are awesome videos! They really Intuitively connect theoretical concepts in linear algebra with application in ways that I was never explicitly taught! Keep up the great work!
I have a quick question, at 1:01 you mention that lambda is such that it is a real number, can't this be extended to imaginary numbers as well? Btw, Thanks for you great work!
I haven’t had a reason to dive into this kind of topic for over 20 years: only saw it during undergrad & grad school. But I enjoyed your technique of going through it.
Your presentation skills are top-notch. Since this is the first of your videos I've watched, I don't yet know whether you devote another video to other properties of eigenvectors. You stress the collinearity, but don't talk about the way the hypervolume of some set of vectors collapses. Maybe you do this in a video where you define the determinant. Maybe your mentioning the null space of the matrix covers this. At any rate, I'll say at this point that I'll probably find all your presentations worthwhile. Best wishes in growing your channel.
In aerodrums screens eigen vector and eigenvalues for different landing planes may be manipulated with out collision by having graphics accordingly correct? That might have been a better explanation.
I think it's important to point out what an operator can do to a vector (A*x) in general, and then point out that these eigen directions are special, because here the operator's effect is just scaling. And this is useful, because...
Math and engineering classes always seem to treat Ax = λx as an abstraction. I wish someone would say at the beginning of the discussion that Ax = λx means that an eigenvector is a vector that points in the same direction after it's been operated on by A.
@@s25412 My comment about direction was a generality. Of course, A might transform x so that it points in the opposite direction, but the eigenvector will point along the same line as it was pointing before being operated on by A. A scalar multiple of an eigenvector is also an eigenvector.
hey, can we subtract the mean of each column from the column so as to make it zero mean before calculating the cov matrix. and in some textbooks it is divided by n-1 instead of n. why is that? Thanks
It is because of the difference between "population" and "sample" if you use for population then the accuracy must be considered so that we use n-1 it's for more accuracy.