Тёмный

The Matrix Transpose: Visual Intuition 

Sam Levey
Подписаться 1 тыс.
Просмотров 26 тыс.
50% 1

Опубликовано:

 

25 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 91   
@fullaccess2645
@fullaccess2645 3 месяца назад
This is exactly what I was looking for. Whenever I tried to research into the transpose I always got confused by all this talk of dual spaces and whatnot, this is the clearest explanation yet.
@samlevey3263
@samlevey3263 3 месяца назад
Thanks!
@blakjak002
@blakjak002 3 месяца назад
@@samlevey3263 No, thank you!
@asfasdfsd8476
@asfasdfsd8476 7 дней назад
Bro casually explained SVD in 20 seconds better than most can do while covering a different topic.
@samlevey3263
@samlevey3263 6 дней назад
Thanks!!
@00000ghcbs
@00000ghcbs 2 месяца назад
I remember asking my linear algebra teacher this exact thing and he just looked at me weird and said, "you just change rows and column". I stopped asking him stuff after that.
@roeetoledano2147
@roeetoledano2147 9 дней назад
same with my professor. I'm so glad I came across this and 3b1b videos, they make me realize just how beautiful all of this is
@roeetoledano6242
@roeetoledano6242 Месяц назад
Such an amazing video. I'm shocked that you only have 924 subscribers. You explained this so well, and so elegantly, it really is truly amazing. Linear algebra is so beautiful. Thank you.
@samlevey3263
@samlevey3263 29 дней назад
Thanks!
@evelyndai8200
@evelyndai8200 3 месяца назад
NEW MANIM RU-vidR ❤❤
@brandonklein1
@brandonklein1 Месяц назад
As a graduate student in physics, this was very helpful in grounding the definition of unitary transformations, thanks so much and beautiful video!
@SchienexKun
@SchienexKun Месяц назад
Bro you're 2nd 3B1B! Keep going✨🔥
@michaelthomas5288
@michaelthomas5288 3 месяца назад
This was a great video! I thought the background music was distracting though.
@samlevey3263
@samlevey3263 3 месяца назад
Thanks for the feedback :)
@SINxGREED
@SINxGREED Месяц назад
Thanks man i discovered retrosynthwave!! Have an icecream from my side!
@klevisimeri607
@klevisimeri607 3 месяца назад
Just so you know I have been searching for this for years! Also the music is very nice!
@evandrofilipe1526
@evandrofilipe1526 3 месяца назад
I realised at 11:00 that this falls out really nice in ga. Applying a linear transformation that is an orthogonal transformation is the same as applying a rotor. Applying a rotor looks like: a' = RaR~ if R is a unit versor, a is the original vector, a' is the transformed vector and ~ represents reversion operation If R isn't unit then a' = RaR^-1 as this expands to a' = (RaR~)/RR~ It seems like the transpose is analagous to reversion which is pretty cool.
@minerharry
@minerharry 3 месяца назад
It’s interesting there doesn’t seem to be a spot in the formula for sigma - I guess that makes sense because GA can’t really do axis-specific stretching with the basic operations?
@evandrofilipe1526
@evandrofilipe1526 3 месяца назад
@@minerharry Kind of true, shearing doesn't really happen
@minerharry
@minerharry 3 месяца назад
My first linear algebra course was very abstract, and while I loved it for that, transpose was always so opaque. The = definition is so ungodly complicated and opaque, and while we could use it algebraically, we never understood what it meant. Having this video then would have saved me so much headache. At first I was a little confused why you were talking so much about preserving dot products, but when you introduced vbar->v and I saw where the forumula was headed it absolutely blew my mind. I also totally agree with your conventions - marking x and v as different like with that bar would have saved me so many headaches too. I also saw some places mention you needed *two* scalar products for the transpose which always confused me - but that notation makes it so clear why: one is a scalar product in the input space, and one’s in the output space. TL;DR awesome video and now I’m gonna rewatch in 3 times to make sure I didn’t miss anything 😊
@minerharry
@minerharry 3 месяца назад
Oh, and I love how clear this makes the spectral theorem too! When you mentioned the case where r1 = r2^-1, I literally thought to myself “oh, I bet that means the eigenvectors are nice, since it’s rotated axis aligned scaling” before even realizing that simplifying the formula would make it obvious that the matrix is symmetric. So cool!
@samlevey3263
@samlevey3263 3 месяца назад
Thanks for the kind words, and I'm glad it helped! I agree, I spent a long time staring at that equation before it clicked.
@locustedfox
@locustedfox 3 месяца назад
Really amazing video! All concepts were clearly explained with enough math and geometrical support, amazing.
@samlevey3263
@samlevey3263 3 месяца назад
Thanks!
@SuryanshGupta-lw9fz
@SuryanshGupta-lw9fz 3 месяца назад
What an amazing explanation ! Please keep posting more such quality content. Hats off !!
@Filup
@Filup 3 месяца назад
This is an excellent video. My linear algebra studies took me through the "Linear Algebra Done Right" approach, and so we didn't use many matrices, which had its pros and cons. It is funny to me that this kind of study can often leave student without a solid grasp of what is really going on unless they put in the extra effort. For me, it wasn't until I did a course on Fourier Analysis (which was really just a functional analysis course in disguise imo) that I really had to understand a lot of this stuff algebraically. While the geometric understanding is helpful, it is funny how much it doesn't matter later on. Math is weird.
@physiologic187
@physiologic187 3 месяца назад
I guess it does matter. For me, a geometric intuition helps me understand what exactly is going on and eventually over time, the visual aspects become so engrained in memory and automated that working with more abstract algebraic terms is more convenient. The visual intuition becomes a subconscious way to process the information even though we use algebraic representations consciously.
@Filup
@Filup 3 месяца назад
@@physiologic187 Yeah, that's a fair take
@samlevey3263
@samlevey3263 3 месяца назад
@@physiologic187 I think you're right!
@arbodox
@arbodox 3 месяца назад
This is such a great video, it really helped give me an intuition of what transposes and SVDs are. If only I had watched this video before I took my linear algebra final 2 days ago...
@samlevey3263
@samlevey3263 3 месяца назад
😅
@billwindsor4224
@billwindsor4224 3 месяца назад
Thanks @arbodox . Your comment is valuable as someone who is fresh out of a Linear Algebra final exam, for the key areas of your focus in this video 👍.
@MrAndreaCaso
@MrAndreaCaso 3 месяца назад
Great video, thank you for posting. Just thought the background music was a tad too loud, though.
@aykoch
@aykoch 2 месяца назад
Just had my exam about computer vision, which relies in many cases on SVD and all sorts of transformation matrices. This video brings much clarity, if only I saw this before my exam 😅
@inverse_of_zero
@inverse_of_zero 15 дней назад
hi there, fellow math educator :) this is my first time watching your channel. great visuals and explanations! my only criticism is that the music was a bit loud and distracting. at the very least i'd say reduce the music volume (or do away with it), and if you keep music, i'd choose music with much lower tempo. the choice for this video felt a bit too 'fast' personally. but otherwise, great content! i'm looking forward to future videos. best of luck :)
@bobsmithy3103
@bobsmithy3103 11 дней назад
was doing some graphics programming and this clears things up. thank you!
@mathephilia
@mathephilia 3 месяца назад
Interestingly, this "sort of inverse" behavior (the fact that the transpose is an involution for finite dimensional vector spaces) but where "adding, subtracting or multiplying an element and its dual (=transpose) respectively gives a sort of symmetry, antisymmetry or symmetric square" finds other analogies throughout mathematics. By this I mean, the sum A + Aᵀ = 2 sym(A) is symmetric, the difference A - Aᵀ = 2 skew(A) is antisymmetric, and the product AAᵀ is symmetric, and in fact a "sort of squaring" of A, called the Gram matrix of A (and which is very important in the SVD). One simple analogical case is the complex conjugate (and you have a similar analogy with your four A, Aᵀ, A⁻¹ and (A⁻¹)ᵀ, with respectively z, z̄, z⁻¹ and z̄⁻¹ , neatly making an axis-aligned rectangle on a circle of radius |z|). The sum z + z̄ = 2 Re(z) is real, the difference z - z̄ = 2 Im(z) is imaginary, and the product zz̄ = |z|² is real, and in fact a "sort of squaring" of z, called the quadratic norm of z. Of course, this also applies to conjugate transposes (of which the above case is just a 1×1 example), etc. There's also things to say about any integral operator of the form = ∫ f(x) __ dx (which is pretty much the transpose of the function f seen as a vector, as it acts as a covector (linear form) over function spaces (which are just infinite dimensional vector spaces)), and you similarly have interesting duality properties of some linear operators, etc, but that's worth a video of its own, surely. Great video in any case, beautiful work. Thanks a lot !
@devilgamingandmusic9841
@devilgamingandmusic9841 3 месяца назад
My Mann you popped up in my recommedation for both of my accounts and I am glad
@好了-t4d
@好了-t4d 3 месяца назад
I have thoroughly watched the video once time and I would rewatch again when I have free time ‘cause some of the concepts I don’t entirely understand 😊 Good video appreciate your work
@123ah_bun
@123ah_bun 2 месяца назад
I like your video as i often get confused while watching other linearly algebra videos. The recap part in the beginning is very neat.
@frostwinter8775
@frostwinter8775 3 месяца назад
I often find math videos hard to follow, but I really like this one! My question is why is preserving the dot product in orthogonal transformation special? Is there any more interesting properties we can get out of it?
@samlevey3263
@samlevey3263 3 месяца назад
Here's some more reading on orthogonal matrices. Reflections and rotations are pretty useful for all sorts of things :)
@shimamooo
@shimamooo 3 месяца назад
Most matrices do not preserve the dot product, that’s what makes it special. Another special property that arises from preserving dot product: Orthogonal matrices are linear transformations that simply rotate/reflect a vector, and don’t change the length of it (orthogonal matrices are norm-preserving). As an example, suppose v = (1, 1) and we apply the orthogonal matrix A that rotates 45deg counterclockwise (what is this 2x2 orthogonal matrix A?). Then, Av = (-1,1) as expected. Notice how v and Av are the same norm. Now notice that the norm of a vector v is sqrt(v dot v). Thus the fact that orthogonal transformations preserve the dot product leads to our interesting property that orthogonal transformations are norm-preserving.
@godfreypigott
@godfreypigott 3 месяца назад
@@samlevey3263 _"Here's some more reading on orthogonal matrices."_ Does that mean you intended to include a link with your post?
@samlevey3263
@samlevey3263 3 месяца назад
@@godfreypigott Woops, I meant to link to the Wikipedia page :) en.wikipedia.org/wiki/Orthogonal_matrix
@korigamik
@korigamik 3 месяца назад
thanks man, really love that you share the source code as well!
@LilithumDrone
@LilithumDrone 3 месяца назад
It is good that you talked about inverse transpose matrix and use SVD to show it. And I think there is a more visually intuitional way to show the geometric relationship between them. Say, if we have an full rank 3×3 matrix {a1 | a2 | a3}(a,b,c are vextors)and its inverse transpose matrix {a1' | a2' | a3'} ,we can found that : a1' is perpendicular to the plane of span {a2,a3} while the dot product of a1 and a1' is 1 a2' is perpendicular to the plane of span {a1,a3} while the dot product of a2 and a2' is 1 a3' is perpendicular to the plane of span {a2,a3} while the dot product of a3 and a3' is 1 In fact, in crystallography, if a1, a2, a3 are the basis of some crystal's lattice, then a1', a2', a3' happen to be the basis of its reciprocal lattice. However in crystallography textbooks, the relationship of inverse transpose hardly mentioned, rather, a1' , a2', a3' are defined as: a1' = (a2 × a3) / det {a1 | a2 | a3} a2' = (a1 × a3) / det {a1 | a2 | a3} a3' = (a1 × a2) / det {a1 | a2 | a3} See, in such definition we can easily get the perpendicular properties I have mentioned but hard to notice the inverse transpose nature of matrix {a1' | a2' | a3'} . Could you make an video to visualize the connection between reciprocal lattice, inverse transpose matrix and the intuition that inverse transpose matrix has vectors perpendicular just to vectors of the initial vectors? That really helps a lot.
@gowthamjothiramalingam6911
@gowthamjothiramalingam6911 2 месяца назад
You are going to blow up in millions very quickly...mark my words!! Commenting here to get atleast thousands of likes from million views😉😁
@padraiggluck2980
@padraiggluck2980 2 месяца назад
An excellent presentation. It must have taken a lot of work to put together. ⭐️
@ThuocGiamDau
@ThuocGiamDau 19 дней назад
This video is so beautiful !
@mostshenanigans
@mostshenanigans 3 месяца назад
If I studied more and think more like in this video instead of just memorizing the formula back to college, I would have became.... About the same me but slightly more intellectual superior.
@johnstuder847
@johnstuder847 2 месяца назад
Thank you for the video. Very clear explanation of transpose and SVD. I am motivated by applications… The SVD is so insanely powerful! Could you make a video that illustrates how the unitary rotation/scale/rotation of the SVD solve a problem? That would be so helpful! Thank you for sharing.
@tomasnuti9868
@tomasnuti9868 3 месяца назад
Amazing video!! Thanks
@spiderjerusalem4009
@spiderjerusalem4009 3 месяца назад
Axler explored this from pure algebraic point of view👍🏻
@veraphine
@veraphine Месяц назад
Excellent exposition!!!
@jotaro6390
@jotaro6390 3 месяца назад
Please continue creating videos
@andresyesidmorenovilla7888
@andresyesidmorenovilla7888 3 месяца назад
Simply beautiful. Thank you so much!
@mtirado
@mtirado Месяц назад
This was very well done!
@Lexxaro
@Lexxaro 2 месяца назад
I don't really get the premise of the explanation. Wouldn't there be infinitely many new vectors v bar that satisfy x dot v = x bar dot v bar with x bar = A x and thus infinetly many matrices to get us there? What is so special about the transpose of the inverse of A then? It does of course satisfy the equation, but so could infinetly many other matrices, no?
@samlevey3263
@samlevey3263 2 месяца назад
For a given fixed x that is true, but if you want to use the same matrices for any arbitrary x and v, then you have to use the A and A-inverse-transpose.
@Lexxaro
@Lexxaro 2 месяца назад
@@samlevey3263 Ok, that makes sense. Thanks for the response!
@nikbl4k
@nikbl4k 3 месяца назад
@15:00 ... I like the 3D Ring/cascade illustration that suddenly appears. i have to rewatch it again to see where it came from, but kiuddos in advance
@garfieldgray6746
@garfieldgray6746 3 месяца назад
Impressive!
@samlevey3263
@samlevey3263 3 месяца назад
Thanks!
@chainetravail2439
@chainetravail2439 2 месяца назад
Is the choice of x and v as names for the vectors instead of u and v a deliberate choise so that students don't mess up because u and v are very similarly written ? If yes, this is another proof of the care you put in the video who is very good
@SchienexKun
@SchienexKun Месяц назад
Can yu explain what is cofactor geometrically?..❤
@williammartin4416
@williammartin4416 3 месяца назад
Is the manim code for this video available?
@samlevey3263
@samlevey3263 3 месяца назад
Thanks for the question, I've just uploaded it here: github.com/slevey087/transpose-video
@williammartin4416
@williammartin4416 3 месяца назад
@@samlevey3263 Thanks!
@randompuppy789
@randompuppy789 3 месяца назад
Great video.
@SobTim-eu3xu
@SobTim-eu3xu 3 месяца назад
Great video❤😊 I have no words❤❤
@imsleepy620
@imsleepy620 3 месяца назад
The algebraic operation still seems a bit mysterious. I would imagine there's more to a transpose hidden in its algebraic operation?
@samlevey3263
@samlevey3263 3 месяца назад
What do you have in mind?
@marcusbluestone2822
@marcusbluestone2822 3 месяца назад
Great video
@berkanc_1436
@berkanc_1436 Месяц назад
underrated af
@leeris19
@leeris19 3 месяца назад
Bro I remember last month doing a rabbit hole on this thing
@samlevey3263
@samlevey3263 3 месяца назад
Yeah, that's what happened to me too, and I figured I'd report back 😅
@ominollo
@ominollo 2 месяца назад
Beautiful 😻
@MissPiggyM976
@MissPiggyM976 3 месяца назад
Well done, many thanks!
@Nerdwithoutglasses
@Nerdwithoutglasses 2 месяца назад
Having hard time with the background music. If you are not sure whether the music is loud or not, you should remove it. The content is nice though. Hope you do it better next time.
@FriedRatBurrito
@FriedRatBurrito 3 месяца назад
I like this video
@zaccandels6695
@zaccandels6695 3 месяца назад
Excellent video.
@tricanico
@tricanico 3 месяца назад
The constant background music is unnecessary and distracting. Great video though. Thanks for sharing.
@shahulrahman2516
@shahulrahman2516 3 месяца назад
Great video
@manfredbogner9799
@manfredbogner9799 3 месяца назад
Sehr gut
@xaviergonzalez5828
@xaviergonzalez5828 3 месяца назад
Great! New subscriber!
@omridrori3286
@omridrori3286 2 месяца назад
Wowwwwwwwww22❤❤❤❤❤❤
@lexellyx_3827
@lexellyx_3827 3 месяца назад
Okay, I really want to thank you for this content. It's incredible. I saw it in class and was a little lost as to the true meaning of these formulas. This video was perfect for that. Keep going 🫶🏻
@andro_8085
@andro_8085 3 месяца назад
Awesome video
@samlevey3263
@samlevey3263 3 месяца назад
Thanks!
Далее
What is the Moebius function?   #SomePi
21:15
Просмотров 21 тыс.
AWAKENED THE UNKNOWN
00:17
Просмотров 2,3 млн
The clever way curvature is described in math
16:17
Просмотров 87 тыс.
The Key Equation Behind Probability
26:24
Просмотров 100 тыс.
Matrix Transpose and the Four Fundamental Subspaces
13:45
New Breakthrough on a 90-year-old Telephone Question
28:45
Is the Future of Linear Algebra.. Random?
35:11
Просмотров 296 тыс.
AWAKENED THE UNKNOWN
00:17
Просмотров 2,3 млн