Check out ProPrep with a 30-day free trial to see how it can help you to improve your performance in STEM-based subjects: www.proprep.uk/info/TOM-Crawford
I subscribed to your channel a couple months ago, but have not watched a single video. This video showed up on my home page. This is the best presentation and proof of the spectral theorem I have seen. Beautiful logic and clarity of thought. Thank you.
I guess the base case for induction here is where A = [a], a 1X1 (automatically symmetric) matrix with single entry any real number a. Then R = [1] with R^-1 = R^T = [1], so that R^TAR = [1] [a] [1]= [a] (which is diagonal as all 1X1 matrices are diagonal).
Fella I was always told I was thick as muck in Maths at school. Yelled at my teachers, and sent out of the class for not understanding algebra. Would have really loved someone like you to inspire me. Instead I've been terrified my whole life of Maths.
In case anyone wants to know why the first statement of part (II) is equivalent to saying there is an orthogonal matrix R such that R-1AR is diagonal, the intuition can be found from 3b1b's video about eigenvectors starting from here: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-PFDu9oVAE-g.html Thanks a lot for this really clear proof Tom - there's loads of examples online but thanks for actually walking us through it :)
Do I understand correctly that _v'_ is the component-wise conjugate, i.e. _v = (a + bi, c + di) => v' = (a - bi, c - di)?_ If so, is the inner product of v with its conjugate v', i.e. _v^T * v',_ really equal to the inner product of v with itself, i.e. _v^T * v,_ as shown at ~10:55?
Isn't the proof by induction a bit of overkill her? ;-) Just considering e_i instead of e_1 and deducing that A_{i,i} = 1, and A_{i,j} = 0 for i e j does the trick, no?
What does II of the thm. say if R was changed to C or some other field? Is the proof any different if it was done on linear maps between arbitrary inner-product spaces instead of Euclidean spaces? What does the thm. say if the dimension was infinite?
The theorem also works over C, but you need to change from symmetric matrices to Hermitian matrices (i.e. matrices that equal their *conjugate* transpose). The proof works in the same way for arbitrary inner product spaces. If the dimension is infinite, one essentially gets into functional analysis and there are various spectral theorems - e.g. the same statement as the basic spectral theorem holds for compact self-adjoint operators on a (real or complex) Hilbert space. Generalising beyond that, in order for the statement to remain true, you also have to generalise your notion of eigenvectors, and this rapidly gets rather complicated.