Тёмный

24. Markov Matrices; Fourier Series 

MIT OpenCourseWare
Подписаться 5 млн
Просмотров 112 тыс.
50% 1

Опубликовано:

 

29 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 93   
@mitocw
@mitocw 5 лет назад
Audio channels fixed!
@snownontrace
@snownontrace 4 года назад
It would be nice to fix other audio channels as well :D
@quirkyquester
@quirkyquester 4 года назад
Thank you!
@abdulmukit4420
@abdulmukit4420 3 года назад
In the middle of my PhD, with all the stress, Dr. Strang's lectures are the only relaxing time I have. Nothing else really feels so great. What a legend. I wish I could attend his lectures physically.
@mississippijohnfahey7175
@mississippijohnfahey7175 2 года назад
You should.. uh.. get a guitar or maybe go on a hike or learn to cook a new dish from scratch. Glad to hear Gilbert's lectures are peaceful for you, but the stress of academia can be monumental and debilitating over time. Hobbies outside of academia can save your life. Fishing is a great hobby--lots of time to think about science or whatever you want, but you're exercising, and spending time in nature.. best of luck on your degree!!!
@seungchullee221
@seungchullee221 Год назад
@@mississippijohnfahey7175 great!
@NeyVasconcellosJr
@NeyVasconcellosJr 4 месяца назад
Yes. You are right
@euler12
@euler12 27 дней назад
does it make sense that during the middle of my PhD I put everything on hold and binge followed lectures+bookchapters+exercises to once and for all fill the gap in the knowledge in crucial areas in wireless and DL. It is taking time but It feels like time well spent
@jeffery777
@jeffery777 2 года назад
Thank you Professor Strang ! but I think 14:43 is in the nullspace of (A-I)^T rather than nullspace of A^T isn"t it ?
@kulikeke9386
@kulikeke9386 Год назад
I can not agree more. A litter mistake
@jefthervieira1
@jefthervieira1 4 года назад
Strang: That is the first time in the history of linear algebra in which a eingenvalue has a component 3300. LA: *Oh, not today, no!*
@MalgosO
@MalgosO 3 года назад
I've finished this course a while ago. still, going through Markov chains in probability was confusing till I came back and watched this, and again, Mr Strang came to the rescue. I don't know how much I need to thank you for your online courses for it to be enough, as simply saying thank you doesn't do you justice. I just want to say that you, Mr Strang is one of a minority of people who indeed make this world a better place. Thank you, from some corner of this Earth.
@mayimark
@mayimark 11 месяцев назад
same boat
@henryzhu7309
@henryzhu7309 4 года назад
This lecture is amazing. Pro.Strang gives an intuitive perspective Fourier Series, and how it is related to orthogonal vector. I know Fourier transform pretty well but now have a deep understanding of that.
@yuchujian8837
@yuchujian8837 2 года назад
I really appreciate the way he related the Fourier series to orthogonal vectors. Before watching this video all I knew was memorizing the formula
@SachinKumar-dy4hh
@SachinKumar-dy4hh 10 месяцев назад
my proffesor in signals and systems explained in a single class all the the prequisites for linear algebra leading to fourier series, i can gurantee you not a single person unless he's already mastered linear algebra understood her. This guy is most amazing proffesor ive ever seen, makes complex things really simple@@yuchujian8837
@-LSC
@-LSC 4 года назад
20:28 when you're about to say "Oh f***!" and you remember the camera is rolling! Haha gotta love Prof. Strang.
@douglasstrother6584
@douglasstrother6584 4 года назад
Professor Strang's introduction to projections using Fourier Series as an example generalizes to other orthogonal functions (Legrendre Polynimials, Bessel Functions, etc.). This is pretty cool, because you'll see this stuff ad nauseum in electrodynamics, quantum mechanics and statistical mechanics.
@douglasstrother6584
@douglasstrother6584 4 года назад
Taking a measurement on a Quantum System (COLLAPSING THE WAVEFUNCTION!!) is a physical dot product.
@EigenCharlie
@EigenCharlie 4 года назад
Which of those topics topics do you think are useful for quantum computing? (applied quantum mechanics)
@baswanthoruganti7259
@baswanthoruganti7259 4 года назад
Yes, the collapse of wavefunction is a central concept in Quantum Theory. Measurement can be understood as projection of the wavefunction onto one of the orthonormal basis states i.e., taking the dot product between the wavefunction and a basis state.
@rogiervdw
@rogiervdw 4 года назад
My god this is so beautiful. So much insight making things fall into place. "Finding coefficienes in a Fourrier series is exactly like an expansion in an orthonormal basis". Now I get it!
@baswanthoruganti7259
@baswanthoruganti7259 4 года назад
I completely understood in math terms the concept of wavefunction collapse by measurement (projection) after watching this brilliant lecture...
@zarehdarakjian2476
@zarehdarakjian2476 3 года назад
Professor Strang: You have the amazing ability to bring out new rabbits out of old hats! Hats off!
@tabrisvan1319
@tabrisvan1319 3 года назад
14:38, shouldn't that (1,1,1) be in the left nullspace of (A - I) instead of left nullspace of A as demonstrated in the tape?
@daniel_liu_it
@daniel_liu_it 3 года назад
Yeh I have this problem too,😂 he may was wrong
@yuanheli8566
@yuanheli8566 3 года назад
yes, he might be wrong
@Antonio_Serdar
@Antonio_Serdar 3 года назад
Yes
@berglingmurphy7899
@berglingmurphy7899 Месяц назад
thx bro😉
@joachimschaeffer9541
@joachimschaeffer9541 3 года назад
This is the moment in time where I have to say thank you! In my opinion this lecture connects everything that happend until now beautifully and builds the foundation for solving most advanced engineering problems.
@ivanmtz7440
@ivanmtz7440 4 года назад
Great lectures! I just wanted to comment because I am learning a lot from all his lectures! And I already have a degree in physics!
@RC-bm9mf
@RC-bm9mf 4 года назад
Cool! Good to know that, how encouraging! Did you got PhD or BS?
@muonneutrino_
@muonneutrino_ 3 месяца назад
Prof. Strang's lectures are legendary 😭I've been following this course for a while and it has been a delight to see the concepts unfold in such an elegant and coherent way
@neoneo1503
@neoneo1503 3 года назад
The link between Orthonormal vectors and Trigonometric functions by the example of Fourier Series! Great example with Great connection between Algebra and Function! at 44:00. Thanks!
@armenmkrtumyan6675
@armenmkrtumyan6675 4 года назад
I am completely lost in the last two lectures
@quirkyquester
@quirkyquester 4 года назад
me too haha. I guess because its somehow related to calculus? or differential equation, if you are not quite familiar with those topics, i guess it might add difficulties to us to see what the lecture is actually trying to prove, and the actual use of these matrix. However, you might probably remember these stuff, when you actually need them in the future. then you can pick up these knowledge again and solve the puzzle hopefully
@santiagoarce5672
@santiagoarce5672 4 года назад
Ah man, I know that feeling. You get to the end of a lecture and realise that you haven't understood anything for several lectures. It can really be worth it to rewatch carefully.
@daniel_liu_it
@daniel_liu_it 3 года назад
here is the question, so why did you back to this lecture and leave the comment lol ,
@mohammedal-haddad2652
@mohammedal-haddad2652 4 года назад
Great video great lecture great professor.
@kensaberu5983
@kensaberu5983 3 года назад
Camera: I see you're leraning about Morkov Matric- (3:43) Heyy nich shirt ma dude
@yuchenzhao6411
@yuchenzhao6411 4 года назад
16:02 is it in N(A-I) instead of N(A)?
@mohammedal-haddad2652
@mohammedal-haddad2652 4 года назад
Yes.
@mauriciobarda
@mauriciobarda 4 года назад
He's calling A to both original A and A-I , that's why he says (1,1,1) is in the left nullspace.
@shubhamtalks9718
@shubhamtalks9718 4 года назад
yes.
@poiuwnwang7109
@poiuwnwang7109 3 года назад
Yes, I figured the same thing.
@chiaochao9550
@chiaochao9550 3 года назад
@@mauriciobarda no. (1,1,1) is in the N(A^T). x1 is in N(A)
@delsonlee9212
@delsonlee9212 2 года назад
It is worth pointing out that A in the the notation N(A) (16:00) is a generic reference to matrix, not the Markov matrix used as a concrete example. Clarify this confusion, and we could better understand why the eigenvector of eigenvalue 1 is in the null space of ‘A’.
@indraneel6601
@indraneel6601 2 месяца назад
He came. Put hands in his pockets Explained Fourier series using projections He left 🛐 Chad Professor
@ribamarsantarosa4465
@ribamarsantarosa4465 9 месяцев назад
45:08 .... and what a "transpose" of a funcion would be? Nevertheless, nice to listen to lessons with the voice of Frank Sinatra!! :)
@freeeagle6074
@freeeagle6074 2 года назад
Professor Strang can almost always explain a concept from an angle other professors rarely touch upon. Excitingly, that angle seems always to be the right angle most appropriate to understand that concept.
@유현준-l5b
@유현준-l5b 4 года назад
44:25 Isn't vectors right hand side weird? I think it must be V^T*W = W1*V1^(T)+... so it becomes Dot product
@berrycoolcat
@berrycoolcat 4 года назад
The notation is just kinda sloppy there. v1, v2,... are actually elements of the vector v
@muonneutrino_
@muonneutrino_ 3 месяца назад
Prof. Strang's lectures are legendary 😭I've been following this course for a while and it has been a delight to see the concepts unfold in such an elegant and coherent way
@danialnoorizadeh2689
@danialnoorizadeh2689 3 года назад
I just came for the first 5 min then I watched it till the end! So engrossing!
@easwaranramamurthy7590
@easwaranramamurthy7590 3 года назад
Can someone say u-not better than Gilbert Strang?
@ze2411
@ze2411 4 года назад
Prof Strang the G.O.A.T!!!
@rajershigpt
@rajershigpt 4 года назад
This is certainly God level! Things he is saying at times, not only predicting the answers to himself but also to the audience, just to assure them that they can do it too. Also, not being there predicting and waiting for more than a few more secs., for others to still follow!
@zhiweithean800
@zhiweithean800 9 месяцев назад
15:42 Shouldn't X1 be in N(A-I) instead of X1 in N(A)?
@lorendisney5068
@lorendisney5068 4 года назад
The eigenvector calculation around minute 33 confused me until I saw how the zero in row three gives an extra degree of freedom in choosing the vector. The check is to multiply by row 2. A nice trick.
@saubaral
@saubaral 4 года назад
hey Yuxuan Wang! congratulations for reaching here :)
@amosbatalden5871
@amosbatalden5871 5 месяцев назад
I’m taking 18.06 now but Gilbert Strang resigned just before I took the class. Very very sad. His lectures here are perfect though so I can just watch these in my dorm instead of walking all the way to lecture.
@elamvaluthis7268
@elamvaluthis7268 4 года назад
You are emancipation of Algebra thank you sir.
@seungchullee221
@seungchullee221 Год назад
In the Fourier series explanation, I think (1, cos, sin, ...) are not unit vectors, cause inner product of cos and cos is pi, not 1. So, I guess, to be orthonormal basis, we need normalization with 1/pi with cos vector. So, In my understanding, just 'orthogonal basis' is correct term for functions (1, cos, sin, ...) as a basis for set of all possible Fourier series' (a vector space).
@OmegaQuark
@OmegaQuark 10 месяцев назад
My goodness, this is beautiful. The insights given by prof. Strang, especially on Fourier series, are out of this world! Thank you very much four your intuition on these profound topics, they are very valuable.
@vivekdabholkar5965
@vivekdabholkar5965 11 месяцев назад
Prof. Strang, you are truly amazing for even a Ph.D, in terms of your lucid explanation and causal but very deep explanation which encourages thinking more than number crunching! It is really an honor to listen to your lecture!
@nandakumarcheiro
@nandakumarcheiro 3 года назад
Hawkings radiation compels us to go in for Blackhole as singularity becomes unsteady flow for a vaporisation in a way supports Einstein's refusal to accept Blackhole as singularity.
@mauriziolazzarini3018
@mauriziolazzarini3018 3 года назад
I didn't know about Markov matrices, they are very interesting. So all n x n matrices with each entry equal to 1/n are Markov matrices, with eigenvalues 0 and 1 (1 because they are Markov matrices, 0 because their rank is equal to 1).
@satyajit1512
@satyajit1512 2 года назад
Little notation mistake at around 16 min Null space of (A-I)^T
@chiaochao9550
@chiaochao9550 3 года назад
35:17 Fourier Series
@daniel_liu_it
@daniel_liu_it 3 года назад
dddd :D here i am
@hangjiang858
@hangjiang858 6 месяцев назад
amazing lecture
@sistaseetaram9008
@sistaseetaram9008 3 года назад
I never understood Fourier series more clearly
@michaeltesfaalem3446
@michaeltesfaalem3446 5 месяцев назад
free enlightenment
@mtlee8977
@mtlee8977 4 месяца назад
Those lectures are pure gold. gorgeous!
@15997359
@15997359 2 года назад
PLEASE DO NOT CODE A.I TO USE ALL THESE FREE INFORMATION ESPECIALLY MIT LECTURES ON RU-vid😭😭🤣I HOPE THE AVERAGE JOE TAKES NOTE AND START STUDYING ASAP
@Robocat754
@Robocat754 Год назад
AI now can solve university level math problems. But I doubt it can truly understand even the basic arithmetics.
@agapedsky6087
@agapedsky6087 9 месяцев назад
Lifechanging lecture, really
@wasabibleach
@wasabibleach 2 года назад
Great analogy of infinite basis for functions
@middlevoids
@middlevoids Год назад
Just beautiful
@bigskywing1019
@bigskywing1019 2 года назад
He is a legend!
@WaDaSiYehSan
@WaDaSiYehSan 5 лет назад
20:23
@mrmr4737
@mrmr4737 4 года назад
Mind is officially blown.
@anandnadar3080
@anandnadar3080 3 года назад
Brilliant!
@DerekWoolverton
@DerekWoolverton 3 года назад
Some smart people moved from CA to MA, and some stayed in CA and made a billion. If only the probabilities had gone more my way.
@ControlTheGuh
@ControlTheGuh 4 года назад
Thank you MIT
@sansha2687
@sansha2687 4 года назад
03:15, 3:45
@utkarsh-21st
@utkarsh-21st 4 года назад
Excellent!
@georgesadler7830
@georgesadler7830 3 года назад
The invention and development of Google is due to Markov Matrices. The things that we see and use everyday in our lives is built on mathematical theory and concepts.
Далее
24b. Quiz 2 Review
48:20
Просмотров 37 тыс.
21. Eigenvalues and Eigenvectors
51:23
Просмотров 635 тыс.
Qalpoq - Amakivachcha (hajviy ko'rsatuv)
41:44
Просмотров 400 тыс.
Редакция. News: 136-я неделя
45:09
Просмотров 1,4 млн
Ответы Мэил Ру
01:00
Просмотров 1,3 млн
3. Multiplication and Inverse Matrices
46:49
Просмотров 1,5 млн
27. Positive Definite Matrices and Minima
50:40
Просмотров 249 тыс.
5. Transposes, Permutations, Spaces R^n
47:42
Просмотров 934 тыс.
Wolfram Physics Project Launch
3:50:19
Просмотров 1,8 млн
1. The Geometry of Linear Equations
39:49
Просмотров 1,7 млн
20. Cramer's Rule, Inverse Matrix, and Volume
51:01
Просмотров 340 тыс.
14. Orthogonal Vectors and Subspaces
49:48
Просмотров 530 тыс.
22. Diagonalization and Powers of A
51:50
Просмотров 504 тыс.
Space-Time: The Biggest Problem in Physics
19:42
Просмотров 227 тыс.
10. The Four Fundamental Subspaces
49:20
Просмотров 643 тыс.
Qalpoq - Amakivachcha (hajviy ko'rsatuv)
41:44
Просмотров 400 тыс.