Тёмный

Eigenvalue Power Method | Lecture 30 | Numerical Methods for Engineers 

Jeffrey Chasnov
Подписаться 82 тыс.
Просмотров 38 тыс.
50% 1

Опубликовано:

 

7 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 22   
@simonebozzardi1301
@simonebozzardi1301 2 года назад
Our Lord and Savior Jeffrey Chasnov. There's no way I'd be able to understand all of this by myself. You truly are a blessing to all of us self-taught students out here
@lifelyrics5659
@lifelyrics5659 Год назад
What alien language did he just say?
@nerium1440
@nerium1440 Год назад
The convergence part was super helpful, way better explained than other sources
@AjdinKocan
@AjdinKocan 4 месяца назад
indeed, this man is single-handedly responsible for me understanding this and everyone else explains it like its defense against the dark arts
@mariomariovitiviti
@mariomariovitiviti 12 дней назад
the "trick" is used to diminish dimensions to a scalar :)
@marcellocali8985
@marcellocali8985 Год назад
Why is there the X_p transposed when we solve for lambda_1? Can't understand this part.
@OlehAbramov
@OlehAbramov Год назад
It took me some time to figure it out, but the idea is that we simply take a projection of X_p+1 onto X_p, so X_p transposed is just a part of the projection formula. The idea of the projection is such that you find some coefficient, which when multiplied by X_p, would produce such a scaled version of X_p that is the closest to X_p+1. We know that when p is sufficiently large, both X_p+1 and X_p are simply some multiples of e1, so they are collinear, which means X_p+1 can be expressed as X_p multiplied by some coefficient and that is the coefficient that projection gives us. Now if we look at the formulas of X_p+1 and X_p, the difference between them is just an extra multiple of lambda, so lambda = projection coefficient of X_p+1 onto X_p.
@omerjunedi5874
@omerjunedi5874 Год назад
when did mike ehrmantraut become a math professor?????
@TheTylerdurden152
@TheTylerdurden152 3 года назад
did you learned how to write backwards just so that you could lecture from behind a glass pane strictly to not be in the way of the camera/student view? If so i commend educators like yourself, thank you
@ProfJeffreyChasnov
@ProfJeffreyChasnov 3 года назад
Haha!
@minahany5894
@minahany5894 2 года назад
I think it's more like he has the glass pane between himself and the camera and the camera simply reflects everything he writes.
@osamaelzubair1203
@osamaelzubair1203 11 месяцев назад
How did you get the lambd1 formula ? Where did the X-transpose come from ?
@mariomariovitiviti
@mariomariovitiviti 12 дней назад
in order to diminish to a scalar you use the dot product :)
@bellacrazyable
@bellacrazyable Год назад
un genio, muchísimas gracias
@Ralster
@Ralster 10 месяцев назад
At 2:15 you say that the n eigenvectors are linear combinations of the other n vectors, but they're also linearly independent? How can they be both?
@lucrece4836
@lucrece4836 9 месяцев назад
As far as i understand, he says that the col vectors can be written as lin combin of evectors, as it spans Rn.
@sunnytian4765
@sunnytian4765 10 месяцев назад
literally watching this minutes before my exam because i couldn't understand my lecture
@SvetiK1324
@SvetiK1324 2 года назад
thank you!!
@JohnJTraston
@JohnJTraston Год назад
Great! I can see you head really well. Just not the text in front of you.
@jaholt11
@jaholt11 Год назад
are you writing backwards on a glass pane you're standing behind or is it some camera trickery?
@megdutton7327
@megdutton7327 Год назад
pretty sure he’s writing normally then when he edits the video he mirrors it so the writing is facing us
@ashwiniashu254
@ashwiniashu254 2 года назад
Hello sir...I have a dought in power theorem....so can I talk with u sir
Далее
Friends
00:32
Просмотров 914 тыс.
Новый хит Люси Чеботиной 😍
00:33
Eigenvalues and Eigenvectors
19:01
Просмотров 236 тыс.
Rayleigh's power method
17:18
Просмотров 88 тыс.
Power Method with Inverse & Rayleigh
7:22
Просмотров 55 тыс.
21. Eigenvalues and Eigenvectors
51:23
Просмотров 629 тыс.
Why is the determinant like that?
19:07
Просмотров 168 тыс.
Shifted Power Method
24:36
Просмотров 2,5 тыс.