Тёмный

(ML 15.2) Newton's method (for optimization) in multiple dimensions 

mathematicalmonk
Подписаться 92 тыс.
Просмотров 73 тыс.
50% 1

Опубликовано:

 

27 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 33   
@yupeng8847
@yupeng8847 7 лет назад
For the first time, I understand why I need Hessian matrix when I use Newton method. Great video!!!
@ВладиславГнип-д4т
@ВладиславГнип-д4т 3 года назад
Absolutely fantastic video, finally understood how this method should work for any other functions, not only for quadr ones.
@eigenchris
@eigenchris 10 лет назад
I've been looking for this derivation. Thanks yet again for a wonderful video. :)
@ShortSnoph
@ShortSnoph 3 года назад
wow its eigenchris
@aurkom
@aurkom 2 года назад
Amazing, crisp and clear explanation.
@helenjones8779
@helenjones8779 12 лет назад
Exellent video. Thank you very much for sharing. This video gave me tons of help in understanding and doing my homework. God bless.
@BerkayCelik
@BerkayCelik 11 лет назад
in some notation please note that the gradient part is also written (x-a)^T times gradient which in this case you took the transpose of the gradient and both are same.
@IslamEldifrawi
@IslamEldifrawi 2 года назад
An Awesome Video, Thanks a lot
@AnkurDeka
@AnkurDeka 7 лет назад
That was very concise and clear. Thanks a lot!
@chen-fay-4614
@chen-fay-4614 2 года назад
Thank you! very helpful.
@iamstein
@iamstein 10 лет назад
Nice video. If you ever get a chance to add Gauss-Newton, that'd be awesome! I can do the math for it, but I don't quite get the intuition.
@MichaelZibulevsky
@MichaelZibulevsky 6 лет назад
See Gauss-Newton at ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-cxh1RWMRc1Q.html
@SmilerBFC
@SmilerBFC 9 лет назад
How do you differentiate a transpose !!
@craigmatthews4517
@craigmatthews4517 6 лет назад
Not seeing how you came up with -2A(T)Hx as middle term at 4:22?
@nay__im
@nay__im 6 лет назад
It's derived from: (x(T))*(H)*(-a) + (-a(T))*H*(x) ,where * is multiplication Now these 2 terms are the same as: x(T) is dimension 1 x n, H is n x n, a is n x 1, thus their product is a 1 x 1 matrix (i.e. a scalar) Using the scalar transpose device, we can show that a scalar product is equal to the transpose of itself, i.e. the 2 terms at the top are equal So we can write the first term as (-a(T))*H*(x) and do the addition: (-a(T))*H*(x) + (-a(T))*H*(x) = 2(-a(T))*H*(x)) = -2a(T)Hx
@gautamsharma4288
@gautamsharma4288 5 лет назад
It could be done as H is symmetric.
@ShreyanGoswami
@ShreyanGoswami 8 лет назад
I did not understand where Newton Raphson method come into the picture. We approximated the function using Taylor series and found the minima. Don't we stop after that?
@danielemingolla
@danielemingolla 3 года назад
Could someone explain how he calculted the derivate of q(x) please? How he pass from b^T*x to b after differentiation?
@김동완-j8j
@김동완-j8j Год назад
Thankyou😃😃
@MyAmrutha
@MyAmrutha 13 лет назад
Very good tutorial !!!
@JKG114
@JKG114 6 лет назад
Do you need to calculate the inverse of the hessian? Would the quadratic curve that you'd get from going in the direction of the gradient lead to the minimum of approximating surface?
@avibank
@avibank 7 лет назад
Fantastic!
@MrSukalpo
@MrSukalpo 11 лет назад
I think the Step size is missing , usually the gamma in standard notation which is done to ensure that the Wolfe conditions are satisfied at each step ......
@JonDavidOrr
@JonDavidOrr 8 лет назад
I'm having a hard time understanding what c will be in actual applications of this method. Such as using some given f of two variables x1 and x2. Oh well.
@stevenhawking3637
@stevenhawking3637 7 лет назад
Thanks.
@MBasaar
@MBasaar 9 лет назад
this is all non sense...y u no show example bradar?
@paramsraman3948
@paramsraman3948 9 лет назад
whats wrong with you.. what examples do you need? this is a brilliant intuition and explanation of the math.. go and read other optimization books/tutorials and you will realize how easy this video makes it to understand the concepts.. please stop making comments if you cant say something meaningful..
@MBasaar
@MBasaar 9 лет назад
okey sorry...thanks btw for the vids :)
@SooZoodimp
@SooZoodimp 8 лет назад
are you on period or what ... what is this changing of moods?
@Leon-pn6rb
@Leon-pn6rb 7 лет назад
did not understand
@muuubiee
@muuubiee 2 года назад
This is probably the worst video to date about this. You're transposing a single variable vector???
Далее
(ML 15.3) Logistic regression (binary) - intuition
14:53
Только ЕМУ это удалось
01:00
Просмотров 2,5 млн
6.4.1 Newton's method in n dimensions
13:27
Просмотров 1,6 тыс.
You don't really understand physics
11:03
Просмотров 178 тыс.
Richard Feynman: Can Machines Think?
18:27
Просмотров 1,5 млн
Visually Explained: Newton's Method in Optimization
11:26
Fundamental Theorem of Calculus Explained | Outlier.org
16:27