Тёмный

L1.2 - Introduction to unconstrained optimization: first- and second-order conditions (vector case) 

aa4cc
Подписаться 7 тыс.
Просмотров 21 тыс.
50% 1

A continuation of an introduction to unconstrained optimization within a course on "Optimal and Robust Control" (B3M35ORR, BE3M35ORR) taught at Faculty of Electrical Engineering, Czech Technical University in Prague. We derive first- and second-order necessary and sufficient conditions of optimality for functions of vector arguments and discuss some caveats when extending the results based on directional derivative (derivative along a fixed direction) to the full vector case.

Наука

Опубликовано:

 

22 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 14   
@Dominus_Ryder
@Dominus_Ryder 6 лет назад
this lecture series was just what I needed, at a time when I needed it the most! Appreciate it!
@welidbenchouche
@welidbenchouche 5 лет назад
hey, great lectures, just one thing , i think you forgot to add the square at the HESSIAN matrix at #5:30 , the one in the box
@aa4cc
@aa4cc 5 лет назад
True, the upper index 2 is missing in the box. Note that it should not be interpretted as SQUARING. It is just one possible notation for the matrix of second (mixed) derivatives.
@ahmedgailani533
@ahmedgailani533 4 года назад
thanks, what is meant by saying we stay within the distance of epsilon from the critical point.
@MaksymCzech
@MaksymCzech 4 года назад
Exactly that - you select epsilon and then stay in a neighborhood within distance that is less than epsilon from the critical point.
@sharachchandrabhat8428
@sharachchandrabhat8428 2 года назад
Great lecture! I have a question about the Caveat. The value alpha''(0) = d1^2 + ... , where d = (d1, d2) and alpha''(0) means alpha'' evaluated at alpha = 0. Since alpha''(0) = 0 for some d1, namely d1 = 0, the sufficient condition is not satisfied. Hence alpha = 0 need not be a minima. So, why was this assertion, that alpha = 0 minima when checked using the directional derivative, made?
@xiaohaoyuan3658
@xiaohaoyuan3658 4 года назад
Thanks for the great lecture, I got a little question at #6:55 about the second-order dominate condition (right-hand side of the inequality), whether it should be O(a^3) or O(a^2)? I noted that in the previous lecture it seems to be O(a^3).
@aa4cc
@aa4cc 4 года назад
You are perfectly right. It should be O(alpha^3).
@xiaohaoyuan3658
@xiaohaoyuan3658 4 года назад
@@aa4cc Thanks for your reply. Have a nice day!
@lachlanpage7819
@lachlanpage7819 4 года назад
Why did you at first say the Hessian needed to be positive semidefinite and then in your final statement say it needs to be positive definite? Was this the difference between a necessary condition and a sufficient condition?
@aa4cc
@aa4cc 4 года назад
Indeed, the Hessian matrix being positive semidefinite is a necessary condition of optimality while the the stricter requirement of Hessian being positive definite is a sufficient condition of optimality.
@malekhammou7572
@malekhammou7572 3 года назад
What can we conclude if the first-order condition is satisfied and the second-order condition is not?
@aa4cc
@aa4cc 3 года назад
Do you mean second-order necessary condition? Or second-order sufficient one? In the former case, the point is then neither minimum nor maximum. Saddle point. To explain the latter case, consider two functions f(x)=x^3 and g(x)=x^4, both at x=0. Both satisfy the first-order necessary condition and second-order necessary condition at x=0 (first derivative equal zero and second derivative nonnegative). So far so good. But both fail to satisfy the second-order sufficient condition (second derivative strictly positive), and yet one of them is minimized at x=0 while the other is not (just picture the graphs). To detect this, we would have to study higher-order derivatives. For f(x), the third derivative is nozero. Recall that in Taylor's expansion, you have odd powers of independent variables with the third derivative. But that means that the corresponding contribution can have both signs and the point can by neither minimum nor maximum. For g(x), it is the fourth derivative that is nonzero and the corresponding term in Taylor only contains even powers of independent variables, hence for positive fourth derivative the function is minimized. Hope this helps.
@malekhammou7572
@malekhammou7572 3 года назад
@@aa4cc Thank you. Now I see things better!
Далее
Introduction to Trajectory Optimization
46:40
Просмотров 88 тыс.
What Is Mathematical Optimization?
11:35
Просмотров 119 тыс.
The moment we stopped understanding AI [AlexNet]
17:38
Просмотров 939 тыс.
Algebraic Topology 0: Cell Complexes
1:08:59
Просмотров 35 тыс.
What's a Tensor?
12:21
Просмотров 3,6 млн