Тёмный
No video :(

Newton's Method for optimization 

OptiML PSE
Подписаться 628
Просмотров 7 тыс.
50% 1

Material is based on the book Convex Optimization by Stephen Boyd and Lieven Vandenberghe, Chapter 9 Unconstrained minimization.

Опубликовано:

 

12 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 5   
@medad5413
@medad5413 2 года назад
I had my "aha moment" here when you multiplied the grad by the delta to calculate the directional derivative and then the resulting term resembled the second term of the multi-variate Taylor series. Thank you very much.
@123XTSK
@123XTSK Год назад
Well visualized coherent presentation of a seemingly easy but really difficult topic for intuitive comprehension.
@Falconoo7383
@Falconoo7383 3 года назад
great work...
@matthewjames7513
@matthewjames7513 2 года назад
Thanks for the video! You mention around 5:27 that: 'our hessian will be positive definite whenever our problem is convex'. Why is this the case?
@csikjarudi
@csikjarudi Год назад
Convex problem means it can be approximated locally by a convex quadratic function. The quadratic function being convex is equivalent to the hessian being positive definite.
Далее
Applied Optimization - Steepest Descent
29:49
Просмотров 62 тыс.
Сервисный центр в Ровеньках
00:27
Visually Explained: Newton's Method in Optimization
11:26
8.1 Quasi Newton Methods Part I
16:16
Просмотров 11 тыс.
Convex Optimization Basics
21:32
Просмотров 33 тыс.
22. Gradient Descent: Downhill to a Minimum
52:44
Просмотров 77 тыс.
Interior Point Method for Optimization
18:12
Просмотров 77 тыс.
The Clever Way to Count Tanks - Numberphile
16:45
Просмотров 882 тыс.