Тёмный

Applied Optimization - Steepest Descent 

purdueMET
Подписаться 63 тыс.
Просмотров 62 тыс.
50% 1

Steepest descent is a simple, robust minimization algorithm for multi-variable problems. I show you how the method works and then run a sample calculation in Mathcad so you can see the intermediate results.

Опубликовано:

 

13 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 46   
@TheEfties
@TheEfties 4 года назад
He said it
@stanikaisla297
@stanikaisla297 7 месяцев назад
This madlad actually did it
@gerardo8av
@gerardo8av 4 года назад
Wow! I was never good in Maths. I’m a physician, and quite old, 55 years of age. And I could understand everything. Oh, and yes: it’s the COVID-19 lockdown motivation... Thank you so much. You can explain anything, I bet!
@KeesJansma7689
@KeesJansma7689 4 года назад
I didn't understand a thing in my textbook, but this is really clear! Thank you sir
@omercix
@omercix 4 года назад
I love you, finially finished my assignment with your help :)
@letpieau1660
@letpieau1660 3 года назад
Love your lecturing style! Ty so much
@robertgawlik2674
@robertgawlik2674 3 года назад
I'm so glad I found this video. Thank you very much.
@venumadhavrallapalli
@venumadhavrallapalli 2 года назад
guess a starting point, look in thesearch direction(jacobian vector), and search along a 1D variable, repeat. Very clear explaination, thank you.
@yihengliu34
@yihengliu34 3 года назад
Brilliant professor, thank you!
@Furzgranate666
@Furzgranate666 5 лет назад
This is quality Education!
@beyzabutun565
@beyzabutun565 3 года назад
This was very helpful. Thank you so much!
@OsmanNal
@OsmanNal 3 года назад
This was really good. Thank you!
@paolaalvarado5352
@paolaalvarado5352 3 года назад
Thank you so much, your explanation was very clear!!!
@mwont
@mwont 2 года назад
Amazing explanation. Thank you.
@monicabrasow5402
@monicabrasow5402 4 года назад
Great! You are a Genius!
@robothegreatful
@robothegreatful 5 лет назад
Great! Thank you!
@user-fo6jd7ts5t
@user-fo6jd7ts5t 4 года назад
Very clear. Thanks
@tiborcamargo5732
@tiborcamargo5732 5 лет назад
Such a great video, congratulations.
@purdueMET
@purdueMET 5 лет назад
Thanks :-)
@elnursalmanov7054
@elnursalmanov7054 5 лет назад
thank you very much for the video
@123XTSK
@123XTSK Год назад
Excellent!
@sanjayksau
@sanjayksau 2 года назад
Beautiful explanation. Is there any video using conjugate direction as well?
@krishnadas6832
@krishnadas6832 2 года назад
That was beautiful. Thank you very much.
@purdueMET
@purdueMET 2 года назад
Wow, thanks :-) When I originally made this video, I thought it might be too specialized to get many views. I'm very pleased to have been wrong.
@collinsdon-pedro1085
@collinsdon-pedro1085 Год назад
Wow!!!! Thank you
@chinmaypatil9386
@chinmaypatil9386 5 лет назад
Great!
@forinternet9079
@forinternet9079 Год назад
Thank you.
@a.m.4654
@a.m.4654 5 лет назад
Thank you, you helped me alot :D
@briancito_mud
@briancito_mud 5 лет назад
helmed*
@rmttbj
@rmttbj 4 года назад
24:21. Could you provide some guidance (a link to an example would also be fine) as to how to reach d = 0.179 if we are to calculate the value manually? Thank you very much :)
@dailyenglishphrases461
@dailyenglishphrases461 3 года назад
if you write the fd(d) function you gonna see that this function just depend on the "d" and you can find it's minimum by using any 1 dimensional search algorithm like bisection, golden search, or newton raphson etc..or it can be resolved analyticaly. (maybe you are no longer interested in but some others could be :) )
@ZinzinsIA
@ZinzinsIA 2 года назад
Very interesting but many things I don't understand. Often when considering the gradient, we consider it at a particular point. When we draw the gradient vector, it's a vector in the same space and coordinate system, but originating from the point where we calculated the gradient, not a vector coming from the origin of the coordinates space ? Then, If I understand gradient like derivative is a slope giving the direction of biggest change, I don't get the intuition on why the gradient lying on this slope is oriented towards the direction of steepest ascent. Does it have anything to do with basis orientation/direction ? I mean when we draw a slope and say it's the slope of the derivative at a particular point, it does not tell us if it is going up or down. I mean rate of change could be toward the decreasing side of the slope, so why do we say gradient always point towards steepest ascent
@DouglasHPlumb
@DouglasHPlumb 5 лет назад
That "d" is what brought me here - so what are the methods to find it other than getting another single dim minimization problem?
@DouglasHPlumb
@DouglasHPlumb 5 лет назад
There is finding that ortho vector along the line..
@velagasohith949
@velagasohith949 2 года назад
What is basic difference between steepest descent method and Marquardt method
@BSplitt
@BSplitt 5 лет назад
For these videos, can you please disable the clock in the background?
@WytseZ
@WytseZ 5 лет назад
I didn't notice it untill I read your comment, now I can't even watch this video...
@9-mananshah741
@9-mananshah741 5 лет назад
@@WytseZ 😂😂😂😂😂😂😂😂
@judkilolo
@judkilolo 4 года назад
What is the unity of d ?
@rowdyghatkar
@rowdyghatkar 4 года назад
The video was great... But what year is this... Did anyone else get some 90's vibes😀...
@brandonrobertson6586
@brandonrobertson6586 2 года назад
"Boats Boats Boats" - Laura Bell Bundy
@Krautmaster86
@Krautmaster86 4 года назад
i suggest to put a mic on ur tshirt =)
@1995a1995z
@1995a1995z 5 лет назад
lol sorry to bring this up but it sounded like you said the n word in 10:53 Great tutorial though. much appreciated
@mcveigh1983
@mcveigh1983 5 лет назад
lol, nearest N*
@robertlockwood5948
@robertlockwood5948 5 лет назад
I concur with both parts of that statement
@darrondb6786
@darrondb6786 3 года назад
knew id found someone else that heard it lol
Далее
Applied Optimization - Minimum Principles in Nature
10:44
Nobody Can Do it🚗❓
00:15
Просмотров 5 млн
Who Can Break Most Walls? Ep.2 | Brawl Stars
00:26
Просмотров 1,4 млн
100+ Linux Things you Need to Know
12:23
Просмотров 740 тыс.
Conjugate gradient method
14:32
Просмотров 6 тыс.
Interior Point Method for Optimization
18:12
Просмотров 76 тыс.
Why the gradient is the direction of steepest ascent
10:32
Newton's Method for optimization
17:27
Просмотров 7 тыс.