Steepest descent is a simple, robust minimization algorithm for multi-variable problems. I show you how the method works and then run a sample calculation in Mathcad so you can see the intermediate results.
Wow! I was never good in Maths. I’m a physician, and quite old, 55 years of age. And I could understand everything. Oh, and yes: it’s the COVID-19 lockdown motivation... Thank you so much. You can explain anything, I bet!
24:21. Could you provide some guidance (a link to an example would also be fine) as to how to reach d = 0.179 if we are to calculate the value manually? Thank you very much :)
if you write the fd(d) function you gonna see that this function just depend on the "d" and you can find it's minimum by using any 1 dimensional search algorithm like bisection, golden search, or newton raphson etc..or it can be resolved analyticaly. (maybe you are no longer interested in but some others could be :) )
Very interesting but many things I don't understand. Often when considering the gradient, we consider it at a particular point. When we draw the gradient vector, it's a vector in the same space and coordinate system, but originating from the point where we calculated the gradient, not a vector coming from the origin of the coordinates space ? Then, If I understand gradient like derivative is a slope giving the direction of biggest change, I don't get the intuition on why the gradient lying on this slope is oriented towards the direction of steepest ascent. Does it have anything to do with basis orientation/direction ? I mean when we draw a slope and say it's the slope of the derivative at a particular point, it does not tell us if it is going up or down. I mean rate of change could be toward the decreasing side of the slope, so why do we say gradient always point towards steepest ascent