Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Re: How to define Hessian for fminunc.
Posted:
Oct 15, 2013 8:22 AM


On 10/14/2013 12:24 PM, Céldor wrote: > Alan_Weiss <aweiss@mathworks.com> wrote in message > <l3gtao$mqh$1@newscl01ah.mathworks.com>... > ... >> When you start at a point where the gradient of the objective is >> zero, fminunc sees that the gradient is zero and realizes that it is >> at a local extremum (min, max, or saddle point). It stops because the >> firstorder optimality measure is zero (the norm of the gradient). It >> does not check secondorder conditions in this case. >> >> I hope this helps. You can read about the fminunc algorithms here: >> http://www.mathworks.com/help/optim/ug/unconstrainednonlinearoptimizationalgorithms.html >> >> >> Alan Weiss >> MATLAB mathematical toolbox documentation > > Hi Alan Weiss, > > Thank you for your replay. > It is not as it supposed to be. Is there any other method in this > toolbox which would check both the first and the second derivative and > return the minimum even if the gradient norm is 0?
You can try patternsearch (requires a Global Optimization Toolbox license) or fminsearch. They do not rely on gradient information. But they are slower solvers, and fminsearch is less reliable.
It seems to me that you might want to use fminunc and restart any run that doesn't change from the initial point, and nudge the initial point to see if the solver returns or goes to a new optimum. See http://www.mathworks.com/help/optim/ug/whenthesolversucceeds.html
Alan Weiss MATLAB mathematical toolbox documentation



