On 10/14/2013 12:24 PM, Céldor wrote: > Alan_Weiss <email@example.com> wrote in message > <firstname.lastname@example.org>... > ... >> When you start at a point where the gradient of the objective is >> zero, fminunc sees that the gradient is zero and realizes that it is >> at a local extremum (min, max, or saddle point). It stops because the >> first-order optimality measure is zero (the norm of the gradient). It >> does not check second-order conditions in this case. >> >> I hope this helps. You can read about the fminunc algorithms here: >> http://www.mathworks.com/help/optim/ug/unconstrained-nonlinear-optimization-algorithms.html >> >> >> Alan Weiss >> MATLAB mathematical toolbox documentation > > Hi Alan Weiss, > > Thank you for your replay. > It is not as it supposed to be. Is there any other method in this > toolbox which would check both the first and the second derivative and > return the minimum even if the gradient norm is 0?
You can try patternsearch (requires a Global Optimization Toolbox license) or fminsearch. They do not rely on gradient information. But they are slower solvers, and fminsearch is less reliable.