On 10/12/2013 8:20 PM, Céldor wrote: > First of all, apologies for asking a trivial question but I am just > learning multivariable calculus and optimization toolbox in > matlab--optimization as well :) > > I was testing my understanding of using optimization toolbox in MATLAB > on a simple 2D function and tried to find some local minima but, for > particular points, MATLAB's FMINUNC function returns the exact > starting points with the following comment: "Initial point is a local > minimum." I have provided both a gradient and a 'user-supplied' > Hessian. I don't know whether it is MATLAB or me. Perhaps I made > mistake in defining Hessian but I think all is correct. > > For random starting point, the function always give a proper (the > closest) minimum. However for certain points, at exact maxima, the > fminunc returns exactly the same point with above statement. Should I > use different optimization algorithm or it is simply the MATLAB thing > I need to accept? I thought providing the Hessian would sort it out. > The Hessian is always a negative-defined matrix at all problematic > points and that should indicate that the point is a maximum! I tested > the Hessian via eig(hessian) to make sure Hessian was negative-defined > which was true. It looks like fminunc does not take Hessian into > account at all. What should I do to make fminunc work with hessians? > **snip**
When you start at a point where the gradient of the objective is zero, fminunc sees that the gradient is zero and realizes that it is at a local extremum (min, max, or saddle point). It stops because the first-order optimality measure is zero (the norm of the gradient). It does not check second-order conditions in this case.