Matt J
Posts:
4,992
Registered:
11/28/09


Re: Nonlinear optimization
Posted:
Mar 7, 2013 5:39 PM


> >It looks like finding lambda for quasiNewtonLM would be much more efficient than for true NewtonLM. > > Not sure I understand your statement. In LM method, finding lambda is based on the same empirical rules that works equally well regardless which the Hessian approximation or true Hessian is chosen. =================
I don't see how that can be. For nonconvex functions and nonposdef Hessians, an empirically chosen lambda could easily leave H+lambda*I singular, or at least, not positive definite and therefore nondescending. You would have to choose lambda>=min(eig(H)) to be sure that didn't happen, and that would require an eigenanalysis of H.

