Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: Non-linear optimization
Replies: 32   Last Post: Mar 8, 2013 2:22 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Bruno Luong

Posts: 8,797
Registered: 7/26/08
Re: Non-linear optimization
Posted: Mar 8, 2013 2:22 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"Matt J" wrote in message <khbsaf$6jf$1@newscl01ah.mathworks.com>...
> "Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <khb5mn$7bg$1@newscl01ah.mathworks.com>...

> One can see that the empirical lambda-tuning rules in the original LM papers should in theory be applicable to true Hessians and not require an eig(H) operation. However, it's still easy to imagine that if the algorithm lands in a non-convex region where H is not positive definite, that you might have to solve
>
> (H+lambda*I)*x=-g
>
> for several lambda before a descent-direction was found.


Yes and usually the lambda is automatically adjusted (my multiplying by a constant factor) until criteria are fulfilled. Simple check as decrease of objective function will warrant (with probability) the lambda is large enough, even if H is not positive.

That having said, I don't know many people who actually are using the true Hessian in non-linear optimization.

Bruno



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.