The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » Software » comp.soft-sys.matlab

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Non-linear optimization
Replies: 32   Last Post: Mar 8, 2013 2:22 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Matt J

Posts: 4,997
Registered: 11/28/09
Re: Non-linear optimization
Posted: Mar 6, 2013 9:05 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <kh8jcc$4qj$>...
> > As for Levenber-Marquardt, it should still converge under this scheme, though perhaps slowly. Working with F directly would without question be better, but then you have to code it yourself. I have searched for general Levenberg-Marquardt routines on the FEX and didn't find them. :-(
> Not a generic LM, but rather than reinventing the wheel, one might save some work by changing the specific cost function and the Jacobian calculation of this code
> Admittedly it takes some minimum skill to make the Jacobian calculation correct and fast.


Bear in mind also that generic LM doesn't use the Jacobian of the cost function F(x), but rather the Hessian of F, or equivalently the Jacobian of F's gradient. This not only means a possibly intensive Hessian calculation, but also when you solve the update equation

(Hessian(x)+eye(N)*lambda)*xnew = gradient(x)

you have to choose lambda so that (Hessian(x)+eye(N)*lambda) is positive definite. This can be difficult for non-convex F.

These considerations make me wonder whether applying LM to F(x) directly really is preferable.

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.