The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » Software » comp.soft-sys.matlab

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Non-linear optimization
Replies: 32   Last Post: Mar 8, 2013 2:22 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Matt J

Posts: 4,997
Registered: 11/28/09
Re: Non-linear optimization
Posted: Mar 7, 2013 4:36 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <khavr6$imm$>...
> "Matt J" wrote in message <khaui3$efe$>...

> >
> > I'm not sure what link with quasi-Newton that you're referring to. If you're saying that
> >
> > (H+lambda*I) x=gradient
> >
> > is an LM generalization of Newton's method,

> Quasi-Newton -> Replace Hessian by an appoximation of it, usually based from the first derivative, such as BFGS formula or H ~ J'*J in the least square cost function, where J is the Jacobian of the model.

> >then yes, I'm sure Newton-LM would converge faster, however each iteration looks costly. You would have to know the minimum eigenvalue of H in order to make (H+lambda*I) positive definite.
> Both BFGS and J'*J approximation provide quasi convex quadratic approximation. Therefore there is no need to bother with such detail about positiveness.


That much, I understand. Maybe I didn't understand what you meant by quasi-Newton being "quasi-efficient". It looks like finding lambda for quasi-Newton-LM would be much more efficient than for true Newton-LM.

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.