Matt J
Posts:
4,997
Registered:
11/28/09
|
|
Re: Non-linear optimization
Posted:
Mar 6, 2013 5:56 PM
|
|
"Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <kh8g3b$p6i$1@newscl01ah.mathworks.com>... > "Matt J" wrote in message <kh8dvs$imu$1@newscl01ah.mathworks.com>... > > > > > But yes, the above is equivalent to minimizing F(x). That's what we want. Now, however, you can feed f(x) to LSQNONLIN and run its Levenberg-Marquardt routine. > > And in what sense this contortion supposes to do any good? > > The Gauss-Newton approximation of the square-root will certainly less accurate than working with F directly. Especially when the lower-bound is careless chosen. =================
It was never really about doing any "good". It was about fulfilling the OP's request. :-)
The OP wanted a way to apply Gauss-Newton to this problem and I, at least, don't see any other way. Granted, Gauss-Newton would probably do badly with a poorly chosen lower bound, but with some tuning, a tighter bound might be identifiable. For example, one could update the optimization with increasing f_low until a solution with F(x)-f_low close to zero was found. The OP has also revealed that many terms in his cost function are already squares. It's only the non-square terms that require a lower-bound estimated, so that would probably help to make a tight f_low identifiable.
As for Levenber-Marquardt, it should still converge under this scheme, though perhaps slowly. Working with F directly would without question be better, but then you have to code it yourself. I have searched for general Levenberg-Marquardt routines on the FEX and didn't find them. :-(
|
|