Matt J
Posts:
4,992
Registered:
11/28/09


Re: Nonlinear optimization
Posted:
Mar 6, 2013 5:56 PM


"Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <kh8g3b$p6i$1@newscl01ah.mathworks.com>... > "Matt J" wrote in message <kh8dvs$imu$1@newscl01ah.mathworks.com>... > > > > > But yes, the above is equivalent to minimizing F(x). That's what we want. Now, however, you can feed f(x) to LSQNONLIN and run its LevenbergMarquardt routine. > > And in what sense this contortion supposes to do any good? > > The GaussNewton approximation of the squareroot will certainly less accurate than working with F directly. Especially when the lowerbound is careless chosen. =================
It was never really about doing any "good". It was about fulfilling the OP's request. :)
The OP wanted a way to apply GaussNewton to this problem and I, at least, don't see any other way. Granted, GaussNewton would probably do badly with a poorly chosen lower bound, but with some tuning, a tighter bound might be identifiable. For example, one could update the optimization with increasing f_low until a solution with F(x)f_low close to zero was found. The OP has also revealed that many terms in his cost function are already squares. It's only the nonsquare terms that require a lowerbound estimated, so that would probably help to make a tight f_low identifiable.
As for LevenberMarquardt, it should still converge under this scheme, though perhaps slowly. Working with F directly would without question be better, but then you have to code it yourself. I have searched for general LevenbergMarquardt routines on the FEX and didn't find them. :(

