Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: optimization with 6 variables, fminsearch, fminunc
Replies: 2   Last Post: May 3, 2013 2:07 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Alan Weiss

Posts: 1,262
Registered: 11/27/08
Re: optimization with 6 variables, fminsearch, fminunc
Posted: May 3, 2013 8:59 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On 5/2/2013 11:29 PM, runcyclexcski wrote:
> Hi all,
>
> I am fitting large numbers (millions) of 5x5 matrices of int16 to 2-D gaussians (6 variables). A typical matrix contains a higher-intensity value at (3,3), with intensity decaying towards the periphery (it's a CCD image of a delta function, i.e. a diffraction-limited spot).
>
> I have great results with fminsearch, it consistently converges to a solution within ~300 iterations. The problem is that it takes too long - about 0.02 s per matrix - which scales up quickly with millions of matrices. I would like to speed this up at least 10 fold (with a method other than running 10 cores at once).
>
> I tried to run fminunc with the exact same parameters as fminsearch, and I am getting the same (slow) performance. It converges to the same result in 40 iterations, but overall takes the same amount of time per matrix as the 300 of the simplex, i.e. each iteration takes 10x longer than the simplex.
>
> Would running mmx precomplied code help? Would predefining derivatives help?
>
> I define my error function as a square of differences:
>
> C = (X(i)-x0)^2/(2*sx^2);
> D = (Y(k)-y0)^2/(2*sy^2);
> error = error+(I(i,k)-b-A*exp(-1*(C + D)))^2;
>
> Thank you in advance!


lsqnnlin or lsqcurvefit would almost certainly be faster. Also, you
could probably impose bounds (this might speed things a bit), and get a
good initial guess from a mean or mode of your previous fitted values.

If you go to the trouble of calculating the Jacobian, it might be even
faster. If you have Symbolic Math Toolbox, you can do this calculation
automatically. See
http://www.mathworks.com/help/optim/ug/symbolic-math-toolbox-calculates-gradients-and-hessians.html

Good luck,

Alan Weiss
MATLAB mathematical toolbox documentation



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.