Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Re: optimization with 6 variables, fminsearch, fminunc
Posted:
May 3, 2013 8:59 AM


On 5/2/2013 11:29 PM, runcyclexcski wrote: > Hi all, > > I am fitting large numbers (millions) of 5x5 matrices of int16 to 2D gaussians (6 variables). A typical matrix contains a higherintensity value at (3,3), with intensity decaying towards the periphery (it's a CCD image of a delta function, i.e. a diffractionlimited spot). > > I have great results with fminsearch, it consistently converges to a solution within ~300 iterations. The problem is that it takes too long  about 0.02 s per matrix  which scales up quickly with millions of matrices. I would like to speed this up at least 10 fold (with a method other than running 10 cores at once). > > I tried to run fminunc with the exact same parameters as fminsearch, and I am getting the same (slow) performance. It converges to the same result in 40 iterations, but overall takes the same amount of time per matrix as the 300 of the simplex, i.e. each iteration takes 10x longer than the simplex. > > Would running mmx precomplied code help? Would predefining derivatives help? > > I define my error function as a square of differences: > > C = (X(i)x0)^2/(2*sx^2); > D = (Y(k)y0)^2/(2*sy^2); > error = error+(I(i,k)bA*exp(1*(C + D)))^2; > > Thank you in advance!
lsqnnlin or lsqcurvefit would almost certainly be faster. Also, you could probably impose bounds (this might speed things a bit), and get a good initial guess from a mean or mode of your previous fitted values.
If you go to the trouble of calculating the Jacobian, it might be even faster. If you have Symbolic Math Toolbox, you can do this calculation automatically. See http://www.mathworks.com/help/optim/ug/symbolicmathtoolboxcalculatesgradientsandhessians.html
Good luck,
Alan Weiss MATLAB mathematical toolbox documentation



