spasmous <email@example.com> writes: >For a general matrix (rectangular, non-symmetric) is there an iterative met= >hod for solving J*x=3Db that only uses multiplies J*v and no transpose mult= >iples J'*u? I know there are many methods based on J'*J. > >The problem I'm facing is a nonlinear optimization over thousands of variab= >les with hundreds of thousands of measurements. It is well behaved, but eac= >h step requires the Jacobian which can barely be held in memory. Plus, it s= >eems wasteful to calculate the entire Jacobian (thousands of function evalu= >ations) when probably only a few principal directions are needed on each st= >ep of the optimization. > >The Jacobian is formed numerically by adding small "h" to each element in t= >urn so it costs thousands of evaluations. A cheaper alternative I have trie= >d is to approximate J using a few (small norm) random vectors Y=3D[y1 y2 y3= >...] such that Q =3D J*Y. Then solve Q*(pinv(Y)*x)=3Db to recover something= > resembling x. > >This kind of works but I hate to use a kludge if there is something better = >available. It seems to me that I would need an approximate way to solve J*x= >=3Db with only multiplications.
since in your case the system will be incompatible and you obviously want the least squares solution, where is no hope for this: necessarily somehow J' will come into play, and indeed the two standard solvers for your problem, lsqr and svdpack both need multiplication with J'. But what if you would switch to the max-norm? then you could use lp technology and J' wouldn't appear. hth peter