For a general matrix (rectangular, non-symmetric) is there an iterative method for solving J*x=b that only uses multiplies J*v and no transpose multiples J'*u? I know there are many methods based on J'*J.
The problem I'm facing is a nonlinear optimization over thousands of variables with hundreds of thousands of measurements. It is well behaved, but each step requires the Jacobian which can barely be held in memory. Plus, it seems wasteful to calculate the entire Jacobian (thousands of function evaluations) when probably only a few principal directions are needed on each step of the optimization.
The Jacobian is formed numerically by adding small "h" to each element in turn so it costs thousands of evaluations. A cheaper alternative I have tried is to approximate J using a few (small norm) random vectors Y=[y1 y2 y3...] such that Q = J*Y. Then solve Q*(pinv(Y)*x)=b to recover something resembling x.
This kind of works but I hate to use a kludge if there is something better available. It seems to me that I would need an approximate way to solve J*x=b with only multiplications.