Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.matlab

Topic: Pseudoinverse in MATLAB
Replies: 12   Last Post: Mar 19, 2013 12:55 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Bruno Luong

Posts: 8,813
Registered: 7/26/08
Re: Pseudoinverse in MATLAB
Posted: Mar 19, 2013 12:55 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"Daniel Vasilaky" wrote in message <ki7d5a$r3s$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <ki1ogv$l9g$1@newscl01ah.mathworks.com>...

> Thank you all for your comments!
> Is my algorithm a continuous function of the data for a fixed perturbation parameter?
> In theory, yes, and in practice one has to deal with round off error.
> Here is what the algorithm can do:
> Let A be an m-by-n matrix where m >= n. (for m=n A ill-conditioned, for m>n A?A ill-conditioned.)
> Consider the equation Ax = b and let x_0 be some initial prior solution and c>0 a perturbation parameter in ||Ax ? b|| + c||x ? x_0||.


This can be solved be backslash.

[A; eye(size(x))] \ [b; c*x0]

This penalization is well known. Among fields are climate and weather forecast.

>The extended algorithm produces a sequence x_0, x_1, x_2, ??.. which converges exponentially fast to the solution of
> Ax = b for m=n and for m > n to the solution of A?Ax = A?b (normal equations) independent of the chosen initial values x_0 and c > 0.
> Simple Tikhonov is a special case when x_0 is set to the zero vector, but zero is usually not the best choice in most applications.
> Of course a well-chosen x_0 and c > 0 will reduce the number of successive approximations needed, but the convergence is always guaranteed. Each successive approximation takes mn flops. A reasonable stopping rule for successive approximations is: ?STOP as soon as the relative residual error is greater than the prior residual error, ? i.e. , stop if
> ||Ax_i ? b||/||b|| > ||Ax_(i-1) ? b||/||b||.
> This basically says STOP if round off error starts to make things worse. This may be an overkill because experiments show that one can stop much sooner.


Using gradient conjugate it probably converge even faster.

>
> Loosely speaking, the proof and derivation of the above result reduces Richard Bellman?s functional recurrence equation to a recurrence equation of what appears to be a recurrence equation involving approximate Householder type projections, i.e.,
> (I ? vv?/(c +v?v)), tempered by parameter c>0. The recurrence relation turns out to be a generalization of the Gram-Schmidt process similar to but not the same as the Householder orthogonalization. This in turn produces an approximate QR factorization of A which solves the extended perturbed problem. The orthogonalization is done once which takes approximately 2mn^2 flops.


This is again a well-known stuffs.

> It is possible to speed things up by Bjock?s modified Gram-Schmidt (MGS) trick.
> The extended algorithm works well in my anomaly detection and reconstruction of anatomical image parts algorithms and has been accepted for publication in a biomedical engineering journal. But the above extended algorithm is only referenced there and not discussed in the publication. The derivation of the algorithm is quite tricky and the liner algebra is laborious but the convergence proof is not difficult. Not being from a math community I don?t know which journal to submit it to that may possibly accept it. Any suggestions would be greatly appreciated.


Please make sure there is something really novel in your algorithm before working on the publication.

Bruno



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.