The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » Software » comp.soft-sys.matlab

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Solving sparse Ax=b without certain eigenvector spaces
Replies: 11   Last Post: Aug 8, 2014 3:25 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
John D'Errico

Posts: 9,130
Registered: 12/7/04
Re: Solving sparse Ax=b without certain eigenvector spaces
Posted: Aug 3, 2014 3:05 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

"Bruno Luong" <b.luong@fogale.findmycountry> wrote in message <lrkmrd$roo$>...
> "Hui " <> wrote in message <lrjnpr$klo$>...
> > Hi, I'm trying to solve a sparse Ax=b problem. The size of A is about 10^4 by 10^4. A is pretty singular and the first a few eigenvalues of A is about 10^(-12), 10^(-6) and 10^(-4). I'm able to find the first a few eigenvalues and eigenvectors using the command eigs, which takes only one or two seconds. Now I want to find x, such that Ax=b, but I want to exclude the first three eigenvectors.
> >

> Let's assume the eigenvectors are the matrix E of size n x 3, n is the size of A.
> Then forcing the solution to be orthogonal to E as following:
> x = [ A ; E' ] \ [ b; zeros(E,2) ]
> You might want to rescale the columns of E to make the augmented system well conditioned
> % Bruno

Bruno, as I am sure you understand, that does not FORCE it
to be orthogonal to those vectors, but merely biases it towards
orthogonality. Essentially, you have appended a few new
rows to the matrix, that will penalize the sum of squares to
the extent that it is not orthogonal.

If the goal is to kill off any component of the solution with
any component that lies in the subspace spanned by those
eigenvectors, then there are two solutions that I see.

1. Deflate the matrix A, by removing them from the
eigen-decomposition of A. Thus, we can write A in the form

A = E1*d1*E1' + E2*d2*E2' + E3*d3*E3' + E4*d4*E4' + ...

where E_i and d_i are the eigenvectors and eigenvalues of A.

So we can create a new matrix, Ahat, that has no component
in those subspaces.

Ahat = A - (E1*d1*E1' + E2*d2*E2' + E3*d3*E3')

And of course, if E is an Nx3 matrix of eigenvectors and d a
vector of the corresponding eigenvalues, we can do the
deflation as

Ahat = A - E*diag(d)*E';

Now solve the system using the deflated matrix

x = Ahat\b;


x = pinv(Ahat)*b;

The alternative solution is to subtract off any component
of the solution in the subspace spanned by those vectors.
So, assuming that the vectors E are unit normalized,
something like this should work:

x = A\b;
x = x - E*(x'*E);


Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.