Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math.independent

Topic: Nx2N lapped orthogonal transform
Replies: 13   Last Post: Sep 12, 2013 7:38 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Leon Aigret

Posts: 31
Registered: 12/2/12
Re: Nx2N lapped orthogonal transform
Posted: Sep 12, 2013 7:38 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On Thu, 05 Sep 2013 14:05:56 +0100, Robin Chapman
<R.J.Chapman@ex.ac.uk> wrote:

>On 05/09/2013 12:00, James Dow Allen wrote:
>> Leon Aigret <aigret.not@myrealbox.invalid> might have writ, in
>> news:vsmc29lmof1ah5npckgve85u0rp2sjuh0o@4ax.com:
>>

>>> The AB^t = 0 condition translates to the requirement that MS^t is both
>>> its own transpose and its own inverse, with drastic consequences for
>>> its eigenvectors and eigenvalues.

>>
>> This is the essential point, and leads directly to the unique parametric
>> form I sought. Do such matrixes, which are both symmetric and orthogonal,
>> arise often?

>
>They correspond to vector subspaces of R^n. For such a subspace V
>it acts as +1 on V and -1 on the orthogonal complement of V.


So let T1 and T2 be te matrices that, in the usual way for linear
transformations, represent the orthogonal projections of R^N onto the
subspaces with eigenvalues 1 and -1 respectively. Then T1 + T2 = 1 and
T1 - T2 = MS^t, so M = (T1 - T2) S. From M = A + B and S = A - B it
then follows that A = 1/2 (M + S) = 1/2 (T1 - T2 + 1) S = T1 S and
similarly B = -T2 S. Ignoring the last minus sign for aesthetic
reasons would also work because (A, B) is a valid combination iff
(A, -B) is.

Presumably, this is the easy answer. A "very easy" answer could now
use the same idea but skip the exploration part:

First step would still be the conversion from verbal to formal:
Multiplying the transpose of the big matrix with the matrix itself
results in a matrix with diagonal components A^t A + B^t B and
off-diagonal components A^t B and B^t A, so the condtions for
orthogonality are that A^t A + B^t B = 1 (the unity matrix) and that
A^t B = 0 (which is equivalent with B^t A = 0)

First conclusion would still be that A + B is an orthogonal matrix.
Now the A^t B = 0 condition means that the columns of A, interpreted
as vectors, are orthogonal to the columns of B, and therefore, with
matrices interpreted as linear tranformations, the images of A and B
are orthogonal linear subspaces of R^N. They are complementary since
for every y in R^N there is an x such that y = (A + B) x = Ax + Bx.

This demonstrates that for every pair (A, B) there is a unique
combination of an orthogonal transformation and an ordered pair of
complementary orthogonal subspaces such that A and B each are the
product of the orthogonal transformation and one of the two orthogonal
projections of R^N onto the orthogonal subspaces.

In the opposite direction, starting with an arbitrary orthogonal
transformation S and a complementary pair of orthogonal subspaces with
corresponding orthogonal projections T1 and T2 and defining A = T1 S
and B = T2 S produces a valid (A, B) combination:

A^t B = 0 because the images of A and B are orthogonal and Pythagoras
guarantees that for every x in R^N one has
x^t A^t A x + x^t B^t B x = (Ax)^t Ax + (Bx)^t Bx = (Sx)^t Sx = x^t x,
which is, not completely trivially, equivalent with A^t A + B^t B = 1.

So the described modeling is indeeda parametrization.

Leon



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.