On Thu, 05 Sep 2013 14:05:56 +0100, Robin Chapman <R.J.Chapman@ex.ac.uk> wrote:
>On 05/09/2013 12:00, James Dow Allen wrote: >> Leon Aigret <email@example.com> might have writ, in >> news:firstname.lastname@example.org: >> >>> The AB^t = 0 condition translates to the requirement that MS^t is both >>> its own transpose and its own inverse, with drastic consequences for >>> its eigenvectors and eigenvalues. >> >> This is the essential point, and leads directly to the unique parametric >> form I sought. Do such matrixes, which are both symmetric and orthogonal, >> arise often? > >They correspond to vector subspaces of R^n. For such a subspace V >it acts as +1 on V and -1 on the orthogonal complement of V.
So let T1 and T2 be te matrices that, in the usual way for linear transformations, represent the orthogonal projections of R^N onto the subspaces with eigenvalues 1 and -1 respectively. Then T1 + T2 = 1 and T1 - T2 = MS^t, so M = (T1 - T2) S. From M = A + B and S = A - B it then follows that A = 1/2 (M + S) = 1/2 (T1 - T2 + 1) S = T1 S and similarly B = -T2 S. Ignoring the last minus sign for aesthetic reasons would also work because (A, B) is a valid combination iff (A, -B) is.
Presumably, this is the easy answer. A "very easy" answer could now use the same idea but skip the exploration part:
First step would still be the conversion from verbal to formal: Multiplying the transpose of the big matrix with the matrix itself results in a matrix with diagonal components A^t A + B^t B and off-diagonal components A^t B and B^t A, so the condtions for orthogonality are that A^t A + B^t B = 1 (the unity matrix) and that A^t B = 0 (which is equivalent with B^t A = 0)
First conclusion would still be that A + B is an orthogonal matrix. Now the A^t B = 0 condition means that the columns of A, interpreted as vectors, are orthogonal to the columns of B, and therefore, with matrices interpreted as linear tranformations, the images of A and B are orthogonal linear subspaces of R^N. They are complementary since for every y in R^N there is an x such that y = (A + B) x = Ax + Bx.
This demonstrates that for every pair (A, B) there is a unique combination of an orthogonal transformation and an ordered pair of complementary orthogonal subspaces such that A and B each are the product of the orthogonal transformation and one of the two orthogonal projections of R^N onto the orthogonal subspaces.
In the opposite direction, starting with an arbitrary orthogonal transformation S and a complementary pair of orthogonal subspaces with corresponding orthogonal projections T1 and T2 and defining A = T1 S and B = T2 S produces a valid (A, B) combination:
A^t B = 0 because the images of A and B are orthogonal and Pythagoras guarantees that for every x in R^N one has x^t A^t A x + x^t B^t B x = (Ax)^t Ax + (Bx)^t Bx = (Sx)^t Sx = x^t x, which is, not completely trivially, equivalent with A^t A + B^t B = 1.
So the described modeling is indeeda parametrization.