```Date: Apr 28, 2004 7:56 PM
Author: Dave Rusin
Subject: Re: Failing Linear Algebra:

In article <20040428191317.09060.00000518@mb-m17.aol.com>,Anonymous wrote:>>(a) Prove that the set  M  of all n by n matrices is a vector space (using>>    familiar matrix addition and scalar multiplication.)>>Let set M be all nXn matrices: [Stick to fewer than 80 columns. What you displayed was probably supposedto be a "picture" of three matrices, A B and C.]Niggling little point: "Let ... be all nXn matrices" is a bit wrong.The thing you're defining isn't going to be a matrix, and I don't see howit can be all matrices since there are different matrices and a thingcan't be both A and B if A and B are different. What you mean to say is"Let ... be the SET of all nXn matrices." Did you know that at this level, mathematics is 90% grammar? Diagramyour sentence: you have a singular subject and so you need a singularsubject-complement; "the set" works, but "all ..." does not.>Let s, t be scalars, elements of R.  And let the above 3 matrices be called A,>B, C, respectively.>>I'd show that A + B = [deleted]>also exists in set M.OK.>Then show A + B (above) = B + A, which is true because (a11 + b11) = (b11 +>a11), same for all (aij + bij) = (bij + aij), since addition of scalars is>commutative.Good. (By the way you can indicate subscripts with an underscore: writea_ij  or  a_{i,j}  or something like that.)>Then show (A+B) + C = A + (B+C) (associative).Yes. No need to show me the proof; it's similar to the previous one.>Then I'd multiply the zero matrix by A and show that it equals 0 matrix, which>is also in M.NO! Being a vector space makes no demand that you be able to multiply twoelements of  M  together in any way. Maybe you mean to show that multiplyingA  by the _scalar_ 0  gives the zero matrix. >Multiply I^n by A, to show that I*A = A.NO again; the key thing is to show that multiplying by the NUMBER 1 returnsthe matrix  A. (In other words, you need an axiom to prevent silly thingsfrom slipping in under the radar as a "vector space". Many theorems wouldfail if you didn't insist that  1*A = A  because a person could say, "Oh,here's my vector space, and my definition of scalar multiplication isthat  c*A  means the zero matrix, no matter what  c  is". This is the onlyaxiom that prevents that.)>multiply s(A) and show that:>is also in M.OK>(st)(A) = (s(tA))This has to be verified; you haven't done it yet, but yes, it's prettytrivial too.>And, there's the eight, right? Um, you can't have copied the axioms right. There are different ways topresent vector spaces but there have to be axioms which combine scalarmultiplication with both additions (of numbers and of vectors). Somethinglike:  for all numbers  c, d  and all vectors  v  we have  (c+d) v = (c v) + (d v)and another one. (P.S. -- don't neglect the quantifiers "for all...";many people do and I think it only adds to the confusion.)You will probably need to copy your book's set of axioms to the newsgroupsince it looks like they're a little different from what I think ismost common. (I don't suppose it says something like, "We say that(V, +, *, 0)  is a vector space if ... ", does it?)>>What is its dimension?>>N, I guess, since it has N rows.NO! Find a basis and count how many elements are in that basis.Of course, this will probably get us into a discussion of what a basis is...>So, in homogenous form, there is a set of N>equations each with N variables, right?This "homogeneous form" stuff which you've mentioned before does notreally apply. You must be thinking of some specific applications ofvector spaces. >>(b) Prove that the map  f(x) = x^t  is a linear transformation from M to MApologies; I did not make this notation clear. Putting a little "t" northeastof the name of a matrix means to take its transpose. Maybe your book writesx'  for this, or uses some other notation.>For any matrix A in set M (same A as above), f(x) = x^t maps every element aij>in A to bij = (aij)^2 in matrix B.  Ack! Ptui! I hope first of all that you don't think that this is how yousquare a matrix. Second, neither the matrix-squaring map  f(x) = x^2,nor the "Kronecker-square" map which you just described, is a lineartransformation! (New exercise: prove this!)[wildly bogus proof deleted]>>What is its kernel?Try again for the transpose map. Incidentally, if you apply the definition of the kernel to the (nonlinear)map  f(x) = x^2  you get non-trivial things in the "kernel" when  n > 1.>>(c) Compute the eigenvalues of  f  and find the eigenspaces.>>I don't know how this would work without actual numbers and an actual matrix. >Since the example is an nXn matrix, we don't know how to calculate the>determinant.  We need det (A-lambda*I) to find the eigenvalues and>eigenvectors.NO! You yourself sort of gave the definition of eigenvectors in anearlier post. You did not mention determinants, nor should you have!I will give a hint: What is  f o f ? (That is, what happens if you applyf  twice in a row?)dave
```