The Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math

Topic: Failing Linear Algebra:
Replies: 54   Last Post: Jan 10, 2007 12:47 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Russell Blackadar

Posts: 586
Registered: 12/12/04
Re: Failing Linear Algebra:
Posted: Apr 29, 2004 3:24 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On 28 Apr 2004 22:32:04 GMT, Anonymous wrote:

>Russell:
>

>> If the kernel
>>is {0}, i.e. the set of rows (or the set of columns) of the matrix is
>>linearly independent, and the matrix is square, then the matrix is
>>invertible and you have an isomorphism. Otherwise, one or more rows
>>or columns is a linear combination of the others; and that means there
>>are nonzero vectors which, when multiplied by your matrix, give zero.
>>In other words, those vectors are members of the kernel of your linear
>>transformation. See how it all fits together?

>
>Right. So, if a matrix is homogenous and in echelon form, the number of free
>variables is the same as the kernel, because if any free variable is set equal
>to 0, then the system holds.


To my knowledge, the term "homogeneous" is applied to a certain matrix
equation, not to the matrix itself. And the kernel is a set, not a
number; you should be talking about the *dimension* of the kernel in
this context. Also you should be aware that the "free variables" you
speak of here, as far as I understand you, are not themselves vectors
in your space; they are *components* of the nx1 column vector _x_ that
you are multiplying by your matrix to get the nx1 _0_ column vector.
Also the salient point of these variables being "free" is that they
can be set to *anything* (not just zero) and the system holds. That
means there are many, many column vectors satisfying such a
homogeneous matrix equation, a whole r-dimensional subspace of them,
where r is the number of free variables.

Therefore, the number of basic variables is the
>same as the image, right?

Its dimension, I think you mean. I'm not familiar with the term
"basic variables" but yes, the dimension of the image is n-r using the
terminology I used above. These are the components of _x_ that *must*
be set to zero for the homogeneous equation to hold, assuming that the
matrix is in row-echelon form.

Btw you are again not thinking abstractly enough for my taste; this
matrix equation (or system of equations if you prefer to call it that)
is one very nice application of matrix theory, but by no means is it
the only one.

>
>>Btw you asked elsewhere what an eigenvalue is. The definition is
>>simplicity itself, learn it!

>
>Yeah, I know the definition. I think I stated it in a post already.
>

>>Btw I found nothing objectionable
>>about Lipschutz from an abstract point of view; if he made it any
>>simpler he would mislead, but he treads that fine line very nicely
>>IMHO.

>
>OK, thanks. I'll be sure to use it to study.


Of course, your mileage may vary. If you do use it, be sure not to
neglect the exercises that are proofs. Indeed if triage is needed,
you probably should concentrate mainly on proofs over the next couple
weeks.

Your single most damning confusion, as far as I can tell, is that you
seem not able yet to distinguish between a list (or set or space) of
vectors, and the n-tuple that describes (or is) one vector. Of course
every component of a vector does also happen to be a vector in R, but
in most cases R is a *different* vector space than the one you are
considering, i.e. the components are *not* vectors in the context of
the problem or proof you are working on. So to avoid confusion, you
shouldn't call them vectors.

Take special care to notice when a list is surrounded by {} -- that is
a set, usually in this context a set of vectors -- and when it is
surrounded by () -- that is (usually, in this context) an n-tuple, and
it (usually) refers to a *single* vector. Perhaps it is an unknown or
arbitrary vector, but it is *one* nonetheless. And nonethemore.

E.g. {(1,0,0), (0,1,0), (0,0,1)} is a set of three vectors; it's one
of the possible basis sets for the vector space R^3. 1, 0, 0 are the
three components of the vector (1,0,0) in R^3.


Date Subject Author
4/22/04
Read Failing Linear Algebra:
Guest
4/22/04
Read Re: Failing Linear Algebra:
Michael N. Christoff
1/10/07
Read Re: Failing Linear Algebra:
Gerry Myerson
1/10/07
Read Re: Failing Linear Algebra:
Jonathan Miller
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich
1/10/07
Read Re: Failing Linear Algebra:
Acid Pooh
1/10/07
Read Re: Failing Linear Algebra:
Guest
4/23/04
Read Re: Failing Linear Algebra:
Russell Blackadar
4/23/04
Read Re: Failing Linear Algebra:
Brian Borchers
4/27/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
maky m.
4/26/04
Read Re: Failing Linear Algebra:
David Ames
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Michael Stemper
1/10/07
Read Re: Failing Linear Algebra:
maky m.
4/23/04
Read Re: Failing Linear Algebra:
Porker899
4/27/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Abraham Buckingham
1/10/07
Read Re: Failing Linear Algebra:
Mitch Harris
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Grey Knight
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Toni Lassila
1/10/07
Read Re: Failing Linear Algebra:
Thomas Nordhaus
1/10/07
Read Re: Failing Linear Algebra:
George Cox
4/28/04
Read Re: Failing Linear Algebra:
Dave Rusin
4/28/04
Read Re: Failing Linear Algebra:
George Cox
4/28/04
Read Re: Failing Linear Algebra:
George Cox
4/29/04
Read Re: Failing Linear Algebra:
Marc Olschok
4/29/04
Read Re: Failing Linear Algebra:
Mitch Harris
4/29/04
Read Re: Failing Linear Algebra:
Robert Israel
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/28/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/29/04
Read Re: Failing Linear Algebra:
Russell Blackadar
4/29/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
5/1/04
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Dave Rusin
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/30/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich
4/27/04
Read Re: Failing Linear Algebra:
Guest
4/27/04
Read Re: Failing Linear Algebra:
Guest
4/28/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Law Hiu Chung
4/30/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2017. All Rights Reserved.