The Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math

Topic: Failing Linear Algebra:
Replies: 54   Last Post: Jan 10, 2007 12:47 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Grey Knight

Posts: 10
Registered: 12/13/04
Re: Failing Linear Algebra:
Posted: Apr 28, 2004 8:51 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

Anonymous wrote in message news:<c98b1ba0.0404271403.8eb3e99@posting.google.com>...
> harrisq@tcs.inf.tu-dresden.de (Mitch Harris) wrote in message news:<c6e1ep$aov3s$1@uni-berlin.de>...

[NB: I write <<M>> for matrices, <x> for vectors, <<A'>> for the
inverse of A, <<A+>> for its transpose and det(A) for its determinant.
<<I>> represents the identity.]

> >the determinant is what?

Some interesting facts about determinants:
det(<<A>> <<B>>) = det(<<A>>) det(<<B>>)
If one row or column of <<A>> consists of all zeroes, det(<<A>>)=0
If any two rows or columns of <<A>> are identical, det(<<A>>)=0
det(a <<A>) = a^n det(<<A>>) for n-by-n matrices
det(<<A'>>) = 1/det(<<A>>) provided det(<<A>>) not= 0 in which case
<<A'>> doesn't exist
det(<<A+>>) = det(<<A>>)
det(<<X>> <<A>> <<X'>>) = det(<<A>>) provided det(<<X>>) not= 0

>
> >eignevectors are what?
> If T(*x*) = (constant)(*x*), then (*x*) is the eigenvector and
> (constant [lambda]) is the eigenvalue, right?


A matrix <<A>> represents a linear transformation; call its
eigenvalues l_i and the corresponding eigenvectors <x_i>. Then any
point which can be written a*<x_i> will transform to the point
(l_i)*a*<x_i>; that is, points starting on the line parallel to <x_i>
through the origin will stay on that line. The eigenvalue tells you
how they move along it. Interestingly, If you apply the transformation
<<A>> to any vector <x> a large number of times, the result tends to
become parallel to the eigenvector whose corresponding eigenvalue has
the largest absolute value (ie: the <x_i> such that |l_i| is greater
than for all other l_j).

> I still need to understand what has to be done to solve for the eigenvalues
> and eigenvectors though.


If you calculate det(<<A>>-l*<<I>>) for an n-by-n matrix <<A>>, you
will get a degree-n polynomial in l; generally you'll be expected to
deal with 2-by-2 matrices, so this polynomial is just a quadratic. The
solutions of this polynomial are your eigenvalues. The eigenvectors
can then be found by solving the problem <<A>> <x_i> = l_i <x_i>; that
is (<<A>> - l_i<<I>>)<x_i> = <0>
(This is where the expression det(<<A>>-l*<<I>>) for the eigenvalues
comes from, BTW)

Interestingly, if you construct a matrix <<L>> = diag(l_i) from the
eigenvalues (that is, the diagonal elements of <<L>> are the l_i; the
other elements are zero) and another matrix <<X>> by composing the
eigenvectors as columns (that is, each column of <<X>> is one of the
eigenvectors), making sure to get the eigenvalues in the same order as
the eigenvectors, then
<<A>> = <<X>> <<L>> <<X'>>
which is a similarity transformation! <<L>> is much easier to work
with than <<A>>, so this pays off greatly. In particular you can
compute powers easily:

<<A>> ^ m = <<A>> <<A>> ... <<A>>
= (<<X>> <<L>> <<X'>>) (<<X>> <<L>> <<X'>>) ... (<<X>> <<L>>
<<X'>>)
= <<X>> <<L>> (<<X'>> <<X>>) <<L>> (<<X'>> ... <<X>>) <<L>>
<<X'>>
= <<X>> <<L>> <<I>> <<L>> <<I>> ... <<I>> <<L>> <<X'>>
= <<X>> (<<L>> ^ m) <<X'>>

which is far easier to calculate; since <<L>> is diagonal, <<L>>^m
just means you raise each of the diagonal elements to the m-th power
separately!

> similar
> > matrices behave how?)
> I don't know what a similar matrix is. Is that where one of them
> equals the other times a scalar? If so, how does one determine if two
> matrices *are* similar?


Two matrices are similar iff
<<A>> = <<X>> <<B>> <<X'>>
where <<X>> must obviously be nonsingular (ie: det(<<X>>) not= 0),
otherwise <<X'>> doesn't even exist!
If <<B>> represents some linear transformation with basis vectors
<e_i>, then <<A>> represents the same transformation with the basis
vectors <d_i> = <<X>> <e_i>


Date Subject Author
4/22/04
Read Failing Linear Algebra:
Guest
4/22/04
Read Re: Failing Linear Algebra:
Michael N. Christoff
1/10/07
Read Re: Failing Linear Algebra:
Gerry Myerson
1/10/07
Read Re: Failing Linear Algebra:
Jonathan Miller
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich
1/10/07
Read Re: Failing Linear Algebra:
Acid Pooh
1/10/07
Read Re: Failing Linear Algebra:
Guest
4/23/04
Read Re: Failing Linear Algebra:
Russell Blackadar
4/23/04
Read Re: Failing Linear Algebra:
Brian Borchers
4/27/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
maky m.
4/26/04
Read Re: Failing Linear Algebra:
David Ames
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Michael Stemper
1/10/07
Read Re: Failing Linear Algebra:
maky m.
4/23/04
Read Re: Failing Linear Algebra:
Porker899
4/27/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Abraham Buckingham
1/10/07
Read Re: Failing Linear Algebra:
Mitch Harris
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Grey Knight
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Toni Lassila
1/10/07
Read Re: Failing Linear Algebra:
Thomas Nordhaus
1/10/07
Read Re: Failing Linear Algebra:
George Cox
4/28/04
Read Re: Failing Linear Algebra:
Dave Rusin
4/28/04
Read Re: Failing Linear Algebra:
George Cox
4/28/04
Read Re: Failing Linear Algebra:
George Cox
4/29/04
Read Re: Failing Linear Algebra:
Marc Olschok
4/29/04
Read Re: Failing Linear Algebra:
Mitch Harris
4/29/04
Read Re: Failing Linear Algebra:
Robert Israel
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/28/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/29/04
Read Re: Failing Linear Algebra:
Russell Blackadar
4/29/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
5/1/04
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Dave Rusin
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/30/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich
4/27/04
Read Re: Failing Linear Algebra:
Guest
4/27/04
Read Re: Failing Linear Algebra:
Guest
4/28/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Law Hiu Chung
4/30/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2017. All Rights Reserved.