Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Failing Linear Algebra:
Replies: 54   Last Post: Jan 10, 2007 12:47 PM

 Messages: [ Previous | Next ]
 Grey Knight Posts: 10 Registered: 12/13/04
Re: Failing Linear Algebra:
Posted: Apr 28, 2004 8:51 AM

> harrisq@tcs.inf.tu-dresden.de (Mitch Harris) wrote in message news:<c6e1ep\$aov3s\$1@uni-berlin.de>...

[NB: I write <<M>> for matrices, <x> for vectors, <<A'>> for the
inverse of A, <<A+>> for its transpose and det(A) for its determinant.
<<I>> represents the identity.]

> >the determinant is what?

det(<<A>> <<B>>) = det(<<A>>) det(<<B>>)
If one row or column of <<A>> consists of all zeroes, det(<<A>>)=0
If any two rows or columns of <<A>> are identical, det(<<A>>)=0
det(a <<A>) = a^n det(<<A>>) for n-by-n matrices
det(<<A'>>) = 1/det(<<A>>) provided det(<<A>>) not= 0 in which case
<<A'>> doesn't exist
det(<<A+>>) = det(<<A>>)
det(<<X>> <<A>> <<X'>>) = det(<<A>>) provided det(<<X>>) not= 0

>
> >eignevectors are what?
> If T(*x*) = (constant)(*x*), then (*x*) is the eigenvector and
> (constant [lambda]) is the eigenvalue, right?

A matrix <<A>> represents a linear transformation; call its
eigenvalues l_i and the corresponding eigenvectors <x_i>. Then any
point which can be written a*<x_i> will transform to the point
(l_i)*a*<x_i>; that is, points starting on the line parallel to <x_i>
through the origin will stay on that line. The eigenvalue tells you
how they move along it. Interestingly, If you apply the transformation
<<A>> to any vector <x> a large number of times, the result tends to
become parallel to the eigenvector whose corresponding eigenvalue has
the largest absolute value (ie: the <x_i> such that |l_i| is greater
than for all other l_j).

> I still need to understand what has to be done to solve for the eigenvalues
> and eigenvectors though.

If you calculate det(<<A>>-l*<<I>>) for an n-by-n matrix <<A>>, you
will get a degree-n polynomial in l; generally you'll be expected to
deal with 2-by-2 matrices, so this polynomial is just a quadratic. The
solutions of this polynomial are your eigenvalues. The eigenvectors
can then be found by solving the problem <<A>> <x_i> = l_i <x_i>; that
is (<<A>> - l_i<<I>>)<x_i> = <0>
(This is where the expression det(<<A>>-l*<<I>>) for the eigenvalues
comes from, BTW)

Interestingly, if you construct a matrix <<L>> = diag(l_i) from the
eigenvalues (that is, the diagonal elements of <<L>> are the l_i; the
other elements are zero) and another matrix <<X>> by composing the
eigenvectors as columns (that is, each column of <<X>> is one of the
eigenvectors), making sure to get the eigenvalues in the same order as
the eigenvectors, then
<<A>> = <<X>> <<L>> <<X'>>
which is a similarity transformation! <<L>> is much easier to work
with than <<A>>, so this pays off greatly. In particular you can
compute powers easily:

<<A>> ^ m = <<A>> <<A>> ... <<A>>
= (<<X>> <<L>> <<X'>>) (<<X>> <<L>> <<X'>>) ... (<<X>> <<L>>
<<X'>>)
= <<X>> <<L>> (<<X'>> <<X>>) <<L>> (<<X'>> ... <<X>>) <<L>>
<<X'>>
= <<X>> <<L>> <<I>> <<L>> <<I>> ... <<I>> <<L>> <<X'>>
= <<X>> (<<L>> ^ m) <<X'>>

which is far easier to calculate; since <<L>> is diagonal, <<L>>^m
just means you raise each of the diagonal elements to the m-th power
separately!

> similar
> > matrices behave how?)
> I don't know what a similar matrix is. Is that where one of them
> equals the other times a scalar? If so, how does one determine if two
> matrices *are* similar?

Two matrices are similar iff
<<A>> = <<X>> <<B>> <<X'>>
where <<X>> must obviously be nonsingular (ie: det(<<X>>) not= 0),
otherwise <<X'>> doesn't even exist!
If <<B>> represents some linear transformation with basis vectors
<e_i>, then <<A>> represents the same transformation with the basis
vectors <d_i> = <<X>> <e_i>

Date Subject Author
4/22/04 Guest
4/22/04 Michael N. Christoff
1/10/07 Gerry Myerson
1/10/07 Jonathan Miller
1/10/07 Guest
1/10/07 David C. Ullrich
1/10/07 Acid Pooh
1/10/07 Guest
4/23/04 Brian Borchers
4/27/04 Guest
1/10/07 maky m.
4/26/04 David Ames
1/10/07 Guest
1/10/07 Michael Stemper
1/10/07 maky m.
4/23/04 Porker899
4/27/04 Guest
1/10/07 Abraham Buckingham
1/10/07 Mitch Harris
1/10/07 Guest
1/10/07 Grey Knight
1/10/07 Guest
1/10/07 Toni Lassila
1/10/07 Thomas Nordhaus
1/10/07 George Cox
4/28/04 Dave Rusin
4/28/04 George Cox
4/28/04 George Cox
4/29/04 Marc Olschok
4/29/04 Mitch Harris
4/29/04 Robert Israel
4/28/04 Guest
4/29/04 Guest
1/10/07 Dave Rusin
4/30/04 Guest
1/10/07 Guest
1/10/07 David C. Ullrich
4/27/04 Guest
4/27/04 Guest
4/28/04 Guest
1/10/07 Law Hiu Chung
4/30/04 Guest
1/10/07 David C. Ullrich