Anonymous wrote in message news:<firstname.lastname@example.org>... > email@example.com (Mitch Harris) wrote in message news:<firstname.lastname@example.org>...
[NB: I write <<M>> for matrices, <x> for vectors, <<A'>> for the inverse of A, <<A+>> for its transpose and det(A) for its determinant. <<I>> represents the identity.]
> >the determinant is what?
Some interesting facts about determinants: det(<<A>> <<B>>) = det(<<A>>) det(<<B>>) If one row or column of <<A>> consists of all zeroes, det(<<A>>)=0 If any two rows or columns of <<A>> are identical, det(<<A>>)=0 det(a <<A>) = a^n det(<<A>>) for n-by-n matrices det(<<A'>>) = 1/det(<<A>>) provided det(<<A>>) not= 0 in which case <<A'>> doesn't exist det(<<A+>>) = det(<<A>>) det(<<X>> <<A>> <<X'>>) = det(<<A>>) provided det(<<X>>) not= 0
> > >eignevectors are what? > If T(*x*) = (constant)(*x*), then (*x*) is the eigenvector and > (constant [lambda]) is the eigenvalue, right?
A matrix <<A>> represents a linear transformation; call its eigenvalues l_i and the corresponding eigenvectors <x_i>. Then any point which can be written a*<x_i> will transform to the point (l_i)*a*<x_i>; that is, points starting on the line parallel to <x_i> through the origin will stay on that line. The eigenvalue tells you how they move along it. Interestingly, If you apply the transformation <<A>> to any vector <x> a large number of times, the result tends to become parallel to the eigenvector whose corresponding eigenvalue has the largest absolute value (ie: the <x_i> such that |l_i| is greater than for all other l_j).
> I still need to understand what has to be done to solve for the eigenvalues > and eigenvectors though.
If you calculate det(<<A>>-l*<<I>>) for an n-by-n matrix <<A>>, you will get a degree-n polynomial in l; generally you'll be expected to deal with 2-by-2 matrices, so this polynomial is just a quadratic. The solutions of this polynomial are your eigenvalues. The eigenvectors can then be found by solving the problem <<A>> <x_i> = l_i <x_i>; that is (<<A>> - l_i<<I>>)<x_i> = <0> (This is where the expression det(<<A>>-l*<<I>>) for the eigenvalues comes from, BTW)
Interestingly, if you construct a matrix <<L>> = diag(l_i) from the eigenvalues (that is, the diagonal elements of <<L>> are the l_i; the other elements are zero) and another matrix <<X>> by composing the eigenvectors as columns (that is, each column of <<X>> is one of the eigenvectors), making sure to get the eigenvalues in the same order as the eigenvectors, then <<A>> = <<X>> <<L>> <<X'>> which is a similarity transformation! <<L>> is much easier to work with than <<A>>, so this pays off greatly. In particular you can compute powers easily:
which is far easier to calculate; since <<L>> is diagonal, <<L>>^m just means you raise each of the diagonal elements to the m-th power separately!
> similar > > matrices behave how?) > I don't know what a similar matrix is. Is that where one of them > equals the other times a scalar? If so, how does one determine if two > matrices *are* similar?
Two matrices are similar iff <<A>> = <<X>> <<B>> <<X'>> where <<X>> must obviously be nonsingular (ie: det(<<X>>) not= 0), otherwise <<X'>> doesn't even exist! If <<B>> represents some linear transformation with basis vectors <e_i>, then <<A>> represents the same transformation with the basis vectors <d_i> = <<X>> <e_i>