Drexel dragonThe Math ForumDonate to the Math Forum

Ask Dr. Math - Questions and Answers from our Archives
_____________________________________________
Associated Topics || Dr. Math Home || Search Dr. Math
_____________________________________________

Finding Eigenvalues and Eigenvectors

Date: 06/27/2007 at 17:20:32
From: Paul
Subject: linear algebra eigenvalues/eigenvectors

(5,-2,2)     (5-x,-2,2)
(4,-3,4)  =  (4,-3-x,4) = -x^3+9x^2-23x+15 = 0
(4,-6,7)     (4,-6,7-x)

In this 3x3 square matrix, how would you find the eigenvalues and 
eigenvectors?  We believe the eigenvalues are 5,1,3 but we cannot 
seem to come up with eigenvectors.  Thanks, Paul




Date: 06/28/2007 at 00:16:44
From: Doctor Ricky
Subject: Re: linear algebra eigenvalues/eigenvectors

Hi Paul,

Thanks for writing Dr. Math!

Eigenvalues and eigenvectors are actually somewhat interesting in 
that they provide bases for the null set.  Remember, the null set is 
the set of vectors that satisfy the homogeneous system (where all 
equations are equal to zero).  So we will define our matrix A as 
your matrix:

       [5  -2  2]
  A =  [4  -3  4]
       [4  -6  7]

Now, remember that an eigenvalue is a scalar that, when multiplied 
by a column vector X, will give us the same as our matrix A 
multiplied by the same column vector.  i.e.

  AX = LX      where L is the scalar (written as lambda, typically)

This means that:

  AX - LX = 0     where 0 is the zero matrix

which is the same as:

  (A-LI)(X) = 0    where I is the identity matrix

since X is multiplied on the right for both.  Also, remember that 
matrix multiplication is not generally commutative, so (A-L)(X) is 
not necessarily the same as (X)(A-L), so the order here is important.

If we take the determinant of both sides, we get:

  det[(A - LI)(X)] = 0    where 0 is now the scalar 0 because the
                          determinant of the zero matrix is 0

A little review on theory would tell you that if you have two square 
matrices M and N, then

  det(M*N) = det(M)*det(N)

We know that our original matrix A was square, and since the matrix 
LI is just a square matrix with L on the diagonal and zeros everywhere 
else, LI is a square matrix also.

Writing this using our notation above, this means that:

  det[(A-LI)(X)] = det(A-LI)*det(X) = 0

Dividing both sides by det(X) [since it is irrelevant], we get:

  det(A-LI) = 0

Remember, we are trying to find what values of L make this statement 
true, so obviously we wouldn't divide by the determinant of the 
matrix that involves L.

This may have seemed like a lot of explanation to get to this step, 
but a nice review of the topic and why we do each step is important 
to truly grasping the process.

So now let's look at A - LI:

         [5  -2  2]     [L  0  0]   [5-L    -2      2]
  A-LI = [4  -3  4]  -  [0  L  0] = [4    -3-L      4]
         [4  -6  7]     [0  0  L]   [4      -6    7-L]

Using Lagrange's theorem (which tells us that we can find determinants 
of large matrices using minors and cofactors to break down the matrix 
into smaller submatrices) with the top row of the matrix, we get:

  det(A-LI) = (5-L)[(-3-L)(7-L)-(4)(-6)] - (-2)[(4)(7-L)-(4)(4)]
              +(2)[(4)(-6)-(-3-L)(4)]
              = 0   [because we said the det(A-LI)=0]

which simplifies to what you found,

  15 - 23L + 9L^2 - L^3 = 0

At this point, we can use the rational root theorem (if we are without 
a graphing utility that can help us find roots) to find the solutions 
of this equation.  The rational root theorem says that the only 
possible rational roots of this equation are all the factors of the 
constant term divided by all the factors of the coefficient of the 
largest exponentiated variable.  In this case, the constant and the 
coefficient respectively are 15 and -1, so the possible rational roots 
are:

  +-1, +-3, +-5, +-15

Using synthetic division or by simply plugging these possible values 
into lambda, we find that the solutions are 1, 3, and 5.

The matrix associated with L = 1 is (A-I), which is:

        [4 -2  2]
  A-I = [4 -4  4]
        [4 -6  6]

The eigenvector is the column vector X that we can right-multiply 
(because of our initial setup) by our matrix (A-LI) such that it will 
always be equal to our zero matrix.  In other words,

             [4 -2  2] [X1]
  (A-I)(X) = [4 -4  4]*[X2] = 0
             [4 -6  6] [X3]

We see on inspection that this will occur when X1=0, X2=1, and X3=1, 
which give us the eigenvector associated with L = 1 to be the matrix 
X = [0  1  1]^T, where ^T means transposed to be a column vector.

Keep in mind that the eigenvector is just ONE basis that makes this 
true.  We could have just as easily used the column vector
X = [0  -1  -1]^T.  However, notice that this vector is not linearly 
independent to our previous eigenvector, so each will form a basis 
but not necessarily a linearly independent basis.

Now, to find the eigenvector associated with L = 3, we have the 
matrix:

              [2 -2  2] [X1]
  (A-3I)(X) = [4 -6  4]*[X2] = 0
              [4 -6  4] [X3] 

Now, since the last two rows are the same, that means that we can 
eliminate the last row so we really only have:

              [2 -2  2] [X1]
  (A-3I)(X) = [4 -6  4]*[X2] = 0
                        [X3] 

On inspection, we see that the vector [1 0 -1]^T satisfies this 
relationship, so it is an eigenvector for L = 3 and forms a basis for 
the null set.

Now, for our final eigenvalue L = 5.  Substituting this in for L in 
our matrix, we get:

              [0 -2  2] [X1]
  (A-5I)(X) = [4 -8  4]*[X2] = 0
              [4 -6  2] [X3]

Although this matrix might seem a little more difficult to find an 
eigenvector on inspection, the first row tells us that X2 = X3 since 
they must cancel out.  Using that in one of our other rows, we see 
that an eigenvector for L = 5 could be [1  1  1]^T.

Note that the eigenvectors form a basis of the null space of A-LI,
otherwise known as the eigenspace.  This means that the three column
vectors we found all constitute a (linearly independent) basis of the
eigenspace.

Hopefully this helped illustrate the process of finding eigenvalues 
and eigenvectors.  If you are having any more difficulty, the Dr. 
Math Archive does have some articles on both topics.  Otherwise, if 
you have any more questions for me, feel free to write back!

- Doctor Ricky, The Math Forum
  http://mathforum.org/dr.math/ 
Associated Topics:
College Linear Algebra
High School Linear Algebra

Search the Dr. Math Library:


Find items containing (put spaces between keywords):
 
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

_____________________________________
Math Forum Home || Math Library || Quick Reference || Math Forum Search
_____________________________________

Ask Dr. MathTM
© 1994-2013 The Math Forum
http://mathforum.org/dr.math/