Eigenvectors and EigenvaluesDate: 12/22/97 at 20:01:59 From: David Chong Subject: Eigenvectors and eigenvalues It's about eigenvectors and eigenvalues. I understand how to get the eigenvectors and eigenvalues from a 2x2 matrix - but in a 3x3 matrix I'm pretty clueless, because I can't understand the notes! Here's what I understand about them: The equation: S' = R S Rt where Rt is the transpose of R This equation gives you the equivalent transformation, S, in a new set of co-ordinate systems. S is the original transformation. R is the co-ordinate matrix of the points in the new set of co-ordinates. Is this correct? Here's something else I'm pretty unsure of: A 3x3 symmetric matrix is supposed to give 3 orthogonal unit vectors. WHY? How are they related to the original matrix? What do they show about it and what do the eigenvalues represent in this case? How do you determine a Matrix, S, by selecting your own eigenvectors, and eigenvalues? What does this matrix then represent? What does it show? Thanks. David Date: 12/23/97 at 14:52:39 From: Doctor Tom Subject: Re: Eigenvectors and eigenvalues Hi David, There's quite a bit to this, so you may have to use what I write in combination with your textbook. For any square matrix (n x n) M, you can find the eigenvalues by solving the following equation: determinant(M - lI) = 0. Usually the "l" is the Greek letter lambda. I is the identity matrix. When you expand the determinant above, you will get a polynomial of degree n, so it will have n roots (not necessarily distinct). The roots are eigenvalues, and their multiplicites represent the maximum number of eigenvectors corresponding to that eigenvalue. For example, suppose for some 7x7 matrix, you go to the trouble of solving the 7th degree polynomial, and it has the following 7 roots: 3, 3, 3, 2, 2, 1, 0. There are at most 3 eigenvectors (and at least one) having eigenvalue 3, at most 2 and at least 1 having eigenvalue 2, and exactly one eigenvector corresponding to each of the eigenvalues 1 and 0. When I say "at most 3", I mean "at most 3 linearly independent". To find the eigenvectors, just plug in variables for the coordinates and solve the equation. For example, if the 3x3 matrix M has 5 as an eigenvalue, expand the following: (a, b, c)M = (5a, 5b, 5c) You'll get a pile of conditions on a, b, and c, and sets of a, b, and c that satisfy the conditions will be eigenvectors. Any symmetric matrix can be diagonalized, which means that it's equivalent to a matrix with stuff only on the diagonal, and in that coordinate system, the vectors are orthogonal. This is a theorem that's not obvious, but is true. -Doctor Tom, The Math Forum Check out our web site! http://mathforum.org/dr.math/ Date: 12/24/97 at 10:14:19 From: Mr Y K Chong Subject: More eigenvectors and eigenvalues! Thanks for helping me out last time, but I still don't understand some parts of these eigenvectors and eigenvalue bits. How would you diagonalise this matrix: ( 2 3 1 ) ( 3 1 5 ) ( 1 5 4 ) Because I sure can't do it! I can work out the eigenvalues now, thanks to your explanation, which is some progress! :) What good does diagonalising a matrix do? What do eigenvalues and eigenvectors actually show about a matrix? Something I don't get the idea of is the eigenvalues: If you have the eigenvectors, why do you need to multiply them, when all you wanted was a direction? Suppose I have an object in 3D space and I want to 'move' the co-ordinate axis such that the new (often referred to as "primed"?) axis is aligned with one of the faces of this object. Obviously, I will need 3 eigenvectors, but how do I go about picking them? How do I then select eigenvalues? The equation: S = U L Ut where Ut is the transpose matrix, U are the eigenvectors, and L the eigenvalues. What does S actually give? Is it the transformation each point the 3D object has to undergo to get from the old set of axes, to the new set? If you can help me again, then I'd be grateful! David Date: 12/24/97 at 14:07:19 From: Doctor Tom Subject: Re: More eigenvectors and eigenvalues! Hi David, If M is your matrix above, and I calculate det(M - lI), I get -l^3 + 7*l^2 + 21*l - 49. If I set this to zero, I get 1.6439... and two other real roots. There are three roots, but they are horrible irrational numbers, so I doubt that the example above is copied from a book correctly. Of course in the real world, the eigenvalues will almost always be horrible irrational numbers, but as exercises in books, they are usually rigged to come out integers or simple fractions so you'll learn something about linear algebra instead of messing with arithmetic. Let me do a different example that does work out nicely: ( 3 2 4 ) ( 2 0 2 ) = M ( 4 2 3 ) det(M - lI) = -l^3 + 6*l^2 + 15*l + 8. Set it equal to zero, and we need to solve: (l+1)(l+1)(l-8) = 0. The three eigenvectors are -1, -1, and 8 For eigenvalue -1: (a b c) ( 3 2 4 ) = (3a+2b+4c, 2a+2c, 4a+2b+3c) = -1(a b c) ( 2 0 2 ) ( 4 2 3 ) So 3a+2b+4c = -a 2a+2c = -b 4a+2b+3c = -c or 4a+2b+4c = 0 2a+b+2c = 0 4a+2b+4c = 0 There are two independent solutions: (1 -2 0) and (0 2 -1), for example. For eigenvalue 8: (a b c) ( 3 2 4 ) = (3a+2b+4c, 2a+2c, 4a+2b+3c) = 8(a b c) ( 2 0 2 ) ( 4 2 3 ) 3a+2b+4c = 8a 2a+2c = 8b 4a+2b+3c = 8c or -5a+2b+4c = 0 2a-8b+2c = 0 4a+2b-5c = 0 There is a solution: (2 1 2) Let P be the matrix formed by the characteristic vectors: ( 1 -2 0 ) ( 0 2 -1 ) ( 2 1 2 ) P^(-1) = 1/9( 5 4 2 ) ( -2 2 1 ) ( -4 -5 2 ) PMP^(-1) = 1/9( -1 0 0 ) ( 0 -1 0 ) ( 0 0 8 ) Of course since the matrix is symmetric, you knew that it was diagonalizable, and its diagonal elements are -1, -1, and 8. The eigenvectors are the vectors that are preserved in direction under matrix multiplication. There are lots of cases where this is important. For example, imagine that the vector represents proportions of a population of animals of various ages, and the matrix represents the survival and reproduction of these age classes into the next generation. If you start with some population distribution, and iterate for a long time, the vectors will tend toward the eigenvector corresponding to the largest eigenvalue. It therefore represents a stable population configuration. In physics, if you write down the matrix for a 3D solid that corresponds to its moment of intertia tensor, the eigenvectors are in the directions of the three principle axes - in other words, the solid will spin without wobbling around any of the three eigenvectors, and not about any other axis. There are a gazillion other examples from the real world where this is useful. I don't understand your last question exactly. If you have three vectors and you want to find a transformation that takes them to three others, here's how to do it. Suppose the original vectors and their targets are: (a1 b1 c1) => (A1 B1 C1) (a2 b2 c2) => (A2 B2 C2) (a3 b3 c3) => (A3 B3 C3) You just need to find a matrix M such that: (a1 b1 c1) (A1 B1 C1) (a2 b2 c2)M = (A2 B2 C2) (a3 b3 c3) (A3 B3 C3) so invert the matrix on the left (with the a1, b1, ... c3) and multiply both sides on the left with it. That will give M. -Doctor Tom, The Math Forum Check out our web site! http://mathforum.org/dr.math/ |
Search the Dr. Math Library: |
[Privacy Policy] [Terms of Use]
Ask Dr. Math^{TM}
© 1994- The Math Forum at NCTM. All rights reserved.
http://mathforum.org/dr.math/