Problem Solving: Eigenvalues and Eigenvectors

OCW Scholar

Flash and JavaScript are required for this feature.

Download the video from iTunes U or the Internet Archive.

PROFESSOR: Hi guys. Today, we are going to play around with the basics of eigenvalues and eigenvectors. We're going to do the following problem, we're given this invertible matrix A, and we'll find the eigenvalues and eigenvectors not of A, but of A squared and A inverse minus the identity. So, this problem might seem daunting at first, squaring a 3 by 3 matrix, or taking an inverse of a 3 by 3 matrix is a fairly computationally intensive task, but if you've seen Professor Strang's lecture on eigenvalues and eigenvectors you shouldn't be all too worried. So I'll give you a few moments to think of your own line of attack and then you'll see mine.

Hi again, OK, so the observation that makes our life really easy is the following one. So say v is an eigenvector with associated eigenvalue lambda to the matrix A. Then, if we hit v with A squared, well, this we can write it as A times A*v, but A*v is lambda*v, right? So we have A*lambda*v.

Lambda is a scalar, so we can move it in front and get lambda*A*v, and lambda*A*v is, when we plug in A*v, lambda*v, is just lambda squared v. So, what we've find out is that if v is an eigenvector for A then it's also an eigenvector for A squared. Just that the eigenvalue is the eigenvalue squared. Similarly, if we hit A inverse-- if you hit v with A inverse.

So in this case we can write v as A*v over lambda, given that of course, lambda is non-zero. But the eigenvalues of an invertible matrix are always non-zero, which is an exercise you should do yourselves. So if we just, then, take out the A and combine it with A inverse, this is the identity, and so we get 1 over lambda v. So v is also an eigenvector for a inverse, with eigenvalue the reciprocal of lambda.

OK, and from here, of course, A inverse minus the identity is lambda inverse minus 1 v, so the eigenvalue of A inverse minus the identity is 1 over lambda minus 1. OK, so, what we've figured out is: we just need to find the eigenvalues and eigenvectors of A and then we have a way of finding what the eigenvalues and eigenvectors of A squared and A inverse minus the identity will be.

OK so, how do we find the eigenvalues? Well, what does it mean for lambda to be an eigenvalue of A? It means that the matrix A minus lambda the identity is singular, which is precisely the case when its determinant is 0, OK? So we need to solve the following equation: 1 minus lambda, 2, 3; 0, 1 minus lambda, -2; and 0, 1, 4 minus lambda.

OK, it's fairly obvious which column we should use to expand this determinant. We should use the first column, because we have only one non-zero entry, and so this is equal to 1 minus lambda times the determinant of the two by two matrix 1 minus lambda, -2; 1, 4 minus lambda, which is, I'm going to do the computation up here.

1 minus lambda, lambda squared minus 5 lambda plus 6. Which is a fairly familiar quadratic, and we can write it as the product of linear factors, as lambda minus 2, lambda minus three. So the three eigenvalues of A are 1, 2, and 3.

OK so, first half of our problem is done, now we just need to find what the eigenvectors associated with each of these eigenvalues are. How we do that? Well, let's see. Let's figure out what the eigenvector associated with lambda equals 1 is. So, we know that the eigenvector needs to be in the null space of A minus lambda the identity, so A minus the identity, v, so-- write this out-- it's, 0, 0, 3, 2, 3, 0, -2, 0, 1.

And we see that the first column is 0, so the first variable will be our free variable if we want to solve this linear system of equations. And you can just set it to 1 and it's not hard to see that the other two entries should be 0.

So we can do the same procedure with the other two eigenvalues and we'll get an eigenvector for each eigenvalue. And in the end-- let me go back here. So I'm going to put our results in a little table. So A squared, inverse minus the identity, so the first row will be eigenvalues.

So it's going to be: if lambda is an eigenvalue for A, then we saw that lambda squared will be the eigenvalue for A squared and lambda inverse minus 1 will be the eigenvalue for A inverse minus the identity. And the eigenvectors will be the same. OK, we're done.

Free Downloads

Video


Caption

  • English-US (SRT)