0
$\begingroup$

Well the title says it all. I am confused since the characteristic polynomial is used to derive eigen values so all the roots are eigen values. Now if I have another set of values which satisfy the characteristic polynomial then those should also be a eigen values right?

I am looking at this matrix.

 3 -1 -1 -1  3 -1 -1 -1  3 

The eigen values come out as 1,4,4 but if I use 1,1,1 as values of lambda and put it in the characteristic equation, it still satisfies the characteristic polynomial. This is all very confusing. Maybe some of my concepts are wrong.

Edit

Let me explain my confusion a little more, and where I was thinking wrong

If I have my eigenvalues then the characteristic equation is given by |A - \lambda *I| = 0,right? But in my not so well conditioned mind I made a huge mistake. Let me first tell you that this was taken from a multiple choice question where 1,1,1 and 1,4,4 were two answers. So I thought I would just put eigenvalues in the characteristic equation and see if it satisfies. Calculator can do the determinant calculation(casio Fx-991MS) and save on a lot of time (I could solve the 3 variable equation as well but deriving that equation seems like a long and error prone process). But in my mighty dumbness I put all the three values in the characteristic equation at the same time. So of course 1,1,1 seemed to satisfy as eigenvalues. What I should have done was checked each number separately. Sorry for kind of a foolish question.

  • 3
    The eigenvalues of the matrix are precisely the roots of the characteristic polynomial.2011-07-29
  • 2
    How can "$1,4,4$" be "the eigenvectors"? Eigenvectors are vectors, so they should be elements of $\mathbb{R}^3$.2011-07-29
  • 0
    @arturo sorry I meant to say eigenvalues.2011-07-29
  • 1
    Certainly if $1$ is a zero of the characteristic polynomial, then every time you plug in $1$ into the characteristic polynomial you will get $0$, whether you plug it once or you plug it three times; I don't understand what you mean when you say that you "use $1,1,1$ as values of lambda".2011-07-29
  • 0
    @arturo see the edit above, I now understand my huge mistake.2011-07-29
  • 0
    @Rick_2047: I'm still not clear about what you mean on "putting all values at the same time" or "checking separately". Perhaps: plug in $1$, if it is $0$, then factor out $\lambda-1$ from the characteristic polynomial to get something like $p(\lambda)=(\lambda-1)q(\lambda)$, and then plugging in the next value into $q(\lambda)$ to see if it was equal to $0$? If so, yes, this will work, provided you factor out correctly.2011-07-29
  • 0
    @arturo post the second comment as answer, I need to accept that as the correct answer.2011-07-29
  • 0
    @arturo what I was doing was for the first row use 1 as eigen value and same for second and third row. Then thinking that 1,1,1 was eigen value. So if I might have checked for 1,4,5 I would have used 1 for first row, 4 for second row and 5 for third row. See I was checking all the values at the same time. I know thats a monstrous mistake to make but there you go.2011-07-29
  • 0
    @Rick_2047: Ah, then your problem is **much more serious**. That is *not* what an eigenvalue is, and that is not how you "check them". You are confusing the entries of an eigenvector with the eigenvalues. This is a serious conceptual error! $(1,1,1)^t$ *is* an eigenvector that happens to correspond to the eigenvalue $\lambda=1$. But then, so is $(2,2,2)^t$: $A(2,2,2)^t = (2,2,2)$. This does **not** mean that $2$ is an eigenvalue. And $(0,1,-1)$ is an eigenvector corresponding to $4$, but neither $0$ nor $-1$ are eigenvalues.2011-07-29
  • 0
    @arturo I understand that mistake now, but if I am given a small set of values (say 4 or 5) I can put each in the characteristic equation (the right way using all of them for all of the rows) I can get which ones are the eigen values. Isn't that right? Considering the second edit you made though, eigenvectors can make this very dangerous technique to use, so to hell with this and I should just check the sum of eigenvalues and sum of priciple diagonals.2011-07-29
  • 1
    @Rick_2047: Evaluating the characteristic polynomial at a scalar will tell you whether the scalar is an eigenvalue or not, but it will not give you the multiplicity by itself. If your options had been "1,4,4" and "1,1,4", simply evaluating the characteristic polynomial to see if you got $0$ with each number would not have allowed you to distinguish the correct answer from the incorrect one.2011-07-29
  • 0
    I don't want to confuse matters too much, but if you are dealing with real matrices, the characteristic polynomial can have complex roots which do not correspond to real eigenvalues. For example, the matrix $\left( \begin{array}{clcr} 0 & 1\\-1 &0 \end{array} \right)$ represents a rotation of the plane $\mathbb{R}^2$. This rotation has no eigenvectors. However, the characteristic polynomial is $x^2 +1$, which has roots $\pm i$ in $\mathbb{C}.$2011-07-29

2 Answers 2

5

For an $n\times n$ matrix $A$, a scalar $\lambda$ is an eigenvalue of $A$ if and only if it is a zero of the characteristic polynomial of $A$.

Why is this? Remember that $\lambda$ is an eigenvalue of $A$ if and only if there is a nonzero vector $\mathbf{v}$ such that $A\mathbf{v}=\lambda\mathbf{v}$; this is equivalent to the existence of a nonzero vector $\mathbf{v}$ such that $(A-\lambda I)\mathbf{v}=\mathbf{0}$. That means that the nullspace of the matrix $A-\lambda I$ (remember, $I$ is the identity matrix) is not just the zero vector, which means, necessarily, that $A-\lambda I$ is not invertible. Since it is not invertible, that means that its determinant is $0$; its determinant happens to equal the characteristic polynomial evaluated at $\lambda$, so this shows that if $\lambda$ is an eigenvalue of $A$, then $\lambda$ is a zero of the characteristic polynomial.

Conversely, if $\lambda$ is a zero of the characteristic polynomial of $A$, then the determinant of $A-\lambda I$ is zero, which means that $A-\lambda I$ is not invertible, which means there is a nonzero vector $\mathbf{w}$ such that $(A-\lambda I)\mathbf{w}=\mathbf{0}$. This shows that $\mathbf{w}$ is an eigenvector of $A$ with eigenvalue $\lambda$, so $\lambda$ is an eigenvalue.

For the matrix you have, $$A = \left(\begin{array}{rrr} 3 & -1 & -1\\ -1 & 3 & -1\\ -1 & -1 & 3 \end{array}\right).$$ The characteristic polynomial $p(t)$ is: $$\begin{align*} p(t)=\det(A-tI) &= \left|\begin{array}{ccc} 3-t & -1 & -1\\ -1 & 3-t & -1\\ -1 & -1 & 3-t \end{array}\right|\\ &= (3-t)\left|\begin{array}{cc} 3-t & -1\\ -1 & 3-t \end{array}\right| +\left|\begin{array}{cc} -1 & -1\\ -1 & 3-t \end{array}\right| - \left|\begin{array}{cc} -1 & -1\\ 3-t & -1 \end{array}\right|\\ &= (3-t)\Bigl((3-t)^2-1\Bigr) + (t-4) - (4-t)\\ &= (3-t)\Bigl(t^2 -6t +8\Bigr) +2(t-4)\\ &= (3-t)(t-4)(t-2) + 2(t-4)\\ &= (t-4)\Bigl(2 - (t-2)(t-3)\Bigr) \\ &= -(t-4)(t^2-5t+6-2)\\ &= -(t-4)(t^2-5t+4)\\ &= -(t-4)^2(t-1). \end{align*}$$ Since $\lambda$ is an eigenvalue of $A$ if and only if $p(\lambda)=0$, this says that the $A$ matrix has two distinct eigenvalues: $\lambda=4$, with algebraic multiplicity $2$, and $\lambda=1$.

What are the corresponding eigen vectors?

For $\lambda=1$, you want vectors $(a,b,c)^t$ such that $A(a,b,c)^t = (a,b,c)^t$ ($t$ is the transpose). Equivalently, you want the nullspace of $A-I$, except for $\mathbf{0}$. It is not hard to verify that $(1,1,1)^t$ is an eigenvector corresponding to $\lambda=1$, and that every eigenvector corresponding to $\lambda=1$ is a nonzero scalar multiple of $(1,1,1)^t$ (is this where you got confused? This is a vector, not a list of eigenvalues).

For $\lambda=4$, you want vectors $(a,b,c)^t$ that le in the nullspace of $A-4I$. Here, you want $a+b+c=0$, so the nullspace is spanned by the vectors $(1,0,-1)^t$ and $(0,1,-1)^t$; you can verify that each of these is an eigenvector corresponding to $\lambda=4$ and they are linearly independent, so the eigenvectors corresponding to $\lambda=4$ are the nonzero linear combinations of these two.

4

The characteristic Polynomial of your matrix is

$$p(\lambda)=\lambda^3 - 9 \lambda^2 + 24 \lambda - 16 =(\lambda-1)(\lambda-4)(\lambda-4)$$

Now you can directly read the eigenvalues from the factorized polynomial, namely $1,4,4$. That you have $p(1)=0$ shows you that 1 is an eigenvalue but for $1,1,1$ to be all eigenvalues of your matrix you would need $p(\lambda)=(\lambda-1)^3$ which is not the case here.

  • 0
    If it's "$p(x)$", then the variable should be $x$; if the variable is $\lambda$, shouldn't it be "$p(\lambda)$"? (-;2011-07-29
  • 1
    Yes :) I changed this afterwards because the OP seems to use lambda rather than x so I thought he understands this better.2011-07-29