14
$\begingroup$

I have a matrix $A = \left(\begin{matrix} -5 & -6 & 3\\3 & 4 & -3\\0 & 0 & -2\end{matrix}\right)$ for which I am trying to find the Eigenvalues and Eigenvectors. In this case, I have repeated Eigenvalues of $\lambda_1 = \lambda_2 = -2$ and $\lambda_3 = 1$.

After finding the matrix substituting for $\lambda_1$ and $\lambda_2$, I get the matrix $\left(\begin{matrix} 1 & 2 & -1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right)$ after row-reduction.

To find the result of $\left(\begin{matrix} 1 & 2 & -1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right)$ $\left(\begin{matrix} e_1\\e_2\\e_3\end{matrix}\right) = \left(\begin{matrix} 0\\0\\0\end{matrix}\right)$, I set $e_2$ and $e_3$ as free variables $s$ and $t$, respectively, solved for $e_1$ and put into vector form:

$$\left[\begin{matrix} -2s + t\\s\\t\end{matrix}\right] = s \left[\begin{matrix} -2\\1\\0\end{matrix}\right] + t \left[\begin{matrix}1\\0\\1\end{matrix}\right]$$

So for the first two Eigenvectors, I found $v_1 = [-2, 1, 0]^{T}$ and $v_2 = [1, 0, 1]^T$. However, checking my answer on Wolfram Alpha, $v_1$ is assigned to the latter and $v_2$ is assigned to the former. Does this matter?

If so, should I instead find the first Eigenvector for $\lambda_1$ then using the same reduced matrix set it equal to the result of $v_1$ instead of the zero vector for $\lambda_2$? I.e., solving the above matrix for $\lambda_1$: $$e_1 = -2e_2 + e_3 \rightarrow 1 = -2(0) + 1 $$ $$v_1 = [1, 0, 1]^T$$

And then for $\lambda_2$:

$$\left(\begin{matrix} 1 & 2 & -1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right) \left(\begin{matrix} e_1\\e_2\\e_3\end{matrix}\right) = \left(\begin{matrix} 1\\0\\1\end{matrix}\right)$$

However, using $e_3$ as a free variable where $e_3 = 0$, I don't get the above vector I originally found for $v_1$:

$$e_1 + 2e_2 = 1 \rightarrow 1(2) + 2(-1/2) = 1$$ $$v_2 = [2, -1/2, 0]^T$$

What am I doing wrong here?

  • 4
    What do you mean, "$v_1$ is assigned to the latter while $v_2$ is assigned to the former"? $\lambda_1$ and $\lambda_2$ are **equal**; both vectors are eigenvectors of the **single** eigenvalue $\lambda=-2$. You don't have "two" eigenvalues, you have a single eigenvalue with algebraic and geometric multiplicity $2$. They aren't two distinct eigenvalues, it's just *one*.2012-05-14
  • 3
    Your answer is correct. However, you should realize that any two vectors $w,y$ such that $\mathbb{sp} \{w, y\} = \mathbb{sp} \{v_1, v_2\}$ are also valid answers. Think 'eigenspace' rather than a single eigenvector when you have repeated (non-degenerate) eigenvalues.2012-05-14
  • 3
    To put the same thing into slightly different words: what you have here is a two-dimensional *eigenspace*, and any two vectors that form a basis for that space will do as linearly independent eigenvectors for $\lambda=-2$. WolframAlpha wants to give an answer, not a dissertation, so it makes what is essentially an arbitrary choice among all the possible correct answers. You are free to make a different choice.2012-05-14
  • 0
    @Arturo: I realize this is the case, but for some reason I just wasn't sure whether it was relevant what order the vectors appeared in the P matrix, hence my confusion about $v_1$ and $v_2$ being "assigned" to one or the other eigenvector (it should have been obvious to me given your statement that it doesn't matter at all, but for some reason I was second guessing myself). Thanks.2012-05-14
  • 0
    Okay, that clears it up a bunch. Realizing that I am finding an eigenspace for repeated eigenvalues makes it much clearer to me. Though the actual algebra isn't too difficult at this point, the concepts are a little difficult to grasp at times. Thanks for clarifying this point for me.2012-05-14
  • 1
    @Dylan: The order that the column vectors appear in the P matrix does not matter as long as the corresponding eigenvalues appear in the same order in the diagonal matrix. IOW, a matrix $PDP^{-1}$ that is similar to A is not unique. There are n! different P matrices for a given diagonalizable $n\times n$ matrix.2012-05-14
  • 0
    @Joel: Thanks. Indeed, solving the problem to find the resultant diagonal matrix, it is clear that it is exactly the same regardless of the order in which the vectors appear in the P matrix.2012-05-14
  • 1
    @Joel, there are a lot more than $n!$ different $P$ matrices, since you can multiply any column of $P$ by any nonzero number. There are (at most) $n!$ different diagonal matrices corresponding to any given diagonalizable matrix.2012-05-14
  • 0
    @GerryMyerson: Of course! I didn't think of that.2012-05-14

1 Answers 1

10

It is not a good idea to label your eigenvalues $\lambda_1$, $\lambda_2$, $\lambda_3$; there are not three eigenvalues, there are only two; namely $\lambda_1=-2$ and $\lambda_2=1$.

Now for the eigenvalue $\lambda_1$, there are infinitely many eigenvectors. If you throw the zero vector into the set of all eigenvectors for $\lambda_1$, then you obtain a vector space, $E_1$, called the eigenspace of the eigenvalue $\lambda_1$. This vector space has dimension at most the multiplicity of $\lambda_1$ in the characteristic polynomial of $A$. In this case, looking at the characteristic polynomial of $A$, we see that the dimension of the eigenspace $E_1$ is at most two.

As you determined, the dimension of $E_1$ is exactly two, as you found two independent eigenvectors for $\lambda_1$. Your eigenvectors $v_1$ and $v_2$ form a basis of $E_1$. It does not matter that WA listed them in the opposite order, they are still two independent eigenvectors for $\lambda_1$; and any eigenvector for $\lambda_1$ is a linear combination of $v_1$ and $v_2$.

Now you need to find the eigenvectors for $\lambda_2$. Note the dimension of the eigenspace must be one here (since the multiplicity of $\lambda_2$ in the characteristic polynomial is 1).