1
$\begingroup$

i'm a bit confused when trying to calculating generalized eigenvectors if there is more than one eigenvectors for a specific eigenvalue.

Example: $A=\begin{pmatrix}0&1&-1&-1\\0&0&0&0\\0&-1&2&2\\0&1&-2&-2\end{pmatrix}$

This matrix has the only eigenvalue $\lambda=0$ with algebraic multiplicity $m=4$

$Eig_\lambda(A)=span\bigg\{\begin{pmatrix}1\\0\\0\\0\end{pmatrix},\begin{pmatrix}1\\0\\1\\-1\end{pmatrix}\bigg\}=span\{v_1,v_2\}$

so geometric multiplicity $g=2$

From what I know, I'd say, that we now should be able to find two generalized eigenvectors. ($4-2=2$)

What I usually do to calculate generalized eigenvectors, if we have an eigenvector $x_1$ to some eigenvalue $p$ is:

$(A-pI)x_1=0$ [gives us the ordinary eigenvector]

$(A-pI)x_2=x_1$

$(A-pI)x_3=x_2$

so that we get the generalized eigenvectors $x_2,x_3$

Back to my example: If I do this: (Note that $(A-\lambda I)=A$

$Aw_1=v_1$ gives:

$w_1\in \{\begin{pmatrix}0\\1\\1\\0\end{pmatrix}+t\begin{pmatrix}1\\0\\0\\0\end{pmatrix}+s\begin{pmatrix}0\\0\\-1\\1\end{pmatrix} | s,t\in\mathbb R\}$

$Aw_2=v_2$ gives:

$w_2\in \{\begin{pmatrix}0\\2\\1\\0\end{pmatrix}+o\begin{pmatrix}1\\0\\0\\0\end{pmatrix}+u\begin{pmatrix}0\\0\\-1\\1\end{pmatrix} | u,o\in\mathbb R\}$

We choose $t=s=o=u=1$ and get

$w_1 = \begin{pmatrix}1\\1\\0\\1\end{pmatrix}$

$w_2 = \begin{pmatrix}1\\2\\0\\1\end{pmatrix}$

So, we get the matrix$ M=(v_1,w_1,v_2,w_2)$

Is that correct? How do I go about such a case?

1 Answers 1

1

The eigenspace for the eigenvalue $\lambda=0$ is given by: $$ A\mathbf x= \begin{bmatrix}0&1&-1&-1\\ 0&0&0&0\\ 0&-1&2&2\\ 0&1&-2&-2 \end{bmatrix} \begin{bmatrix} x\\y\\z\\t \end{bmatrix}= \begin{bmatrix} 0\\0\\0\\0 \end{bmatrix} $$ that gives: $$ \begin{bmatrix} x\\y\\z\\t \end{bmatrix}= \begin{bmatrix} x\\0\\-t\\t \end{bmatrix} $$ so we can chose two linearly independent eigenvectors as: $$\mathbf v_1= \begin{bmatrix} 0\\0\\-1\\1 \end{bmatrix}\qquad \mathbf v_2= \begin{bmatrix} 1\\0\\0\\0 \end{bmatrix} $$ Now using $\mathbf v_1$ we can find a generalized eigenvector searching a solution of: $$ \begin{bmatrix}0&1&-1&-1\\ 0&0&0&0\\ 0&-1&2&2\\ 0&1&-2&-2 \end{bmatrix} \begin{bmatrix} x\\y\\z\\t \end{bmatrix}=\begin{bmatrix} 0\\0\\-1\\1 \end{bmatrix} $$ that gives a vector of the form $$ \begin{bmatrix} x\\y\\z\\t \end{bmatrix}= \begin{bmatrix} x\\-1\\-1-t\\t \end{bmatrix} $$ and, for $x=t=0$ we can chose the vector $\mathbf w_1=[0,-1,-1,0]^T$

In the same way we can find the generalized eigenvector $\mathbf w_2=[0,2,1,0]$ as a solution of $A\mathbf x=\mathbf v_2$.

Now we have the matrix $$ M=[\mathbf v_1,\mathbf w_1,\mathbf v_2, \mathbf w_2]= \begin{bmatrix} 0&0&1&0\\ 0&-1&0&2\\ -1&-1&0&1\\ 1&0&0&0 \end{bmatrix} $$ with the inverse: $$ M^{-1}=\begin{bmatrix} 0&0&0&1\\ 0&1&-2&-2\\ 1&0&0&0\\ 0&1&-1&-1 \end{bmatrix} $$ and a Jordan decomposition of The matrix $A$ is: $$A= \begin{bmatrix}0&1&-1&-1\\ 0&0&0&0\\ 0&-1&2&2\\ 0&1&-2&-2 \end{bmatrix}= \begin{bmatrix} 0&0&1&0\\ 0&-1&0&2\\ -1&-1&0&1\\ 1&0&0&0 \end{bmatrix} \begin{bmatrix} 0&1&0&0\\ 0&0&0&0\\ 0&0&0&1\\ 0&0&0&0 \end{bmatrix} \begin{bmatrix} 0&0&0&1\\ 0&1&-2&-2\\ 1&0&0&0\\ 0&1&-1&-1 \end{bmatrix}= MJM^{-1} $$

This decomposition is not unique, In the sense that the matrix $M$ ( and $M^{-1}$) can be different, because we can chose different eigenvectors and generalized eigenvectors.

If, as in OP, we chose: $$\mathbf v'_1= \begin{bmatrix} 1\\0\\0\\0 \end{bmatrix}\qquad \mathbf v'_2= \begin{bmatrix} 1\\0\\1\\-2 \end{bmatrix} $$ than the generalized eigenvectors that satisfies the equations: $$ A\mathbf w'_1=\mathbf v'_1 \qquad A\mathbf w'_2=\mathbf v'_2 $$ becomes: $$ \mathbf w'_1= \begin{bmatrix} 1\\2\\0\\1 \end{bmatrix} \qquad \mathbf w'_2= \begin{bmatrix} 1\\3\\1\\1 \end{bmatrix} $$

(this seems the mistake in OP) and we have a matrix $$ S= \begin{bmatrix} 1&1&1&1\\ 0&2&0&3\\ 0&0&1&1\\ 0&1&-1&1 \end{bmatrix} $$ and a Jordan decomposition: $$ SJS^{-1}=\begin{bmatrix} 1&1&1&1\\ 0&2&0&3\\ 0&0&1&1\\ 0&1&-1&1 \end{bmatrix} \begin{bmatrix} 0&1&0&0\\ 0&0&0&0\\ 0&0&0&1\\ 0&0&0&0 \end{bmatrix} \begin{bmatrix} 1&-2&2&3\\ 0&2&-3&-3\\ 0&1&-1&-2\\ 0&-1&2&2 \end{bmatrix}= \begin{bmatrix}0&1&-1&-1\\ 0&0&0&0\\ 0&-1&2&2\\ 0&1&-2&-2 \end{bmatrix}=A $$

Finally, note that the eigenspace of the eigenvalue $\lambda=0$ is the kernel of $A$ and also $\mathbf u_1=A\mathbf e_2$ and $\mathbf u_2=A\mathbf e_3$ are vectors of the kernel, so they are eigenvectors of $A$, and $\mathbf e_2$ and $\mathbf e_3$ are the corresponding generalized eigenvectors, so another matrix that gives a Jordan decomposition is $N=[\mathbf u_1,\mathbf e_2,\mathbf u_2,\mathbf e_3]$

  • 0
    so the process itself was right?2017-02-04
  • 0
    Yes, the process is correct. See: https://en.wikipedia.org/wiki/Generalized_eigenvector2017-02-05
  • 0
    hmm, I'm still confused. What was my mistake? You have $v_1=(0,0,-1,1)$ and you have $v_1=(0,0,1,-1)$ but both are in the kernel, so it shouldn't matter, should it?2017-02-05
  • 0
    Also: The solution of me does this: We say $A^2=0$. Then they complement $kern(A-\lambda I)$ with the vectors $e_2, e_3$ to aBasis of $\mathbb R^4$. Note that $e_2,e_3\not\in Kern(A-\lambda I)$. But then $Ae_2$ and $Ae_3$ are in $Kern(A-\lambda I)$ So they get $S=(Ae_2,e_2,Ae_3,e_3)$. What exactly happens here?2017-02-05
  • 0
    Right! The mistake is not in the first step. I've edit my answer and added something.2017-02-06