1
$\begingroup$
  1. For a complex square matrix $M$, a maximal set of linearly independent eigenvectors for an eigenvalue $\lambda$ is determined by solving $ (M - \lambda I) x = 0. $ for a basis in the solution subspace directly as a homogeneous linear system.
  2. For a complex square matrix $M$, a generalized eigenvector for an eigenvalue $\lambda$ with algebraic multiplicity $c$ is defined as a vector $u$ s.t. $ (M - \lambda I)^c u = 0. $ I wonder if a generalized eigenbasis in Jordan decomposition is also determined by finding a basis in the solution subspace of $(M - \lambda I)^c u = 0$ directly in the same way as for an eigenbasis? Or it is more difficult to solve directly as a homogeneous linear system, and some tricks are helpful?

Thanks!

1 Answers 1

1

Look at the matrix $M=\pmatrix{1&1\cr0&1\cr}$ Taking $\lambda=1$, $c=2$, Then $(M-\lambda I)^c$ is the zero matrix, so any two linearly independent vectors will do as a basis for the solution space of $(M-\lambda I)^cu=0$. But that's not what you want: first, you want as many linearly independent eigenvectors as you can find, then you can go hunting for generalized eigenvectors.

  • 0
    In general, I don't know. In this example, since $P$ can be any invertible matrix, $P^{-1}MP$ can be any matrix similar to $M$. And that means it can be any matrix with trace $2$ and determinant $1$, other than the identity matrix.2012-11-27