5
$\begingroup$

Is there any relation between the eigenvalues of possibly non-Hermitian matrix A and those of exp(A)?

For hermitian matrices, they are just exponentials of the corresponding values. But in general, any relation?

Thanks.

3 Answers 3

6

The same relation holds. To see this, bring $A$ in Jordan normal form and observe that $\exp{(TAT^{-1})} = T\exp(A)T^{-1}$. Since powers of an upper triangular matrix are upper triangular, the diagonal entries of the exponential of an upper triangular matrix are the exponentials of the diagonal entries of the original matrix and you're done.

What is special about the Hermitian case, though, is that $\exp{(iA)}$ is a unitary matrix because $\exp(iA)^{\ast} = \exp{(-iA)}$ and $\exp(A + B) = \exp(A)\exp(B)$ for commuting matrices.

  • 0
    I did a numerical experiment, for example, for {{1, 2}, {3, 2}}, I got eigenvalues {4,-1}. But doing Texp(A)T^-1, then calculate the eigenvalues I got {20.89374391, 1.131891102}, instead of {e^4,e^-1} which is {54.59815003, 0.3678794412}. Anything wrong here?2011-01-29
  • 0
    Just for you to check, here T={{0.5547001962, -0.7071067812}, {0.8320502943, 0.7071067812}}2011-01-29
  • 0
    The reason for the formula I gave is that $\exp(TAT^{-1}) = \sum \frac{(TAT^{-1})^n}{n!} = \sum T \frac{A^{n}}{n!} T^{-1} = T \left(\sum \frac{A^{n}}{n!} \right) T^{-1} = T\exp(A)T^{-1}$.2011-01-29
5

The same result holds in general. This is probably easiest to see using the Jordan canonical form.

Results of this form go under the name "spectral mapping theorem". If $f$ is any power series whose disk of convergence contains the eigenvalues of $A$, then the eigenvalues of $f(A)$ are precisely the values of $f$ evaluated at the eigenvalues of $A$, as can be seen first with polynomials using Jordan form and then taking limits. You can also apply holomorphic functions using Cauchy integrals in contexts where power series aren't applicable , as seen in a more general context in the Wikipedia article on holomorphic functional calculus. The spectral mapping theorem holds there, too.

  • 0
    I did an experiment as I posted in the following with Theo's reply. Could you please check what might be wrong there? I did not get this conclusion from my experiment. thanks.2011-01-29
  • 0
    @Qiang Li: I don't know what you did exactly, but those eigenvalues for $T\exp(A)T^{-1}$ are incorrect. They are $e^4$ and $\frac{1}{e}$.2011-01-29
  • 0
    I made a mistake. It is true. Thank you. :) I also chose Theo's reply as the answer, because it sounds more concrete to me. thank you also.2011-01-29
4

If you are willing to forget about problems of convergence for a moment, think first with polynomials: let

$$ f(t) = a_0 + a_1 t + a_2 t^2 + \cdots + a_n t^n $$

and $A$ any square matrix. Then

$$ f(A) = a_0 I + a_1 A + a_2 A^2 + \cdots + a_n A^n \ . $$

Assume $v$ is an eigenvector of $A$ with eigenvalue $\lambda$: $Av = \lambda v$. Then

$$ \begin{align} f(A)v &= a_0 Iv + a_1 Av + a_2 A^2v + \cdots + a_n A^nv \\ &= a_0 v + a_1 \lambda v + a_2 \lambda^2 v + \cdots + a_n \lambda^n v \\ &= \left( a_0 + a_1 \lambda + a_2 \lambda^2 + \cdots + a_n \lambda^n \right) v \\ &= f(\lambda ) v \ . \end{align} $$

That is, $v$ is also an eigenvector of $f(A)$ with eigenvalue $f(\lambda )$.

The same result holds for analytic functions, such $\exp$: if

$$ f(t) = \sum_{n=0}^\infty a_n t^n $$

then also $v$ is an eigenvector of $f(A)$ with eigenvalue $f(\lambda )$.

So, for no matter what kind of square matrix $A$, if $v$ is an eigenvector of $A$ with eigenvalue $\lambda$, $v$ is an eigenvector of $\exp (A)$ with eigenvalue $\exp (\lambda )$ too.

  • 0
    I like this answer. This expicitly handles both containments for the case where $A$ is diagonalizable, and I suppose the rest follows by density of the diagonalizable matrices and continuity of eigenvalues.2011-01-29
  • 0
    @Jona. Thank you. :-)2011-01-29