Is there any relation between the eigenvalues of possibly non-Hermitian matrix A and those of exp(A)?
For hermitian matrices, they are just exponentials of the corresponding values. But in general, any relation?
Thanks.
Is there any relation between the eigenvalues of possibly non-Hermitian matrix A and those of exp(A)?
For hermitian matrices, they are just exponentials of the corresponding values. But in general, any relation?
Thanks.
The same relation holds. To see this, bring $A$ in Jordan normal form and observe that $\exp{(TAT^{-1})} = T\exp(A)T^{-1}$. Since powers of an upper triangular matrix are upper triangular, the diagonal entries of the exponential of an upper triangular matrix are the exponentials of the diagonal entries of the original matrix and you're done.
What is special about the Hermitian case, though, is that $\exp{(iA)}$ is a unitary matrix because $\exp(iA)^{\ast} = \exp{(-iA)}$ and $\exp(A + B) = \exp(A)\exp(B)$ for commuting matrices.
The same result holds in general. This is probably easiest to see using the Jordan canonical form.
Results of this form go under the name "spectral mapping theorem". If $f$ is any power series whose disk of convergence contains the eigenvalues of $A$, then the eigenvalues of $f(A)$ are precisely the values of $f$ evaluated at the eigenvalues of $A$, as can be seen first with polynomials using Jordan form and then taking limits. You can also apply holomorphic functions using Cauchy integrals in contexts where power series aren't applicable , as seen in a more general context in the Wikipedia article on holomorphic functional calculus. The spectral mapping theorem holds there, too.
If you are willing to forget about problems of convergence for a moment, think first with polynomials: let
$ f(t) = a_0 + a_1 t + a_2 t^2 + \cdots + a_n t^n $
and $A$ any square matrix. Then
$ f(A) = a_0 I + a_1 A + a_2 A^2 + \cdots + a_n A^n \ . $
Assume $v$ is an eigenvector of $A$ with eigenvalue $\lambda$: $Av = \lambda v$. Then
$ \begin{align} f(A)v &= a_0 Iv + a_1 Av + a_2 A^2v + \cdots + a_n A^nv \\ &= a_0 v + a_1 \lambda v + a_2 \lambda^2 v + \cdots + a_n \lambda^n v \\ &= \left( a_0 + a_1 \lambda + a_2 \lambda^2 + \cdots + a_n \lambda^n \right) v \\ &= f(\lambda ) v \ . \end{align} $
That is, $v$ is also an eigenvector of $f(A)$ with eigenvalue $f(\lambda )$.
The same result holds for analytic functions, such $\exp$: if
$ f(t) = \sum_{n=0}^\infty a_n t^n $
then also $v$ is an eigenvector of $f(A)$ with eigenvalue $f(\lambda )$.
So, for no matter what kind of square matrix $A$, if $v$ is an eigenvector of $A$ with eigenvalue $\lambda$, $v$ is an eigenvector of $\exp (A)$ with eigenvalue $\exp (\lambda )$ too.