Can anyone give hint for proving this? I think some kind of inverse thm argument would work. But I wasn't able to make it accurate... such as continuity of mapping $X \mapsto \exp(X)$ or this mapping has full rank near $A$ if $A$ is sufficiently close to identity...
exp(X)=A has a solution if A is sufficiently close to identity matrix
3 Answers
Since $\exp(A) = I + A + O(A^2)$, the derivative of $\exp$ at $0$ is the identity map. The inverse function theorem then shows that $\exp$ is invertible in a neighbourhood of $I$.
If $A=I+B$, then the usual Taylor series for the logarithm $\ln(1+x)=x-\frac12 x^2+\frac13x^3-\frac14x^4\pm\ldots$ helps. Show that plugging in a matrix $B$ still converges if $||B||$ is small. Then convince yourself that the relation "log is inverse of exp" transfers well from real power series to matrix series.
-
0I agree with your ideas.. But it is actually very hard to prove log(exp(A))=exp(log(A))=A for matrix A, isn't it? – 2012-10-31
$exp(X)$ is always a invertible matrix regardless of whether $X$ is or not. So, if a solution should exist, then the necessary condition is $A$ should be invertible. Now consider the case, when $A$ is invertible and also diagonalizable (don't feel its a special case, set of all diagonalizable matrices is infact very dense in the space of all matrices, and I believe solution exists only for that case.). So $A=S\Lambda S^{-1}$ for some invertible matrix $S$. Let's say $X=SDS^{-1}$ for some diagonal matrix $D$ to be found. Let $D_{i,i}=d_i$ be its diagonal entries. Its staightforward to verify \begin{align} exp(X)=Sexp(D)S^{-1} \end{align} So we are left with solving the equation \begin{align} exp(D)=\Lambda \end{align} or the scalar equation $e^{d_i}=\lambda_{i}$ which always has complex solutions and hence the answer. Note that this holds for any invertible and diagonalizable matrix irrespective of whether it is close to identity or not.
-
0thanks for very generalized arguments. – 2012-10-31