Let $X \in \text{GL}_n(\mathbb{R})$ be an arbitrary real $n\times n$ matrix. How can we prove rigorously: $ \underset{b>0} {\exists} : \underset{|t|\le b} {\forall} : \det (I + t X) \neq 0 $ If necessary, we could also assume that $t \ge 0.$
How to prove $I + t X$ is invertiable for small enough $ | t | ?$
-
0I have seen it in a couple of papers in computer science - along with mathbf for vectors. But I will try to avoid it from now on on math.stackexchange ;-) – 2012-07-03
5 Answers
Note that for $t\neq 0$ $ \det(I+tX) = t^n\det(\frac{1}{t}I+X) = 0 $ only if $\frac{-1}{t}$ is an eigenvalue of $X$. If $ t=0$ the statement is trivial so we exclude this case.
Since $X$ has only finitely many eigenvalues, its set of eigenvalues is bounded in magnitude. For all $t$ sufficiently small $|\frac{-1}{t}|$ will be larger than this magnitude bound, so $\frac{-1}{t}$ will not be an eigenvalue of $X$, and hence we will have $ \det(I+tX) \neq 0. $
-
0Nice one! Especially I like that it only relies on basic linear algebra. – 2012-07-03
A matrix is invertible iff its determinant is nonzero. But determinant is a continuous function, so $GL_n$ is open as a subset of the space of all $n\times n$ matrixes, so there is a neighborhood of $I$ with no singular elements.
In particular shifting $I$ (or any other invertible matrix) by a small enough multiple of any given matrix $X$ (not necessarily invertible) does not move $I$ out of the neighborhood, leaving the matrix invertible.
Let $\lVert\cdot\rVert$ a submultiplicative norm over the set of $n\times n$ real matrices, denoted $\mathbf M_n(\Bbb R)$ (we can take $\lVert M\rVert:=\sup_{x\neq 0}\frac{\lVert Mx\rVert}{\lVert x\rVert}$, where $\lVert\cdot\rVert$ is the Euclidian norm). This norm makes $\mathbf M_n(\Bbb R)$ complete (since it's a norm over a finite-dimensional vector space over $\Bbb R$. For $A\in \mathbf M_n(\Bbb R)$ of norm $<1$, we have $(I-A)\sum_{j=0}^{+\infty}A^j=I=\sum_{j=0}^{+\infty}A^j(I-A),$ hence $I-A$ is invertible (the series is normally convergent, and the space being complete, convergent).
If $X=0$, the statement is obviously true, and if $X\neq 0$, we can take $b<\frac 1{\lVert A\rVert}$.
The determinant of a matrix is a polynomial (and hence continuous) in its $n^2$ entries. Taking the limit as $t\to 0$ makes all the diagonal entries tend to $1$ and all other entries to $0$, so the determinant tends to $1.$ So there exists a neighbourhood around $t=0$ such that the determinant of all those matrices has positive determinant, so are invertible.
$\det(A)$ is a polynomial function in the entries of $A$. The solution set to $\det(A) = 0$ is a closed subset, and so the set of invertible matrices is an open set. In particular, there is an open neighborhood $U$ of $I$ such that every matrix in $U$ is invertible. By choosing $t$ sufficiently small, we guarantee $I + tX \in U$.
$\det(I + tX)$ is a polynomial function of $t$, and so $\det(I + tX) = 0$ has finitely many roots, and $t=0$ is not a root. Therefore, we can find an open interval containing 0 such that $I + tX$ is invertible on that interval.
We can compute the Taylor series for $(I + tX)^{-1}$ about 0:
$ (I + tX)^{-1} = I - t X + t^2 X^2 - t^3 X^3 + \cdots $
It's not difficult to see that the right hand side is convergent in every component of the matrix (e.g. ratio test along with an upper bound on the entries for $X^n$) on an interval of positive radius. By multiplying through by $(I + tX)$, we can check that the sum really is the inverse.