5
$\begingroup$

Suppose I have an idempotent matrix such that $A^2=A$. From its properties, if $A$ is not an identity matrix, then it is singular. Through trial and error, I can see that for all $I+A$ are invertible.

But how can I show that $I+A$ is indeed invertible and then to find out its inverse? I searched on this and found out that the $(I+A)^{-1}=\frac{1}{2}(2I-A)$ so it must be through that $I+A$ has an inverse but how was this derived?

  • 0
    Are you assuming only idempotence or also symmetry?2011-08-21
  • 0
    @Geoff: The eigenvalues of $A$ are repeated $0$ and $1$. The eigenvalues of $I+A$ are repeated $1$. So the determinant shows that the $I+A$ is indeed invertible. Is this right? But how can I make use of the eigenvalues to derive the inverse equation for $I+A$?2011-08-21
  • 0
    @cardinal: It may not be symmetry. It can be other non-symmetric matrices too.2011-08-21
  • 0
    Potentially it was derived by simple guess and check: $(I + A)(I - \frac{1}{2}A) = I$, where idempotence is used since $-\frac{1}{2} A^2 = -\frac{1}{2} A$, so you're done.2011-08-21
  • 6
    More generally, if $A^2 = A$, $(I + a A)(I + tA) = I + (a + t + at) A$. So solve $a + t + at = 0$ for $t$.2011-08-21
  • 0
    @Robert: How did you derive the part on I+$(a+t+at)A$?2011-08-21
  • 2
    $(I + a A)(I + tA) = (I + a A) + t(I + a A)A = I + aA + tA + atA^2...$ and then use idempotence of $A$.2011-08-21
  • 0
    oh yea...the eigenvalues should be $1$ and $2$. Thanks everyone for the help!2011-08-21

2 Answers 2

8

If $A$ is a diagonalizable invertible matrix, you can use Lagrange's Interpolation Formula to invert it. More precisely, if the distinct eigenvalues are $\lambda_1,\dots,\lambda_k$, then we have $$A^{-1}=\sum_{i=1}^k\ \frac{1}{\lambda_i}\ \prod_{j\not=i}\ \frac{A-\lambda_jI}{\lambda_i-\lambda_j}\quad.$$ More generally, the inverse of an invertible matrix $A$ is a polynomial in $A$, which depends only on the minimal polynomial of $A$, and which is given by a simple formula. If you're interested, I'll be happy give you more details.

EDIT 1. @rcollyer kindly asked for more details. Here they are.

Let $A$ be a complex invertible square matrix with minimal polynomial $f\in\mathbb C[X]$, and let $$f=(X-\lambda_1)^{m(1)}\cdots(X-\lambda_k)^{m(k)}$$ be the factorization of $f$, where the $\lambda_j$ are the distinct eigenvalues of $A$.

There is a unique polynomial $g$ of degree less than the degree $d$ of $f$ such that $g(A)=A^{-1}$. Moreover $g$ is given by the following recipe.

For any rational fraction $\varphi\in\mathbb C(X)$ defined at $\lambda_j$, let $T_j(\varphi)$ be the degree less than $m(j)$ Taylor approximation of $\varphi$ at $X=\lambda_j$.

Put $g_j:=T_j(1/X)$.

Then $g$ is the unique degree less than $d$ solution to the congruences $$g\equiv g_j\bmod(X-\lambda_j)^{m(j)},\quad 1\le j\le k.$$ More precisely, $g$ is is given by $$g=\sum_{j=1}^k\ T_j\!\!\!\!\left(g_j\ \frac{(X-\lambda_j)^{m(j)}}{f}\right)\ \frac{f}{(X-\lambda_j)^{m(j)}}\quad.$$

Again, I'll be happy to offer any further explanation I can give.

EDIT 2. How to prove the above claims? There are three ingredients:

(1) The canonical epimorphism $\mathbb C[X]\twoheadrightarrow\mathbb C[A]$ induces an isomorphism $\mathbb C[X]/(f)\overset\sim\to\mathbb C[A]$.

(2) By the Chinese Remainder Theorem, the natural morphism from $\mathbb C[X]/(f)$ to the product of the $\mathbb C[X]/(X-\lambda_j)^{m(j)}$ is an isomorphism.

(3) Taylor's formula enables one to invert the above isomorphism.

I'll just say a few words about (3). Suppose for simplicity $k=3$. So, we want to solve $$g\equiv g_1\bmod(X-\lambda_1)^{m(1)},$$ $$g\equiv g_2\bmod(X-\lambda_2)^{m(2)},$$ $$g\equiv g_3\bmod(X-\lambda_3)^{m(3)}.$$ Suppose we can solve the system $$h_1\equiv g_1\bmod(X-\lambda_1)^{m(1)},$$ $$h_1\equiv 0\bmod(X-\lambda_2)^{m(2)},$$ $$h_1\equiv 0\bmod(X-\lambda_3)^{m(3)};$$ the system $$h_2\equiv 0\bmod(X-\lambda_1)^{m(1)},$$ $$h_2\equiv g_2\bmod(X-\lambda_2)^{m(2)},$$ $$h_2\equiv 0\bmod(X-\lambda_3)^{m(3)};$$ and the system $$h_3\equiv 0\bmod(X-\lambda_1)^{m(1)},$$ $$h_3\equiv 0\bmod(X-\lambda_2)^{m(2)},$$ $$h_3\equiv g_3\bmod(X-\lambda_3)^{m(3)}.$$ Then we'll just set $g:=h_1+h_2+h_3$. How to solve the system for $h_1$? The last two equations tell us that $h_1$ will be of the form $(X-\lambda_2)^{m(2)}(X-\lambda_3)^{m(3)}u$, and we only have to solve $$(X-\lambda_2)^{m(2)}\ (X-\lambda_3)^{m(3)}\ u\equiv g_1\bmod(X-\lambda_1)^{m(1)},$$ which we can write as $$T_1\Big((X-\lambda_2)^{m(2)}\ (X-\lambda_3)^{m(3)}\ u\Big)=T_1(g_1),$$ where $T_1(?)$ is the degree less than $m(1)$ Taylor approximation of ? at $X=\lambda_1$. This give $$T_1(u)=T_1\!\!\left(\frac{g_1}{(X-\lambda_2)^{m(2)}(X-\lambda_3)^{m(3)}}\right),$$ whence the formula.

EDIT 3. The general formula appears in the entry Hermite interpolation formula of the Encyclopaedia of Mathematics, edited by Michiel Hazewinkel.

  • 0
    I am interested in more detail, if you would. A reference would be sufficient.2011-08-21
  • 0
    Dear @rcollyer: Thanks for your interest. I don’t really know a convenient reference for this, but it’s so simple that it's easier to describe (and prove) the statements. The main point is that it’s somewhat artificial to restrict oneself to computing the inverse of an invertible matrix. The argument works for any “function” of you matrix, like the exponential - and one can defines in a precise way what is meant here by “function”. ...2011-08-21
  • 0
    Dear @rcollyer: … I tried to describe the case of the exponential function [here](http://math.stackexchange.com/questions/33851/how-to-calculate-the-matrix-exponential-explicitly-for-a-matrix-which-isnt-diago/34139#34139). I’ll try to add something to the above answer for you, but perhaps not right away (it’s almost 11pm where I am right now, and I’m not a night bird...).2011-08-21
  • 0
    Dear @rcollyer, I’ve just edited the post.2011-08-21
1

If a square $n \times n$ matrix $M$ has nonzero determinant then by the Cayley-Hamilton theorem $M^{-1}$ exists and is a polynomial in $M$ of degree $\leq (n-1)$.

Here $M = I + A$ and $n=2$. The inverse, if it exists, is a linear function of $A$. It does exist by the eigenvalue argument in the comments. The coefficients of the linear function can be found by various means such as specializing $A$ to $0$ or $I$, or solving for $x$ and $y$ such that $(I+A)(xI+yA)=I$.

The eigenvalue argument and the formula for the inverse assume that the matrix entries are taken from a field (or ring) where division by $2$ is possible. In characteristic 2 the statement is false; if $2=0$ then $A=I$ is idempotent but $I+A=0$.

Another method is to write the idempotence of $A$ as a condition on $M$:

$(M-I)^2 = (M-I)$ or $M^2 - 3M + 2I = 0$ and this is the same as $M(3I - M)=2I$. The inverse of $M$ is therefore $(3I - M)/2 = (3I - (I+A))/2 = I - A/2$.