8
$\begingroup$

Question:

If a square matrix $A$ satisfies $A^2=I$ and $\det A>0$, show that $A+I$ is non-singular.

I have tried to suppose a non-zero vector $x$ s.t. $Ax=x$ but fail to make a contradiction.

And I tried to find the inverse matrix of $A+I$ directly, suppose $(A+I)^{-1}=\alpha I +\beta A$, but it still doesn't work.

(Update: According to the two answers, this question itself is incorrect.)

3 Answers 3

10

$(I+A)(I-A)=0$, so $I+A$ is invertible (for $A$ satisfying $A^2=I$) if and only if $A=I$.

This works for $A$ in any ring with unit [in which $2$ is invertible], not only a ring of square matrices.

  • 0
    A counter example is take matrix $A$ with diagonal entries 1 and -1.2012-05-12
18

$-I$ with even size is a counterexample.

14

This seems to be false: consider the $3 \times 3$ diagonal matrix with diagonal entries $1,-1,-1$. Similarly, taking an $n \times n$ diagonal matrix with two entries $-1$ and all the rest equal to $1$ gives a counterexample for any $n \geq 2$.

(A comment on how I came up with this: the matrix $A+I$ is singular iff $-I-A$ is singular iff $-1$ is an eigenvalue of $A$. The condition $A^2 = I$ forces each of the eigenvalues to be $\pm 1$ and the condition $\operatorname{det} A > 0$ forces the number of instances of $-1$ to be even, but this is not enough to give the result.)

Of course the above reasoning would also lead to Ricky Demer's counterexample, and probably should have. For some reason I thought of the above first.