1
$\begingroup$

Now, I'm wondering how to prove this one. It seems simple but I might be looking in the wrong direction.

Properties which are used:

$\text{adj}(AB) = \text{adj}(B)\text{adj}(A)\:\ (1)$

$\text{adj}(I) = I\:\ (2)$

$X \implies Y$

Assuming $A$ is regular.

$AX = XA = I$, where $X$ is its inverse.

$\text{adj}(AX) = \text{adj}(XA) = \text{adj}(I)$

$\text{adj}(X)\text{adj}(A) = \text{adj}(A)\text{adj}(X) = I$, here the implication ends.

$\text{adj}(A)$ is regular and $\text{adj}(X)$ is its inverse.

$Y \implies X$

Assuming $\text{adj}(A)$ is regular.

$\text{adj}(A)X = X\text{adj}(A) = I$, where $X$ is the inverse ($X = \text{adj}(B)$).

According to $(1)$ and $(2)$, I can write this as...

$\text{adj}(BA) = \text{adj}(AB) = \text{adj}(I)$...

Now, here I'm stuck. I do not know if adjugate matrix is unique for each matrix. Don't know if I can just "inverse" the process to get the $BA = AB = I$, and show that $A$ is regular.

  • 1
    If $n$ is the size of the (square) matrix, the statement to be proved is false (in the direction $\Leftarrow$) for $n=1$ (namely the adjugate will be $(1)$, hence regular, irrespective of$~A$; in particular this is so for $A=(0)$). The case $n=1$ is of course not very interesting, but complete proof should have to use $n\neq1$ at some point.2013-08-12

2 Answers 2

2

Use this property $\text{adj}(A)A=A\text{adj}(A)=\det(A)I.$

  • 0
    In the opposite direction, what is proved here is actually "if $A$ is not invertible *and nonzero* then $\operatorname{adj}(A)$ is not invertible". I think the answer should mention this detail, which in fact (correctly) prevents it from providing a "proof" for the case $n=1$.2013-08-12
2

Here's a solution that relies on the notion of rank. Let $A$ be a matrix of size $n\times n$, and let $0\leq r\leq n$ be its rank. Recall that $A\times \text{Adj}(A)=\text{Adj}(A)\times A=\text{det}(A)I$

The rank of $\text{Adj}(A)$ equals

  • $0$ if $r\leq n-2$,
  • $1$ if $r=n-1$,
  • $n$ if $r=n$.

Indeed, if $r\leq n-2$, then each and every coefficitent of $\text{Adj}(A)$ equals $0$, since any $n-1$ columns in $A$ are linearly dependent, and obviously remain so if a line of coefficients is erased from them, so all cofactors are $=0$, and $\text{Adj}(A)=0$.

If $r=n-1$, then there A has $n-1$ linearly independent columns, say the first $n-1$ columns are linearly independent. Over a field, the column rank and the row rank coincide, so there are $n-1$ linearly independent rows among those $n-1$ linearly independent lines, and the determinant these lines and columns define is non-zero, so at least one cofactor is $\neq 0$, and $\text{Adj}(A)\neq 0$. On the other hand, because $0=\text{det}(A)I=A\times \text{Adj}(A)$, the columns of $\text{Adj}(A)$ all lie in the nullspace of $A$, which is one dimensional by the rank theorem, so $0<\text{rank}(\text{Adj}(A))\leq 1$.

if $r=n$, then the above equation (and the fact that $\text{det}(A)\neq 0$) tells you that $\text{Adj}(A)$ is invertible, and has inverse $\frac{A}{\text{det(A)}}$. This analysis shows that $Adj(A)$ is invertible iff $A$ is invertible.

  • 0
    @MarcvanLeeuwen you're right.2013-08-12