14
$\begingroup$

I had my exam of linear algebra today and one of the questions was this one.

Given $ A \in \mathbb{R}^{n \times n}$, prove that:

$$\mathrm{adj}(\mathrm{adj}(A)) = (\mathrm{det}(A))^{n-2} \cdot A.$$

Of course I was not able to prove this identity, otherwise I wouldn't post it here. But I'm still curious how one can prove this identity.

Could someone point me in the right direction?

2 Answers 2

13

We use the identities
$$\tag 1\operatorname{adj}(A)\cdot A=\det A \cdot I_n$$ and $$\tag 2\operatorname{adj}(AB)=\operatorname{adj}(B)\cdot \operatorname{adj}(A).$$ We have by (1) $$\operatorname{adj}(\operatorname{adj}(A)\cdot A)=(\det A)^{n-1}\cdot I_n$$ and using (2) $$\operatorname{adj}(A)\cdot \operatorname{adj}(\operatorname{adj}(A))=(\det A)^{n-1}I_n.$$ Multiplying by $A$ we get $$\det A \cdot I_n \cdot \operatorname{adj}(\operatorname{adj}(A))=(\det A)^{n-1}\cdot A.$$ If $\det A\neq 0$, we get the wanted equality, otherwise it's clear if $n\geq 2$.

  • 0
    Alright, I thought about $A^{-1} = \frac{\operatorname{adj}(A)}{\operatorname{det}A}$ but forgot about $\operatorname{adj}(AB) = \operatorname{adj}A \cdot \operatorname{adj}B$. Thank you, now that I see it, it seems quite easy. Damn, missed out 10 points!2011-12-19
  • 0
    @Ief2 You can combine that formula with $(A^{-1})^{-1}=A$, and it works. Of course you also need to discuss separately the case $\det(A)=0$. It works since $A= (A^{-1})^{-1} = \frac{adj(A^{-1})}{\det(A^{-1})}$, and $adj(A^{-1})= adj( \frac{(adj(A)}{det(A)}) = \frac{1}{\det(A)^{n-1}} adj(adj(A))$... But this is exactly the same proof, written in a more complicated way.2011-12-19
  • 1
    @N.S. Thank you for pointing that out. I think my exam question stated that the $A$ was an invertible matrix. But if that was not the case, and $\operatorname{det}A = 0$, would I just fill in the $0$ and conclude the following?: $\operatorname{adj}(\operatorname{adj}A) = 0$ only if $A = 0$? I doubt it because, if $A$ consists of $2$'s only, its adjugate matrix will also be zero, as will be $A$'s determinant.2011-12-19
  • 2
    @Ief2 If $A$ is not invertible, the relation to prove becomes $adj(adj(A))=0$, which is true. It is a little harder to prove, but can be probably be proven the following way: Step 1: If $rank(A) \leq n-2$ then $adj(A) =0_n$. Step 2: Since $A adj(A)=0$, if $n \geq 3$ then either $rank(A) \leq n-2$ or $rank(adj(A)) \leq n-2$. This would solve the problem in the case $n \geq 3$, the rest is simple... I am sure there is a much simpler solution...2011-12-20
  • 0
    @N.S. Thank you for your comments. It clears things up.2011-12-20
0

The equality holds over any commutative ring.

Short proof. It suffices to show that the equality holds for diagonal matrices, which is straightforward.

Slightly longer proof. Let $$ (a_{ij})_{i,j=1}^n $$ be indeterminates. It is enough to check that the equality holds for the matrix $$ A\in M_n(\mathbb Q(a_{11},\dots,a_{nn})) $$ whose $(i,j)$ entry is $a_{ij}$. But this clear since $A$ is semi-simple.

More details.

Why does it suffice to check the equality in this particular case?

Let $B$ be in $M_n(K)$, where $K$ is a commutative ring. The statement we must prove says that a certain matrix $F(B)$, depending on $B$, is zero. But each entry of $F(B)$ is a polynomial in the entries of $B$, and the coefficients of this polynomial are integers depending only on $n$.

Why is $A$ semi-simple?

Because the discriminant of its characteristic polynomial is nonzero.

  • 0
    Ok, maybe this level of maths is a little bit to high for me. But you say that this equality holds over a cummutative ring. I believe you when you say that, but that's not only the case right? Generally square matrices aren't a commutative ring, right?2011-12-20
  • 1
    Dear Ief2: There is a misunderstanding about this commutative ring stuff. You asked the question for a **real** matrix. But neither Davide's answer nor mine uses the assumption that the entries of the matrix are real. This shows that the statement holds for matrices with entries in **any** commutative ring. I agree that $n$ by $n$ matrices with entries in a commutative ring form a **non** commutative ring. - A slightly different way of phrasing the argument I gave is this...2011-12-20
  • 0
    ... Let's work over the complex numbers. Your equality holds for diagonalizable matrices, which are dense, and both sides of the equality are continuous in $A$.2011-12-20
  • 0
    Alright now I know what you meant when talking about the commutative ring. For the rest of you answer I'll certainly read it again when I'm a little bit more educated in math. I think I'm just not ready for it yet :).2011-12-20