2
$\begingroup$

Prove that for every matrix $A$ in $K^{n\times n}$, where $K$ is a field, there exists a $B$ in $K^{n\times n}$ such that $AB = BA = (\det A) \times I$ ($I$ denotes the unit matrix).

Later edit: Sure, for $\det A \ne 0$, $B = A^*$ satisfies the equalities, where $A^*$ is the adjoint matrix of $A$. I would be interested in finding $B \ne O_n$ for $\det A = 0$

  • 0
    Use of elemetary abstract-algebra would be appreciated. Thanks:)2011-12-20
  • 5
    Do you know about [adjugate of a matrix](http://en.wikipedia.org/wiki/Adjugate_matrix)?2011-12-20
  • 0
    Yes. I updated my question.2011-12-20
  • 1
    The identity $\operatorname{adj} A \cdot A = A \cdot \operatorname{adj} A = (\det A) \cdot I$ holds for all matrices $A$, nonsingular or singular (and is a simple consequence of Cramer's rule). What's special about a nonsingular matrix $A$ is that this identity allows you to express the inverse of the matrix in terms of the adjugate: $$ A^{-1} = \frac{1}{\det A} \operatorname{adj} A. $$ It is this part that fails when you have $\det A = 0$. But your question does not relate to this.2011-12-20
  • 0
    @Srivatsan: I think you can turn your comments to answer.2011-12-20
  • 1
    Only problem is, sometimes adj A is $O_n$. If $A = (a)_{i,j} , $ where $ a_{ij}= 1, \forall i,j$, for example. And $B ≠ O_n$ is crucial for me when $det A = 0$. What I actulally mean to prove here is that A is a zero divisor.2011-12-21
  • 0
    Just a minor remark: the quest for $B\neq0$ is hopeless when $n=0$. And then $A$ is not even singular in that case!2011-12-21

2 Answers 2

1

When $A$ is nonsingular, the adjugate matrix $\operatorname{adj} A$ already works. However, if $A$ is singular, this approach fails because it is possible that $\operatorname{adj} A = 0$. Here's one alternate approach for this case.

Since the space of $n \times n$ matrices is finite dimensional, the infinite list of matrices $\{ I, A, A^2, \ldots \}$ is linearly dependent. Pick a linear dependence $$ a_m A^{m} + a_{m-1} A^{m-1} + \cdots + a_{1} A^1 + a_0 I = 0 $$ among the powers of $A$ such that $m$ is as small as possible, and define $$B = a_m A^{m-1} + a_{m-1} A^{m-2} + \cdots + a_{1} I . \tag{$\dagger$}$$

Now, first note that $-a_0 I = A B = B A$. Taking determinants on both sides, we have $(-1)^n a_0^n = 0$. So $a_0 = 0$, and hence $AB = AB = 0$. Finally, $(\dagger)$ says that $B$ is a linear combination of the first $m-1$ powers of $A$, and hence it is nonzero by our choice of $m$.

  • 0
    I'm afraid I have no knowledge of finite dimentional spaces, nor do I understand the ideea before 'why?'.2011-12-21
  • 0
    @MihaiBogdan The "why?" is easy: $$ a_0 I = - A \cdot (a_mA_{m−1}+a_{m−1} A_{m−2} + \cdots +a_1 I) .$$ Now take determinants on both sides. Right side is zero, whereas the left side is $a_0^n$. So $a_0 = 0$.2011-12-21
  • 0
    Ok. I get this part now.2011-12-21
  • 0
    For the other part, I am using rudimentary [linear algebra](http://en.wikipedia.org/wiki/Linear_algebra). I am not sure if the answer can be rewritten to avoid it, but let me think for some time.2011-12-21
  • 0
    I understand now! Thank you so much!2011-12-21
2

Here is another way of producing a non-zero $B$ when $A$ is singular.

Let $x$ be a non-zero column vector such that $Ax = 0$. Since the transpose $A^\mathrm T$ is also singular, we can find a $y \in K^n$ such that $(A^\mathrm T)y = 0$, i.e. $y^\mathrm TA = 0$. Then $B = xy^\mathrm T$ will work: if $x_i$ and $y_j$ are non-zero, then the $ij$-th component of $B$ is $x_iy_j \neq 0$. [Many thanks to Srivatsan for showing me that this is enough.]