1
$\begingroup$

Let $AB \in K^{n\times n}$, $AB$ is a diagonal matrix and the elements on the diagonal are non-zero.

Is $A$ invertible?

Since $AB$ is a diagonal matrix and the elements on the diagonal are non-zero, $AB$ is invertible. Also neither $A$ nor $B$ can possibly contain zero-vectors or $AB$ would too which it doesn't. However not containing zero-vectors is not sufficient for a matrix to be invertible, right?

Please be aware that I don't know much about linear algebra yet, so a thorough explanation would be much appreciated.

  • 2
    $A(B(AB)^{-1})=I$, done.2012-11-18

3 Answers 3

1

Yes, both $A$ and $B$ are invertible provided they are square in the first place. We could however say that $A$ and $B$ both have full-rank since $$\text{rank(AB)} \leq \min \{\text{rank(A), rank(B)}\}$$ Since $AB$ is diagonal with non-zero entries, $\text{rank(AB)} = n$. If $A \in \mathbb{R}^{n \times m}, B \in \mathbb{R}^{m \times n}$, then the previous result means that $m \geq n$. If $m=n$, we could also conclude that $A$ and $B$ are invertible.

4

$$AB = \Lambda$$ Since $\Lambda$ is a diagonal matrix with non-zero elements it's invertible $$\left ( AB \right)^{-1} = \Lambda^{-1}$$ If a product is invertible, then multiplicands are also invertible $$\left( AB \right)^{-1} = B^{-1}A^{-1} = \Lambda^{-1} $$ therefore $A^{-1} = B\Lambda^{-1}$

  • 0
    My prof's script contains this rule with the constraint that $A, B \in GL(n, K)$. Is this really true for all matrices in $K^{n\times n}$?2012-11-18
  • 0
    @Kaster , your proof assumes what must be proved, i.e. that $\,A\,$ is invertible. To write $\,(AB)^{-1}=B^{-1}A^{-1}\,$ is meaningless *unless* we know a priori that both $\,A,B\,$ are invertible.2012-11-18
2

Use the determinant:

$$AB=\begin{pmatrix} a_1&0&...&0\\...&...&...&...\\0&0&...&a_n\end{pmatrix}\,\,,\,a_i\neq 0\Longrightarrow \det A\cdot\det B=\det AB=\prod_{i=1}^na_i\neq 0\Longrightarrow$$

$$\Longrightarrow \det A\neq 0\Longrightarrow \,\,A\;\;\text{ is regular and thus invertible}$$

  • 0
    Thanks for your answer though I'll have to do a bit of reading to understand it because we haven't covered the determinant yet.2012-11-18
  • 0
    Well, then re-take your argument: a matrix is non-invertible( i.e. singular) iff upon reducing it (by rows or columns, it doesn't matter) we get one at some point one row (column) of zeros, If this was the case for $\,A\,$ then in $\,AB\,$ (after reducing $A$) we'd get one row (column) or zeros, which is impossible since, at you pointed out, $\,AB\,$ is invertible.2012-11-18
  • 0
    Makes sense. I wasn't exactly sure if I could assume A to be reduced already. So this also means if $A$ had linearly dependant columns (and could thus be reduced to have zero columns) the columns in $AB$ would also be linearly dependent, right?2012-11-18
  • 0
    Indeed so, @Christian2012-11-18