2
$\begingroup$

$A$ is a square matrix, and there is a matrix $D$ such that $AD=I$.

I need help to prove either (a) or (b) below:

(a) Show that for each $\vec b$, $A\vec x=\vec b$ has a unique solution.

OR

(b) Show that $A$ is invertible.


For (a), $AD\vec b=I\vec b = \vec b$, so obviously the given equation has at least one solution $\vec x =D\vec b$. But how to show that the solution is unique?

For (b), I'm guessing we should show either that $DA=I$ or that $DA=AD$ (and thus $D=A^{-1}$), but I'm not sure how to do this without assuming that A is invertible, which is what we are needing to show.

  • 1
    Note that by the definition of an _Inverse Matrix_ it is required to be the unique matrix such that: $${\bf A}{\bf A}^{-1}={\bf A}^{-1}{\bf A}={\bf I}$$2012-07-19
  • 0
    What is *your* definition of "invertible matrix"?2012-07-19
  • 0
    @DonAntonio By "invertible matrix", I mean it must be left- *and* right- invertible.2012-07-19
  • 0
    probably duplicate of http://math.stackexchange.com/questions/3852/if-ab-i-then-ba-i2012-07-19

6 Answers 6

0

Since $AD=I$, the range of $A$ is the full vector space: For any vector $\vec v$ you can find a vector $\vec w$ so that $A\vec w=\vec v$: Just use $\vec w=D\vec v$. Denote the dimension of the vector space with $d$ (I assume we are in a finite-dimensional vector space, because otherwise I would not know how to define a square matrix). Thus the range of $A$ has dimension $d$.

Now assume there are two vectors $\vec v_1\neq\vec v_2$ so that $A\vec v_1=A\vec v_2$. Then by linearity, $A(\vec v_1-\vec v_2)=\vec 0$. Now you can take the vector $\vec b_1 := \vec v_1-\vec v_2$ (which by the assumption is nonzero) and extend that with $d-1$ other vectors to a basis $\vec b_1,\dots,\vec b_n$ of the vector space. Now look at the image of an arbitrary vector $w=\sum w_i\vec b_i$: Since $\vec b_1$ is, by construction, always mapped to $\vec 0$, every vector is mapped to the $\sum_{k=\color{red}2}^d w_i A\vec b_i$. But the space spanned by those vectors can only have the maximal dimension $d-1$, which contradicts the previously shown fact that the range of $A$ is the whole vector space. Thus the assumption $\vec v_1\neq\vec v2$ but $A\vec v_1=A\vec v_2$ cannot be true.

4

Let's use a determinant argument.

$AD=I \Longrightarrow \det(AD)=\det(I) \Longrightarrow \det(A)\det(D)=1.$

If $A$ or $D$ are square and non-invertible, then $\det(A)=0$ or $\det(D)=0$. By above, this cannot be the case.

Therefore, both $A$ and $D$ are invertible and belong to $GL_n(\mathbb{F})$. The inverse is unique, so $D = A^{-1}$.

Showing part (b) implies part (a): if the inverse $A^{-1}$ exists, it is unique, so $A^{-1}Ax = A^{-1}b \Longrightarrow x = A^{-1}b$.

  • 0
    $A$ is not invertible, it only has by assumption an inverse to the right. If you assume that invertible matrices are a group with law being the product of matrices, then you implicitely say that $A$ is invertible, which is not clear.2012-07-19
  • 0
    This is nice @Ed, yet I can't understand why you can't deduce that **both** $\,A,D\,$ are invertible after you showed $\,\det A\cdot \det D=1\neq 0\,$...?2012-07-19
  • 0
    Yeah, actually I just noticed that, too. I have updated the response to use this stronger argument. Group theoretic considerations are still used to show uniqueness -- probably not necessary, but illuminating.2012-07-19
  • 0
    @DonAntonio if either $\det(A)$ or $\det(D)$ would be equal to zero, then $\det(A)\cdot\det(D)$ should not be $1$ :).2012-07-19
  • 0
    @Ed This is a great proof, and by far the most straightforward, thus the most elegant, IMO. Thank you!2012-07-19
4

Claim: A square matrix $\,A\,$ is invertible iff $\,AD=I\,$ for some (square) matrix $\,D\,$

The proof of the above depends on the following

Proposition: in $\,x\in G\,$ is a group with unit $\,1\,$ and $\,xx=x\,$ , then $\,x=1\,$

So using now associativity: $$(DA)(DA)=D(AD)A=DIA=DA\Longrightarrow DA=I=AD$$

Putting $\,D=A^{-1}\,$ gives us the usual notation for inverse, and the above solves positively both questions: $$(a) \,\,Ax=b\Longrightarrow A^{-1}Ax=A^{-1}b\Longrightarrow A^{-1}b$$ $(b)\,$It's done above

Added: Following the remarks in the comments, and since we cannot assume any of $\,D,A,DA, AD\,$ is invertible (and thus talking of a group is out of order), I shall try an approach proposed by Rolando (assume the matrices are $\,n\times n\,$):

We're given $\,AD=I,$ . Either in terms of matrices or of linear operators, this means that $\,\dim Im(AD)=n\,$ (is full, i.e. $\,AD\,$ is onto). Now we have a general claim whose proof is elementary:

Claim: If we have functions $\,f:A\to B\,\,,\,g:B\to C\,$ , then $\,g\circ f\,$ onto $\,\Longrightarrow\,g\,$ onto.

From this it follows that $\,A\,$ is onto $\,\Longleftrightarrow \dim Im(A)=n\Longleftrightarrow \dim\ker A=0\,$ ,which means $\,A\,$ is a bijection and, thus, invertible.

  • 1
    Proposition: in x∈G is a group with unit 1 and xx=1 , then x=1 ????2012-07-19
  • 0
    @Rolando, thanks: that was a stupid typo. I already fixed it.2012-07-19
  • 1
    I think you want to say "if $G$ is a group with unit $1$ and $x\in G$ is such that $xx=x$, then $x=1$.", or something like that.2012-07-19
  • 2
    But to use that you must be aware that the matrix product is a group law for invertible matrices, so you assume that $DA$ is invertible, which is not clear imo.2012-07-19
  • 0
    Well, I do not need $\,DA\,$ being invertible but each of them being, and you're right: the claim, which is painfully easy to prove, cannot be used unless we already know we're in, at least, a monoid (no need of group for this). Good point.2012-07-19
  • 0
    I guess I shall wait until the OP tells us what is his definition of "invertible matrix".2012-07-19
  • 0
    Assuming that $A$ and $D$ are both invertible is more restrictive. I can't see a direct proof of your statement for monoids.2012-07-19
  • 0
    all these proofs just don't work imo. You have to use a non-trivial result, i.e. the rank nullity theorem.2012-07-19
  • 0
    Proof: Let $\,M\,$ be a monoid with unit $\,1\,\,,\,\,x\in M\,$, and let $\,x'\in M\,$ be its inverse, then $$x=x\cdot 1=x(xx')=(xx)x'=xx'=1$$ In fact, you can see that in the above is enough to assume $\,x'\,$ is the right inverse of \,$,x\,$2012-07-19
  • 0
    Ah ok, i understand. Thank you. So i think you should change your statement with : G is a monoid and x has a right inverse, then x²=x implies x=1.2012-07-19
  • 0
    @Rolando, I think you're rushing too much. The proof I provided you is of the claim :In a monoid where every elements has a right inverse, if some element multiplied by itself equals itself, then that element is the monoid's unit". I thinki this was clear for you but alas, it wasn't. Who knows what you mean by "wrong proof"2012-07-19
  • 0
    @DonAntonio, yes, i understood the statement with the monoids. The fact is that if you want to apply this lemma to the matrix $DA$, you choose that $DA$ lies in the monoid consisting of matrix with multiplication as law. Of course, every element of that monoid has not a right inverse. So applying this lemma to $DA$ you assume that it has a right inverse and i don't why it should be. No offense, i just want to understand.2012-07-19
  • 0
    @DonAntonio Hey, can you explain (using first principles) why $BX=B$ implies that $X=I$? Remember, we cannot assume that DA is invertible! How about directly proving (a) without making use of result (b)? (BTW, the last second line has a typo (it is missing "x=" near the end of the sentence). Als, your claim above should contain just "if" instead of "if and only if" because this is what you have proven as well as what I asked for.) Cheers2012-07-19
  • 0
    @Rolando, no: I don't want to use that with the matrices. I was just *point-answering* your doubt about the proposition in my answer, that's all. About the matrices I still have to think over how to mend the very first part using the OP's definition of "invertible matrix", as most text books I know of define it as a left *and* right invertible matrix and then it is shown that both one-sided inverses are the same.2012-07-19
  • 0
    @Rolando, please do check the *added* part in my answer. Any doubt please write back, and thanks for the idea.2012-07-19
  • 0
    @Ryan, I added some lines to my answer avoiding to assume any of $\,DA,A,D\,$ is invertible. As for your question in your last comment: $\,BX=B\Longrightarrow X=I\,$ is true *if* $\,B\,$ is **invertible**, otherwise it can fail big time, e.g. $$\begin{pmatrix}0&2\\0&2\end{pmatrix} \begin{pmatrix}0&1\\0&1\end{pmatrix}= \begin{pmatrix}0&2\\0&2\end{pmatrix}\rlap{\;\;\;/} \Longrightarrow\begin{pmatrix}0&1\\0&1\end{pmatrix}\neq I $$2012-07-19
  • 0
    @DonAntonio I have resolved the issue regarding the presumption of invertibility by using a simple determinant argument in my answer below, which should bring us to your original argument.2012-07-19
  • 0
    @DonAntonio Thanks for completing the proof! It seems perfect now. The argument of ed is also nice.2012-07-19
  • 0
    @DonAntonio Yes, that's what I was trying to point out-- i.e. that your initial proof seemed to invalidly use the assumption that B is invertible. So I was wondering whether you had an alternative explanation for that step. Thanks for your answer.2012-07-19
3

Here is a proof which does not use group theory or vector spaces:

I will establish equivalence of the following. The details are left for you to be filled in. Once you have established (b) by the sketch below, establishing (a) is trivial.

The following are equivalent for any square matrix $A$:

  1. $A$ has a left inverse.
  2. $Ax = 0$ has only the trivial solution.
  3. The reduced row echelon form of $A$ is the identity matrix.
  4. $A$ is a product of elementary matrices.
  5. $A$ is invertible.
  6. A has a right inverse.

$1\implies 2$: If $A$ has a left inverse $L$ then $Ax=0 \implies LAx=L0\implies Ix=x=0$.

$2\implies 3$: The augmented matrix for $Ax=0$ must have been reduced to $Ix=0$ by Gauss Jordan elimination.

$3\implies 4$: Since $E_1\cdots E_kA=I$ and each elementary matrix is invertible (and the inverse is also an elementary matrix) so $A=E_k^{-1}\cdots E_1^{-1}$.

$4\implies 5$: Each elementary matrix is invertible.

$5\implies 6$: Trivial.

$6\implies 1$: If $R$ is the right inverse of $A$, then $A$ is the left inverse of $R$ and by $1\implies 5$ $R$ is invertible with inverse $A$, following which $R$ is the left inverse of $A$.

  • 0
    It is such cleverness to invoke the priorly-established <$1 \implies 5$> within the crucial last step <$6 \implies 1$>! And yes, that your proof involves only beginning linear algebra concepts instead of vector spaces or group theory makes it immensely useful and worthy, IMO. This is precisely the type of first-principles proof I was after, although I really like the simplicity of Ed's determinant argument too.2012-07-19
  • 0
    Does this still work if we omit the initial assumption that A has a left inverse?2012-07-19
  • 0
    This proof shows that all the statements are equivalent. But your problem statement does not make any of those statements. In other words, there is nothing here that guarantees that your matrix A *must* have a left inverse, or that D is the right inverse of A. All it says is that *if* D is the right inverse of A, then A is invertible, so we still need to prove that that is the case.2012-07-19
  • 0
    @Ed For the purpose of answering my original question, <$5 \implies 6$> can be omitted, but 1 is important because <$1 \implies 5$> is used to prove <$6 \implies 1$>. What Shahab has done is to prove part of the "Invertible Matrix Theorem", which contains a set of equivalent statements about the invertibility of A. He has done a better job than my textbook author David Lay, who made a small but critical mistake. This is actually the motivation for my original question. :)2012-07-19
  • 0
    @Ed The supposition is found in my original question: "A is a square matrix, and there is a matrix D such that AD=I." I.e. I was asking for a proof that <$6 \implies 1$>. Mathstackexchange is one of the best things about the internet.2012-07-19
  • 0
    Yes, but the inference that right multiplying by D and obtaining I leads is the same thing as saying that D is the right inverse requires an assumption that we are within a group.2012-07-19
  • 0
    @Ed Oh. But througout Shahab's proof, "X has a right-inverse Y" is simply used to mean exactly that "XY=I" (without inferring any other meaning or property). So I'm quite comfortable with it.2012-07-19
0

For any $b$ in your vector space, you have $$AD(b)=I(b)=b$$ In particular $A$ is surjective, and $D$ is injective. Since an endomorphism of a vector space is surjective if and only if it is bijective by the rank nullity theorem, $A$ is invertible.

0

Fisrt, we need to prove that $A$ is invertible. Given $ AD=I \Rightarrow |AD|=1 \Rightarrow |A||D|=1$ $\Rightarrow$ $|A| \neq 0$ and $|D|\neq 0\,.$ Now, since $|A|\neq 0$ then $A$ is invertible.

The other step, you need to prove that $D=A^{-1}$. One way is the following or the other ways as in the other answers

$ AD=I \Rightarrow AD=AA^{-1} \Rightarrow AD-AA^{-1}=0 \Rightarrow A(D-A^{-1})=0 \Rightarrow (D-A^{-1})=0\,, $ if not then $AD \neq I\,,$ $\Rightarrow D = A^{-1}\,,$

since A does not equal the zero matrix.

  • 1
    and what is $A^{-1}$?2012-07-19
  • 2
    @Mhenni In Step 1, we're not allowed to assume that $A^{-1}$ exists. And in Step 4, it is quite possible that the product of two non-zero matrices is zero.2012-07-19
  • 0
    See the last edit.2012-07-21