4
$\begingroup$

Let $A,B$ be $n×n$ matrices such that $BA = I_n$, where $I_n$ is the identity matrix.
$(1)$ Suppose that there exists $n×n$ matrix $C$ such that $AC = I_n$. Using properties of the matrix multiplication only (Theorem 2, sec. 2.1, p.113) show that $B = C$.
$(2)$ Show that if $A, B$ satisfy $BA = I_n$, then C satisfying $AC = I_n$ exists. Hint: Consider equation $Ax ̄ = e_i$ for $e_i$ - an elements from the standard basis of $R^n$. Show that this equation has a unique solution for each $e_i$.
Theorem 2

Attempt: $(1)$Ehhm... since $BA=I_n$ then $B$ is the inverse of $A$. Same with $C$. And since a matrix has a unique inverse $B=C$. But, I don't see how to prove it using the properties of multiplication only. Maybe something like this: $A=B^{-1}I_n=B^{-1}\\A=I_nC^{-1}=C^{-1}\\C^{-1}=B^{-1}\\C=B$

So, I am not sure about this one because I am using the inverse of a matrix, but I am told to only use the properties of multiplication. Hints please.

$(2)$ Lets say that $BA=I_3$. Then, $\vec e_1=\begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}\; ,\vec e_2=\begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}\; ,\vec e_3=\begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}$. Then, $AC=I_3$, where $C=\begin{pmatrix} c_{11} & c_{12} & c_{13} \\ c_{21} & c_{22} & c_{23} \\ c_{31} & c_{32} & c_{33} \end{pmatrix}$. So, $A\begin{pmatrix} c_{11} \\ c_{21} \\ c_{31} \end{pmatrix}= \vec e_1$. I am trying to use that hint here($A\vec x=e_i$ has a unique solution).How do I show that this system is consistent and has a unique solution. Thanks.

2 Answers 2

5

HINTS:

(1) You have $AC = I_n$. What happens if you multiply both sides of the equation with $B$ and use $BA=I_n$?

(2) What happens if you multiply both sides of $Ax=e_i$ by $B$ and use $BA=I_n$? Do you get a solution $x$? Is it unique?

Afterwards you have so assemble all $n$ solutions of the equations $Ax=e_i$ into the matrix $C$. (1) then tells you that $C=B$.

2

Your statement "since $BA=I_n$, then $B$ is the unique inverse of $A$" is true, but the point of the problem is to walk you through a proof of this statement; i.e. it's not a fact you can take for granted.

Recall that the traditional definition of a matrix inverse requires two equations to hold separately: $AB = I$ and $BA = I$. The fact that you actually only need one to hold for the other to be true, with uniqueness, is pretty nontrivial and is not at all obvious from the definition. For example, it's definitely not true for nonsquare matrices: if $A$ is $3\times 2$ and $B$ is $2\times 3$ such that $AB = I_3$, it will not be true that $BA = I_2$.