6
$\begingroup$

Proving that a right (or left) inverse of a square matrix is unique using only basic matrix operations

-- i.e. without any reference to higher-order matters like rank, vector spaces or whatever ( :)).

More precisely, armed with the knowledge only of:

  • rules of matrix equality check, addition, multiplication, distributive law and friends
  • Gauss-Jordan elimination and appropriate equation system solution cases for the reduced row-echelon form

Thanks in advance.

  • 0
    @Arturo That's what I was th$i$nking to try, thanks! (Once again you help me :) )2012-02-17

3 Answers 3

6

So, let us suppose that $A$ is a square matrix, and that $B$ is a matrix such that $BA=I$. You want to show that $B$ is the unique left inverse of $A$ (that is).

Note that a system $A\mathbf{x}=\mathbf{b}$ has at most one solution, namely $B\mathbf{b}$: if $A\mathbf{x}=\mathbf{b}$, then $\mathbf{x} = I\mathbf{x} = BA\mathbf{x} = B\mathbf{b}.$

If $CA=I$, then again a system $A\mathbf{x}=\mathbf{b}$ has at most one solution, namely $C\mathbf{b}$. Thus, $B\mathbf{b}=C\mathbf{b}$ for any $\mathbf{b}$ for which the system has a solution.

If we can show that $A\mathbf{x}=\mathbf{e}_i$ has a solution for each $i$, where $\mathbf{e}_i$ is the $i$th standard basis vector ($1$ in the $i$th entry, $0$s elsewhere) this will show that $B=C$, since they have the same columns.

Because $A\mathbf{x}=\mathbf{0}$ has a solution, that solution must be $B\mathbf{0}=\mathbf{0}$. That means that the reduced row-echelon form of $A$ is $I$. Because the reduced row-echelon form of $A$ is $I$, performing row reduction on the augmented coefficient matrix $[A|\mathbf{e}_i]$ yields the matrix $[I|\mathbf{y}]$ for some $\mathbf{y}$, with $\mathbf{y}$ being the solution to $A\mathbf{x}=\mathbf{e}_i$. Since this vector is equal to both $\mathbf{b}_i=B\mathbf{e}_i$ (the $i$th column of $B$) and to $\mathbf{c}_i=C\mathbf{e}_i$, as noted above, then the $i$th columns of $B$ and $C$ are equal; thus, $B=C$, and the matrix has a unique left inverse.

Now, let us suppose that $A$ is a square matrix and has a right inverse, $AB=I$. We want to show that $B$ is the unique right inverse of $A$. Taking transposes, we get $I = I^T = (AB)^T = B^TA^T$. By what was proven above, $B^T$ is the unique left inverse of $A^T$. If $AC=I$, then $C^TA^T=I^T = I$, so $C^T=B^T$, hence $C=B$. Thus, $B$ is the unique right inverse of $A$.

2

Suppose you have square matrices $A$ and $B$ with $BA = I$. To show that $B$ is the unique left inverse of $A$ it suffices to show that $A$ is invertible, because then $B$ is the unique inverse. (If $C$ is an inverse for $A$ we have $B = BAC = C$.)

The equation $Ax = 0$ has a solution, namely $x = 0$, and for every vector $v$ with $Av = 0$ we have $v = BAv = 0$, hence there is only the trivial solution to the equation $Ax = 0$. This shows that applying Gauss-Jordan elimination to $A$ gives you the identity matrix, hence $A$ is the product of elementary matrices and these are all invertible. This shows that $A$ is invertible and we are done.

1

HINT Show that $I_m = AA^{-1} = A^{-1}A = I_n$; if $m \neq n$ then you have different dimensions for $I_A$...this means they can't commute!!

  • 0
    Huh, that's nice! :)2013-11-25