3
$\begingroup$

Some background: I was in class today, and the professor was proving that given a matrix $A$ that is $n$ by $m$ and a $B$ such that $AB=I$ where $B$ is obviously $m$ by $n$ and $I$ is $n$ by $n$, if it is also the case that $BA=I$, then $A$ must be a square matrix (that is, $m=n$). We didn't manage to get through however, because we couldn't find a reason why $A$ actually had to be square in that case -- it seemed to me that it would just be $BA=I$ where $I$ is $m$ by $m$. So in both cases, we would have an Identity matrix, but of different dimensions (sounds strange, but as far as I recall, as long as the product yields an Identity, $B$ counts as an inverse, even if dimensions vary depending on whether you multiply it from left or right). I was thinking something along the lines of overdetermined systems -- if $A$ is invertible (and not square) at least from the right, it has to have more equations than variables, but that did not bring me anywhere. Thanks!

  • 0
    @Arturo: whoops. I slightly misread the question. (I was under the impression that the question was asymmetric in m and n.)2011-01-17

3 Answers 3

0

Here is another nice exercise which also results in the desired conclusion: Let $A$ be an $n\times m$ matrix and $B$ be an $m \times n$ matrix. If $m < n$ then $\det(AB) = 0$ (note that the argument is pretty much the same as Arturo's answer. All you need is that the image of a linear map has at most the same dimension as the domain)

15

HINT $\ \ $ $\rm\ n\: =\: $ trace $\rm\: AB\ =\:$ trace $\rm\: BA\: = m$

  • 0
    Almost "Proof without a word"!!2012-02-21
7

Assuming your matrices have real coefficients, you can use the Rank-Nullity Theorem, which is essentially what you are talking about. All you need to know is that a homogeneous system of linear equations with more unknowns than equations always has nontrivial solutions.

Suppose that $A$ is $n\times m$, $B$ is $m\times n$, and $AB=I_n$, $BA=I_m$.

Consider the system of equation $A\mathbf{x}=\mathbf{0}$, where $\mathbf{x}$ is the column vector with $m$ unknowns. A solution $\mathbf{s}$ must be equal to $\mathbf{0}$, because if $A\mathbf{s}=\mathbf{0}$, then $\mathbf{0} = B\mathbf{0} = B(A\mathbf{s}) = (BA)\mathbf{s} = I_m\mathbf{s}=\mathbf{s}.$ So the only solution to $A\mathbf{x}=\mathbf{0}$ is the trivial solution, which tells you that you must have at most as many unknowns as equations. Since $A\mathbf{x}=\mathbf{0}$ has $n$ equations and $m$ unknowns, you know $n\geq m$.

Now, repeat the argument with the system $B\mathbf{x}=\mathbf{0}$; multiplying by $A$ on the left you conclude that the system has at least as many equations as unknowns, because the trivial solution is the only solution. So $m\geq n$.

Putting the two together you get $n=m$.

(Notice that you do not need to assume that it is the same matrix $B$ that works on both sides at the outset. If you have some $m\times n$ matrices $B$ and $C$ such that $CA=I_m$ and $AB=I_n$, then $C=CI_n = C(AB) = (CA)B = I_mB = B$; or you can just do the argument above using $C$ in the first part instead of $B$ to conclude $m=n$ first, then conclude that $C=B=A^{-1}$).

(The argument actually breaks down if your matrices have coefficients in more general rings; in particular, if you allow the entries to come from noncommutative rings, then you can have nonsquare matrices $A$ and $B$ such that $AB$ and $BA$ are different-sized identities.)

  • 0
    @InterestedQuest: The trace of a square matrix is the sum of the entries along the main diagonal. It is an easy exercise that if $A$ is $m\times n$ and $B$ is $n\times m$, then the traces of $AB$ and of $BA$ are the same (find a formula for the $ij$-th entry of $AB$ in terms of the entries of $A$ and of $B$). And the trace of the $k\times k$ identity is easily seen to be $k$.2011-01-18