3
$\begingroup$

Let $V$ be a $k$-dimensional subspace of $\mathbb{R}^n$, with $k < n$. Choose an orthonormal basis $(a_1, \ldots, a_k)$ of $V$ and let $A$ be the $k \times n$ matrix whose rows are $(a_1^T, \ldots , a_k^T)$. Define a linear mapping $L: V \rightarrow \mathbb{R}^k$ by

$$ L(x) = Ax. $$

From the definition of $A$ it is clear that

$$ L^{-1}(y) = A^T y. $$

Therefore, $L$ has an inverse while $A$ (being rectangular) has not. This seems odd...


While writing this it came to me that $A$ probably isn't the matrix representing $L$ in any basis, since such should be a $k \times k$ matrix, but I am a bit confused - what then is $A$?

  • 1
    Indeed, $A$ does not represent $L$ relative to any basis; even though $V$ "lives" inside of $\mathbb{R}^n$, when you take a basis for $V$ and you write vectors of $V$ in terms of that basis, the coordinate vectors are elements of $\mathbb{R}^k$, not of $\mathbb{R}^n$. Remember that the coordinate matrix of $L$ is a matrix that is computed using **coordinate vectors**.2012-01-27
  • 0
    Why is it "clear" that $L^{-1}(y) = A^T y$?2012-01-27
  • 0
    @Qiaochu: At first blush I thought the same, but it works: $L$ maps $a_i$ to the $\mathbf{e}_i\in\mathbb{R}^k$, because $a_1,\ldots,a_k$ are an orthonormal basis; so $L^{-1}\colon\mathbb{R}^k\to V$ maps $\mathbf{e}_i$ to $a_i$, and that gives $A^T$ (assuming the $a_i$ were column vectors).2012-01-27
  • 0
    You might want to think about a simple example, say, let $V$ be the $x$-axis in 2-space with basis $(1,0)$, etc.2012-01-27
  • 0
    It depends how you define what matrix inverse means! Indeed, $A^\top$ is the pseudo-inverse (http://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_pseudoinverse#Orthonormal_columns_or_rows) of $A$.2012-01-27
  • 0
    Also note that $V$ needs to be a proper subset of $\mathtt{R}^n$ in order to be surjective. Thus, $L^{-1}(\cdot)$ cannot except an arbitrary vector as input, or it is not the proper inverse of $L$ (in the same way that $A^\top$ is not the proper inverse of $A$ but just the pseudo-inverse).2012-01-27
  • 0
    @Hauke: The problem begins by saying that $V$ is a $k$-dimensional subspace of $\mathbb{R}^n$, with $k\lt n$.2012-01-27
  • 0
    @Arturo: if you submit your first comment as an answer, I will accept it as it basically clears my original problem (and I get it now). To all others: sorry if the question was a bit muddy, however writing it down helped me to see it clearer ;-)2012-01-27
  • 2
    @koletenbert: Then why not write out an answer yourself? Point out your confusion, and how it has been clarified, and what the answer is. (-:2012-01-28
  • 0
    There's nothing wrong with posting an answer to your own question, and accepting it (if no one finds anything wrong with it). In fact, doing so has been encouraged.2012-01-28
  • 0
    These are the subtle points of linear algebra that shock graduate students the second time around when they realize they didn't really know it anywhere near as well as they thought they did.2012-01-28

2 Answers 2

3

It seems people would like me to answer my own question... well then, why not.


My original assumption was that if a linear mapping $L$ can be written as

$$ L(x) = Ax $$

for some matrix $A \in \mathbb{R}^{k \times n}$, then $A$ represents $L$ with respect to some basis (in fact, the canonical bases of $\mathbb{R}^n$ and $\mathbb{R}^k$ in this case, as $L$ seems to go from $\mathbb{R}^n$ to $\mathbb{R}^k$).

However, there is a subtelty about the domain of $L$ in the problem as posted originally - here, $L$ goes from a $k$-dimensional subspace of $\mathbb{R}^n$ to $\mathbb{R}^k$. Therefore, both the domain and range of $L$ are actually $k$-dimensional vector spaces, and any matrix representing $L$ will necessarily be $k \times k$.

Now, what does such a matrix look like? To find it, choose bases for the domain $V$ and range $\mathbb{R}^k$ of $L$. For example, $(a_1, \ldots, a_k)$ and $(e_1, \ldots, e_k)$ will do, in which case the matrix will turn out to be the $k \times k$ identity matrix (which is obviously invertible).

Thus, to answer the question in the title, if a linear map is invertible, any matrix representing it will be so as well.

2

I think that a great part of your difficulty was that you were suffering from a mathematical education that hadn’t paid sufficient attention to the importance of the domain, the target space, and the image (=“range”) of a map. Do you know the following about ordinary maps $f\colon X \to Y$ between sets?

The map $f$ is one-to-one if and only if it has a left inverse, i.e. there is a map $g\colon Y \to X$ such that $g\circ f=$ identity on $X$. And $f$ is onto (meaning that the image is equal to all of the target space) if and only if $f$ has a right inverse, i.e. there is a map $h\colon Y \to X$ such that $f\circ h=$ identity on $Y$. The latter is not quite trivial: in fact it depends on (and I think is equivalent to) the Axiom of Choice. The proofs are easy enough, nothing fancy, and if you haven’t seen the facts, you should spend a little time to prove them for yourself.

The significance of the above is that the same results are true for maps between vector spaces. You have constructed a left inverse of the inclusion map $V\to {\mathbb{R}}^n$, which of course is a one-to-one map, but your new map is a left inverse only. Try composing them in the opposite direction!

  • 0
    "You have constructed a left inverse of the inclusion map $V\to\mathbb{R}^n$." Except that none of the maps have $\mathbb{R}^n$ as domain or codomain: the map $L$ goes from $V$ to $\mathbb{R}^k$, the map $L^{-1}$ goes from $\mathbb{R}^k$ to $V$. I'm not sure your reading here is accurate; the real issue is that although $L$ can be given by the formula $L(x) = Ax$, $A$ is not the coordinate matrix of $L$.2012-01-28
  • 0
    @ArturoMagidin, you are completely correct. I’ll leave my rather careless response up, but in its defense, I will say: just what is his matrix $A$? From its shape, it has to be a map from an $n$-dimensional space to one of dim. $k$. Isn’t it the orthogonal projection from the big space to the small? OP chose to restrict it to the subspace, so that as he says, in that sense it’s identity, but the matrix is more than that: it is in fact the left inverse I mentioned; and the transpose matrix is the matrix of inclusion, with respect to his basis in the subspace, the standard basis in the big.2012-01-29
  • 0
    Yes, $A$ is the coordinate matrix of a transformation from all of $\mathbb{R}^n$ to $\mathbb{R}^k$, which happens to restrict to the identity on $V$.2012-01-29