0
$\begingroup$

Let be $k$ a field; $n$, $m$ positive integers. And define $\mathcal{L}\left(k^{n},k^{m}\right)\equiv\left\{ T:k^{n}\to k^{m}|\,\mathrm{T\, is\, a\, linear\ transformation}\right\} $

How do you prove that $\mathcal{L}\left(k^{n},k^{m}\right)\cong M_{m\times n}(k)$ where the last expression means that the vector spaces are isomorphic.

I'm a undergraduate student taking a first course in Linear Algebra.

  • 0
    Of course, thanks.2012-11-30

2 Answers 2

1

A more fundamental (and clear) way to prove that two vector spaces are isomorphic is to show that there exists an invertible linear transformation between the vector spaces (this is actually the definition of isomorphism). So, in general, for two arbitrary finite dimensional vector spaces we can state the following

Theorem

Let $V$ and $W$ be finite dimensional vector spaces over a field $F$, with $\mathrm{dim}(V)=n$ and $\mathrm{dim}(W)=m$, and let $\beta$ be an ordered basis (o.b.) of $V$ and $\gamma$ be an ordered basis of $W$, then the transformation $\Phi:\mathcal{L}(V,W)\to \mathcal{M}_{m\times n}$ given by

\begin{equation} \Phi(T)=\left[ T \right]_{\beta}^{\gamma} \end{equation}

is an isomorphism.

Note: $\left[ T \right]_{\beta}^{\gamma}$ is the matrix representation of the linear transformation $T$. Let $\beta=\{v_1,v_2,\dots,v_n\}$ and $\gamma=\{w_1,w_2,\dots,w_m\}$, if $T(v_j)=\sum_{k=1}^m B_{kj} w_k$ then the $j$-th column of $B:=\left[ T \right]_{\beta}^{\gamma}$ (denoted as $B_{\dot \ ,\ j}$) is $(B_{1j},B_{2j},\dots,B_{mj})^T$ (for $j\in \{1,2,\dots,n\}$).

Proof

The path is to show that $\Phi$ is linear and invertible (then it's an isomorphism).

(I) $\Phi$ is linear. Let $U$, $T$ $\in \mathcal{L}(V,W)$ and $c\in F$. Then

\begin{align} \Phi(U+cT) & = \left[ U+cT \right]_{\beta}^{\gamma} \end{align}

But, if $U(v_j)=\sum_{k=1}^m A_{kj} w_k$ then the $j$-th column of $A:=\left[ U \right]_{\beta}^{\gamma}$ (denoted as $A_{\dot \ ,\ j}$) is $(A_{1j},A_{2j},\dots,A_{mj})^T$. So

\begin{align} (U+cT)(v_j) & = U(v_j)+cT(v_j) \\ & = \sum_{k=1}^m A_{kj} w_k+c\sum_{k=1}^m B_{kj} w_k \\ & = \sum_{k=1}^m (A_{kj} +c B_{kj}) w_k \end{align}

Which implies that the $j$-th column of $C:=\left[ U+cT \right]_{\beta}^{\gamma}$ (denoted as $C_{\dot \ ,\ j}$) is

\begin{align} C_{\dot \ ,\ j}&=(A_{1j}+cB_{1j},A_{2j}+cB_{1j},\dots,A_{mj}+cB_{1j})^T \\ &=(A_{1j},A_{2j},\dots,A_{mj})^T+c(B_{1j},B_{2j},\dots,B_{mj})^T \\ &=A_{\dot \ ,\ j}+cB_{\dot \ ,\ j} \end{align}

Since this is valid for $j\in \{1,2,\dots,n\}$, we conclude that $C=A+cB$, or $\left[ U+cT \right]_{\beta}^{\gamma}=\left[ U \right]_{\beta}^{\gamma}+c\left[ T \right]_{\beta}^{\gamma}$, i.e. $\Phi(U+cT)=\Phi(U)+c\Phi(T)$.

(II) $\Phi$ is invertible. To show that this is true, it's enough to show that $\Phi$ is both injective and surjective.

A standard theorem (you can check it in any linear algebra book) says that if two linear transformations have the same matrix representation, then they are the same linear transformation. This implies that $\Phi$ is injective.

On the other hand $\Phi$ is surjective because given $M \in \mathcal{M}_{m\times n}$ (with entries $M_{ij}$), there exists a unique linear transformation $L\in \mathcal{L}(V,W)$ such that $\Phi(L)=\left[ L \right]_{\beta}^{\gamma}=M$. This is the transformation that does the following: $L(v_j)=\sum_{k=1}^m M_{ki}w_k$ (which is the same as the $j$-th column of $M$ and is defined by the entries of $M$).

We conclude that $\Phi$ is an isomorphism. $\blacksquare$

  • 0
    Is there a name for this theorem?2018-02-07
2

To be isomorphic as finite dimensional vector spaces, you merely need to have the same dimension. A standard basis for $M_{m\times n}(k)$ should be clear, and has $mn$ elements. A standard basis for the other space consists of all maps $f_{ij}$, where $f_{ij}(\vec{e}_k)=\delta_{jk}\vec{e}_i$. So there is again a basis of $mn$ elements.

If $m=n$, then each vector space is actually an algebra, because you can `multiply' vectors together and the result is a vector of the same type. With $n\times n$ matrices, the multiplication is matrix multiplication. With the linear functions space, composition is the multiplication. In this case the two algebras are indeed isomorphic, but a simple vector space isomorphism is not sufficient. It would need to be one such that $\phi(ab)=\phi(a)\phi(b)$. But the vector space isomorphism hinted at in the first paragraph (sending $f_{ij}$ to $1_{ij}$) will have this feature.

  • 0
    @Michael No problem. That is the Kronecker delta. It evaluates to $1$ if $j=k$, and $0$ otherwise.2015-02-04