2
$\begingroup$

I have a homework problem that asks me to prove if $T(B)$ is a basis for $W$, then $T$ is an isomorphism. Here is the full question:

Let $V$ be a vector space of dimension $n$ over a field $F$ and let $B=\{v_{1},v_{2},\ldots,v_{n}\}$ be a basis for $V$. Let $W$ be a vector space over $F$. Let $T \in\hom(V,W)$ such that $T(B)=\{T(v_{1}),T(v_{2}),...,T(v_{n})\}$ is a set of $n$ distinct vectors in $W$. Prove that if $T(B)$ is a basis for $W$, then $T$ is an isomorphism.

I know if I prove $\ker(v)=${0} (zero vector)then it is one to one. Hence it also proves onto because of dimension theorem. However, I cannot prove it is one to one. Please help me out.

  • 0
    Oh, Okay. and I wrote wrong at the $c$omment either2012-04-19

3 Answers 3

0

Let me expand on Asaf's answer above. Now you already said that if you can prove that $T$ is injective then by the Rank - Nullity Theorem and the fact that $\dim V = \dim W$ it follows that $T$ is surjective too, and hence an isomorphism.

Definition of a basis: A collection of $n$ vectors $w_1, \ldots w_n$ is said to be a basis for a vector space $W$ if the following hold:

(1) Any $w \in W$ can be written as a linear combination of $w_1, \ldots w_n$. Viz. you can always find scalars $a_1, \ldots a_n \in F$ such that $w = a_1w_1 + \ldots a_nw_n.$

(2) The the vectors $w_1, \ldots w_n$ are linearly independent. This means to say that the only way of writing the zero vector $0$ as a linear combination of $w_1, \ldots w_n$ is as $0 = 0\cdot w_1 + 0\cdot w_2 + \ldots 0\cdot w_n.$

We say that the dimension of $W$ is the number of vectors in this collection, which in this case is $n$.

Suppose that there is a vector $x = a_1v_1 + \ldots a_nv_n$ in the kernel of $T$. Then this means that $T(x) = a_1T(v_1) + \ldots a_n T(v_n) = 0$ by definition of $x \in \ker T$ and using linearity.

However by assumption we know that $\mathcal{B} = \{T(v_1), \ldots T(v_n)\}$ is a basis for $W$. Therefore by (2) above we know that the only way to write $0$ as a linear combination of the $T(v_i)$ is when $a_1 = a_2 = \ldots = a_n = 0$.

Therefore our original vector $x = 0\cdot v_1 + 0\cdot v_2 + \ldots 0 \cdot v_n = 0$ so that $x = 0$. So we have proven that $\ker T \subseteq \{0\}$. Now for the reverse inclusion it is clear that $0 \in \ker T$ because $T$ is a linear transformation. Therefore $\{0\} \subseteq \ker T$ so that $\ker T = \{0\}$.

Hence you have proven that $T$ is injective.

Doing it the other way round: Suppose you did not know injectivity but wanted to prove surjectivity. By definition (1) above and the fact that the collection $\{T(v_1),\ldots T(v_n)\}$ is a basis for $W$, every vector $w \in W$ can be written as a linear combination of the $T(v_i)$. Viz. for all $w \in W$, there exist $a_1, \ldots a_n$ such that

$ w= a_1T(v_1 ) +\ldots + a_nT(v_n).$

But then $T$ is a linear transformation so that $a_1T(v_1) + \ldots a_nT(v_n) = T(a_1v_1 + \ldots a_nv_n).$ However $a_1v_1 + \ldots a_nv_n$ is just some vector lying in $V$. Furthermore our choice of the vector $w \in W$ a priori was just any vector in $W$. It follows that any vector $w\in W$ is the image of some vector in $V$. This is exactly saying that $T$ is surjective. Done.

Extra exercise: Prove that a linear map $T$ between finite dimensional vector spaces is injective iff its kernel is trivial.

  • 0
    Well I am $n$ot sure I am tryi$n$g now. why?2012-04-19
2

Suppose $T(x)=0$. Write $x$ as a linear combination of the $v_i$, apply $T$ using linearity to get a linear combination of the $T(v_i)$, then note that those vectors are a basis, hence, linearly independent, and draw a conclusion.

  • 0
    The $v_i$ are a basis for $V$. That means every $v$ in $V$ can be written as a linear combination of the $v_i$.2012-04-19
2

To expand Gerry's hint a bit: suppose $x\in V$, since $\{v_1,\ldots,v_n\}$ is a basis for $B$ we can write $x=\alpha_1 v_1+\ldots+\alpha_n v_n$ for scalars $\alpha_1,\ldots,\alpha_n\in F$. Suppose that $T(x)=0$, then we have:

$0=T(x)=T(\alpha_1 v_1+\ldots+\alpha_n v_n) = \alpha_1 T(v_1)+\ldots+\alpha_n T(v_n)$

Now use the assumption on $\{T(v_1),\ldots,T(v_n)\}$ to conclude that $\alpha_i=0$ for all $i$ and therefore $x=0$.

  • 0
    Ok.. by the way, if $v \in V$ = (a,b,c) in $R^3$, and {(1,1,1,),(1,1,0),(1,0,0)} is a basis. Then $v$=c(1,1,1)+(b-c)(1,1,0)+(a-b)(1,0,0). Is this kind of thing what you said?2012-04-19