2
$\begingroup$

I have a homework problem that asks me to prove if $T(B)$ is a basis for $W$, then $T$ is an isomorphism. Here is the full question:

Let $V$ be a vector space of dimension $n$ over a field $F$ and let $B=\{v_{1},v_{2},\ldots,v_{n}\}$ be a basis for $V$. Let $W$ be a vector space over $F$. Let $T \in\hom(V,W)$ such that $T(B)=\{T(v_{1}),T(v_{2}),...,T(v_{n})\}$ is a set of $n$ distinct vectors in $W$. Prove that if $T(B)$ is a basis for $W$, then $T$ is an isomorphism.

I know if I prove $\ker(v)=${0} (zero vector)then it is one to one. Hence it also proves onto because of dimension theorem. However, I cannot prove it is one to one. Please help me out.

  • 0
    Perhaps, it would be easier to show $T$ is onto. What can you say about the linear span of the basis elements $T(v_1)$, $T(v_2)$, $\ldots\,$, $T(v_n)$?2012-04-19
  • 0
    If $Ker(T)$ = {0} then $Dim(Im(T))$ = 4-0 = 4. So $Im(T)$ = $R^4$. That shows $T$ is onto. But I cannot prove $Ker(T)$ = {0}.2012-04-19
  • 0
    David was suggesting proving "onto" first, directly, then deducing one-one by dimension theorem.2012-04-19
  • 0
    Oh, Okay. and I wrote wrong at the comment either2012-04-19

3 Answers 3

0

Let me expand on Asaf's answer above. Now you already said that if you can prove that $T$ is injective then by the Rank - Nullity Theorem and the fact that $\dim V = \dim W$ it follows that $T$ is surjective too, and hence an isomorphism.

Definition of a basis: A collection of $n$ vectors $w_1, \ldots w_n$ is said to be a basis for a vector space $W$ if the following hold:

(1) Any $w \in W$ can be written as a linear combination of $w_1, \ldots w_n$. Viz. you can always find scalars $a_1, \ldots a_n \in F$ such that $w = a_1w_1 + \ldots a_nw_n.$

(2) The the vectors $w_1, \ldots w_n$ are linearly independent. This means to say that the only way of writing the zero vector $0$ as a linear combination of $w_1, \ldots w_n$ is as $$0 = 0\cdot w_1 + 0\cdot w_2 + \ldots 0\cdot w_n.$$

We say that the dimension of $W$ is the number of vectors in this collection, which in this case is $n$.

Suppose that there is a vector $x = a_1v_1 + \ldots a_nv_n$ in the kernel of $T$. Then this means that $T(x) = a_1T(v_1) + \ldots a_n T(v_n) = 0$ by definition of $x \in \ker T$ and using linearity.

However by assumption we know that $\mathcal{B} = \{T(v_1), \ldots T(v_n)\}$ is a basis for $W$. Therefore by (2) above we know that the only way to write $0$ as a linear combination of the $T(v_i)$ is when $a_1 = a_2 = \ldots = a_n = 0$.

Therefore our original vector $x = 0\cdot v_1 + 0\cdot v_2 + \ldots 0 \cdot v_n = 0$ so that $x = 0$. So we have proven that $\ker T \subseteq \{0\}$. Now for the reverse inclusion it is clear that $0 \in \ker T$ because $T$ is a linear transformation. Therefore $\{0\} \subseteq \ker T$ so that $\ker T = \{0\}$.

Hence you have proven that $T$ is injective.

Doing it the other way round: Suppose you did not know injectivity but wanted to prove surjectivity. By definition (1) above and the fact that the collection $\{T(v_1),\ldots T(v_n)\}$ is a basis for $W$, every vector $w \in W$ can be written as a linear combination of the $T(v_i)$. Viz. for all $w \in W$, there exist $a_1, \ldots a_n$ such that

$$ w= a_1T(v_1 ) +\ldots + a_nT(v_n).$$

But then $T$ is a linear transformation so that $$a_1T(v_1) + \ldots a_nT(v_n) = T(a_1v_1 + \ldots a_nv_n).$$ However $a_1v_1 + \ldots a_nv_n$ is just some vector lying in $V$. Furthermore our choice of the vector $w \in W$ a priori was just any vector in $W$. It follows that any vector $w\in W$ is the image of some vector in $V$. This is exactly saying that $T$ is surjective. Done.

Extra exercise: Prove that a linear map $T$ between finite dimensional vector spaces is injective iff its kernel is trivial.

  • 0
    Yes! Thank you very much!! It so helped me. By the way, on the third line under the definition box, isn't it $T(B)$ = {${T(v_{1}),...T(v_{n})}$} ?2012-04-19
  • 0
    @yoodaniel Well I just gave it a different name this time and called it $\mathcal{B}$ instead of what you called it above. $\mathcal{B} = $ `\mathcal{B}`. By the way if you write $B = \{v_1, \ldots v_n\}$, I don't it makes sense to write $T(B)$. Anyway this is just notation. If my answer above answered your queries, please accept it to mark that.2012-04-19
  • 0
    I checked it , is it right? I am new here.2012-04-19
  • 0
    @yoodaniel Yeah that is right. Also, on this site you can upvote and downvote answers by clicking on the up arrows or down arrows next to answers. If you like anything, you can upvote it, dislike it, you can downvote it!2012-04-19
  • 0
    haha Thank you for letting me know but unfortunately that action requires at least 15 reputation. I have only 13. I think you are pretty good at Linear Algebra. I am doing hw and stuck so many times. I envy you so much.2012-04-19
  • 0
    @yoodaniel With time you will gain more experience!2012-04-19
  • 0
    @yoodaniel Can you do the extra exercise?2012-04-19
  • 0
    Well I am not sure I am trying now. why?2012-04-19
2

Suppose $T(x)=0$. Write $x$ as a linear combination of the $v_i$, apply $T$ using linearity to get a linear combination of the $T(v_i)$, then note that those vectors are a basis, hence, linearly independent, and draw a conclusion.

  • 0
    Could you explain more specifically?2012-04-19
  • 0
    Sure. What in particular would you like me to explain?2012-04-19
  • 0
    I understand let $v \in Ker(T)$, then $v \in V$ and $T(v)=0$. But after that how am I going to explain... so $v=...$. You said lin. comb. Yes, that was what I thought too but how.. like there is c in F, and $cv_{i}$ ?2012-04-19
  • 0
    The $v_i$ are a basis for $V$. That means every $v$ in $V$ can be written as a linear combination of the $v_i$.2012-04-19
2

To expand Gerry's hint a bit: suppose $x\in V$, since $\{v_1,\ldots,v_n\}$ is a basis for $B$ we can write $x=\alpha_1 v_1+\ldots+\alpha_n v_n$ for scalars $\alpha_1,\ldots,\alpha_n\in F$. Suppose that $T(x)=0$, then we have:

$$0=T(x)=T(\alpha_1 v_1+\ldots+\alpha_n v_n) = \alpha_1 T(v_1)+\ldots+\alpha_n T(v_n)$$

Now use the assumption on $\{T(v_1),\ldots,T(v_n)\}$ to conclude that $\alpha_i=0$ for all $i$ and therefore $x=0$.

  • 0
    Why is it Sigma? Sum of all? Isn't $v \in V$ $v_{i}$ ? I mean shouldn't it be only one vector? yours is the sum of all vectors.2012-04-19
  • 0
    @yoodaniel: Do you know that every $x\in V$ can be written as a unique sum of the basis $\{v_1,\ldots, v_n\}$?2012-04-19
  • 0
    No, never heard of it...2012-04-19
  • 0
    @yoodaniel: What is the definition of a basis, then?2012-04-19
  • 0
    Linearly independent and spans V2012-04-19
  • 0
    @yoodaniel: And what does that mean that it spans $V$?2012-04-19
  • 0
    um... span(B)=V?2012-04-19
  • 0
    @yoodaniel: But how do you define $\mathrm{span}(B)$? When is a vector in that set?2012-04-19
  • 0
    Oh, is it ... the definition of Span? The intersection of all subspaces B of V?2012-04-19
  • 2
    @yoodaniel: Are you asking or telling? Also, this seems to be inaccurate. Let me give you another advice about mathematics: it almost always (at introductory level) follow directly from definition (and theorems that you were given). Sit with your lecture notes/book open and check every definition of every symbol and term that you are not certain 101% that you know perfectly the meaning of.2012-04-19
  • 0
    Ok.. by the way, if $v \in V$ = (a,b,c) in $R^3$, and {(1,1,1,),(1,1,0),(1,0,0)} is a basis. Then $v$=c(1,1,1)+(b-c)(1,1,0)+(a-b)(1,0,0). Is this kind of thing what you said?2012-04-19