4
$\begingroup$

Here is my homework question,

Let $V$ and $W$ be vector spaces over a field $F$ and let $T$ from $V$ to $W$ be an isomorphism. Let $X$={$v_{1},v_{2},...,v_{n}$} be a subset of V, and recall that T(X)={$T(v_{1}),T(v_{2}),...,T(v_{n})$}.

a. Prove that if $X$ is linearly independent, then $T(X)$ is also linearly independent.

b. Prove that if $X$ spans $V$, then $T(X)$ spans $W$.

I am trying to solve a. but I don't know how to use isomorphism here... My work is.. Suppose X is linearly independent. Then for all $a_{1}v_{1},a_{2}v_{2},...,a_{n}v_{n}$ = 0. So, $a_{1},a_{2},...,a_{n}$ = 0. but I can't go further...

  • 0
    By definition, isomoporhisms are injective (i.e. monomorphisms) and surjective (i.e. epimorphisms).2012-04-19
  • 0
    @FernandoMartin Yes, I know that2012-04-19
  • 0
    As for your work, if $X$ is linearly independent, then $a_1v_1,\dots,a_nv_n=0$ iff $a_1,\dots,a_n=0$, but $a_1v_1,\dots,a_nv_n=0$ certainly doesn't hold for all vectors (unless, of course, your space is the trivial vector space).2012-04-19
  • 0
    a) follows from the fact that $T$ is injective; b) follows from the fact that $T$ is surjective.2012-04-19
  • 1
    Judging from this and your previous question, and the discussion that ensued, you really, really, really need to go learn the basic definitions. You need to know **exactly** what linearly independent means, to begin with.2012-04-19
  • 0
    @GerryMyerson Umm. I know the definition of Linearly independent is...2012-04-19
  • 2
    Maybe you do, but your writing "Then for all $a_1v_1,a_2v_2,\dots,a_nv_n=0$. So, $a_1,a_2,\dots,a_n=0$" suggests quite the opposite.2012-04-19
  • 0
    Intuitively, if the linear transform does not reduce the number of dimensions (for example, does not project 3D space onto a 2D plane) then it is reversible, and of course it will map a basis (a set of linearly independent vectors) into another basis. If the transform projects to a lower space, then it will "crush" the linear independence of the vectors. E.g. in a 2D space if we have three vectors, one of them must be necessarily redundant. All three vectors are lacking a component which is normal to the 2D plane in which they lie, and so they cannot form a basis which spans 3-space.2012-04-19

2 Answers 2

1

I understand that this is a homework question. However based on your previous question I am willing to give a little more help to help you get the ball rolling and basic ideas in these problems. I will help you out with $(a)$ because $(b)$ almost immediately follows from the definitions of $T$ being an isomorphism (in particular that it is surjective).

For $(a)$, you have an implication $p \implies q$, where $p$ and $q$ are respectively the statements

$p$: $X$ is linearly independent

$q$: $T(X)$ is linearly independent.

If you don't know about the contrapositive yet, it is a powerful way of proving statements. In your context it says that proving $p \implies q$ is equivalent to proving $\neg q \implies \neg p$ where the "$\neg$" symbol is logical negation.

So suppose we know $\neg q$. That is, we assume that the vectors in the collection $T(X)$ are linearly dependent. Now you will out the following details:

1) What does it mean for the vectors in $T(X)$ to be linearly dependent? You just need to apply the definition of linear dependence.

After you have written that down, using linearity of $T$ this implies that (.....)? You can fill in $(\ldots \ldots )$ here by looking at question 2) below:

2) Is there a vector in the kernel of $T$ now? Proceed to 3) below.

3) Using the fact that $T$ is an isomorphism can you conclude from here that $X$ is linearly dependent? Remember we started of with $\neg q$ and we want to prove $\neg p$, where $\neg p$ is the statement "$X$ is linearly dependent."

$\textbf{Edit:}$ Here is my response to your solution below: You cannot start with saying " suppose that $a_1 = a_2 = \ldots a_n = 0$." This is non-sensical. In fact in your proof the only thing you've done is showing that $T$ maps zero to zero. This is because you started of with saying that all the $a_i's$ were zero, so that $T$ applied to any linear combination involving these $a_i's$ will be zero anyway. Do you see this does not lead to a proof of what you want to prove?

What you need to do is this: Suppose we have a linear combination of the $T(v_i)'s$ that gives us zero. Viz. there are scalars $a_1 , a_2, a_3, \ldots a_n$ such that

$$a_1T(v_1) + \ldots a_nT(v_n) = 0.$$

Remember: You want to conclude from here, using information about $X$, that this linear combination can only be the trivial linear combination. Now by linearity of $T$ this means that the vector $a_1v_1 + \ldots a_nv_n$ is in the kernel of $T$. But then $T$ is an isomorphism so this means that $a_1v_1 + \ldots a_nv_n = 0$.

Do you see how to prove it correctly now?

Now because $X$ is linearly independent, this forces us to conclude that $a_1 = a_2 = \ldots = a_n = 0$.

  • 0
    How can I post my solution?2012-04-19
  • 0
    @yoodaniel That is also possible, but then again there can be many ways to solve a problem! Now post your solution as an answer and I will check it.2012-04-19
  • 0
    Sorry I can't post answer yet because of the reputation. but here is my answer. $a_{1},...,a_{n}$= 0. So $a_{1}v_{1}+a_{2}v_{2}+...+a_{n}v_{n}$= 0. Then, $a_{1}v_{1}+a_{2}v_{2}+...+a_{n}v_{n}$ is an element in {0}. So, $a_{1}v_{1}+a_{2}v_{2}+...+a_{n}v_{n}$ is in Ker(T). T($a_{1}v_{1}+a_{2}v_{2}+...+a_{n}v_{n}$)= 0. So $a_{1}T(v_{1})+a_{2}T(v_{2})+...+a_{n}T(v_{n}) = 0$ . Thus $T(X)$ is a linearly independent set.2012-04-19
  • 0
    Maybe I need to start opposite way...2012-04-19
  • 0
    @yoodaniel Please see my response to your proof above.2012-04-19
0

To expand on Martin's Hint:

Try to prove the following:

Let $V$ and $W$ be vector spaces over a field $\Bbb F$; let $T$ be a map from $V$ to $W$.

  1. The following are equivalent:

    • $T$ is injective.
    • $T$ takes each linearly independent set in $V$ to a linear independent set in $W$.
  2. The following are equivalent:

    • $T$ is surjective.
    • $T$ takes each spanning set of $V$ to a spanning set of $W$.
  • 0
    ok I am on it....2012-04-19
  • 0
    I'd suggest rewriting, something like, "$T$ takes *each* linearly independent set...," lest someone think it's enough to show that $T$ takes *some* linearly independent set to a linearly independent set.2012-04-19
  • 0
    @GerryMyerson I agree. I will make it clear.2012-04-19