3
$\begingroup$

Prove that any subspace of vector space $V$ is a null space over some linear transformation $V \rightarrow V$.

So far I have:
Let $W$ be the subspace of $V$, let $(e_1, e_2, \ldots, e_r)$ be the basis of $W$, where $r \leq \dim(V)$.
It seems I need to find a linear transformation $T: V \rightarrow V$, so that:.
$T(s) = 0$ if $s \in W$
$T(s) \neq 0$ if $s \notin W$.
So $T(x) = 0$, if $x$ is a linear combination of $(e_1, e_2, \ldots, e_r)$ and $T(x) \neq 0$ if it's not.
How do I construct the matrix of this linear transform?

  • 0
    I wouldn't say "the" basis: there are lots of possible choices! Now, do you know that you can extend this basis to a basis $\{e_1, \ldots, e_r, f_{r + 1}, \ldots, f_{\dim V}\}$ for $V$?2012-05-06
  • 0
    How do you feel about projections?2012-05-06

1 Answers 1

2

You are doing well: first find a basis $(e_1,\ldots,e_r)$ for $W$.

Then, remember that every linearly independent subset of $V$ can be extended to a basis. Since $(e_1,\ldots,e_r)$ is linearly independent, we can extend it to a basis: $(e_1,\ldots,e_r,f_1,\ldots,f_{n-r})$, where $n=\dim(V)$, $r=\dim(W)$, $r\leq n$.

Then remember that a linear transformation may be specified by saying what it does to a basis: proceeding as you did before, send each and every one of $e_1,\ldots,e_r$ to $0$.

How do we ensure nothing else is mapped to $0$? How about mapping each $f_i$ to itself?

Suppose $a_1e_1+\cdots+a_re_r + b_1f_1+\cdots+b_{n-r}f_{n-r}$ is mapped to $0$. That means that $$\begin{align*} 0 &= T(a_1e_1+\cdots+a_re_r+b_1f_1+\cdots+b_{n-r}f_{n-r})\\ &= a_1T(e_1)+\cdots+a_rT(e_r) + b_1T(f_1) + \cdots + b_{n-r}T(f_{n-r})\\ &= a_10 + \cdots +a_r0 + b_1f_1 + \cdots + b_{n-r}f_{n-r}\\ &= b_1f_1+\cdots+b_{n-r}f_{n-r}. \end{align*}$$ What can we conclude about $b_1,\ldots,b_{n-r}$?

Now, it is easy to find the matrix of $T$ with respect to the basis $(e_1,\ldots,e_r,f_{1},\ldots,f_{n-r})$. What is it? Express the image of each basis vector in terms of the basis vectors, and those are the columns of the matrix of $T$.

  • 0
    For $0 = b_1f_1 + ... + b_{n-r}f_{n-r}$ to be true $b_1, ..., b_{n-r}$ have to be 0, since $f_1, ..., f_{n-r}$ are linearly independent. This means that $T(x) = 0$ only if $x \in W$. But before that I don't understand why $T(e_i) = 0$ $1<=i<=r$. Why does it have to be so?2012-05-06
  • 0
    Actually I got that part, I'm looking for a transformation where $T(e_i) = 0$ and $T(f_j) = f_j$2012-05-06
  • 0
    Ok, so let $B$ be the change of basis matrix from basis $(e_1,\ldots,e_r,f_1,\ldots,f_{n-r})$ to standard basis. Let $A$ be the matrix for linear transformation $T$ with respect to standard basis. The matrix that I'm looking for is $C$, which is the matrix of linear transformation T with respect to basis $(e_1,\ldots,e_r,f_1,\ldots,f_{n-r})$. $C = B^{-1} * A$. Is this correct?2012-05-06
  • 0
    The matrix with respect to the basis $(e_1,\ldots,e_r,f_1,\ldots,f_{n-r})$ is very easy; it's a diagonal matrix that has $0$'s in the first $r$ positions (corresponding to the fact that the $e_i$ map to $0$) and $1$'s in the last $n-r$ positions (corresponding to the fact that $f_i$ map to themselves). If you want to find the matrix of this transformation with respect to the standard basis (which would depend on what $W$ actually is) you *would* use $B$, but the matrix would be $BCB^{-1}=A$, not what you wrote.2012-05-06
  • 0
    Ok, got it, but I don't really need A for this proof, right? I can just say that given $[v]_B \in V$ the $T([v]_B) = 0$ only if $[v]_B \in W$?2012-05-06
  • 0
    @user1376993: As I understand your question, you are asked for a *linear transformation*, not for a matrix. The linear transformation $T$ determined by its action on the basis $(e_1,\ldots,e_r,f_1,\ldots,f_{n-r})$ is a perfectly good linear transformation. But if in your class, by "linear transformation" you always understand "a matrix relative to the standard basis", then you would need $A$. But **no**, it's not $[v]_B$ that needs to be in $W$, it's $v$ itself that needs to be in $W$. Don't confuse the coordinate vector of $v$ with $v$. $T$ is defined in terms of vectors, not coordinates.2012-05-06
  • 0
    Yes I only need a linear transformation. I used matrices because I know that matrix multiplication with vector is always a linear transformation. If I don't use them don't I still have to prove that T(x) actually is a _linear_ transformation? $T(x + y) = T(x) + T(y)$ and $T(cx) = cT(x)$2012-05-06
  • 0
    @user1376993: Given a basis $(b_1,\ldots,b_n)$ of a vector space, and given **any** vectors $v_1,\ldots,v_n$, we define a linear transformation by saying $T(b_i) = v_i$; this **uniquely** determines a linear transformation that is defined as follows: given an arbitrary vector $v$, write it in terms of the basis, $v=\alpha_1b_1+\cdots+\alpha_nb_n$. Then $T(v)$ is$$\begin{align*}T(v) &= T(\alpha_1b_1+\cdots+\alpha_nb_n)\\&=\alpha_1T(b_1)+\cdots+\alpha_nT(b_n)\\&= {\alpha_1} v_1+\cdots+\alpha_nv_n.\end{align*}$$This is **always** a linear transformation, obtained by "extending linearly". (cont)2012-05-06
  • 0
    @user1376993: (cont). If you haven't proven this result, then you do need to prove it, but it is a very basic result: a linear transformation is completely determined by what it does to a basis, so if you **explicitly say** what it does to the basis, you are done. You get a linear transformation from that definition.2012-05-06
  • 0
    Oh ok, thanks for your help! Sorry for taking so long to understand.2012-05-06