3
$\begingroup$

Given $$ F(x) = \left(\begin{matrix} -x_2 \\ x_1+2x_2 \\ 3x_1 - 4x_2 \end{matrix} \right), x = \left(\begin{matrix} x_1 \\ x_2 \end{matrix} \right)$$ Prove whether $F$ is a linear function or not.

I've tried to prove it, but I'm not sure it's right:

$$F(x) = Mx \Rightarrow M = \left(\begin{matrix} 0 & -1 \\ 1 & 2 \\ 3 & -4 \end{matrix}\right)$$

Let there be $u,v \in \mathbb{R}^2$, then:

$$\begin{eqnarray*} F(u+v) = M (u+v) = Mu+Mv = F(u)+F(v) \end{eqnarray*}$$

and

$$F(\lambda{u}) = M\lambda{u} = {\lambda}Mu = {\lambda}F(u)$$

so the function $F$ is linear.

Using the matrix $M$ was not shown in any of the examples I've seen, I've just invented the technique, I think it's nicer (and almost certainly I've reinvented the wheel). What's the technical term for $M$?

I'm sure there is a missing step in arriving from the given function to the form $F(x)=Mx$, M looks like the coefficient vectors arranged column-wise.

Is the proof formally correct?

  • 1
    $M$ is the matrix that *represents* your linear map $F:\mathbb{R}^2\to\mathbb{R}^3$.2012-11-12
  • 0
    @wj32 How to write it formally?2012-11-12
  • 0
    How "formal" do you need it to be? I think it would be sufficient for you to state that $F(x)=Mx$ and then conclude that $F$ is linear from the properties of matrix multiplication.2012-11-12
  • 0
    I think if I could see it more formally, I could get more insight, get a better feel for what I've learned so far.2012-11-12
  • 1
    I don't have time to type up a full answer, but here's a short explanation. Given bases for your finite-dimensional spaces $\mathbb{R}^2$ and $\mathbb{R}^3$, there is a unique matrix that represents each linear map $f:\mathbb{R}^2\to\mathbb{R}^3$. So if you're given a linear map $f$ then you can always write $f(x)=Mx$ for some matrix $M$, and vice versa. Here you've chosen to use the "standard basis" for each of the spaces.2012-11-12
  • 0
    @wj32 Hm, but the problem is, I don't know if $F$ is linear or not, that's what I have to prove. Can I still use it and see in the end if I come to a contradiction?2012-11-12
  • 0
    If you have not seen this method in your course: presumably you are supposed to check directly from the difinition of "linear" this time. Then, in a day or two, you will learn the matrix method of doing it, and you will appreciate it more at that time.2012-11-12

2 Answers 2

2

The main thing, is that there is a matrix $M$, such that $F(x)=M\cdot x$ for all vectors $x$. In fact for a mapping $F$ between vector spaces with fixed bases, this is equivalent to be linear. Then, that multiplying by a matrix is indeed a linear mapping, is straightforward: depend on the nice properties of matrix multiplication, and you proved it formally correctly.

So, what should be emphasized is that indeed $F(x)=M\cdot x$ for all $x$, but this can be easily seen by expanding the product on the right hand side.

$M$ is called the matrix belonging to the linear map $F$, with respect to the standard bases.

Let $V$ and $W$ be vector spaces, with basis $\mathcal B:=(b_1,..,b_n)$ in $V$ and basis $\mathcal C:=(c_1,..,c_m)$ in $W$. If $F:V\to W$ is a linear mapping, then define its matrix wrt. bases $\mathcal B$ and $\mathcal C$ as: $$[F]^{\mathcal B}_{\mathcal C} := \left[ [F(b_1)]_{\mathcal C}\, ... [F(b_n)]_{\mathcal C} \right] $$ where $[w]_{\mathcal C}$ denotes the culomn vector $\in \Bbb R^m$ of coordinates of $w$ in basis $\mathcal C$, i.e. $$[w]=\pmatrix{\alpha_1\\ \vdots \\ \alpha_m} \iff w=\alpha_1 c_1+..+\alpha_m c_m. $$ Then, by linearity one can check that for all $v\in V$ we have $$[F]^{\mathcal B}_{\mathcal C}\cdot [v]_{\mathcal B} = [F(v)]_{\mathcal C}\ . $$ Prove it first for the basis vectors $v=b_i$.

  • 0
    +1. Can you elaborate just a tiny little bit the parts regarding "bases", "fixed bases" (**never heard this one**) and "standard bases"? I'm not sure I've understood the concepts so good from my readings to feel free to "play" with them like that (and my learning material is in German).2012-11-12
  • 0
    For instance I've found $F(x) = \left(\begin{matrix} x_2 \\ x_1^2 \end{matrix} \right)$ as another example, I think there's no such matrix $M$, and $F$ doesn't look linear to me either. How would I prove it in that case? By contradiction?2012-11-12
  • 0
    To show that it is not linear, show explicit numbers $x_1$, $x_2$ and $\lambda$ such that $F(\lambda x)\ne \lambda F(x)$, or show counterexample to the $F(x)+F(y)=F(x+y)$. Try to find the simplest counterexample..2012-11-12
  • 0
    Thanks, this has cleared my mind :-)2012-11-12
1

You've shown that your map can be represented by a matrix and then shown that it must be linear based on the properties of matrix multiplication.

In fact, most mathematicians would view the problem the other way round: matrices are introduced as a neat way of encoding a linear map: so while it is certainly true that any map that can be represented by a matrix $A$ is linear, it is also true that any linear map can be represented by a matrix.

To see why this is: consider your example, where you are dealing with the map which takes $\left(\begin{matrix} x_1 \\ x_2 \end{matrix} \right)$ to $\left(\begin{matrix} -x_2 \\ x_1+2x_2 \\ 3x_1 - 4x_2 \end{matrix} \right)$. In mathematics, we say that this is an example of a map from $\mathbb{R}^3$ (vectors with three components) to $\mathbb{R}^2$ (vectors with two components). I will now show that every linear map from $\mathbb{R}^3$ to $\mathbb{R}^2$ can be represented by a matrix in this way. The general case for maps from $\mathbb{R}^m$ to $\mathbb{R}^n$ is treated in exactly the same way.

First note that we can write any vector $a=\left(\begin{matrix} a_1 \\ a_2 \end{matrix} \right)$ in $\mathbb{R}^2$ as $$a_1\left( \begin{matrix} 1 \\ 0 \end{matrix} \right)+a_2\left( \begin{matrix} 0 \\ 1 \end{matrix} \right)=a_1e_1+a_2e_2$$ and any vector $b= \left( \begin{matrix} b_1 \\ b_2 \\ b_3 \end{matrix} \right)$ in $\mathbb{R}^3$ as $$b_1\left( \begin{matrix} 1 \\ 0 \\ 0 \end{matrix} \right) + b_2\left( \begin{matrix} 0 \\ 1 \\ 0 \end{matrix} \right) + b_3\left( \begin{matrix} 0 \\ 0 \\ 1 \end{matrix} \right)=b_1e_1+b_2e_2+b_3e_3$$. Now let $\alpha$ be a linear map from $\mathbb{R}^2$ to $\mathbb{R}^3$. Then, if $a\in \mathbb{R}^2$, $$\alpha(a)=\alpha(a_1e_1+a_2e_2)=a_1\alpha(e_1)+a_2\alpha(e_2)$$

Now, since for $i=1,2$, $\alpha(i) \in \mathbb{R}^3$, we can write $$\alpha(e_i)=m_{1i}e_1+m_{2i}e_2+m_{3i}e_3$$ where the $m_{ij}$ are the components of the images of the vectors $e_i$ under $\alpha$. Writing the $m_{ij}$ as a matrix $$M=\left( \begin{matrix} m_{11} & m_{12} \\ m_{21} & m_{22} \\ m_{31} & m_{32} \end{matrix} \right)$$

it is quick to check that applying $\alpha$ to the vector $a=\left( \begin{matrix} a_1 \\ a_2 \end{matrix}\right)$ is exactly equivalent to multiplying by the matrix $M$.

Your question asks the opposite: given a matrix $M$,does multiplication by $M$ always define a linear map. Your calculations (which are perfectly precise) show that the answer is yes, although it's such a basic result that I think showing that the map can be represented by a matrix is probably enough for that question.

Moral: for vector spaces over the real numbers, linear maps are matrices are linear maps. Finding a matrix representation is a great way to show that a map is linear.

I'm not wure what was intended for this question, but you could show that the map is linear directly (without inventing matrices). That might be a slightly nicer example, as it doesn't assume properties of matrix multiplication which are non-obvious (though trivial to prove).