You've shown that your map can be represented by a matrix and then shown that it must be linear based on the properties of matrix multiplication.
In fact, most mathematicians would view the problem the other way round: matrices are introduced as a neat way of encoding a linear map: so while it is certainly true that any map that can be represented by a matrix $A$ is linear, it is also true that any linear map can be represented by a matrix.
To see why this is: consider your example, where you are dealing with the map which takes $\left(\begin{matrix} x_1 \\ x_2 \end{matrix} \right)$ to $\left(\begin{matrix} -x_2 \\ x_1+2x_2 \\ 3x_1 - 4x_2 \end{matrix} \right)$. In mathematics, we say that this is an example of a map from $\mathbb{R}^3$ (vectors with three components) to $\mathbb{R}^2$ (vectors with two components). I will now show that every linear map from $\mathbb{R}^3$ to $\mathbb{R}^2$ can be represented by a matrix in this way. The general case for maps from $\mathbb{R}^m$ to $\mathbb{R}^n$ is treated in exactly the same way.
First note that we can write any vector $a=\left(\begin{matrix} a_1 \\ a_2 \end{matrix} \right)$ in $\mathbb{R}^2$ as $a_1\left( \begin{matrix} 1 \\ 0 \end{matrix} \right)+a_2\left( \begin{matrix} 0 \\ 1 \end{matrix} \right)=a_1e_1+a_2e_2$ and any vector $b= \left( \begin{matrix} b_1 \\ b_2 \\ b_3 \end{matrix} \right)$ in $\mathbb{R}^3$ as $b_1\left( \begin{matrix} 1 \\ 0 \\ 0 \end{matrix} \right) + b_2\left( \begin{matrix} 0 \\ 1 \\ 0 \end{matrix} \right) + b_3\left( \begin{matrix} 0 \\ 0 \\ 1 \end{matrix} \right)=b_1e_1+b_2e_2+b_3e_3$. Now let $\alpha$ be a linear map from $\mathbb{R}^2$ to $\mathbb{R}^3$. Then, if $a\in \mathbb{R}^2$, $\alpha(a)=\alpha(a_1e_1+a_2e_2)=a_1\alpha(e_1)+a_2\alpha(e_2)$
Now, since for $i=1,2$, $\alpha(i) \in \mathbb{R}^3$, we can write $\alpha(e_i)=m_{1i}e_1+m_{2i}e_2+m_{3i}e_3$ where the $m_{ij}$ are the components of the images of the vectors $e_i$ under $\alpha$. Writing the $m_{ij}$ as a matrix $M=\left( \begin{matrix} m_{11} & m_{12} \\ m_{21} & m_{22} \\ m_{31} & m_{32} \end{matrix} \right)$
it is quick to check that applying $\alpha$ to the vector $a=\left( \begin{matrix} a_1 \\ a_2 \end{matrix}\right)$ is exactly equivalent to multiplying by the matrix $M$.
Your question asks the opposite: given a matrix $M$,does multiplication by $M$ always define a linear map. Your calculations (which are perfectly precise) show that the answer is yes, although it's such a basic result that I think showing that the map can be represented by a matrix is probably enough for that question.
Moral: for vector spaces over the real numbers, linear maps are matrices are linear maps. Finding a matrix representation is a great way to show that a map is linear.
I'm not wure what was intended for this question, but you could show that the map is linear directly (without inventing matrices). That might be a slightly nicer example, as it doesn't assume properties of matrix multiplication which are non-obvious (though trivial to prove).