6
$\begingroup$

Can someone give me an example illustrating physical significance of the matrix-vector multiplication?

  1. Does multiplying a vector by matrix transforms it in some way?
  2. Do left & right multiplication signify two different things?
  3. Is matrix a scalar thing? (EDIT)

Thank you.

3 Answers 3

6

The physical significance depends on the matrix. The primary point is that multiplication by a matrix (in the usual sense, matrix on the left) represents the action of a linear transformation. We'll work with one basis throughout which we'll use to represent our matrices and our vectors. Just note that because of linearity, if we have a vector $x = c_1e_1 + \ldots + c_ne_n$, and a linear transformation $L$, then $ \begin{eqnarray*} L(x) &=& L(c_1e_1 + \ldots + c_ne_n) \\ &=& L(c_1e_1) + \ldots + L(c_ne_n) \\ &=& c_1L(e_1) + \ldots + c_nL(e_n) \end{eqnarray*}. $

This means that any linear transformation is uniquely determined by its effect on a basis. So to define one, we only need to define its effect on a basis. This is the matrix

$ \left(L(e_1) \ldots L(e_n)\right) = \left( \begin{array}{c} a_{11} & \ldots & a_{1n} \\ \ldots & \ldots & \ldots \\ a_{n1} & \ldots & a_{nn} \end{array} \right) $

where $a_{ij}$ is the $i$'th compenent of $L(e_j)$.

Let's call this matrix $M_L$. We want to define multiplication of $M_L$ and some vector $x$ so that $M_L \cdot x = L(x)$. But there's only one way to do this. Because the $j$'th column of $M_L$ is just $L(e_j)$ and in light of our decomposition of the action of $L$ in terms of the $L(e_j)$, we can see that

$ M_L \cdot x = \left( \begin{array}{c} a_{11} & \ldots & a_{1n} \\ \ldots & \ldots & \ldots \\ a_{n1} & \ldots & a_{nn} \end{array} \right) \cdot \left( \begin{array}{c} c_1 \\ \ldots \\ c_n \end{array} \right) $

must equal

$ c_1\left( \begin{array}{c} a_{11} \\ \ldots \\ a_{n1} \end{array} \right) + \ldots + c_n\left( \begin{array}{c} a_{1n} \\ \ldots \\ a_{nn} \end{array} \right) = \left( \begin{array}{c} c_1a_{11} + \ldots + c_na_{1n} \\ \ldots \\ c_1a_{n1} + \ldots + c_na_{nn} \end{array} \right) $

which is the standard definition for a vector left-multiplied by a matrix.

EDIT: In response to the question "Is a matrix a scalar thing". Kind of but no.

If you consider the most basic linear equation in one variable, $y = mx$, where everything in sight is a scalar, then a matrix generalizes the role played by $m$ to higher dimensions and a vector generalizes the role played by $y$ and $x$ to higher dimensions. But matrices don't commute multiplicatively. So that's one big thing that's different. But they're strikingly similar in a lot of ways. We can define the function of matrices $f(A) = A^2$ and we can differentiate it with respect to $A$. When we do this in one variable with the map $f(x) = x^2$, we get the linear map f_x'(h) = 2xh but when we do it with matrices, we get the linear map f_A'(H) = AH + HA. If matrices commuted, then that would just be $2AH$!

EDIT2:

My "add comment" button isn't working for some reason. The $e_j$'s are a basis, $e_1, \ldots, e_n$. I think the best thing to do would be to wait for your teacher to get around to it. I sometimes forget that people introduce matrices before vector spaces and linear transformations. It will all make much more sense then. The main point of a basis though is that it's a set of vectors so that every vector in the given space can be written as a unique linear combination of them.

  • 0
    thanks. How about that Gilbert Strang Book? MIT prof., I guess...?2019-03-25
6

I believe that parts 2 and 3 of your question have been answered well. I'd like to take a stab at part 1, though the other answers to this part are probably better.

There's an interesting way of thinking of the application of a matrix to a vector using the Singular Value Decomposition (SVD) of a matrix. Let A be an $m \times n$ rectangular matrix. Then, the SVD of A is given by $A = U \Sigma V^T$, where $U$ is an $m \times m$ unitary matrix, $\Sigma$ is an $m \times n$ diagonal matrix of so-called singular values and $V$ is an $n \times n$ unitary matrix. For more on the SVD, check out the Wikipedia article: http://en.wikipedia.org/wiki/Singular_value_decomposition

That same article contains proof that every matrix has an SVD. Given that fact, we can now think of matrix vector multiplication in terms of the SVD. Let $\bf x$ be a vector of length $n$. We can write the matrix-vector multiplication as ${\bf b} = A {\bf x}$. But, $A{\bf x} = U\Sigma V^T {\bf x}$.

Since $V$ is unitary, $V {\bf x}$ does not change the magnitude of ${\bf x}$. Unitary matrices applied to a vector only change the direction of the vector (rotate it by some angle). The product $V^T {\bf x}$ rotates ${\bf x}$.

$\Sigma$ is a diagonal matrix. Its entries directly multiply the corresponding entries of the vector ${\bf x}$, thus scaling the vector (increasing its length) along the axis around which $V$ rotated ${\bf x}$. However, remember that the rotated vector $V^T{\bf x}$ is a vector of length $n$ while $\Sigma$ has dimensions $m \times n$. This means that $\Sigma$ is also embedding the vector $V^T{\bf x}$ in an $m$-dimensional space, i.e., changing the dimensions of the vector. If $m=3$ and $n=2$, for example, $\Sigma$ scales the 2D vector $V^T{\bf x}$ in 2 dimensions and then "places" it in a 3D space.

Finally, we have the product of the unitary matrix $U$ with the $m$-dimensional vector $\Sigma V^T{\bf x}$. $U$ rotates that vector in the $m$-dimensional space.

Every matrix thus potentially rotates, scales and embeds and then again rotates a vector when applied to a vector. When $m=n$, of course, a matrix-vector product doesn't involve any embedding- simply a rotation, scaling and another rotation. Like a little assembly line.

2

I will interpret physical significance geometrically.

  1. Does multiplying a vector by matrix transforms it in some way?

Yes. It performs a sequence of translations, rotations, and scalings. If you keep studying linear algebra you will learn how to write down the formulas for each of these operations given a matrix.

  1. Do left & right multiplication signify two different things?

Yes: only one of these is defined. If you have an $m\times n$ matrix $M$ and a vector $V\in\mathbb{R}^n$ then only the multiplication $M V$ is defined.

  1. Is matrix a scalar thing? (EDIT)

A matrix is most definitely not a scalar.

  • 0
    Yes, you can think about it like this. There are a lot of equivalent ways to think about it :).2011-04-04