2
$\begingroup$

I'm trying to study Linear Algebra :) In my textbook, the author said $\begin{align} u_1=\begin{pmatrix}1 & 0 & 1 & 0 & -1\end{pmatrix} \end{align}$ $\begin{align} x_1=\begin{pmatrix}-1 \\ -2 \\ 1 \\ 0 \\ 0\end{pmatrix} \end{align}$ $\begin{align} u_1 \cdot x_1 = 0 \end{align}$

However, I can't understand how I can do dot product with $u_1$ and $v_1$.

The form of $u_1$ is horizontally stretched, but $v_1$ is vertically stretched. Isn't it violating the definition of dot product of matrices?

  • 0
    @Fant Aha, thanks :)2012-12-08

2 Answers 2

2

You are right, it's not the regular dot-product here, but the usual matrix multiplication.

The dot-product, also known as inner product, is a function $\mathcal{V}\times\mathcal{V}\longrightarrow\mathbb{R}$ Here you have $u_1\in\mathbb{R}^{1\times5}$ but $x_1\in\mathbb{R}^{5\times1}$, so the aren't really in the same vector space, which is why a inner product is a bit futile.

But the matrix multiplication $u_1\cdot x_1$ will give you the same value as the dot product of $u_1^T\cdot x_1$. The main difference here is that the dot product is commutative, thus $u_1^T\cdot x_1 = x_1\cdot u_1^T$, but the matrix product is not!

0

There can be defrined more kinds of 'products' among vectors and matrices.

The dot products for vectors is sometimes also dentoed by $\langle x,y\rangle$, and indeed it is defined primarily if $x$ and $y$ are coming from the 'same space', and usually the column vectors are preferred in usage. There is also a notion of transpose $M^T$ of a matrix $M$ which exchanges the indices and thus makes row vector from column vector and vice-versa.

Now, if $\bf x$ and $\bf y$ are both column vectors, with coordinates $(x_i)_i$ and $(y_i)_i$ then their dot product is $\langle {\bf x},{\bf y}\rangle := x_1y_1+x_2y_2+x_3y_3+\ldots$ And, if we have a matrix $A$ with rows $\pmatrix{{\bf a_1}\\{\bf a_2}\\{\bf a_3}\\ \vdots}$ and a matrix $B$ with columns $\pmatrix{{\bf b_1}&{\bf b_2}&{\bf b_3}&\ldots}$, then their matrix product is (can be) defined as the matrix $A\cdot B:= (\langle {\bf a_i}^T,{\bf b_j}\rangle )_{i,j}\ .$