I have to solve the following problem (exact wording):
Given is a Matrix $A_{m \times n} \in \mathbb{R}^{m \times n}$ and $r = \mathrm{rank}(A)$. The goal is to find the vectors $V=\{v_1,v_2,\ldots,v_n\}$ of the orthogonal basis of $\mathrm{im}(A)$ by using the Gram Schmidt algorithm.
May $a_1,\ldots,a_n \in \mathbb{R}^m$ be the column vectors of A. We define $u_1,\dots,u_n \in \mathbb{R}^m$ by means of the Gram Schmidt algorithm,
$ u_r = a_r - \sum_{s=1}^{r\;-1}\;\mathrm{proj}_{u_s}(a_r)\quad \text{for r = 1,...,n} $
where
$ \mathrm{proj}_{u_s}(a_r) = \begin{cases} \frac{u_s\cdot a_r}{u_s\cdot u_s}u_s & \text{for}~u_s \neq 0\\ \phantom| \\ \qquad0 & \text{for}~u_s=0 \end{cases} $
a) What is the value of $u_r$, if $a_r\in \mathrm{span}(a_1,\dots,a_{r\,-1})$ with $r \leq n$? How can $\{v_1,\ldots,v_k\}$ be expressed in terms of $\{u_1,\ldots,u_n\}$?
What I don't understand is how $\mathrm{proj}_{u_s}(a_r)$ is defined, as far as I understand $u_s$ and $a_r$ are column vectors of size $m$, thus how can they be multiplied with one another?
My hunch is that $u_r$ is zero if $a_r \in \mathrm{span}(a_1,\dots,a_{r\,-1})$ as $a_r$ is not linearly independent in this case and can thus be expressed as sum of $(u_1,\dots,u_{r\,-1})$.