Given a vector v, and an orthonormal basis $\beta$ = {q$_1$,q$_2$,...q$_n$) we know we can decompose v into a linear combination of $\beta$ using the following:
v = $\langle\ q_1,v \rangle$q$_1$ + $\langle\ q_2,v \rangle$$q_2$ + ...
Though my textbook states that the following sums are equal:
$$v = \sum_{i=1}^n (q_i^*v)q_i = \sum_{i=1}^n(q_iq_i^*)v $$
My first question: Is the equality of the above simply a result of associativity of matrix multiplication? Then also just simply moving the q$_i$ to the left side of the scaler product as opposed to the right side in the first sum.
My second question is more so asking to make sure my interpretation is correct:
For constructing an orthogonal projection matrix ,the author shows that the SVD of an orthogonal projector is of the form
P = Q$\Sigma$Q$^*$.
So looking at the reduced SVD we have P= QQ$^*$, where Q is reduced to the columns we want to project onto.
Then he claims that QQ$^*$v = $\sum_{i=1}^n(q_iq_i^*)v $
So both of the left and right hand side of the above will be a vector that is supposed to be the projection of v onto the columns of Q that we chose in our reduced version of Q.
If our reduced Q matrix has $n-2$ columns then our projection matrix P = QQ$^*$ will produce a vector Pv that is a projection of v into the subspace spanned by the $n-2$ orthonormal columns we have chosen.
If we choose Q to have full rank (n orthonormal columns) then QQ$^*$ will be the Identity matrix and Pv will simply equal v.