2
$\begingroup$

Given a vector v, and an orthonormal basis $\beta$ = {q$_1$,q$_2$,...q$_n$) we know we can decompose v into a linear combination of $\beta$ using the following:

v = $\langle\ q_1,v \rangle$q$_1$ + $\langle\ q_2,v \rangle$$q_2$ + ...

Though my textbook states that the following sums are equal:

$$v = \sum_{i=1}^n (q_i^*v)q_i = \sum_{i=1}^n(q_iq_i^*)v $$

My first question: Is the equality of the above simply a result of associativity of matrix multiplication? Then also just simply moving the q$_i$ to the left side of the scaler product as opposed to the right side in the first sum.

My second question is more so asking to make sure my interpretation is correct:

For constructing an orthogonal projection matrix ,the author shows that the SVD of an orthogonal projector is of the form

P = Q$\Sigma$Q$^*$.

So looking at the reduced SVD we have P= QQ$^*$, where Q is reduced to the columns we want to project onto.

Then he claims that QQ$^*$v = $\sum_{i=1}^n(q_iq_i^*)v $

So both of the left and right hand side of the above will be a vector that is supposed to be the projection of v onto the columns of Q that we chose in our reduced version of Q.

If our reduced Q matrix has $n-2$ columns then our projection matrix P = QQ$^*$ will produce a vector Pv that is a projection of v into the subspace spanned by the $n-2$ orthonormal columns we have chosen.

If we choose Q to have full rank (n orthonormal columns) then QQ$^*$ will be the Identity matrix and Pv will simply equal v.

  • 1
    Hint: if $w$ and $v$ are vectors, then $w^{*}v$ is a scalar, hence commutes with other vectors. That explains the first correspondence. For the "then he claims" -- that can't be right, for $v$ does not appear on the left hand side. You might want to just TRY all this with a single vector like $[1 2]^t$ in $\Bbb R^2$; it should become quite clear once you do.2017-02-13
  • 0
    For the 1st part would we have: (q$^*$v)q$_i$ = q$_i$(q$_i$$^*$v) (since as u said a scaler commutes with a vector). Then by assoc. we have (q$_i$q$_i$$^*$)v. @JohnHughes2017-02-14
  • 1
    Ah...I see you've edited the "then he claims" part, making the second part of my comment irrelevant. Good.2017-02-14
  • 0
    For the second part i've edited the line with 'then he claims'. I did in fact forget the v, thanks for that catch. I suppose this is then much more obvious as its clear we are just taking the sum of the outer products to represent matrix multiplication. In terms of seeing it clearly, i think i'm seeing it more clearly when considering that the sum of the first n outer products equals the sum of the first n projections onto a orthonormal vector q$_i$. Since as we show in part 1 for the sum: (q$_i$$^*$v)q$_i$ = (q$_i$q$_i$$^*$)v @JohnHughes2017-02-14
  • 0
    when i say 'seeing it more clearly' i'm referencing seeing how cutting down the full rank orthogonal matrix Q to n$-2$ columns will project a vector v onto that subspace2017-02-14

0 Answers 0