0
$\begingroup$

Let $\mathbf{v}_1,\ldots,\mathbf{v}_n$ vectors in an $m$-dimensional space $V$. Taking these as column vectors of the matrix $M$, let $ M = U\Sigma V^\ast $ its singular value decomposition. Now, I have a problem where $n>m$ and some of the $\mathbf{v}_i$ are equal*. Is there an easy (meaning, computionally efficient – it is a numerical application) way to calculate $U\Sigma V^\ast$ from $U'\Sigma' V'^\ast$, the SVD of $M$ with duplicate columns removed, or will I just need to brute-force it? Specifically, I only need the left SVD, i.e. $U$ and $\Sigma$. It seems to my like this should be possible, but I have no idea how to approach it.


*In fact, I envision cases where most of them are duplicates, which is why I'm concerned about the performance penalty of calculating the full decomposition of $M$.

  • 0
    @AsafKaragila not sure if we need this tag, I did submit a tag wiki entry. It's IMO comparable to eigenvalues in significance.2012-11-25

2 Answers 2

0

I don't think there is any simple relationship between the SVD of $M$ and the SVD of $M'$, but if you keep inserting new columns into $M$ and want to compute $U$ and $\Sigma$ after each insertion, and if the number of columns of $M$ far exceeds $m$, then what you need perhaps is rank-1 update of symmetric eigenvalue decomposition of $MM^\ast$.

You see, if $MM^\ast = U\Lambda U^\ast$ is an eigenvalue decomposition of $MM^\ast$, the columns of $U$ and the diagonal of $\sqrt{\Lambda}$ are respectively the left singular vectors and the singular values of $M$. So, if right singular vectors are not needed, then a SVD problem can be converted into a symmetric eigenvalue decomposition problem. Now, if you add a new column $x$ to $M_{\textrm{old}}$ to get a new matrix $M_{\textrm{new}}$, then $M_{\textrm{new}}M_{\textrm{new}}^\ast = M_{\textrm{old}}M_{\textrm{old}}^\ast + xx^\ast$. So, given an eigenvalue decompositon of $M_{\textrm{old}}M_{\textrm{old}}^\ast$, you are going to find a new eigenvalue decompositon when $M_{\textrm{old}}M_{\textrm{old}}^\ast$ is incremented by a rank-1 matrix $xx^\ast$.

There are lots of papers on rank-1 update of symmetric eigenvalue decomposition. You can find a bunch of them on the internet. On my computer, e.g., the first result returned by Google is this.

0

Since adding columns doesn't change the rank in your case, the left singular vectors $U$ remains the same. But the same cannot be said about singular values and the right singular vectors. To see this, look at the frobenius norm of the new matrix, it changes, so the singular values must change (since frobenius norm is also equal to sum of squares of singular values). Also, when you add a new column, the dimension of row-space increases (geometricallY). So you can use only $U$ obtained from the previous computation before adding duplicate columns. To find the singular values, call the new matrix $A_1$ (after adding duplicate columns). Now find the square roots of eigenvalues of $A_1^{T}A_1$. This will give you the singular values. I don't think (in general) you can find the new singular values using previous decomposition, other than finding $U$.

  • 1
    Left singular vectors are basis vectors of (a superspace of) the column space, but the converse is not true. Adding a column to $M$ can change its left singular vectors if the right singular vectors $V$ do not form a permutation matrix. You may try some small examples. For instance, M'=\begin{pmatrix}1&1\\0&1\end{pmatrix} and M=\begin{pmatrix}1&1&1\\0&1&1\end{pmatrix}. In both cases, both $\Sigma'$ and $\Sigma$ have distinct singular values, so $U'$ and $U$ are uniquely determined up to multiples of unit diagonal matrices. But $U'$ and $U$ are quite different.2012-11-26