0
$\begingroup$

In linear algebra, the single value decomposotion of $A_{mxn}$ with $rank(A) = r$ is defined as

$ A_{mxn} = U\;\Sigma V = \sum^{r}_{k=1}u_k\sigma_kv_k^T $

Is it true that for any $k=(1,\dots, r)$ the sum to the right in the above equation will always return the matrix $A_k$ with all zeros except for the row $k$ (where it has the values of A in the row k)? I have found this to be true for a number of examples but I am unsure if this holds as a general truth.

If you could include a brief explanation with your answer that would be great, thanks!

  • 0
    thx, fixed the sigma vs. sum issue. Thanks also for the counter example, guess I am wrong then.2012-01-15

1 Answers 1

1

It seems my comment has answered the question, so I'm posting it here with a little elaboration.

Let $A = \begin{bmatrix}2 & 1 \\ 1 & 2\end{bmatrix}$. Its singular value decomposition is $A = U\Sigma V^T$, where $\begin{align} U = V &= \frac1{\sqrt{2}}\begin{bmatrix}1 & -1 \\ 1 & 1\end{bmatrix}, \\ \Sigma &= \begin{bmatrix}3 & 0 \\ 0 & 1\end{bmatrix}. \\ \end{align}$ Assuming by $u_k$ and $v_k$ you mean the $k$th column of $U$ and $V$, we have $\begin{align} u_1\sigma_1v_1^T &= \frac32\begin{bmatrix}1 & 1 \\ 1 & 1\end{bmatrix}, \\ u_2\sigma_2v_2^T &= \frac12\begin{bmatrix}1 & -1 \\ -1 & 1\end{bmatrix}. \end{align}$

In general, $u_k\sigma_kv_k^T$ is not a matrix with the $k$th row or column equal to that of $A$ and zeroes everywhere else. What it is, is a rank-one matrix that represents the following linear transformation: given an input vector $x$, take its component along $v_k$ and discard the rest, scale that by $\sigma_k$, and make it point along $u_k$ instead. (Apart from the scaling by $\sigma_k$, this is something like a rank-one version of an orthogonal transformation. I can't remember if there's a specific name for it.) What the singular value decomposition tells you is how to represent your rank-$r$ matrix as a linear combination of $r$ such rank-one linear transformations.