3
$\begingroup$

I used a Lapack library to compute Singular Value Decomposition of symmetrical Matrix in c# language like:

$\left( \begin{array}{ccc} 1 & 2 & 5 \\ 2 & 1 & 7 \\ 5 & 7 & 1 \end{array} \right) = \left( \begin{array}{ccc} -0.467 & -0.328 & -0.821 \\ -0.582 & -0.586 & 0.564 \\ -0.666 & 0.741 & 0.083 \end{array} \right) \left( \begin{array}{ccc} 10.62 & 0 & 0 \\ 0 & 6.74 & 0 \\ 0 & 0 & 0.88 \end{array} \right) \left( \begin{array}{ccc} -0.467 & -0.582 & -0.666 \\ 0.328 & 0.586 & -0.741 \\ 0.821 & -0.564 & -0.083 \end{array} \right) $

but How do I get eigenvectors and eigenvalues from $USV^T$, I was reading that $S$ are the eigenvalues but like the sign must be changed... also the $U$ gives the eigenvectors but there is an issue with the sign, is there a rule to get the correct sign?

1 Answers 1

3

The singular values of a matrix $A$ (not necessarily square) are the square roots of the eigenvalues of $A^TA$. Squaring $S$ will NOT give the eigenvalues of $A$, but of $A^TA$. Additionally, the singular vectors (the columns of $U$ and $V$) are the eigenvectors of $A^TA$ and $AA^T$. A priori, there is no direct connection between the eigenvalues of $A$ and the singular values of $A$, or between eigenvectors and singular vectors.

Of course, there are some relations. Given a diagonal matrix, it scales the length of a vector by at most the largest entry in the matrix. Because orthogonal matrices preserve length, the largest singular value is the largest amount by which the length of a vector can be scaled. If $A$ is symmetric, the same description holds of the largest eigenvalue. However, in general, if $\lambda$ is the largest eigenvalue of $A$, then in general $|\lambda|$ will be smaller than the largest singular value. For example, with the matrix

$ \pmatrix{1 & 2\\ 3 & 4}$

the eigenvalues are $5.372$ and $-.372$, while the singular values are $5.465$ and $.366$.

If $A$ is symmetric, the eigen-decomposition is an SVD decomposition. Unfortunately, I know of no good way to go from one decomposition to another in general.


There is a nice relationship for normal matrices, that is matrices which satisfy $AA*=A*A$. (or in the case of real matrices, $AA^T=A^TA$). A matrix is normal if and only if it can be diagonalized by a unitary matrix, that is $A=USU^*$ where $UU^*=I$ and $S$ is diagonal. In this case, $AA^*=(USU^*)(US^*U^*)=U(SS^*)U^*$. In this case, $SS^*$ will be diagonal and will contain the squares of the lengths of the eigenvalues of $A$, and therefore the singular values of $A$ are in fact the absolute values of the eigenvalues of $A$.

Unfortunately, this makes it easy to go from an eigenvalue decomposition of $A$ to an SVD, but not the other way around.

  • 1
    @cMinor For a symmetric matrix, the SVD decomposition is the eigen-decomposition. No work is required to go from one to the other. You will have that $U=V$ to begin with (or rather, you can, as technically, one could change the signs of any of the column's of $U$ to produce a $V$ that works for SDV. So really, you can just ignore $V$ in entirety). I don't know what you mean by the sign of $U$, but yes, the diagonal entries of $S$ will be the eigenvalues. However, when you're normal, they are only the lengths of the eigenvalues, and when you're not normal, there is no clear relationship.2012-01-23