I have something that looks similar to an eigenvalue decomposition, $K = U^T\Sigma U$, where $U \in R^{d \times n}$ and the diagonal $\Sigma \in R^{d \times d}$. $K$ is full rank and $U^TU = I_n$ is the identity in $R^{n \times n}$. Oddly, $d$ is much larger than $n$, $d >> n$. I would like to know the eigenvalue decomposition of $K$ efficiently. Are there any tricks to do the efficiently?
Over complete matrix decomposition
-
0Does $\Sigma$ have positive diagonals? – 2017-03-10
-
0Yes it does and K is positive semi definite – 2017-03-10
1 Answers
If $\Sigma$ is positive semi-definite, we may define the matrix $\Sigma$ as $$ \Sigma = P^t \Gamma^2 P $$ where $P$ is a permutation matrix and the $m \times m$ diagonal matrix $\Gamma^2$ is positive definite. Then the matrix $K$ can be written in the form $$ K = W^t \Gamma^2 W = (\Gamma W)^t (\Gamma W) $$ where $m \le d$, $$ W = PU = P_{11} U_1 + P_{12}U2 $$ $$ P = \left[ \begin{array}{cc} P_{11} & P_{12} \\ P_{21} & P_{22} \end{array} \right] , \;\;\; U = \left[ \begin{array}{c} U_1 \\ U_2 \end{array} \right] . $$ The matrix $W$ is an $m \times n$ matrix and $P_{11}$ is an $m \times m$ matrix.
The eigenvalues of the matrix $K$ are given by the singular values squared of the matrix $\Gamma W$. The corresponding eigenvectors are given by the right singular vectors of $\Gamma W$.
The SVD of $\Gamma W$ is usually computed in two steps:
Step 1: Compute the QR-factorization of $\Gamma W$: $$ QR = \Gamma W $$ where the $n \times n$ matrix $R$ is upper triangular. The matrix $Q$, which has orthonormal columns, is not explicitly computed. Only $R$ is required.
Step 2: Compute the SVD of $R$, $$ R = \Phi S \Psi^t $$ where $\Phi^t \Phi = \Psi^t \Psi = I$ and $S$ is the singular value matrix. This can be done without explicitly computing $\Phi$. The required eigenvectors are given by $\Psi$. The eigenvalues of $K$ are given by the squares of the singular values.
The nominally expensive part is computing the SVD of $R$ but $R$ is a small $n \times n$ matrix and hence inexpensive.
If it is known that $m$ is nearly equal to $d$, then the reduction using the permutational matrix can be skipped. In that case, we proceed by computing the $QR$-factorisation of $\Sigma^{\frac{1}{2}}U$ instead of $\Gamma W$.
-
0Thanks but I think you misunderstood the question. In the case I am interested in d is larger than n and hence I have an over complete decomposition. Naively I could orthognalise U but that would be as costly as performing regular svd. – 2017-03-10
-
0Sorry for the misunderstanding your question. I have revised the answer. – 2017-03-10
-
0Thanks @Vini for the detailed answer. However, is there any computational advantage to doing this over directly computing the svd of K, as the QR factorisation I assume will cost O(n^3) – 2017-03-11
-
1First, you should compute eigenvalue decomposition of $K$, not the SVD. Secondly, the complexity of the QR-factorisation is O($m n^2$) . If $m
$K$ explicitly and working with a Cholesky type factorisation, there is an accuracy improvement. Due to this "square-root" algorithm and we do not have a numerically damaging $A^tA$ or $AA^t$ type step. Finally, this algorithm works when $U$ is not an orthogonal matrix. – 2017-03-11