3
$\begingroup$

I posted this on cs theory yesterday but did not get an answer and hence I am posting here.

I have a low rank matrix given as $AB^T$ where $A,B \in \mathbb{R}^{n \times p}$ and $p \ll n$. (I know $A$ and $B$ separately)

One efficient way to get the Singular Value Decomposition is to do a reduced QR on $A$ and $B$ i.e.

$A = Q_A R_A$ and $B = Q_B R_B$.

The above can be done in $\mathcal{O}(p^2n)$ cost.

I could then compute the svd of $R_A R_B^T = U_1 \Sigma V_1^T$ which is $\mathcal{O}(p^3)$.

Hence this will give me $$AB^T = (Q_A U_1) \Sigma (Q_B V_1)^T$$

The total cost is $\mathcal{O}(np^2)$.

However, I am wondering if there are other efficient ways to go about doing this to reduce the coefficient infront of $p^2n$. If I am right, the coefficient infront of $np^2$ is $3$ if we go about doing QR. The reason why I am interested in minimizing the coefficient is that my $n$ is really ginormous. So if I were to implement it, a cost cutting on the coefficient of the $\mathcal{O}(n)$ could be significant.

I am wondering if we could exploit the fact that the left singular vectors must span $A$ and the right singular vectors span $B$

i.e. we know that if $AB^T = U \Sigma V^T$, then

$$A = U \alpha \text{ and } B = V \beta$$

and we want $\alpha \beta^T$ to be diagonal and $U$,$V$ needs to be unitary.

EDIT What I essentially want is an exact rank $r

  • 0
    Any sparsity structure to your matrix? Have you looked at Fast SVD? That can be handy, in particular, if $r \ll p$.2011-03-18
  • 0
    @cardinal: There is no structure in $A$ and $B$. And what Fast SVD are you talking about? could you provide a link? I am not looking for approximate or randomized algorithms...2011-03-18
  • 0
    I recall a MATLAB package for doing fast exact truncated SVDs. Don't know if **[this](http://www.mathworks.com/help/techdoc/ref/svds.html)** or **[this](http://www.stanford.edu/~dgleich/programs/fastsvds.m)** is helpful at all.2011-03-18
  • 0
    @cardinal: I don't think it would help my problem since my problem has the matrix in the low rank form $AB^T$ where $A,B$ are thin matrices.2011-03-18
  • 0
    Have you seen [this](http://dx.doi.org/10.1137/S0895479897325578)?2011-07-21

1 Answers 1

1

I presume you're already aware that just the mere fact of multiplying $\mathbf A$ and $\mathbf B^T$ already exposes you to numerical instability, similar to the reasons why one is better off computing the singular value decomposition of a matrix $\mathbf A$ instead of the eigendecomposition of $\mathbf A^T \mathbf A$.

Luckily, you don't have to go through that trouble: there is such a thing called "product-induced singular value decomposition" that allows you to formally decompose the product without having to multiply the two matrices. See the paper and the references for details.

  • 0
    The paper doesn't answer the question. Even in the above procedure I have, I never multiply out $A$ and $B^{T}$. The paper has the essentially same algorithm but with taking into account the numerical stability.2011-03-18
  • 0
    There is still a lot of literature on "product eigenvalue" algorithms; at the very least, try searching for "product SVD" or "generalized SVD" in the usual places.2011-03-18