0
$\begingroup$

I have a sparse matrix $X$ which is $m \times n$, where $m>n$ and I am trying solve least squares problem $Xa=b$. I need to solve it multiple times for different $b$'s so I apply Cholesky decomposition on $Y=X^TX$, and solve $Ya=X^Tb$.

The problem is I need to add $k$ rows ($k < n$) to $X$ where each additional row contains only one nonzero element. Then $Y$ will be Y^' = X^TX + W^TW where $W$ is $k\times n$ matrix consists of additional rows. Since $W^TW$ is a diagonal matrix another way to show is Y^' = X^TX + Iv where $v$ is a vector $n x 1$ and contains nonzero elements of additional rows.

I do not want to recompute decomposition for each time when multiple rows added, I checked Woodbury matrix identity by considering Y^' = X^TX + W^TW but it is not very useful when $k$ is close to $n$. Is there any special case for Y^' = X^TX + Iv (may be something like Sherman–Morrison formula) ?

Also other suggestions will be appreciated (including pointing to related references, changing decomposition or whole approach etc.), thank you.

  • 0
    Thanks for your time and concern, Will Jagy. Really appreciated.2012-02-08

1 Answers 1

0

Your use of notation is quite confusing.

The one thing that would appear to work is $k$ applications of Sherman-Morrison, once for each added row. Adding row $i,$ where $1 \leq i \leq k,$ let the row vector in the Wikipedia article, called $v^T,$ be the new row, and the column vector $u$ be the transpose of that vector.

  • 0
    @J.M. I just use normal equations because my current library does not support sparse QR decomposition (or any related thing) to solve $Xa=b$. I know normal equation may become singular (although $X$ has independent columns) due to finite precision but it is unlikely to happen in my problem. Also2012-01-12