2
$\begingroup$

Assume an $N \times N$ matrix $A$ and a length $N$ vector $b$. I've already solved the system $Ax = b$ for $x$ using standard methods. (If you want you can assume that I have the inverse of $A$ as well.)

Now, I have a block matrix A' = \begin{pmatrix} A & V \\ U & 0 \\ \end{pmatrix}

and a block vector b' = (b, b^*)^T. $U$ is an $1 \times N$ matrix, $V$ is an $N \times 1$ matrix, and the zero in the lower right corner is a scalar (or $1 \times 1$ matrix if you prefer).

In other words A' is a matrix created by adding one row and one column onto $A$ and b' is a vector created by adding one element onto $b$. I want to solve the system A'x' = b' for x'. This is the same system as $Ax=b$ except that it has one extra equation and one extra variable to be solved for. Is there an efficient (less than $O(N^3)$) algorithm for this given that I've already computed the solution to $Ax=b$?

1 Answers 1

3

Look up Sherman Morrison Woodbury formula. That precisely answers your question. You will need to solve for $Ay = b$ though (or) if you have the inverse of $A$, the solution can be updated in $\mathcal{O}(N^2)$ steps. A nice thing about Sherman Morrison Woodbury thing is that it works even if $V \in \mathbb{R}^{N \times p}$ and $U \in \mathbb{R}^{p \times N}$. It will cost $\mathcal{O}(pN^2)$. The special case with $p=1$ is also called the Sherman Morrison formula.

  • 0
    Nah, I actually found it funny when I realized how obvious an optimization I was missing.2011-08-17