2
$\begingroup$

Assume an $N \times N$ matrix $A$ and a length $N$ vector $b$. I've already solved the system $Ax = b$ for $x$ using standard methods. (If you want you can assume that I have the inverse of $A$ as well.)

Now, I have a block matrix $$ A' = \begin{pmatrix} A & V \\ U & 0 \\ \end{pmatrix} $$

and a block vector $b' = (b, b^*)^T$. $U$ is an $1 \times N$ matrix, $V$ is an $N \times 1$ matrix, and the zero in the lower right corner is a scalar (or $1 \times 1$ matrix if you prefer).

In other words $A'$ is a matrix created by adding one row and one column onto $A$ and $b'$ is a vector created by adding one element onto $b$. I want to solve the system $A'x' = b'$ for $x'$. This is the same system as $Ax=b$ except that it has one extra equation and one extra variable to be solved for. Is there an efficient (less than $O(N^3)$) algorithm for this given that I've already computed the solution to $Ax=b$?

1 Answers 1

3

Look up Sherman Morrison Woodbury formula. That precisely answers your question. You will need to solve for $Ay = b$ though (or) if you have the inverse of $A$, the solution can be updated in $\mathcal{O}(N^2)$ steps. A nice thing about Sherman Morrison Woodbury thing is that it works even if $V \in \mathbb{R}^{N \times p}$ and $U \in \mathbb{R}^{p \times N}$. It will cost $\mathcal{O}(pN^2)$. The special case with $p=1$ is also called the Sherman Morrison formula.

  • 0
    I saw that before I posted here. The problem is that the matrix multiplies involved in applying it make it also O(N^3). You have to multiply two NxN matrices at some point, which is an O(N^3) operation.2011-08-17
  • 0
    @dsimcha: If you have the inverse of $A$, all you will be doing is matrix vector products and hence the additional cost will only be $\mathcal{O}(N^2)$2011-08-17
  • 0
    @Sirvarm: Look at equation 1 in the Wikipedia article you linked to. IIUC, A^-1 * U * (C - V * A^-1 * U) * V is an NxN matrix. You then multiply this monstrosity by A^-1, hence O(N^3).2011-08-17
  • 0
    @dsimcha: You multiply a **vector** by a matrix. Hence the cost is $\mathcal{O}(N^2)$. Note that $U$ and $V$ are vectors.2011-08-17
  • 0
    Right, but (C - V * A^-1 * U) is a scalar, A^-1 ^ U is a column vector and V is a row vector. Thus this multiplication produces a matrix.2011-08-17
  • 0
    I really dont understand the issue. $A^{-1}U$ is a column vector and $V$ is a row vector. Hence multiply a row vector with a column vector gives a scalar and the cost is $\mathcal{O}(N)$ to take the inner product of two vectors.2011-08-17
  • 0
    But a column vector on the LHS times a row vector on the RHS produces an N*N matrix.2011-08-17
  • 0
    A column vector times a row vector cost is $\mathcal{O}(N^2)$ and not $\mathcal{O}(N^3)$. Why dont you do this for your self and check? Product of two $N \times N$ matrices is $\mathcal{O}(N^3)$2011-08-17
  • 0
    Right, but this is only part of the algorithm. You're multiplying a column vector by a row vector to get an $NxN$ matrix, then you're multiplying that by another $NxN$ matrix. That's where it becomes $\mathcal{O}(N^3)$.2011-08-17
  • 0
    (Slaps self in forehead.) Right, because matrix multiplication is associative. Thanks for your help and getting that through my thick head.2011-08-17
  • 0
    No problem. Sorry if my previous comment was a bit rude. I shall delete it.2011-08-17
  • 0
    Nah, I actually found it funny when I realized how obvious an optimization I was missing.2011-08-17