3
$\begingroup$

I need to solve the following equation for $\boldsymbol{\mathrm{B}}$

$\boldsymbol{\mathrm{A}}_{m\times p}=\boldsymbol{\mathrm{B}}^{-1}_{m\times m}\boldsymbol{\mathrm{C}}_{m\times p}$

The problem is that the matrix $\boldsymbol{\mathrm{C}}$ is not invertible. Any way to approximate or solve this equation for $\boldsymbol{\mathrm{B}}$. Thanks for your time and help.

Note

$p$ could be 1 and in that case $\mathrm{C}\mathrm{C}^{T}$ is not invertible.

  • 0
    My bad, I realized you can't left multiply by B as A is an mxp matrix. As mentioned, there are counterexamples still unless you have stricter conditions. Also, these matrices are of strange sizes.2011-12-02

3 Answers 3

3

If $m and $\operatorname{rank}(\boldsymbol{\mathrm{C}}) = m$ then the $p\times m$ matrix $\boldsymbol{\mathrm{D}}=\boldsymbol{\mathrm{C}}^T(\boldsymbol{\mathrm{C}} \boldsymbol{\mathrm{C}}^T)^{-1}$ is a right inverse of $\boldsymbol{\mathrm{C}}$, i.e. we have $\boldsymbol{\mathrm{C}}\boldsymbol{\mathrm{D}}= \boldsymbol{\mathrm{I}}_{m\times m}$. So multiplying both sides of the equation on the right by that will give you $\boldsymbol{\mathrm{B}}^{-1}$.

  • 0
    I think the solution would not be unique in that case. To be continued.......2011-12-03
2

What you seem to need here is to compute a Moore-Penrose pseudoinverse (essentially the same as Michael's answer, but recast in different form).

Start with

$\mathbf A=\mathbf B^{-1}\mathbf C$

Now $\mathbf C$ always has a singular value decomposition (SVD): $\mathbf C=\mathbf U\mathbf \Sigma\mathbf V^\top$, where $\mathbf U$ and $\mathbf V$ are orthogonal, and $\mathbf \Sigma$ is diagonal. The Moore-Penrose inverse can be computed as

$\mathbf C^+=\mathbf V\mathbf \Sigma^+\mathbf U^\top$

where $\mathbf \Sigma^+$ is obtained by reciprocating only the nonzero diagonal entries (or, in inexact arithmetic, reciprocate the entries that are larger than $\varepsilon\sigma_1$, where $\sigma_1$ is the largest diagonal entry, and $\varepsilon$ is machine epsilon.) This coincides with the usual inverse if $\mathbf C$ is in fact invertible.

Having the SVD on hand, we can do the following:

$\begin{align*} \mathbf A&=\mathbf B^{-1}\mathbf C\\ \mathbf A&=\mathbf B^{-1}(\mathbf U\mathbf \Sigma\mathbf V^\top)\\ \mathbf A\mathbf V&=\mathbf B^{-1}\mathbf U\mathbf \Sigma\\ \mathbf A\mathbf V\mathbf \Sigma^+&=\mathbf B^{-1}\mathbf U\\ \mathbf A\mathbf V\mathbf \Sigma^+\mathbf U^\top&=\mathbf B^{-1}\\ \mathbf B&=(\mathbf A\mathbf V\mathbf \Sigma^+\mathbf U^\top)^{-1} \end{align*}$

  • 0
    Neither my answer nor this one is compmlete. The next step is to say what happens if either m>p or the matrix $\Bbb C$ otherwise fails to have rank as large as $m$. There would be a non-unique solution and the task would be to describe the space of solutions.2011-12-03
0

If you want to do this computationally (with particular numbers), then let $x_j$ be the $j$th row of $B^{-1}$, and let $a_j$ be the $j$th row of $A$. Then you're just trying to solve $x_j C = a_j$ for $x_j$, which is a standard Gaussian elimination problem. If $C$ has rank $m$ then the solution will exist and be unique; if $C$ has rank less than $m$ then there will either be no solution or infinitely many solutions. Once you have $B^{-1}$ or a candidate for $B^{-1}$, then it's equally easy to compute its inverse (again by Gaussian elimination) or show that it's not invertible.

  • 1
    The problem is Gaussian elimination isn't terribly reliable for rank determination in inexact arithmetic. (If exact arithmetic is available, it's fine.) Golub and Van Loan have a nice discussion on the subtleties of "numerical rank" in their book.2011-12-03