5
$\begingroup$

I have given a general $n \times d$ matrix $B$, and need to compute two similarity matrices $R \in \mathbb{R}^{n \times n}$ and $S \in \mathbb{R}^{d \times d}$ as follows:

$R = BSB^\top$ and $S = B^\top RB$.

Inserting one equation into the other yields $S = B^\top B S B^\top B$. I assume a solution doesn't always exist. In this case, I'm interested in an approximate solution that minimizes $||S - B^\top B S B^\top B||_F$.

Is there a name for this kind of problem? And is there an efficient way to solve it (possibly using some iterative algorithm)?

Note: The zero matrix would satisfy above equation, but I need similarity matrices, i.e. symmetric matrices for which the diagonal elements must be equal to 1.

  • 0
    The zero matrix seems to work - do you want to exclude it?2012-05-22
  • 0
    @MarkBennet: Thanks, I simply forgot to check. The zero matrix is excluded, see the edit.2012-05-22
  • 0
    This seems relevant: http://en.wikipedia.org/wiki/Lyapunov_equation2012-05-22
  • 0
    @RahulNarain: Unfortunately, this method is not restricted to similarity matrices and always returns the zero matrix.2012-05-22

1 Answers 1