2
$\begingroup$

Consider matrices $Y\in\mathbb{R}^{n\times n}$ and $X\in\mathbb{R}^{n\times m}$ where $m\geq n$. $X$ is unknown but $Y=XX'$, which implies that $Y$ is positive definite (I see no reason why this couldn't alternately be expressed as a positive semi-definite problem with $Y=X'X$, a different $Y\in\mathbb{R}^{m\times m}$ would still be known).

What the easiest method to find $X$? I was thinking of minimizing the Frobenius norm, but wasn't sure if there was some relatively straightforward thing that I'm missing.

  • 1
    Notice the typographical difference between $Y\epsilon\mathbb{R}^{n\times n}$ and $Y\in\mathbb{R}^{n\times n}$. Writing \in rather than \epsilon not only makes the symbol look different, but also results in proper spacing, since those conventions are built in to the software. $\TeX$ is fairly sophisticated. (I changed it in the posting.)2012-09-11
  • 0
    When I see epsilon, I think of [Hilbert choice operator](http://en.wikipedia.org/wiki/Epsilon_calculus#Hilbert_notation), but of course the [arity](http://en.wikipedia.org/wiki/Arity) is different..2012-09-11
  • 0
    @MichaelHardy Thanks.2012-09-12

2 Answers 2

1

Consider $X = I$, then $X' = I$ and $Y = I$. On the other hand, if $X = -I$, then $X' = -I$, but $Y = I$ still. Are there any other assumptions you can make about $X$?

In fact, for any $X$, $(-X)(-X)' = XX'$, so there are always at least two solutions.

Additionally, if $$X_1 = \left(\begin{array}{cc} a_1 & b_1 \\ 0 & 0\end{array}\right),\quad X_2 = \left(\begin{array}{cc} a_2 & b_2 \\ 0 & 0\end{array}\right)$$

with $a_1^2 + b_1^2 = a_2^2 + b_2^2$ then $Y = X_1 X_1' = X_2 X_2'$.

I think maybe something needs to be said about the singular values of $X$ to get better uniqueness - maybe that they're all non-degenerate?

  • 0
    $Y$ is some known matrix. It could be an identity matrix, or it could be anything else so long as it is positive definite.2012-09-11
  • 0
    Ok, but what I'm saying is that there might not be a unique $X$ such that $Y = XX'$, unless we can make some more assumptions (or unless uniqueness isn't a concern).2012-09-11
  • 0
    I think the non-uniqueness can get worse than this also - see the edit to my answer.2012-09-11
  • 1
    For any orthogonal matrix $R$, if $X$ satisfies $X X' = Y$ then so does $X R$. So there's always a lot of non-uniqueness.2012-09-12
  • 1
    In fact, if $n=m$ and $Y$ is invertible, that's all the non-uniqueness there is: if $X X' = Z Z' = Y$ is invertible, then $Z$ and $Z'$ are invertible and $Z^{-1} X (Z^{-1} X)' = Z^{-1} X X' (Z')^{-1} = Z^{-1} Z Z' (Z')^{-1} = I$ so $Z^{-1} X$ is orthogonal, i.e. $X = Z R$ for an orthogonal matrix $R$.2012-09-12
  • 0
    Ahh... that makes sense. Thus the condition that $Y$ is positive definite, not just positive semi-definite (which I thought was what the problem originally stated, though I'm either wrong, or it was edited).2012-09-12
  • 0
    @BaronVT I see your point now. What kinds of conditions would need to be made on the eigenvalues. In the particular ones I'm working with, it will often be the case that several will be equal to each other.2012-09-12
  • 0
    @John Well, as Robert Israel pointed out, in the square case, if $Y$ is invertible, $X$ is invertible, so eigenvalues are non-zero, and the non-uniqueness is characterized as above.2012-09-12
0

You also need $Y$ to be symmetric. An algorithm which does this if you want $X$ to be triangular is the Cholesky decomposition.

  • 0
    You might need full pivoting if the matrix is singular. ($Y = XX'$ already implies $Y$ is symmetric.)2012-09-11