1
$\begingroup$

Given a (symmetric) positive definite matrix ${\bf A}\in\mathbb{R}^{N\times N}$, I know that it can always be expressed as a sum of $N$ rank-one matrices using the singular value decomposition ${\bf U\Sigma V^\ast}$ of the corresponding Cholesky matrix $\bf L$: $${\bf A} = {\bf L}{\bf L}^{\rm T} = {\bf U\Sigma V^\ast}({\bf U\Sigma V^\ast})^{\rm T} = \sum_{i,j=1}^N \sigma_i \sigma_j {\bf u}_i {\bf v}^{\rm T}_i {\bf v}_j {\bf u}^{\rm T}_i= \sum_{i=1}^N \sigma_i^2 {\bf u}_i {\bf u}^{\rm T}_i,$$ with ${\bf u}_i$ being the $i$-th column of $\bf U$. So the matrix terms here are all outer products (dyads) of a vector with itself.

Can such a matrix also be expressed as the sum of a diagonal positive definite matrix $\bf D$ of full rank with $D_{ii} = d_i^2 > 0$ and a series of $M$ outer products ${\bf x}{\bf x}^{\rm T}$ (${\bf x}\in \mathbb{R}^N$)? That is:

$${\bf A} = {\bf D} + \sum_{k=1}^M {\bf x}_k{\bf x}_k^{\rm T}$$

Basically, this is just one large system of $N(N+1)/2$ quadratic equations with $N(M+1)$ unknowns $\{d_i, x_{k,i}\}$. Using small test values for $N$ and $M$, I was able to find numerical solutions to this equation system for specific example matrices, but it would be nice to have an analytic solution. I suspect this should always be possible, at least for large enough values of $M$.

If so, what is the minimal value of $M$ for which such a decomposition exists? What can be said about the relationship of ${\bf d}={\rm diag}({\bf D})$ and the different ${\bf x}_k$? Is there some $M$ for which the decomposition is unique?

If not, how can a counterexample be constructed?

Some thoughts on this:

  • for $N=1$, this is trivial, as one can always choose $\bf D=A$ and $M=0$
  • for $N\geq2$, I thought of choosing $D_{ii} = A_{ii}$ as a starting point and applying the Cholesky/SVD strategy on the difference ${\bf A}-{\bf D}$ to get the ${\bf x}_k$. However, for $N=2$ at least, this doesn't work, since the difference matrix is not positive definite. So I guess a related question is: For which $N\times N$ diagonal matrices ${\bf D}$ does ${\bf A}-{\bf D}$ remain positive definite?
  • from a geometric perspective, a positive definite matrix ${\bf A}$ defines an $N$-dimensional ellipsoid $\{{\bf z}\in\mathbb{R}:{\bf z}^{\rm T}{\bf A}{\bf z}={\rm const.}\}$. If the matrix is diagonal, the semi-axes of this ellipsoid are aligned with the coordinate axes. In constrast, an outer-product-like term ${\bf x}{\bf x}^{\rm T}$ defines a pair of hyperplanes $\{{\bf z}\in\mathbb{R}:{\bf z}^{\rm T}{\bf x}{\bf x}^{\rm T}{\bf z}=\|{\bf z}^{\rm T}{\bf x}\|^2={\rm const.}\}$. So the above problem is equivalent to asking whether any $N$-dimensional ellipsoid can be understood as a "superposition" of an axis-aligned ellipsoid and a series of hyperplane pairs, although the meaning of "superposition" is somewhat vague in this context...
  • the above system of equations can be written using the tensor product $\otimes$: $ \left(\sum_{i,j} a_{i,j}\,{\bf e}_i\otimes{\bf e}_j\right) = \left(\sum_i d_i^2\,{\bf e}_i\otimes{\bf e}_i\right) + \sum_k \left(\sum_i x_{k,i}{\bf e}_i\right) \otimes \left(\sum_j x_{k,j}{\bf e}_j\right) $. Maybe some tensor algebra can come in handy?
  • 0
    The number of rank-one terms needed to represent $A-D$ is simply the rank of $A-D$, so the question of minimizing $M$ can be equivalently expressed as follows: Given a symmetric positive definite matrix $A$, what is the lowest-rank positive definite matrix of the form $A-D$ where $D$ is diagonal? As my comment on Batman's answer shows, you can always achieve rank $N-1$, but maybe it's possible to do better.2017-01-05

1 Answers 1

1

Let $A$ be positive definite, and $\xi$ be the smallest eigenvalue. Then, $A-\frac{\xi}{2} I$ is also positive definite (this just shifts the eigenvalues by $\frac{\xi}{2}$). The eigendecomposition of $A - \frac{\xi}{2} I= \sum_i \lambda_i u_i u_i^T$ where $\lambda_i, u_i$ are the eigenvalues+vectors of $A - \frac{\xi}{2}I$.

Then, you can write $A = \xi/2 I + \sum_i (\sqrt{\lambda_i} u_i) (\sqrt{\lambda_i} u_i)^T$.

Note that you can play around with this; so long as you replace $\frac{\xi}{2}$ with something $\leq \xi$, this idea works. Will it give you the minimum number of terms in the sum? not necessarily. For example, take $A$ to be diagonal with positive entries. You can then just take $A=D$ but this will give you terms in the sum.

  • 0
    You can do slightly better: $A-\xi I$ is a positive semidefinite matrix of rank $N-1$ or less, so you can do the same thing but with one fewer outer product (i.e. one of the $\lambda_i$ is zero, and in the notation of the question, we have $M=N-1$). But I suspect this may still not be the best possible.2017-01-05
  • 0
    Yeah -- my last comment alluded to that. You can have $M=0$ if $A$ is a diagonal matrix with positive diagonal. But, dealing with the minimality is trickier in general.2017-01-05
  • 0
    That's true. I was just saying if you're going to subtract a multiple of the identity, there's no reason not to use $\xi I$ itself. That way you can always get a rank-$(N-1)$ matrix instead of always rank $N$.2017-01-05
  • 0
    Sorry it took so long to get back to you: had some start-of-year things to attend to :) Yes, I have thought of choosing $D=\xi I$, but as you say, I suspect there may be an even better solution. The extreme case with $A$ being diagonal can be resolved with $M=0$, and there is presumably a whole set of matrices which can be constructed with just $M=1$ terms in the sum. I'm just wondering whether a general statement can be made about matrices which can be expressed with arbitrary $M$. How large is this set for every value of $M$, compared to the full set of positive definite matrices?2017-01-13