14
$\begingroup$

Let $\Sigma$ be a symmetric positive definite matrix of dimensions $n \times n$. Is there a numerically robust way of checking whether it can be decomposed as $\Sigma = \mathcal{D} + v^t.v$ where $v$ has dimensions $r \times n$ with $r < n-1$ and $\mathcal{D}$ is diagonal with positive elements?

So far, given $\Sigma$, I am checking for minimal $k$ for which positive solutions for diagonal matrix of $\mathrm{rank}(\Sigma -\mathcal{D}) = k$ exist.

I have to add, that even when the solution does exist, it may be non-unique.

Example:

$ \Sigma = \left( \begin{array}{cccc} 6 & 6 & 7 & 0 \\ 6 & 11 & 12 & -3 \\ 7 & 12 & 20 & -6 \\ 0 & -3 & -6 & 9 \\ \end{array} \right) \, \text{ where } \mathcal{D} = \left( \begin{array}{cccc} 1 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 4 \\ \end{array} \right) \text{ and } v^t = \left( \begin{array}{cc} 2 & 1 \\ 3 & 0 \\ 4 & -1 \\ -1 & 2 \\ \end{array} \right) $

$\Sigma$ admits a different representation as well, in fact it is one of 1-parametric family:

$ \mathcal{D} = \mathrm{diag}\left(\frac{31}{39}, \frac{403}{203}, \frac{79}{29}, \frac{84}{19} \right) \text{ and } v^t = \left( \begin{array}{cc} 0 & 679 \\ -78 & 696 \\ -156 & 812 \\ 97 & 0 \\ \end{array} \right) \cdot \left( \begin{array}{cc} \sqrt{2522} & 0 \\ 0 & 2 \sqrt{19691} \\ \end{array} \right)^{-1} $

I have two questions. Why is there a parametric family of solutions ? Can a solution be found by methods of linear algebra ?

Thank you

1 Answers 1

6

If you accept the numerical accuracy around the order of $[10^{-14},10^{-10}]$, then the following Linear Matrix Inequality solution is robust:

I have tried to solve the related problem : Finding a diagonal matrix $D\succ 0$ such that $S:=\Sigma-D \succeq 0,\quad \ \operatorname{rank}(S)\leq n-2$ Rank constraints are always problematic (non-convex) since going up (increasing the rank) is easy but going down is typically very hard (Since the rank of a matrix is upper semi-continuous..)

Anyway, I have set up a simple LMI problem with a diagonal matrix $D$ as follows $ \min_{\displaystyle\substack{ D\succ 0,\\S\succeq 0}} \ \ \operatorname{tr(S)}$ In other words, I simply minimized the sum of the eigenvalues of $S$, hoping that some of them will come quite close to zero. And much to my surprise, (except a few cases ended up with one, so far) two of them usually turned out to be arbitrarily close to zero. This is also equivalent to maximizing entries of $D$ while keeping the positive semi-definiteness of $S$.

If you have MATLAB and Robust Control Toolbox you can test it for yourself with the following code (any LMI solver would do):

%Generate some pos def matrix n = 5; Sr = rand(n); Sr = 2*(Sr+Sr'); Sr = 2.5*max(abs(eig(Sr)))*eye(n)+Sr; % Uncomment the next line for the particular matrix in the question %Sr = [6 6 7 0;6 11 12 -3;7 12 20 -6;0 -3 -6 9]; n=4;  %Declare LMIs setlmis([]) D=lmivar(1,[ones(n,1) zeros(n,1)]); lmiterm([1 1 1 D],1,1); lmiterm([-1 1 1 0],Sr); lmis = getlmis; c = mat2dec(lmis,-eye(n)) [cop,xop] = mincx(lmis,c,[1e-12 200 1e9 20 0]); Dnum= dec2mat(lmis,xop,D); disp('The eigenvalues of the difference') eig(Sr-Dnum) disp('The rank of the difference') rank(Sr-Dnum) disp('The condition number of the difference') rcond(Sr-Dnum) 

But as we can expect $(S_r-D_{num})$ is a full rank matrix bur its condition number is generally around $10^{-14}$.

Finally, I have tried your matrix and ended up with $ D = \begin{pmatrix} 0.75669 &0 &0 &0\\ 0 &2.1392 &0 &0\\ 0 &0 &2.6752 &0\\ 0 &0 &0 &4.4885 \end{pmatrix} $ After making sure that m of the eigenvalues are indeed close to zero, obtaining the low rank variable is simply numerical massaging. (This part can be substantially improved in my opinion)

[K,L,M] = svd(Sr-Dnum); M = M'; m=2; %Select how many eigs are close to zero K = K(:,1:n-m); L = L(1:n-2,1:n-m); M = M(1:n-m,:); T = K*L*M disp('The rank of V^T V ' ) rank(T) %indeed n-m V = K*sqrt(L); % Testing the result disp('The error of \Sigma = D - v^T v' ) Sr-Dnum-V*V' 

Numerical result for your variable $v^t$ (rounded off further by the display) $ v^t = \begin{pmatrix} -1.826 &1.3817\\ -2.9418 &0.45479\\ -4.1423 & -0.40799\\ 1.2817 & 1.6938 \end{pmatrix} $

We also have a necessary condition by using the orthogonal complement $V_\perp$ of $v^t$: $ V_\perp^T\Sigma V_\perp = V_\perp^T D V_\perp $ And this difference also suffers from numerical errors (around $10^{-12}$. Nevertheless, if you are searching for an quick and dirty approximation, this comes to pretty close. By fiddling with the solution $D_{num}$ (which means manual labor), you can even obtain exact results.

Regarding the theory, I don't know why the number of eigs that come close to zero is $n-2$ in general but not more. I'll try to come back to this if I have some time later.

I don't think an analytical solution would lead to a direct result since the rank is excessively dependent to matrix entries and it would be really tough(and great of course) to obtain something other than approximations.

Lastly, I did not understand what you meant with $1$-parametric family, so I don't have any comments on that :).