2
$\begingroup$

I have a complex symmetric matrix $A$, (i.e. non-Hermitian and obeying $A=A^T$), which is positive definite, in the sense that: $\Re({z^HAz}) > 0$ for any $z$. I am able to verify this numerically by noting that the Hermitian part of this matrix $(A + A^H)/2$ has all-positive eigenvalues (or that any negative parts are small enough to be attributable to rounding error and hopefully negligible)

I know that for my problem only a few eigenvalues of this matrix contribute to the solution, so I have performed a spectral decomposition of my matrix $A = U\cdot diag(\lambda) \cdot V$

with $V$ and $U = V^{-1}$ representing the (non-orthogonal) eigenvectors and $\lambda$ the eigenvalues. This then gives me a representation of $A$ in terms of "partial matrices", $A = \sum_i \lambda_i U_i\otimes V_i$

However, these partial matrices are no-longer positive definite (i.e. some of them have quite significant negative eigenvalues of their Hermitian parts), hence they cannot yield meaningful decompositions of the solution to my problem.

  • Is there some explanation as to why these partial matrices are not positive definite?
  • Is there an alternative decomposition which would preserve this property?

edit

As this is a complex symmetric system, there is the additional relationship $V=U^{T}$

1 Answers 1

2

In the answer to another question, Qiaochu Yuan said that

It's not misleading as long as you change your notion of equivalence. When a matrix represents a linear transformation $V \to V$, the correct notion of equivalence is similarity: $M \simeq B^{-1} MB$ where $B$ is invertible. When a matrix represents a bilinear form $V \times V \to \mathbb{R}$, the correct notion of equivalence is congruence: $M \simeq B^TMB$ where $B$ is invertible. As long as you keep this distinction in mind, you're fine.

Since you are dealing with a $z^HAz$, $A$ should be viewed as a bilinear form rather than a linear transformation. Therefore, if you want to find an equivalent matrix to $A$ via matrix decomposition, what you should use is not a similarity transform, but a matrix congruence. Actually, even in case of $\mathbb{R}$, a similarity transform in general wouldn't help. You may think of it this way: positive definiteness is a basis-independent property. If $x^TAx>0$ for all $x$ in the standard basis, by a change of basis $x\mapsto Sy$ ($S$ is an invertible matrix), $x^TAx=y^T(S^TAS)y$ is also positive for all $y$. Therefore, what preserves positive definiteness is the congruence $A\mapsto S^TAS$ but not a similarity transform $A\mapsto S^{-1}AS$.

Surely, when $A$ is real symmetric or Hermitian, $A$ is orthogonally/unitarily similar to a diagonal matrix. So, in this case, we have both congruence and similarity in one shot. For a general $A$, however, we are not that lucky. I am not sure what is the best way to go. This varies from scenario to scenario. Yet, if the preservation of positive definiteness is of utmost importance, I think the best one can do is perhaps just a Schur decomposition $A=UTU^H$, where $U$ is a unitary matrix and $T$ is upper triangular. You may still read off the eigenvalues from the diagonal of $T$, but $T$ is not a diagonal matrix so that in general you can only write $A$ as a linear combination of $n(n+1)/2$ tensor products.

  • 0
    $A\mapsto S^HAS$ is called (by some authors) a $\ast$-congruence ("star-congruence"). Again, which of congruence or $\ast$-congruence is more appropriate depends on your needs. Star-congruence preserves positive definiteness, but congruence can bring you some nicely structured matrix. See [Tagaki factorization](http://en.wikipedia.org/wiki/Matrix_decomposition#Takagi.27s_factorization) for example.2012-12-13