14
$\begingroup$

I'm interested in the properties of zero-diagonal symmetric (or hermitian) matrices, also known as hollow symmetric (or hermitian) matrices.

The only thing I can come up with is that it cannot be positive definite (if it's not the zero matrix): The Cholesky decomposition provides for positive definite matrices $A$ the existence of a lower triangular $L$ with $A=LL^*$. However the diagonal of $LL^*$ is the inner product of each of the rows of $L$ with itself. Since the diagonal of $A$ consists of zeros, so $L$ (and thus $A$) must be the zero matrix.

The sorts of questions that interest me are:

  • which symmetric matrices can be transformed orthogonally into a zero-diagonal matrix?
  • what can we say about the eigen-values of a zero-diagonal symmetric matrix?.
  • any other interesting known properties??
  • 2
    The eigenvalues sum up to zero.2011-11-07
  • 2
    Nice, thanks (: Since $tr(AB)=tr(BA)$ and if $U$ is an o-n basis of eigen vectors of $A$ then $tr(A)=tr(U^*AU)$ which is the sum of the eigen values of $A$.2011-11-07
  • 0
    If the matrix is not null matrix, it should have +ve as well as -ve eigenvalues i.e., it is indefinite matrix.2011-11-07

4 Answers 4

9

I'll consider the special case of symmetric tridiagonal matrices with zero diagonal for this answer.

I prefer calling the even-order tridiagonal ones Golub-Kahan matrices. These matrices turn up in deriving the modification of the QR algorithm for computing the singular value decomposition (SVD). More precisely, given an $n\times n$ bidiagonal matrix like ($n=4$)

$$\mathbf B=\begin{pmatrix}d_1&e_1&&\\&d_2&e_2&\\&&d_3&e_3\\&&&d_4\end{pmatrix}$$

the $2n\times 2n$ block matrix $\mathbf K=\left(\begin{array}{c|c}\mathbf 0&\mathbf B^\top \\\hline \mathbf B&\mathbf 0\end{array}\right)$ is similar to the Golub-Kahan tridiagonal

$$\mathbf P\mathbf K\mathbf P^\top=\begin{pmatrix}& d_1 & & & & & & \\d_1 & & e_1 & & & & & \\& e_1 & & d_2 & & & & \\& & d_2 & & e_2 & & & \\& & & e_2 & & d_3 & & \\& & & & d_3 & & e_3 & \\& & & & & e_3 & & d_4 \\& & & & & & d_4 & \end{pmatrix}$$

where $\mathbf P$ is a permutation matrix. This similarity transformation is referred to as the "perfect shuffle".

The importance of this is that the eigenvalues of the Golub-Kahan matrices always come in $\pm$ pairs; more precisely, if $\mathbf B$ has the singular values $\sigma_1,\sigma_2,\dots,\sigma_n$, then the eigenvalues of the Golub-Kahan tridiagonal are $\pm\sigma_1,\pm\sigma_2,\dots,\pm\sigma_n$.

Odd-order zero-diagonal tridiagonals can be treated similarly, as they have a zero eigenvalue in addition to the $\pm$ pairs of eigenvalues. The treatment given above for Golub-Kahan tridiagonals becomes applicable after deflating out the zero eigenvalue; one can do this by applying the QR decomposition $\mathbf T=\mathbf Q\mathbf R$ and forming the product $\mathbf R\mathbf Q$ and deleting the last row and last column, thus forming a Golub-Kahan tridiagonal.

See Ward and Gray's paper (along with the associated FORTRAN code) and this beautiful survey by David Watkins.

  • 0
    can you say a little more on transforming to a tridiagonal matrix? Is it along the lines of $PAP^T=LTL^T$, $A$ our symmetric matrix, $P$ a permutaion, $L$ lower unit triangular, $T$ tridiagonal (found on wikipdeia 'Symmetric matrix' page, from a book? "Matrix Computations" Golub & van Loan '96).2011-11-07
  • 0
    I said "similar", not "congruent". For symmetric $\mathbf A$, one can always find an orthogonal matrix $\mathbf W$ such that $\mathbf A=\mathbf W\mathbf T\mathbf W^\top$ and $\mathbf T$ is tridiagonal. What you are describing is Aasen's decomposition, which is not a similarity transformation.2011-11-07
  • 0
    Huh, scratch that; the similarity doesn't preserve the zero diagonal in general. At least you have a substantial simplification in the tridiagonal case.2011-11-07
  • 0
    @J.M. A symmetric tridiagonal matrix cannot have repeated eigenvalues. So, this $A = WTW^\top$ doesn't seem to be holding for all symmetric matrices.2016-01-27
  • 0
    @Keivan, you mean a "symmetric **unreduced** tridiagonal matrix". If $\mathbf A$ has repeated eigenvalues, the resulting matrix is a direct sum of tridiagonal matrices (alternatively, has zeroes on the off-diagonals).2016-02-11
  • 0
    @J. M. You're right.2016-02-11
6

Regarding your first two questions, the matrices that can be orthogonally transformed into a zero-diagonal symmetric matrix are exactly those symmetric matrices such that the sum of their eigenvalues is zero.

Indeed, since the trace of a symmetric matrix is the sum of its eigenvalues, the necessity follows. And the sufficiency follows from the Schur-Horn Theorem, that says that the possible diagonals of an operator are exactly those majorized by the eigenvalue vector; if the eigenvalues $\lambda_1,\ldots,\lambda_n$ add to zero, then the zero vector is majorized by $\lambda$ and so there is an orthonormal basis such that in that basis the operator has zero diagonal.

As for further properties of these matrices, I don't think much can be said: take any $n\times n$ symmetric matrix $A$ and expand it as $A\oplus\text{Tr}(-A)$; this is orthogonally similar to a zero diagonal matrix.

0

From an application standpoint: If the rows and columns in the matrix M represent data objects, and the matrix entries are distances between them, then a Hermitian matrix H represents an asymmetric distance according to:

H = (1/2) (M+M^T) + i(1/2)(M-M^T)

where ^T is a matrix transpose

violating the axiom of a metric space that the distances d(A,B) = d(B,A) for some "distance function" d() The zero diagonal means that the distance from an object to itself is zero, preserving the axiom: d(A,A) = 0

-6

1) Trace is zero. 2) They are unitary matrices i.e. determinant is zero.

eg: real skew-symmetric matrix.

  • 4
    It's rather rare for a skew-symmetric matrix to be symmetric... for a unitary matrix to have zero determinant, even more so.2012-09-25