13
$\begingroup$

Let $p(x)=x^n+a_{n-2}x^{n-2}+a_{n-3}x^{n-3}+\cdots+a_1x+a_0=(x-\lambda_1)\cdots(x-\lambda_n)$ be a polynomial with real coefficients such that every $\lambda_i$ is real.

Is there always a symmetric real $n\times n$ matrix $M$, containing only zeros on its main diagonal such that its characteristic polynomial is $p$?

  • 0
    Are $\lambda_i$ known? Or is a better solution one that uses only the $a_i$?2012-12-22

2 Answers 2

6

The answer is affirmative. The conceptual construction of that zero-diagonal real symmetric matrix is rather easy. We begin with $D=\operatorname{diag}(\lambda_1,\ldots,\lambda_n)$. Let $Q$ be a real orthogonal matrix with its last column equal to $u=\frac{1}{\sqrt{n}}(1,\ldots,1)^T$ (e.g. you may consider the Householder reflection $Q=I-2vv^T/\|v\|^2$, where $v^T=u^T-(0,\ldots,0,1)$). Then $D\leftarrow Q^TDQ$ would become a real symmetric matrix whose $(n,n)$-th entry is zero. Now perform the similar procedure recursively on the leading principal submatrices of $D$, we obtain the desired matrix.

  • 0
    @adamW I misspoke. The Householder trick is applicable for traceless *normal* matrices only. It doesn't work for general traceless matrices. I should have known this well because in [a previous problem](http://math.stackexchange.com/questions/95537/does-the-set-of-matrix-commutators-form-a-subspace/252324#252324) I was forced to reduce a general traceless matrix to a zero-diagonal matrix using 2x2 matrices instead of Householder reflections. Stupid me. Absentminded. :-D2012-12-29
2

It is sufficient to look at the $2 \times 2$ case since if the $\lambda_i$ are known, start with the diagonal matrix and operate on the sub matrices. Pick two along the diagonal that are opposite in sign (this is always possible since the trace is zero).

An orthogonal matrix $\pmatrix{c & s \\ -s & c}$ is desired such that:

$\pmatrix{c & s \\ -s & c}\pmatrix{\lambda_0 & 0 \\ 0 & \lambda_1}\pmatrix{c & s \\ -s & c}^{-1}=\pmatrix{0 & * \\ * & \lambda_0 + \lambda_1}$

where $c^2 + s^2 =1$. This is the same as: \begin{align} \pmatrix{c & s \\ -s & c}\pmatrix{\lambda_0 & 0 \\ 0 & \lambda_1}\pmatrix{c & -s \\ s & c}&=\pmatrix{0 & * \\ * & \lambda_0 + \lambda_1} \\ \pmatrix{\lambda_0 c & \lambda_1 s \\ -\lambda_0 s & \lambda_1 c}\pmatrix{c & -s \\ s & c}&=\pmatrix{0 & * \\ * & \lambda_0 + \lambda_1} \\ \pmatrix{\lambda_0 c^2 + \lambda_1 s^2 & (\lambda_1 -\lambda_0) cs \\ (\lambda_1 -\lambda_0) cs & \lambda_1 c^2 + \lambda_0 s^2}&=\pmatrix{0 & * \\ * & \lambda_0 + \lambda_1} \\ \end{align}

Since it is a similarity, only the equation $\lambda_0 c^2 + \lambda_1 s^2=0$ need be solved and the trace value of $\lambda_0 + \lambda_1$ will naturally follow. Since $\lambda_0$ and $\lambda_1$ are opposite in sign, it can be solved with using only real values for $c$ and $s$.

Repeat the procedure on the sub matrices with each pair of non-zero diagonal values (the off diagonals of the sub matrices will remain as $0$ values). The end result of the entire matrix will be zero along the diagonal, since the trace is zero and all the diagonal elements are made to be zero.

It is possible to do a similar technique using only the $a_i$, beginning with the companion matrix, and finding a symmetric similarity. It is possible yet more involved to do such, but then would also not have zero elements along the sub matrix off diagonals. It also would be still possible yet slightly more complicated to deal with those non zero elements.

  • 0
    @Dominik Oops, I missed that. Thank you!2012-12-22