1
$\begingroup$

I expect this probably has an easy answer but I'm feeling stupid and I'm stumped.

A book I'm reading, if I've understood it correctly, includes the following assertion. Let $M$ be a $3\times3$ symmetric matrix of real numbers with the following form:

$M \equiv \left(\begin{array} \ 1 \ a \ b \\ a \ 1 \ c \\ b \ c \ 1 \end{array}\right)$

Then $M$ is positive semidefinite iff $a^2 + b^2 + c^2 - 2 a b c \leq 1$.

The only approach I can think of is to look at the characteristic equation of $M$, which (if I didn't mess up) turns out to be

$(1 - \lambda)^3 - (a^2 + b^2 + c^2)(1 - \lambda) + 2 a b c$

So it seems that I need to show that, if $\alpha >= 0$ and $\beta \in \mathbb{R}$, then $\alpha - \beta \leq 1$ iff all roots of $x^3 - \alpha x + \beta$ satisfy $x \leq 1$. But I can't see how to prove that. Can anyone help?

4 Answers 4

1

Another way of looking at it: a real symmetric $n \times n$ matrix is positive semidefinite iff all the principal minors have nonnegative determinants. So the conditions are $1 - a^2 \ge 0$, $1 - b^2 \ge 0$, $1-c^2 \ge 0$, and $\det M = 1 - a^2 - b^2 - c^2 + 2 a b c \ge 0$.

  • 0
    Nice, thanks. (padding for min comment length)2012-04-18
1

$A$ is positive semidefinite $\iff$ all eigenvalues of $A$ are greater than $0$. To show semidefiniteness, we look at the bilinear form $x^{T}Ax$

If this quantity is $\geq0$, then $A$ is positive semidefinite. Note that we can use the Spectral theorem which states that every $n\times n$ symmetric matrix has $n$ linearly independent eigenvectors. This means that we can use a change of basis to rewrite $A$ in terms of its eigenvalues and eigenvectors (or equivalently, we can rewrite any matrix as $Q\Lambda Q^{-1}=Q\Lambda Q^{T}$ for symmetric matrices with eigenvectors chosen to be orthogonal). We can equivalently rewrite any vector $x\in\mathbb{R}^{n}$ in terms of these eigenvectors, upon which $ x^{T}Ax=x^{T}\lambda x=\lambda\left\langle x,x\right\rangle \geq0\ \iff\lambda\geq0 $

Since the inner product of two vectors is a positive definite bilinear form, we only require that $\lambda\geq0$. Without loss of generality, we can use linearity to only have to show this for each eigenvalue.

But note that $\det A=\lambda_{1}\cdots\lambda_{n}$. However, we note that for the case of $n=3$, $\det A\geq0$ implies that either all of the $\lambda_{j}>0$ or $2$ of the \lambda_{j}<0. You just need one more step showing that this latter case does not happen, or something equivalent to show that $\det A\geq0\implies\lambda_{j}>0$. Once you have done this, you can see that all you need to show is that $\det A\geq0$, which would imply that $1-(a^{2}+b^{2}+c^{2})+2abc\geq0$, which is equivalent to the assertion. Hope this helps!

  • 0
    Thanks. This works if I make the additional assumption that a, b and c are <= 1, which is perhaps what the author intended. Sorry that I can only mark one of the replies as the answer.2012-04-15
1

Let the zeroes of $x^3-\alpha x+\beta$ be $r,s,t$. Then $x^3-\alpha x+\beta=(x-r)(x-s)(x-t)\;,$ so $r+s+t=0$, $rs+rt+st=-\alpha$, and $rst=-\beta$. Replacing $t$ by $-r-s$, we have $-\alpha=rs+r(-r-s)+s(-r-s)=-r^2-s^2-rs\;,$ or $\alpha=r^2+s^2+rs$. Thus,

$\begin{align*} \alpha-\beta&=r^2+s^2+rs+rst\\ &=r^2+s^2+rs-r^2s-rs^2\\ &=(r+s)^2-rs(1+r+s)\tag{1} \end{align*}$

It’s not hard to check that $r,s,t\le 1$ iff $\langle r,s\rangle$ lies in the triangular region $T$ bounded by $r=1,s=1$, and $r+s=-1$. For a fixed value $c$ of $r+s$, $(1)$ is maximized when $rs$ is minimized, and since $1+r+s\ge 0$, this occurs when $r=1$ or $s=1$, at the extreme ends of the intersection of $r+s=c$ with $T$. Thus, for $r+s=c$ we have $\alpha-\beta\le c^2-(c-1)(1+c)=1\;,$ and it follows that $\alpha-\beta\le 1$ whenever $r,s,t\le 1$.

The converse is false:

$x^3-7x+6=(x-1)(x-2)(x+3)$

has a zero at $x-2>1$, but $\alpha-\beta=7-6=1$.

  • 0
    Thanks very much for your help. It appears that the original assertion is wrong (e.g. take a = b = c = sqrt(4/3)).2012-04-15
1

I believe the assertion is true with the additional presumption that $a^2 + b^2 + c^2 \leq 3$.

Let $\phi(x) = x^3 - (a^2 + b^2 + c^2)x + 2 a b c$. Then $M \geq 0$ iff $\phi(x) = 0$ implies $x \leq 1$. Note that $\phi$ is 3rd degree monic polynomial, and that $x$ is a zero of $\phi$ iff $1-x$ is an eigenvalue of $M$.

Suppose $M\geq 0$. Then $\phi(1)\geq 0$, otherwise for some $x>1$ we would have $\phi(x)=0$. It is easy to see that $\phi(1)\geq 0$ is equivalent to the condition $a^2 + b^2 + c^2 - 2 a b c \leq 1$.

Now suppose $a^2 + b^2 + c^2 - 2 a b c \leq 1$. Then we have $\phi(1)\geq 0$. We need to show that $\phi(x)>0$, whenever $x>1$. Since $\phi$ is a monic, cubic polynomial, it will be strictly increasing after any local minimum. We can find the local minimum by looking at $\phi'(x) = 3x^2-(a^2+b^2+c^2)$. We can see that the local minimum occurs at $\hat x =\sqrt{\frac{a^2+b^2+c^2}{3}}$. By assumption, $\hat x \leq 1$, and since $\phi$ is strictly increasing on $[\hat x, \infty)$, and $\phi(1)\geq 0$, we have $\phi(x)>0$ when $x>1$, which finishes the proof.

Addendum: To show the 'iff' mentioned in the comment below, I just need to show that $M\geq 0$ implies $a^2 + b^2 + c^2 \leq 3$. Suppose this is not true, then following the reasoning above, we have that $\hat x > 1$. Substituting the value for $\hat x$, we get $\phi(\hat x) = 2(a b c - {\hat x}^3)= 2(abc - (\frac{a^2+b^2+c^2}{3})^{\frac{3}{2}})$. If I can show that $\phi(\hat x) \leq 0$ this finishes the proof as it implies that $M$ has a negative eigenvalue. Clearly if $abc\leq 0$, then $\phi(\hat x) \leq 0$, so assume that $abc>0$. To conclude I need a small result which is similar to the result that the geometric mean is no bigger than the arithmetic mean:

Suppose $m,n$ are positive integers, and $x_1,...x_n$ are positive. Then we have $\sqrt[n]{x_1...x_n} \leq \sqrt[m]{\frac{x_1^m+...+x_n^m}{n}}.$ By dividing the $\log$ of the right-hand side by $\frac{1}{m}$ and using concavity of $\log$, we have: $\frac{1}{m} \log \frac{x_1^m+...+x_n^m}{n} \geq \frac{1}{m} \sum_{i=1}^n \frac{1}{n} \log x_i^m = \frac{1}{n} \sum_{i=1}^n x_i = \log \sqrt[n]{x_1...x_n}$

Taking the exponential of both sides produces the required identity. Choosing $n=3$, $m=2$, $x_1=|a|, x_2=|b|, x_3=|c|$ we obtain the required result.

  • 0
    No, I don't think you're missing anything - the stronger condition I mentioned implies the sum-of-squares condition, and the context in the book I'm reading makes it likely that the author is implicitly assuming that a, b and c are <= 1. When I wrote that "the theorem is true when..." I was only talking about a sufficient condition, sorry if that was unclear.2012-04-18