8
$\begingroup$

Let ${x_1,x_2,...x_n}$ be positive numbers. Consider the matrix $C$ whose $(i,j)$-th entry is

$$\min\left\{\frac{x_i}{x_j},\frac{x_j}{x_i}\right\}$$

Show that $C$ is non-negative definite (or positive semidefinite, meaning $z^t C z\geq 0$ for all $z\in \mathbb{R}^n$).When is $C$ positive definite?

  • 2
    Non-negative definite? Is that another way of saying "positive semidefinite"?2012-09-17
  • 0
    Where is the problem from?2012-09-17

3 Answers 3

2

We can suppose that $x_i\leq x_j$ for $i0$ iff all $x_i$'s are different. Notice that if you take the top left corner of $C$ with $k$ rows and columns, you get $C_k$, and $\det C_k\geq0$. Your matrix is therefore positive semidefinite (by Sylvester criterion), and it is positive definite iff all $x_i$'s are different.

Now we need to prove $(*)$. Let $D_n$ be $C_n$ with $i,k$-th element multiplied by $x_i x_j$, so that $\det D_n=\det C_n\times\prod_i x_i^2$. We want to show

$$\det D_n=x_1^2(x_2^2-x_1^2)\dots(x_n^2-x_{n-1}^2).$$

$D_n$ looks like $$ \begin{pmatrix} x_1^2 & x_1^2& x_1^2&x_1^2\\ x_1^2& x_2^2& x_2^2&x_2^2\\ x_1^2& x_2^2&x_3^2&x_3^2\\ x_1^2& x_2^2&x_3^2&x_4^2 \end{pmatrix} $$ (for $n=4$ - I hope the pattern is clear), i.e. $$ \begin{pmatrix} a & a& a&a\\ a& b& b&b\\ a& b&c&c\\ a& b&c&d \end{pmatrix}. $$ ($a=x_1^2,\dots,d=x_4^2$). If we now subtract the first row from the others and then the first column from the others, we get $$ \begin{pmatrix} a & 0& 0&0\\ 0& b-a& b-a&b-a\\ 0& b-a&c-a&c-a\\ 0& b-a&c-a&d-a \end{pmatrix}. $$ i.e. $$ \det \begin{pmatrix} a & a& a&a\\ a& b& b&b\\ a& b&c&c\\ a& b&c&d \end{pmatrix}= a\det \begin{pmatrix} b-a& b-a&b-a\\ b-a&c-a&c-a\\ b-a&c-a&d-a \end{pmatrix} $$ Repeating this identity, we get $$\det D_n= \det \begin{pmatrix} a & a& a&a\\ a& b& b&b\\ a& b&c&c\\ a& b&c&d \end{pmatrix}= a\det \begin{pmatrix} b-a& b-a&b-a\\ b-a&c-a&c-a\\ b-a&c-a&d-a \end{pmatrix}= a(b-a)\det \begin{pmatrix} c-b&c-b\\ c-b&d-b \end{pmatrix}= a(b-a)(c-b)(d-c) $$ as we wanted to show.

There must be a more intelligent solution, but for the moment this one should do.

  • 0
    Just because the determinant is positive, it doesn't mean that the matrix is PSD.2012-09-17
  • 0
    @S.B. positive determinant for all top left corners; see http://en.wikipedia.org/wiki/Sylvester%27s_criterion2012-09-17
  • 0
    I see, I wasn't aware of this theorem.2012-09-17
  • 0
    Thanks guys.Both the proofs are elegant.the BM motion proof is more revealing...the sort of thing i was looking for.But the determinant proofs has there own charm too.2012-09-18
1

First, note that the matrix is symmetric: if $x_j > x_i$, then $C_{ij} = \frac{x_i}{x_j}$. Then, $C_{ji} = \min \{ \frac{x_i}{x_j}, \frac{x_j}{x_i} \} = \frac{x_i}{x_j} = C_{ij}$.

Since it is real and symmetric, it is diagonalizable: $C = V^T D V$. Let $y = Vx$. Then, the problem reduces to finding $y^T D y > 0$. This should be easy to finish from here.

  • 0
    Thanks.But it is not! One must prove that the eigenvalues(the diagonal entries of D) are non-negative. Finding eigen values is much harder problem.2012-09-17
  • 0
    There is a relationship between eigenvalues and determinants.2012-09-17
  • 0
    product of eigenvalues == determinant.Its not a home work problem.It came up while proving something else.2012-09-17
  • 0
    @user41518 Why do you think it is positive semidefinite?2012-09-17
  • 0
    I have made numerical experiments!besides its equivalent to another matrix C' having same property.C' has (i,j)-th entry2012-09-17
  • 0
    besides its equivalent to another matrix $C'$ having same property.$C'$ has $(i,j)$-th entry $\exp(-\theta|y_i - y_j|)$, $\theta>0$, and $y_1,y_2,...,y_n$ are some real numbers.To see the equivalence,assume without loss of generality that $x_i$ are increasing, and put $x_i = e^{\theta y_i}$.2012-09-17
  • 0
    I had actually misread your problem at first; I thought you knew it was positive semi-definite, and wanted to know under which conditions it was PD! Nevertheless, it should still be possible to manipulate the symmetry argument.2012-09-17
1

Consider positive real numbers $t_i > 0$ and real numbers $\alpha_j \in \mathbb{R}$.

  • The matrix $M_{ij}=\min(t_i, t_j)$ is positive semi-definite since $\sum z_i M_{ij}z_j = \int_0^{\infty} \big( \sum_i z_i 1_{t
  • If $M=(M_{ij})_{ij}$ is positive semi-definite, so is $N_{ij} = \alpha_i \alpha_j M_{ij}$ since $\sum z_i z_j N_{ij} = \sum y_i y_j M_{ij} \geq 0$ where $y_i = \alpha_i z_i$.

The choice $t_i = x_i^2$ and $\alpha_j = \frac{1}{x_j}$ solves the exercise.

  • 0
    Beautiful; while I was proving that $M$ (in my answer denoted by $D$) is positive semidefinite via determinants, you gave a direct half-line argument2012-09-17
  • 0
    yes, it is beautiful!2012-09-18