8
$\begingroup$

I'm stuck to prove the following exercise : Given real numbers $x_1,\ldots,x_n$ and $y_1,\ldots,y_n$, show that $$ \det(e^{\large{x_iy_j}})_{i,j=1}^n>0 $$ provided that $x_1<\cdots

Any idea ?

  • 0
    I assume that the index of $y$ is $j$.2012-05-26
  • 0
    @Phira : You're right. Let me edit that.2012-05-26
  • 2
    If one of the sets are integers, this follows from the Vandermonde determinant and the definition of a Schur function.2012-05-26
  • 0
    Can you give some context for the exercice?2012-05-26
  • 0
    In approximation theory, this is about to show that the family of functions $x\mapsto e^{xy_i}$, $i=1,\ldots,n$, forms a Chebyshev system.2012-05-26
  • 1
    Positivity of this matrix is equivalent to positivity of another matrix $(e^(x_i-x_j)^2)$ (expand the square then one gets $x_iy_j$). But the latter is well known to be positive, for example in the area of gaussian processes (it is the covariance matrix). There is actually a very neat lower bound for the determinant of the matrix you give, proved by Drury-Marshall (1987).2012-05-30
  • 1
    This has been proved in Theorem 6.5.2, pp.297-299 of Kung *et al.*, [Combinatorics: The Rota Way](http://www.math.tamu.edu/~cyan/book.html), Cambridge University Press.2013-05-09

3 Answers 3

6

In the following I give a partial answer: I prove the inequality for rational $y$ which only implies the weak inequality for real $y$.

First suppose that $y_j$ are integers. Set $\lambda_j=y_j-j$ which is a partition.

Set $a_i=e^{x_i}$. Your determinant becomes $\det(a_i^{y_j})=\det(a_i^{j-1}) s_{\lambda}(a_1,\dots,a_n)$, a product of the Vandermonde determinant which is positive by the monotonicity of your variables (and exp respects this, of course) and a Schur function of positive variables (since exp is always positive) which is also positive because the Schur function is a sum of monomials in its variables.

Now, suppose that the $y_j$ are rational. Let their common denominator be $D$ and change $x_i$ and $y_j$ to $x_i/D$ and $y_iD$ which brings us back to the the integer case.

Finally, for real numbers, you can take the limit which unfortunately just gives you $\det \ge 0$, but maybe you have another way of knowing that your matrix is not singular.

4

It suffices to prove that given $x_1<\ldots nonzero.

Indeed, $D$ is a continuous real function on the open connected subset $$ U=\{(\bar{x},\bar{y}):x_1<\ldots

To prove the nonvanishing of the determinant, we use induction on $n$, imagine to fix the values of $x_1,\ldots, x_{n-1}$ and of $y_1,\ldots ,y_n$ and let vary $x_n=z$ over $\mathbb{R}$. Consider the determinant $D(z)$ of $M$ as a function of $z$. Clearly, we have $D(x_i)=0$ for $i=1,\ldots n-1$ and we expect (if the thesis is true) that $D(z)$ does not vanishes elsewhere.

Using Laplace expansion for the determinant we see that $D(z)$ has the following form $$ D(z)= A_1 (e^{y_1})^z+\ldots A_n (e^{y_n})^z $$ with coefficients $A_j\neq 0$ (by induction on $n$).

Now to conclude it is natural to recall a well-known fact about exponential polynomials.

Lemma 1 Let $\rho_1\ldots,\rho_n$ positive distinct real numbers and $c_1,\ldots, c_n\in\mathbb{R}$ non vanishing real numbers. Then the equation $\displaystyle\sum_{i=1}^n c_i\rho_i^z=0$ has at most $n-1$ solutions $z\in\mathbb{R}$

Lemma 1 it is a direct corollary of the more general.

Lemma 2 Let $\rho_1\ldots,\rho_n$ positive distinct real numbers and $c_1,\ldots, c_n\in\mathbb{R}[x]$ non trivial polynomials with real coefficients. Then the equation $\displaystyle\sum_{i=1}^n c_i(z)\rho_i^z=0$ has at most $\displaystyle\sum_{i=1}^n (deg(c_i)+1)-1$.

Lemma 2 is easily proven using only induction and Rolle's theorem.

1

Some ideas:

1) What about induction on n? For $\,n=1\,$ the claim is trivial. For $\,n=2\,$ we have to prove that $\,\displaystyle{e^{x_1y_1+x_2y_2}>e^{x_1y_2+x_2y_1}\Longleftrightarrow (x_1-x_2)(y_1-y_2)>0}\,$, check...etc. (this looks really awful)

2) For any $\,X:=(v_1,v_2,...,v_n)\in\mathbb{R}^n\,$ , show that $\,X^TAX>0\,$ , with $\,A:=\left(e^{x_iy_j}\right)\,$ , making this matrix positive definite and thus its determinant is positive (this looks slightly better...)

  • 0
    Thanx to Davide, I already edited my answer and changed the notation. @Anon, I don't quite understand what you mean by X independent to A...? One is a vector, the other one is a matrix.2012-05-26
  • 0
    @Davide, I'm not sure. I think we get something like $$X^TAX=\sum_{k=1}^n v_k\sum_{i=1}^n v_ie^{x_ky_i}$$2012-05-26
  • 0
    Re: your comment to me, the matrix $(\exp x_iy_j)_{ij}$ is a function of the vector $(x_i)$; they are not independent of each other and so saying that $X^TA(X)X>0$ would not necessarily mean $A(X)$ was pos-def for a specific $X$. Of course you wanted $X$ independent of $A$, and so changed this in light of Davide's comment.2012-05-26