3
$\begingroup$

Let be $X$ a positive random variable. I would like to prove that the function $\varphi:\mathbb{N}\to [0,+\infty]$ defined by $\varphi(n)=-\log \mathbb{E}[\exp(-nX)]$ satisfies $\varphi(m+n)\leq \varphi(n)+\varphi(m)$.

I tried Hölder and Jensen inequalities but I got only a weak result that $\varphi(n)\leq n\varphi(1)$.

1 Answers 1

6

This is a consequence of the following:

Coupling inequality: Let $u$ and $v$ be nonincreasing (nonnegative) functions. Then, $ \mathrm E(u(X)v(X))\geqslant \mathrm E(u(X))\mathrm E(v(X)). $

Proof: Let $w(x,y)=(u(x)-u(y))(v(x)-v(y))$. Then, $w(x,y)\geqslant0$ for every $x$ and $y$. Let $Y$ denote a random variable independent on $X$ and with the same distribution. Expanding $w$ in the inequality $\mathrm E(w(X,Y))\geqslant0$, one gets $ \mathrm E(u(X)v(X))-\mathrm E(u(X)v(Y))-\mathrm E(u(Y)v(X))+\mathrm E(u(Y)v(Y))\geqslant0. $ The common value of the first and fourth terms is $\mathrm E(u(X)v(X))$. The common value of the second and third terms is $\mathrm E(u(X))\mathrm E(v(X))$. End of the proof.

Application: Consider the functions $u:x\mapsto\mathrm e^{-nx}$ and $v:x\mapsto\mathrm e^{-mx}$.

  • 0
    You are welcome. Thanks for the kind words.2012-02-25