2
$\begingroup$

We have $2$ groups of random variables $X_1, X_2, .. X_T$ and $Y_1, Y_2, .. Y_T$. The first group variables are NOT independent, but the second group are.

We know that

$ P \{X_1 > a \} < P \{Y_1 > a\} $

and

$ P \{X_{t+1} > b | X_{t} = c \} < P \{c \times Y_{t+1} > b\} $

Then is this inequality

$ P\{X_t > a\} < P\left\{\prod_{i=1}^tY_i < a\right\} $

correct?

For me it seems to be correct intuitively. In a certain sense, $X_t$ is smaller than $\prod_{i=1}^tY_i$.

But can we prove it?

  • 0
    Is the Skorokhod representation part in answer below related to this [Dumb question: Computing expectation without change of variable formula](https://math.stackexchange.com/questions/2690542) ?2018-04-28

1 Answers 1

2

All kinds of order reversals can occur when products of negative real numbers enter the picture so let us assume that $X_t$ and $Y_t$ are nonnegative with full probability, for every $t$.

A key fact in this context is the following coupling result:

Assume that the random variables $\xi$ and $\eta$ are such that $\mathrm P(\xi\geqslant x)\leqslant\mathrm P(\eta\geqslant x)$ for every $x$. Then there exists random variables \xi' and \eta', defined on a common probability space, such that \xi' is distributed like $\xi$, \eta' is distributed like $\eta$, and \xi'\leqslant\eta' with full probability.

Of course, the reciprocal holds since, if such random variables \xi' and \eta' exist, then, for every $x$, [\xi'\geqslant x]\subseteq[\eta'\geqslant x], hence \mathrm P(\xi\geqslant x)=\mathrm P(\xi'\geqslant x)\leqslant\mathrm P(\eta'\geqslant x)=\mathrm P(\eta\geqslant x).

Let us apply this key fact to show that $\mathrm P(X_t\geqslant x)\leqslant\mathrm P(U_t\geqslant x)$ for every $x\geqslant0$, recursively over $t\geqslant1$, where $U_t=Y_1Y_2\cdots Y_t$.

The case $t=1$ is part of the hypothesis.

Assume the result holds for some $t\geqslant1$. Then, introducing the distribution $\mu_t$ of $X_t$, $ \mathrm P(X_{t+1}\geqslant x)=\mathrm E(\mathrm P(X_{t+1}\geqslant x\mid X_t))\leqslant\int\mathrm P(zY_{t+1}\geqslant x)\mu_t(\mathrm dz)=(*). $ Let $X$ denote any random variable independent on $Y_{t+1}$ and distributed like $X_t$. Then, $ (*)=\mathrm P(XY_{t+1}\geqslant x). $ The key fact applied to the recursion hypothesis shows that there exists some random variables X' and U' such that X' is distributed like $X$, U' is distributed like $U_t$, and U'\geqslant X' almost surely. Let Y' denote any random variable independent on (X',U') and distributed like $Y_{t+1}$. Then, (*)=\mathrm P(X'Y'\geqslant x). Since Y'\geqslant0 almost surely, U'Y'\geqslant X'Y' almost surely and [X'Y'\geqslant x]\subseteq[U'Y'\geqslant x]. Thus, (*)\leqslant\mathrm P(U'Y'\geqslant x). Note that (U',Y') is distributed like $(U_t,Y_{t+1})$ since U' is independent on Y', U' is distributed like $U_t$ and Y' is distributed like $Y_{t+1}$. Hence, \mathrm P(U'Y'\geqslant x)=\mathrm P(U_tY_{t+1}\geqslant x)=\mathrm P(U_{t+1}\geqslant x), which concludes the proof that $\mathrm P(X_{t+1}\geqslant x)\leqslant\mathrm P(U_{t+1}\geqslant x)$.

Edit 1: The OP asks for some explanations about the relation $(*)=\mathrm P(XY_{t+1}\geqslant x)$. This follows from the definitions and from the independence property. To see this, introduce the distribution $\nu_{t+1}$ of $Y_{t+1}$ and note that, by definition, $ (*)=\int\mathrm P(zY_{t+1}\geqslant x)\mu_t(\mathrm dz)=\iint [zy\geqslant x]\mu_t(\mathrm dz)\nu_{t+1}(\mathrm dy), $ that is, $ (*)=\iint u(z,y)\mu_t(\mathrm dz)\nu_{t+1}(\mathrm dy),\quad\text{with}\ u:(z,y)\mapsto[zy\geqslant x]. $ Let (X'',Y'') denote any couple of random variables with distribution $\mu_t\otimes\nu_{t+1}$. In other words, assume that the distribution of X'' is $\mu_t$, the distribution of Y'' is $\nu_{t+1}$ and X'' and Y'' are independent. Then, (*)=\mathrm E(u(X'',Y''))=\mathrm P(X''Y''\geqslant x). To conclude, note that $(X,Y_{t+1})$ is an example of such a couple (X'',Y'').

Edit 2: The coupling result mentioned at the beginning of this post is a consequence of the following fact, often called Skorokhod representation theorem (see the first chapter of The coupling method by T. Lindvall):

For every random variables $\xi$ and $\eta$, there exists a random variable $\zeta$, uniform on $(0,1)$, and some nondecreasing functions $u$ and $v$ such that $u(\zeta)$ is distributed like $\xi$ and $v(\zeta)$ is distributed like $\eta$.

Since, in our case, $\mathrm P(\xi\geqslant x)\leqslant\mathrm P(\eta\geqslant x)$ for every $x$, one can choose $u$ and $v$ such that $u\leqslant v$. Hence, a solution (a so-called coupling) is \xi'=u(\zeta) and \eta'=v(\zeta).

The basic idea of this version of Skorokhod representation theorem is to pick for $u$ the inverse of the CDF $F_\xi:x\mapsto\mathrm P(\xi\leqslant x)$ of $\xi$ and for $v$ the inverse of the CDF $F_\eta:x\mapsto\mathrm P(\eta\leqslant x)$ of $\eta$. Beware however that, to be fully rigorous, one must define carefully these so-called inverses since, in the general case, $F_\xi$ and $F_\eta$ need not be continuous nor strictly increasing (hence one relies on formulas like $u(z)=\inf\{x\mid F_\xi(x)\geqslant z\}$ and $v(z)=\inf{x\mid F_\eta(x)\geqslant z}$). Nevertheless, omitting the technical details of the representation, one can guess that $F_\xi\geqslant F_\eta$ implies $u\leqslant v$, which proves the result.

  • 1
    On coupling: see Edit 2. About $(*)$: see Edit 1.2012-02-26