10
$\begingroup$

I am trying to compute the conditional expectation $E[\max(X,Y) | \min(X,Y)]$ where $X$ and $Y$ are two iid random variables with $X,Y \sim \exp(1)$.

I already calculated the densities of $\min(X,Y)$ and $\max(X,Y)$, but I failed in calculating the joint density. Is this the right way? How can I compute the joint density then? Or do I have to take another ansatz?

  • 0
    Right. I guess your $\exp(\lambda)$ should be $\exp(-\lambda)$ but let us leave this for the moment. You might try to compute the probability of $[x\leqslant\min\leqslant\max\leqslant y]$. What is the result? And how does this give you the distribution of (min,max)?2011-10-25

6 Answers 6

8

As indicated in the comments, a useful idea when maxima and minima are involved is to consider well adapted events. Here, introducing $Z=\min\{X,Y\}$ and $W=\max\{X,Y\}$, one sees that $[z\leqslant Z,W\leqslant w]$ is $[z\leqslant X\leqslant w]\cap[z\leqslant Y\leqslant w]$ for every nonnegative $z$ and $w$ such that $z\leqslant w$. Here is a computation: since the probability that a standard exponential random variable is $\geqslant x$ is $\mathrm e^{-x}$ for every nonnegative $x$, the events $[z\leqslant X\leqslant w]$ and $[z\leqslant Y\leqslant w]$ both have probability $\mathrm e^{-z}-\mathrm e^{-w}$. Hence, $ \mathrm P(z\leqslant Z,W\leqslant w)=(\mathrm e^{-z}-\mathrm e^{-w})^2. $ Differentiating this with respect to $z$ and $w$ yields the density of $(Z,W)$ as $ 2\mathrm e^{-z-w}\cdot[0\leqslant z\leqslant w]. $ This formula is all right but, because of the indicator functions in it, I am afraid to make mistakes when using it, so I try to simplify it. Let $V=W-Z$, then $Z\geqslant0$, $V\geqslant 0$, and using $v=w-z$, the density becomes $ 2\mathrm e^{-z-(v+z)}\cdot[0\leqslant z\leqslant v+z]=2\mathrm e^{-2z}\cdot[z\geqslant 0]\cdot\mathrm e^{-v}\cdot[v\geqslant0]. $ This proves that $Z$ and $V$ are independent with $Z$ exponential of parameter $2$ and $V$ of parameter $1$ and yields at last the answer to the initial question as $ \mathrm E(W\mid Z)=\mathrm E(V+Z\mid Z)=\mathrm E(V)+Z=1+Z. $ The same technique yields that the order statistic $(X^{(k)})_{1\leqslant k\leqslant n}$ of an i.i.d. sample $(X_k)_{1\leqslant k\leqslant n}$ of standard exponential random variables, defined by the conditions that $\{X^{(1)},X^{(2)},\ldots,X^{(n)}\}=\{X_1,X_2,\ldots,X_n\}$ and that $X^{(1)}, is distributed like $(Z_1,Z_1+Z_2,\ldots,Z_1+Z_2+\cdots+Z_n)$ for independent exponential random variables $(Z_k)_{1\leqslant k\leqslant n}$ such that the distribution of $Z_k$ is exponential with parameter $n-k+1$. A consequence is that, for every $1\leqslant k\leqslant\ell\leqslant n$, $ \mathrm E(X^{(\ell)}\mid X^{(k)})=X^{(k)}+\sum\limits_{i=n-\ell+1}^{n-k}\frac1i. $

7

For two independent exponential distributed variables $(X,Y)$, the joint distribution is $ \mathbb{P}(x,y) = \mathrm{e}^{-x-y} \mathbf{1}_{x >0 } \mathbf{1}_{y >0 } \, \mathrm{d} x \mathrm{d} y $ Since $x+y = \min(x,y) + \max(x,y)$, and $\min(x,y) \le \max(x,y)$ the joint distribution of $(U,V) = (\min(X,Y), \max(X,Y))$ is $ \mathbb{P}(u,v) = \mathcal{N} \mathrm{e}^{-u-v} \mathbf{1}_{v \ge u >0 } \, \mathrm{d} u \mathrm{d} v $ The normalization constant is easy to find as $ \int_0^\infty \mathrm{d} v \int_0^v \mathrm{d} u \,\, \mathrm{e}^{-u-v} = \int_0^\infty \mathrm{d} v \,\, \mathrm{e}^{-v} ( 1 - \mathrm{e}^{-v} ) = 1 - \frac{1}{2} = \frac{1}{2} = \frac{1}{\mathcal{N}} $ Thus the conditional expectation we seek to find is found as follows (assuming u>0): $ \mathbb{E}(\max(X,Y) \vert \min(X,Y) = u) = \frac{\int_0^\infty v \mathrm{d} P(u,v)}{\int_u^\infty \mathrm{d} P(u,v)} = \frac{\int_u^\infty \mathcal{N} v \mathrm{e}^{-u-v} \mathrm{d} v}{\int_u^\infty \mathcal{N} \mathrm{e}^{-u-v} \mathrm{d} v} = 1 + u $

  • 0
    @DidierPiau I have removed my wrong answer and replaced it with correct one.2011-10-25
7

If $Z = \min(X,Y)$ and $W = \max(X,Y)$, then for $w > z$, $\begin{align*} F_{Z,W}(z,w) &= P\{Z \leq z, W \leq w\}\\ &= P\left[\{X \leq z, Y \leq w\} \cup \{X \leq w, Y \leq z\}\right]\\ &= P\{X \leq z, Y \leq w\} + P\{X \leq w, Y \leq z\} - P\{X \leq z, Y \leq z\}\\ &= F_{X,Y}(z, w) + F_{X, Y}(w,z) - F_{X,Y}(z,z) \end{align*} $ while for $w < z$, $\begin{align*} F_{Z,W}(z,w) &= P\{Z \leq z, W \leq w\} = P\{Z \leq w, W \leq w\}\\ &= P\{X \leq w, Y \leq w\}\\ &= F_{X,Y}(w,w). \end{align*} $ Consequently, if $X$ and $Y$ are jointly continuous random variables, then $f_{Z,W}(z,w) = \frac{\partial^2}{\partial z \partial w}F_{Z,W}(z,w) = \begin{cases} f_{X,Y}(z,w) + f_{X,Y}(w,z), & \text{if}~w > z,\\ \\ 0, & \text{if}~w < z. \end{cases} $ The conditional density of $W$ given $Z = z$ is $ f_{W \mid Z}(w \mid z) = \frac{f_{Z,W}(z,w)}{f_Z(z)} = \begin{cases} \frac{f_{X,Y}(z,w) + f_{X,Y}(w,z)}{\int_z^{\infty} f_{X,Y}(z,w) + f_{X,Y}(w,z)\ \mathrm dw}, & w > z,\\ 0, & w < z, \end{cases} $ and so with $f_{X,Y}(x,y) = e^{-x-y}$ for $x, y \geq 0$ $ \begin{align*}E[W \mid Z = z] &= \frac{\int_z^\infty w\cdot f_{X,Y}(z,w) + w\cdot f_{X,Y}(w,z)\ \mathrm dw}{ \int_z^\infty f_{X,Y}(z,w) + f_{X,Y}(w,z)\ \mathrm dw}\\ &= \frac{\int_z^\infty w\cdot e^{-w-z} + w\cdot e^{-w-z}\ \mathrm dw}{ \int_z^\infty e^{-w-z} + e^{-w-z}\ \mathrm dw}\\ &= \frac{2e^{-2z}\int_z^\infty w\cdot e^{-w}\ \mathrm dw}{ 2e^{-2z}} = \frac{2e^{-z}[\left . (-we^{-w})\right\vert_z^{\infty} + \int_z^{\infty}e^{-w}\ \mathrm dw]}{2e^{-2z}}\\ &= 1 + z. \end{align*} $

5

You can presumably find $\Pr(X\le a)$ and $\Pr(b \lt X \le a) $ and so $\Pr(b \lt X \le a) \Pr(b \lt Y \le a)$.

Take the derivatives of this with respect to $a$ then and $b$ (changing the sign as $b$ is a lower limit) and add an indicator such as $I_{b\le a}$, and you have the joint density which we might call $p(a,b)$ with $a$ acting as $\max(X,Y)$ and $b$ acting as $\min(X,Y)$.

You can then work out the conditional density $p(a|b) =\dfrac{p(a,b)}{\int_{a=b}^\infty p(a,b) \; da}$ and the conditional mean $E[a|b] = \int_{a=b}^\infty a\; p(a|b) \; da$, which will be a function of $b$.

To check, remember that the exponential distribution is memoryless so if $X$ and $Y$ have mean $\mu$ then you will have $E[\max(X,Y) | \min(X,Y)] = \min(X,Y) +\mu$.

  • 0
    @Didier: At the time I thought $[b\le X\lt a]\cap[b\le Y\lt a]$ being equivalent to $[b\le \min(X,Y) \le \max(X,Y) \lt a]$ was obvious.2012-02-28
1

Observe that $\max\left(X,Y\right)=X+Y-\min\left(X,Y\right)$ so that:

$\begin{aligned}\mathsf{E}\left[\max\left(X,Y\right)\mid\min\left(X,Y\right)\right] & =\mathsf{E}\left[X+Y-\min\left(X,Y\right)\mid\min\left(X,Y\right)\right]\\ & =2\mathsf{E}\left[X\mid\min\left(X,Y\right)\right]-\min\left(X,Y\right) \end{aligned} \tag1$ Here the second equality is based on symmetry.

For a fixed $m>0$ we find:

$\begin{aligned}\mathsf{E}\left[X\mid\min\left(X,Y\right)=m\right] & =\frac{1}{2}m+\frac{1}{2}\mathsf{E}\left[X\mid X>m\right]\\ & =\frac{1}{2}m+\frac{1}{2}\left(m+\mathsf{E}X\right)\\ & =m+\frac{1}{2}\mathsf EX\\ & =m+\frac{1}{2} \end{aligned} \tag2$

Here the first equality is based on symmetry and the second on the fact that exponential distribution has no memory.

Then $(2)$ leads us to the conclusion:

$\mathsf{E}\left[X\mid\min\left(X,Y\right)\right]=\min\left(X,Y\right)+\frac{1}{2}$ and substitution in $(1)$ results in:

$\mathsf{E}\left[\max\left(X,Y\right)\mid\min\left(X,Y\right)\right]=\min\left(X,Y\right)+1$

0

another way is to realize that the min and (max-min) are independent by the memoryless property of exponential distribution. Find the joint density of min and (max - min) and then apply transformation of random variables,---the jocobian.