3
$\begingroup$

This is a problem from a book, I am working on this to prepare for an exam.

If $X$ and $Y$ be independent random variables with an exponential distribution with parameters $u$ and $D$. Let

$$U=\min\{X,Y\},\quad V=\max\{X,Y\},\quad W=V-U.$$

Prove that $U$ and $W$ are independent.

I am not even able to proceed on this one. The hint given is to remove $\max$ and $\min$ out of equation by using this formula. If $$E_1 = \{X\leq Y\}\quad \text{and}\quad E_2=\{Y\leq X\}$$ then $$P(A) = P(A \cap E1) + P (A \cap E2).$$

  • 0
    Your formula doesn't make sense. $U$ and $V$ are random variables, not events, so there's no such thing as $P(A \cap U)$ or $P(A \cap V)$. I suspect what is meant is $P(A) = P(A \cap (U=X)) + P(A \cap (V=X))$.2012-12-09
  • 0
    Robert - corrected the hint2012-12-09
  • 0
    Also the exponential distribution has only one parameter, not two.2012-12-10
  • 0
    There are two random variables with two parameters respectively.2012-12-10

2 Answers 2

8

Suppose $X$ and $Y$ have different parameters, say rate parameters $\alpha$ and $\beta$ respectively (so expected values $1/\alpha$ and $1/\beta$). Consider any bounded Borel functions $g(U)$ and $h(W)$. When $X \le Y$ we have $U = X$ and $W = Y - X$, otherwise $U = Y$ and $W = X - Y$. $$\eqalign{E[g(U) h(W)] &= \int_0^\infty \int_0^\infty \alpha \beta e^{-\alpha x} e^{-\beta y} g(\min(x,y)) h(\max(x,y) - \min(x,y))\; dx\; dy\cr &= \int_0^\infty \int_0^x \alpha \beta e^{-\alpha x-\beta y} g(y) h(x-y)\; dy\; dx + \int_0^\infty \int_0^y \alpha \beta e^{-\alpha x-\beta y} g(x) h(y-x)\; dx\; dy\cr}$$ Interchange variable names $x$ and $y$ in the first integral: $$E[g(U) h(W)] = \int_0^\infty \int_0^y \alpha \beta \left(e^{-\beta x - \alpha y} + e^{-\alpha x - \beta y}\right) g(x) h(y-x)\; dx \;dy$$ Then with $z = y - x$, we have $dx\; dy = dz\; dx$, with $0 \le x \le \infty$ and $0 \le z \le \infty$ corresponding to $0 \le x \le y \le \infty$. $$\eqalign{E[g(U) h(W)] &= \int_0^\infty \int_0^\infty \alpha \beta \left( e^{-(\alpha + \beta) x - \beta z} + e^{-(\alpha + \beta) x - \alpha z}\right) g(x) h(z)\; dx\; dz\cr &= \alpha \beta \left(\int_0^\infty e^{-(\alpha+\beta) x} g(x)\; dx\right)\left(\int_0^\infty \left(e^{-\alpha z} + e^{-\beta z}\right) h(z)\; dz\right)\cr}$$ Similarly $$\eqalign{E[g(U)] &= \alpha \beta \left(\int_0^\infty e^{-(\alpha+\beta) x} g(x)\; dx\right)\left(\int_0^\infty \left(e^{-\alpha z} + e^{-\beta z}\right) \; dz\right)\cr E[h(W)] &= \alpha \beta \left(\int_0^\infty e^{-(\alpha+\beta) x} \; dx\right)\left(\int_0^\infty \left(e^{-\alpha z} + e^{-\beta z}\right) h(z)\; dz\right)\cr 1 = E[1] &= \alpha \beta \left(\int_0^\infty e^{-(\alpha+\beta) x} \; dx\right)\left(\int_0^\infty \left(e^{-\alpha z} + e^{-\beta z}\right) \; dz\right)\cr}$$ so that $E[g(U) h(W)] = E[g(U)] E[h(W)]$. Since this is true for all bounded Borel functions, $U$ and $W$ are independent.

5

As usual, the functional approach is straightforward: for every bounded measurable function $\varphi$, $$ \mathbb E(\varphi(U,W))=\mathbb E(\varphi(X,Y-X)\mathbf 1_{Y\geqslant X})+\mathbb E(\varphi(Y,X-Y)\mathbf 1_{X\geqslant Y}). $$ Since $X$ and $Y$ are i.i.d., both terms in the RHS coincide, hence $$ \mathbb E(\varphi(U,W))=2\int_0^{+\infty}\!\!\!\int_x^{+\infty}\varphi(x,y-x)\mathrm e^{-y}\mathrm dy\,\mathrm e^{-x}\mathrm dx. $$ The change of variable $(z,t)=(x,y-x)$ has Jacobian $1$ and yields $$ \mathbb E(\varphi(U,W))=2\int_0^{+\infty}\!\!\!\int_0^{+\infty}\varphi(z,t)\mathrm e^{-2z-t}\mathrm dz\,\mathrm dt=\iint\varphi(z,t)f_U(z)\mathrm dz\,f_W(t)\mathrm dt, $$ for some density functions $f_U$ and $f_W$ which I will let you discover. Since this identity holds for every $\varphi$, it proves that $U$ and $W$ are independent with densities $f_U$ and $f_W$ respectively.

  • 0
    did you are amazing...thanks2012-12-09
  • 0
    I got $$ f_U(z)\mathrm = D exp(-Dx)*(1-exp(-ux))$$ But how do I get probability function for $W$.2012-12-10
  • 0
    Please straighten a bit your notations: $z$ is not $x$ and there is no $D$ and no $u$, so no, this is not (and cannot be) $f_U(z)$. Furthermore, since the method in the post yields $f_U(z)$ and $f_W(t)$ **at the same time**, I do not even know what you did to reach the formula in your comment.2012-12-10
  • 0
    ok. I calculated F(z) = P(X<= Y) = P(X<=z|Y=z) = Integral of (Fx(x)*fy(x)). By differentiatin I got the above result. I agree it should be z instead of x in above case. Can you help me if I am wrong, on how to calculate these terms.2012-12-10
  • 1
    See my answer for the more general case where $X$ and $Y$ have different parameter values.2012-12-10
  • 0
    Sorry, I missed the mention of parameters $u$ and $D$ in your question. My answer solves the case $u=D=1$ (and is easily adaptable to the general case).2012-12-10