Let's say X is an exponential random variable with $\theta$ and Y is an uniform random variable over $[0, T]$. How does one calculate $E[(X-Y)|(X>Y)]$?
Conditional Expectation of an Exponential and Uniform Random variable
2 Answers
Let $X$ denote an exponential random variable and $Y$ any nonnegative random variable independent of $X$. Then, $\mathbb E(X-Y\mid X\gt Y)=\mathbb E(X)$ is independent of the distribution of $Y$.
To show this, call $\theta$ the parameter of the exponential distribution of $X$ and note that $\mathbb E(X-Y\mid X\gt Y)=\frac{N}D$ with $N=\mathbb E(X-Y;X\gt Y)$ and $D=\mathbb P(X\gt Y)$. The independence of $X$ and $Y$ yields $ D=\int_0^{+\infty}\int_y^{+\infty}\mathrm d\mathbb P_X(x)\mathrm d\mathbb P_Y(y)=\int_0^{+\infty}\mathbb P(X\gt y)\mathrm d\mathbb P_Y(y)=\int_0^{+\infty}\mathrm e^{-\theta y}\mathrm d\mathbb P_Y(y), $ hence $ D=\mathbb P(X\gt Y)=\mathbb E(\mathrm e^{-\theta Y}). $ Likewise, $ N=\int_0^{+\infty}\int_y^{+\infty}(x-y)\mathrm d\mathbb P_X(x)\mathrm d\mathbb P_Y(y)=\int_0^{+\infty}\mathbb E(X-y;X\gt y)\mathrm d\mathbb P_Y(y). $ For every fixed $y\geqslant0$, $ \mathbb E(X-y;X\gt y)=\int_y^{+\infty}(x-y)\mathrm d\mathbb P_X(x)=\int_y^{+\infty}\int_y^x\mathrm dz\,\mathrm d\mathbb P_X(x), $ hence $ \mathbb E(X-y;X\gt y)=\int_y^{+\infty}\int_z^{+\infty}\mathrm d\mathbb P_X(x)\,\mathrm dz=\int_y^{+\infty}\mathbb P(X\geqslant z)\,\mathrm dz, $ that is, $ \mathbb E(X-y;X\gt y)=\int_y^{+\infty}\mathrm e^{-\theta z}\,\mathrm dz=\theta^{-1}\mathrm e^{-\theta y}. $ Hence $ N=\mathbb E(X-Y;X\gt Y)=\theta^{-1}\mathbb E(\mathrm e^{-\theta Y}). $ Finally, $ \mathbb E(X-Y\mid X\gt Y)=\frac{\theta^{-1}\mathbb E(\mathrm e^{-\theta Y})}{\mathbb E(\mathrm e^{-\theta Y})}=\theta^{-1}=\mathbb E(X). $
-
0Thanks! One would expect that the conditional expectation would be a function of $y$. But I guess the memory-less property of exponential distribution diminishes that? – 2012-10-26
Hint: I presume $X$ and $Y$ are supposed to be independent. Use the "lack of memory" property of the exponential distribution.
-
0It's slightly subtle that the property works for a random variable $Y$ independent of $X$ as well as for a constant. You can prove that by approximating $Y$ by discrete random variables. – 2012-10-24