2
$\begingroup$

I have troubles to understand the solution to the following task:

$X,Y$ are two independent exponential r.v. with parameters $\lambda,\mu>0$ and $T=\min(X,Y)$. Comput $\mathbb{E}(T\mid X)$


Solution: For $f:\mathbb{R}\to \mathbb{R}$ measurable and bounded we have $$\mathbb{E}(Tf(X))=\mathbb{E}(Xf(X)\mathbb{1}_{\{Y>X\}}+Yf(X)\mathbb{1}_{\{Y\leq X\}})$$ And because of independence: $$\stackrel{(*)}{=}\int\limits_0^\infty(xP(Y>x)+\mathbb{E}(Y\mathbb{1}_{\{Y\leq x\}}))f(x)P_X(\mathrm{d}x)$$ $$=\int\limits_0^\infty\left(xe^{-\mu x}+\frac{1}{\mu}(1-e^{-\mu x}(1+\mu x))\right)f(x)P_X(\mathrm{d}x)=\mathbb{E}(\frac{1}{\mu}(1-e^{-\mu X})f(X))$$ hence $$\mathbb{E}(T\mid X)=\frac{1-e^{-\mu X}}{\mu}\quad a.s.$$

I have 2 little questions:

  1. For me the calculation steps to get $(*)$ looks like this $$\mathbb{E}(Xf(X)\mathbb{1}_{\{Y>X\}}+Yf(X)\mathbb{1}_{\{Y\leq X\}})=\mathbb{E}(\mathbb{1}_{\{Y>X\}})\mathbb{E}(Xf(X))+\mathbb{E}(Y\mathbb{1}_{\{Y\leq X\}})\mathbb{E}(f(X))$$ $$=P(Y>X)\mathbb{E}(Xf(X))+\mathbb{E}\left(\mathbb{E}(Y\mathbb{1}_{\{Y\leq X\}})f(X)\right)$$ $$=\mathbb{E}\left((P(Y>X)X+\mathbb{E}(Y\mathbb{1}_{\{Y\leq X\}}))f(X)\right)$$

Is this correct?

  1. If the above calculations are correct, then why is $$\mathbb{E}(Xf(X)\mathbb{1}_{\{Y>X\}})=\mathbb{E}(\mathbb{1}_{\{Y>X\}})\mathbb{E}(Xf(X))$$ allowed? Wouldn't that imply that $\mathbb{1}_{\{Y>X\}}$ and $X$ are independent?

Thank you very much!

  • 0
    I think $f$ has to do with a non-rigorous way of dealing with conditional expectation. Idk. My experience with rigorous conditional expectation is that $E[T|X]$ is any random variable $Z$ s.t. $Z$ is $\sigma(X)-$measurable and $E[T1_A] = E[Z1_A]$ for $A \ \in \ \sigma(X)$ so my approach would be to show that $E[\min(X,Y) 1_A] = E[(\frac{1-e^{-\mu X}}{\mu})1_A]$ for all $A \ \in \ \sigma(X)$2017-01-07
  • 0
    Matriz, 1 I think the $f$ is somehow related to the $1_A$. What's your definition of conditional expectation w/rt a random variable exactly? 2. I notice you have 'measurable' there. Is this Borel-measurable? 3. As for 1 and 2, I'm not sure that that's allowed.2017-01-07
  • 0
    edited dumb comment hehe2017-01-07
  • 0
    that's stochastic II? In what context? My stochastic II class had Feynman-Kac, Ito isometry, martingale representation, etc2017-01-07
  • 0
    @BCLC So did you mean by "rigorous" that they used a (not specific) general $\sigma(X)$-measurable function $f$ so that they can use the $\mathbb{1}_A$ which is $\sigma(X)$-measurable to compute the expectation? (That is what I meant with my question) EDIT: context of martingales & Markov chains.2017-01-07
  • 2
    *Of course* the approach is perfectly rigorous, to show that $E(T\mid X)=g(X)$ being equivalent to showing that $E(Tf(X))=E(g(X)f(X))$ for every, say, bounded measurable $f$. // No, the identity $$E(Xf(X)\mathbb{1}_{\{Y>X\}})=E(\mathbb{1}_{\{Y>X\}})E(Xf(X))$$ is not correct at all, since $Xf(X)$ and $\mathbb{1}_{\{Y>X\}}$ are not independent. And this is not what the proof uses... Note that $$E(Xf(X)\mathbb{1}_{\{Y>X\}})=E(E(Xf(X)\mathbb{1}_{\{Y>X\}}\mid X))=E(Xf(X)(1-F_Y(X)))$$ and compare to what is actually written in the proof.2017-01-07
  • 0
    @Did Thank you for the answer. What does the notation $F_Y(X)$ mean? Distribution function of $Y$ with upper limit $X$?2017-01-07
  • 0
    The notation $F_Y$ stands for the CDF of $Y$ (this is a nearly universal convention, isn't it?), that is, $F_Y(y)=P(Y\leqslant y)$ for every real $y$.2017-01-07
  • 0
    @Did I see. Thanks a lot for your answer!2017-01-07

0 Answers 0