1
$\begingroup$

I'm finding it difficult to understand the explanation for my problem's solution, I would like to post it here so maybe one of you could enlight my understanding.

Just to clarify, I'm studying for a Probability test and I'm stuck at this exercise (and I'm pretty bad with functions of R.V.s), so the Homework tag is just to let it clear that I'm looking for something for dummies :/

Problem is: Given two i.i.d., exponential with $\lambda = 4$ R.V.s, X and Y, find the p.d.f. of $Z = \frac{X}{X+Y}$

Thanks in advance.

  • 0
    First you know the joint density is $f_{XY}(x,y) = 4e^{x+y}$, $x, y \ge 0$. This is so because the RVs are independent. Let $z > 0$. Now integrate this over the region $\{(x,y)| x > z(x + y)\}$ to get $1 - F_Z(z)$. You can recover the density for $Z$ via differentiation.2012-07-03

2 Answers 2

1

One first finds the cumulative distribution function: $S_Z(z) = \mathbb{P}\left( \frac{X}{X+Y} > z \right)$. Clearly $S_Z(z) = 0$ for all $ z \geqslant 1$, and $S_Z(z) = 1$ for $z \leqslant 0$, hence assume $0. Then: $$ S_Z(z) = \mathbb{P}\left( X > z X + z Y \right) = \mathbb{P} \left( (1-z) X > z Y \right) = \mathbb{P} \left( X > \frac{z}{1-z} Y \right) = \mathbb{E}\left( \mathbb{P} \left( X > \frac{z}{1-z} Y |Y \right) \right) = \mathbb{E}\left( \exp\left(- \lambda \frac{z}{1-z} Y\right) \right) = \int_0^\infty \lambda \exp\left(- \lambda \frac{z}{1-z} y\right) \exp(-\lambda y) \mathrm{d} y = \frac{1}{1+ \frac{z}{1-z}} = 1 - z $$ This proves that $Z$ is a uniform random variable.

  • 0
    The step from the first to the second line and the first step from there are not clear...2012-07-04
  • 0
    @Giuliano This is [the law of total expectation](http://en.wikipedia.org/wiki/Law_of_total_expectation). It says that you can treat $Y$ as constant while computing the expectation with respect to $X$, due to independence of $X$ and $Y$.2012-07-04
2

Here is a systematic way.

  1. By definition, for every measurable bounded function $u$, $$ \mathrm E(u(X,Y))=\iint16\mathrm e^{-4x-4y}u(x,y)\,[x\geqslant0,y\geqslant0]\,\mathrm dx\mathrm dy. $$
  2. In particular, $$ \mathrm E(u(Z))=\iint16\mathrm e^{-4(x+y)}u\left(\frac{x}{x+y}\right)\,[x\geqslant0,y\geqslant0]\,\mathrm dx\mathrm dy. $$
  3. Consider the change of variable $x=zs$, $y=(1-z)s$. Thus, $\mathrm dx\mathrm dy=s\mathrm ds\mathrm dz$ with $s\geqslant0$ and $0\leqslant z\leqslant1$, and $$ \mathrm E(u(Z))=\iint16\mathrm e^{-4s}u(z)\,[s\geqslant0,0\leqslant z\leqslant1]\,s\mathrm ds\mathrm dz=\int u(z)f(z)\,\mathrm dz, $$ with $$ f(z)=\int16\mathrm e^{-4s}\,[s\geqslant0,0\leqslant z\leqslant1]\,s\mathrm ds=[0\leqslant z\leqslant1]\,\int_0^{+\infty}16\mathrm e^{-4s}\,s\mathrm ds=[0\leqslant z\leqslant1]. $$
  4. This holds for every bounded measurable function $u$ hence $Z$ is uniform on $(0,1)$.
  • 0
    My Jacobian insists telling me the change of variables is lacking a minus signal, or the differentials are in the wrong order, am I right?2012-07-04
  • 0
    Surely you are aware that the correct factor in a change of variables is *the absolute value* of the determinant of the Jacobian matrix.2012-07-04
  • 0
    +1 I wonder why the $f$ found this way is the density of $Z$, given $u$ is any measurable bounded function?2012-07-05
  • 1
    I voted Sasha's answer because, I found it closer to what I am expected to do, given the previous tests. I can't up vote yours, though. Thanks anyway :)2012-07-05