3
$\begingroup$

I recently learned about finding density functions of functions of random variables using the cumulative distribution function. For example, computing the density function for the difference of two uniform random variables on $[0,1]$ (as in Grinstead & Snell's free online book). I gave myself the following problem.

Say $X$ and $Y$ are independent random variables with exponential density $f(t) = a e^{-at}$. I'm trying to compute the density function for $Z = X-Y$, but I'm getting nonsense. Since $z<0$ is ok and the density function will go like $e^{-az}$, trying to check that $\int_{-\infty}^\infty f_{X-Y}(z) dz$ diverges! What is going wrong?

EDIT: The calculations are as follows. If $z > 0$, then \begin{split} P(Z < z) &= \int_0^\infty dy \int_0^{y+z} dx a^2 e^{-a(x+y)} \\ &= 1 - \frac 1 2 e^{-a z} \end{split} I found this by thinking about the (x,y) points satisfying $Y > X - z$, giving a region $y > 0, y+z > x > 0$. If $ z < 0$, $ \begin{split} \displaystyle P(Z < z) &= \int_z^\infty dy \int_0^{y+z} dx a^2 e^{-a(x+y)} \\ &= -\frac 1 2 e^{-3 a z } + e^{-a z}.\end{split}$

  • 0
    Ahh!! I'm foolish. My integral for z < 0 should be $\displaystyle\int_{-z}^\infty$. Thanks Nate.2010-12-06

3 Answers 3

1

It is a laplace distibution.

http://en.wikipedia.org/wiki/Laplace_distribution#Relation_to_the_exponential_distribution

0

In general, you are actually interested in Cross-correlation.

Let's consider your problem in the setting of cross-correlation. We'll begin by considering the distribution function of $X-Y$ for $t>0$ (this is enough, since $X-Y$ is a symmetric random variable, hence having a symmetric density function). Then, by the law of total probability (conditioning on $Y$), we have $ {\rm P}(X - Y \le t) = \int_0^\infty {{\rm P}(X - Y \le t|Y = \tau )f(\tau ){\rm d}\tau } = \int_0^\infty {{\rm P}(X \le t + \tau )f(\tau ){\rm d}\tau }. $ Of course, you can now calculate the right-hand side integral directly, and then differentiate with respect to $t$, but it is instructive to first differentiate with respect to $t$. Thus, we get $ f_{X - Y} (t) = \frac{{\rm d}}{{{\rm d}t}}\int_0^\infty {F(t + \tau )f(\tau ){\rm d}\tau } = \int_0^\infty {f(t + \tau )f(\tau ){\rm d}\tau } , $ where $F$ denotes the distribution function of $X$ (and $Y$). According to Wikipedia's link given above, this can be denoted as $f_{X-Y}(t) = (f \star f)(t)$ (read "$f_{X-Y}(t)$ is the cross-correlation of $f$ with itself"). Finally, from $f(u)= a {\rm e}^{-a u}$, $u > 0$, we find $f_{X-Y}(t) = (a/2){\rm e}^{-at}$, $t > 0$; then, by the aforementioned symmetry, $f_{X-Y}(t) = (a/2){\rm e}^{ - a|t|}$, for all $t \in \mathbb{R}$.

0

You could also use transformation theorem for linear transformations like this

$V = X + Y, U = X$.

Then find Joint law ( law means cumulative distribution function) and then derive its marginal through

$ f_{V}(v)=\int_{-\infty}^{\infty} f_{U,V}(U,V) du $

  • 0
    Yeah... just noticed2015-10-27