0
$\begingroup$

I tried finding the following density function:

Let $X$ and $Y$ be independent random variables, each having the exponential distribution with parameter $\lambda$. Find the joint density function of $X$ and $Z=X+Y$.

$\begin{aligned}f_{X,X+Y}=&\frac{\partial}{\partial x\partial z}\mathbb P (X\leq x,Y\leq z-x)\\=&\int_{-\infty}^x\int_{-\infty}^{z-x}f(u,v)\,\mathrm dv=\frac{\partial}{\partial z}\left[\frac{\partial}{\partial x}\int_0^x\int_0^{z-x}\lambda^2e^{-\lambda u-\lambda v}\,\mathrm dv\,\mathrm d u\right]\\ =&\frac{\partial}{\partial z}\int_0^{z-x}\lambda^2e^{-\lambda x-\lambda v}\,\mathrm dv=\frac{\partial}{\partial z}\int_x^{z}\lambda^2e^{-\lambda x-\lambda(v-x)}\,\mathrm dv\\ =&\lambda^2e^{-\lambda x-\lambda(z-x)}=\lambda^2e^{-\lambda z} \end{aligned}$

But I don't think this is correct, because $\mathbb P(X\leq x,X+Y\leq z)$ isn't equal to $\mathbb P(X\leq x,Y\leq z-x)$. But I can't think of anything else.

Could someone help me a little bit with this exercise?

I'm going to try the following:

$\mathbb P(X\leq x,Z\leq z)=\int_{-\infty}^x\int_{-\infty}^zf_X(u)f_Z(v)\,\mathrm du\,\mathrm dv$,

where I use the convolution formula. If I made no mistake, this gives

$f_{X+Y}(z)=z\lambda^2e^{-\lambda z}$, if $z>0$.

Then calculating the joint density function:

$\begin{aligned}f_{X,X+Y}(x,z)=\frac{\partial}{\partial z}\left[\int_{-\infty}^z\lambda^3ve^{-\lambda x-\lambda v}\,\mathrm dv\right]. \end{aligned}$

Oh wait, this is also incorrect, because $X$ and $X+Y$ are dependent.

I really don't know how to proceed; could someone help?

I've also looked at this post, but they mention the Jacobian, and my teacher has chosen to skip the paragraph about the Jacobian, so it wouldn't make sense if I had to use that.

2 Answers 2

1

Directly, by the Jacobian change of variables, we have:

$$\begin{align}f_{X,X+Y}(x,z) ~&=~ f_{X,Y}(x,z-x)~\lVert\tfrac{\partial(x,z-x)}{\partial (x,z)}\rVert\\[1ex]& =~ \lambda^2\mathsf e^{-\lambda(x+z-x)}\mathbf 1_{(x,z)\in\Bbb R^2\wedge0\leqslant x\leqslant z}\\[1ex] &=~ \lambda^2\mathsf e^{-\lambda z}\mathbf 1_{(x,z)\in\Bbb R^2\wedge 0\leqslant x\leqslant z}\end{align}$$

(The Jacobian determinant is, conveniently, $1$ in this case).


By your method. (Using the Fundamental Theorem of Calculus:)

$$\begin{align}f_{X,X+Y}(x,z) ~&=~ \dfrac{\partial^2\qquad }{\partial~x~\partial~z}\mathsf P(X\leq x,X+Y\leq z) \\[1ex] &=~ \dfrac{\partial^2\qquad }{\partial~x~\partial~z} \int_0^x f_X(s)\mathsf P(Y\leq z-X\mid X=s)\operatorname d s \\[1ex] &=~ f_X(x)\,\dfrac{\partial\quad}{\partial~ z~}\mathsf P(Y\leq z-x) \\[1ex] &= f_X(x)\,\dfrac{\partial\quad}{\partial~ z~}\int_0^{z-x}f_Y(t)\operatorname d t \\[1ex] &=~ f_X(x)\,f_Y(z-x) \\[1ex] &=~ \text{(see above)}\end{align} $$

  • 0
    I don't understand how you came up with $\begin{aligned}\dfrac{\partial^2}{\partial~x~\partial~z} \int_0^x f_X(s)\mathbb P(Y\leq z-X\mid X=s)\,\mathrm d s.\end{aligned}$ Is this a theorem? I am familiar with conditional density/distribution functions, but I haven't seen this before. I can follow the rest of the steps.2017-01-27
  • 1
    @ShaVulklia It is just the *Law of Total Probability*, as applied to continuous random variables. $$\mathsf P(X{\leqslant} x, Z{\leqslant} z) = \int_{s \leqslant x} f_X(s)\,\mathsf P(Z{\leqslant} z\mid X{=}s)\operatorname d s$$ It is analogous to the application to discrete random variables $$\mathsf P(U\leq u, V\leq v) = \sum_{k:k\leq u} \mathsf P(U=k)\,\mathsf P(V\leq v\mid U=k)$$2017-01-28
  • 0
    Wait, but why do we get $f_X(x)\frac{\partial}{\partial z}\mathbb P(Y\leq z-x)$. I would have thought that we'd get $f_X(x)\frac{\partial}{\partial z}\mathbb P(Y\leq z-x\mid X=x)$.2017-01-29
  • 1
    $Y$ is *independent* of $X$, so $\mathsf P(Y\leq z-X\mid X=x) =\mathsf P(Y\leq z-x)$ .2017-01-29
  • 0
    Ohhhhh, of course! Thank you.2017-01-29
1

In this solution I try to do everything step by step without referring to more advanced knowledge. The most honorable OP has to accept that this problem is not that simple. (Elementary but not simple; not tricky but needs work.)

Let's concentrate on the common distribution function:

$$F_{X,X+Y}(x,y)=P(X

After substituting $z$ for $X$ we get $$P(X

Since $\mathbb I_{z

$$\lambda\int_0^{\min(x,y)}P(Y

$$=\lambda\int_0^{\min(x,y)}\left(1-e^{-\lambda(y-z)}\right)e^{-\lambda z}\ dz=$$ $$=\lambda\int_0^{\min(x,y)}e^{-\lambda z}\ dz-\lambda e^{-\lambda y}\int_0^{\min(x,y)}\ dz=$$ $$=1-e^{-\lambda \min(x,y)}-\lambda e^{-\lambda y}\min(x,y).$$

Now, let's compute the common density. For $x

$$f_{X,X+Y}(x,y)=\frac{\partial^2 F_{X,X+Y}}{\partial y \partial x}=\frac{\partial}{\partial y}\frac{\partial}{\partial x}\left[1-e^{-\lambda x}-\lambda e^{-\lambda y}x\right]=$$ $$=\lambda\frac{\partial}{\partial y}\left(e^{-\lambda x}-e^{-\lambda y}\right)=\lambda^2e^{-\lambda y}.$$

But if $y\leq x$ then the operation above results in $0$.

As a result we have

$$f_{X,X+Y}(x,y)=\begin{cases} \lambda^2e^{-\lambda y}&\text{ if } 0\leq x\leq y\\ 0& \text{ otherwise}. \end{cases}$$

  • 0
    I don't understand how you came up with $\int_0^{\infty}P(X2017-01-27
  • 0
    Instead of wasting another hour trying to solve this, I'm just going to read through the paragraph about the Jacobian, and then solve it quickly.2017-01-27
  • 1
    Using the independency of $X$ and $Y$: $P(X2017-01-27
  • 0
    @ShaVuklia: I've redone everything for your convenience.2017-01-28