0
$\begingroup$

Let $X$ and $Y$ have joint pdf $f(x,y) = 4e^{-2(x+y)}$ for $0 < x < \infty$, $0 < y < \infty$, and zero otherwise.

(a) Find the CDF of $W = X + Y$

(b) Find the joint pdf of $U = X/Y$ and $V=X$

(c) Find the marginal pdf of $U$

Could someone show me the statistics behind setting up the integration? I can do the computation myself. So for instance, for (b). I will at least need the Jacobian, $\begin{vmatrix} U_x & U_y\\ V_x & V_y \end{vmatrix} = \dfrac{-X}{Y^2}$

Then subbing, I get $f(u,v)=4e^{2(\frac{v}{u}+v)}$

And for the marginal, I am not continuing until I am sure (b) is right otherwise I will waste my time doing unnecessary computation.

(a) $\int_{0}^{w} \int_{0}^{w-x} 4e^{-2(x+y)}dydx$

Thanks

  • 0
    No, it is **not** always a given that new bivariate random variable $(U,V)$ has the same range as $(X,Y)$, though it does happen to be true in this instance. For example, $(X+Y,X-Y)$ has different range than $(X,Y)$. So, it is important to figure out what the range is in each case, and at least mention what the range is. You cannot simply say $f(u,v)=4e^{2(v/u+v)}$ without any qualifications because most people would assume that the range is the entire plane in such cases. Note, by the way, that you do not have a valid density $f(u,v)$ _even_ if you say that u, v > 02012-11-27

1 Answers 1

0

You may also notice that the joint pdf density split into a function of the form $f(x)g(y)$, meaning that they are two independent gaussian random variables. Then everything become much easy with convolutions and relatives ...

  • 0
    **gaussian** random variables?2012-11-27