0
$\begingroup$

I am preparing for upcoming exam and trying to do all examples from book. I think I may call it homework. I stuck on one particular one. Already did a lot but stuck on something and cant find how to get out.

So we got two independent random variables X and Y and their laws. \begin{align*} P_X(dx)= P_Y(dx) = e^{-x} \mathbb{1}_{\mathbb{R}_+}dx \end{align*} And functions $U=\min\{X,Y\}$ and $V=|X-Y|$.

I have to prove, that U and Z are independent random variables.

I use formula $\mathbb{E}(\gamma(X)\psi(Y)) = \mathbb{E}(\gamma(X)) \mathbb{E}(\psi(Y))$.

I started with calculating the density. I got that: \begin{align*} P_U(dx)&=2e^{-2x}\mathbb{1}_{\mathbb{R}_+}dx\\ P_V(dx)&=e^{-x}\mathbb{1}_{\mathbb{R}_+}dx \end{align*}

Then I got stuck on the integral. \begin{align*} \mathbb{E}(\gamma(U)\psi(V))&=\int^{\infty}_{0} \left[ \int^{\infty}_{0} \gamma(\min\{X,Y\})\psi(|X-Y|)e^{-x}dx\right]e^{-y}dy=\\ &=\int^{\infty}_{0} \left[ \int^{\infty}_{0} \gamma(U)\psi(V)e^{-x}dx\right]e^{-y}dy \end{align*} In the previous example with two variable functions, prof used substitution of variables. I tried to write integral using variables $V$ and $U$ only, but since these functions have singularities I think we cant use that trick here. So maybe someone have encountered such problems before and knows way out.

Appreciate your help

  • 0
    There is a formula for the joint pdf of $U$ and $V$ if you know the joint pdf of $X$ and $Y$. I don't know it off the top of my head, but it is probably in your book. Find it, and check that it factors into a function of $u$ times a function of $v$2012-05-20
  • 0
    @user20520 The standard "formula" for the joint pdf requires differentiable functions (the Jacobian involves derivatives, remember?), but here the functions are not differentiable.2012-05-20
  • 0
    Maybe this question should be doubled on stats.SE ?2012-05-20

1 Answers 1

1

By definition and symmetry, for every bounded measurable $\varphi$, $\mathrm E(\varphi(U,V))=(\ast)$ with $$ (\ast)=2\mathrm E(\varphi(X,Y-X):X\lt Y)=2\iint [0\lt x\lt y]\,\varphi(x,y-x)\mathrm e^{-x-y}\mathrm dx\mathrm dy. $$ The change of variable $u=x$, $v=y-x$ yields $x=u$, $y=u+v$, hence the Jacobian is $1$ and $$ (\ast)=2\iint [0\lt u,0\lt v]\,\varphi(u,v)\mathrm e^{-2u-v}\mathrm du\mathrm dv=\mathrm E(\varphi(\tfrac12X,Z)), $$ where $Z$ is standard exponential and independent of $X$. Since this holds for every $\varphi$, $U$ and $V$ are independent and exponential with $2U$ and $V$ standard exponential. Thus, $U$ is distributed like $\frac12X$ and $V$ like $X$.

  • 0
    Compare with [this](http://math.stackexchange.com/a/30966/6179).2012-05-20
  • 0
    So in comparison to my example you use $\varphi(U,V)=\gamma(U)\psi(V)$ ? Didn't actually understood the introduction of variable $Z$. Why did we have to introduce it? Why is it independent of $X$? $Z$ equals to $V$?2012-05-21
  • 0
    As written in the post, Z can be any standard exponential random variable independent of X. Hence, no, Z is not V.2012-05-21
  • 0
    Can you be more specific why can we introduce such variable as Z?2012-05-21
  • 0
    The idea is that $(\ast)=\mathrm E(\varphi(R,S))$ for **any** independent random variables such that $R$ is exponential with parameter $2$ and $S$ is exponential with parameter $1$. Hence $(U,V)$ is distributed like any such $(R,S)$, for example like $(\frac12X,Z)$ as in the post.2012-05-21
  • 0
    Also a question about the 2 before integral. Is it because of the symmetry? Or because we have the same probability distribution function? Can you explain more about symmetry?2012-05-21
  • 0
    $(\ast)=E(\varphi(X,Y-X);X\lt Y)+E(\varphi(Y,X-Y);Y\lt X)$ and both terms in the RHS are equal since the distributions of $(X,Y)$ and $(Y,X)$ are equal.2012-05-22