0
$\begingroup$

Let $X_1$ and $X_2$ be iid Random Variables. The question is to find P($X_1$<$X_2$) using transformations.

So, what I tried:

Let $Y_1=X_1-X_2\\ Y_2=X_2\\ u = x - y, v = y\\ |J| = 1\\f_{uv}=f_{xy}(u+v,v)\\P(Y_1<0,Y_2<\infty) = \int^{0}_{-\infty} \int^{\infty}_{-\infty} f_{xy}(u+v,v)$

Now, I'm stuck.

  • 0
    By symmetry in the role of $X_1$ and $X_2$, we have \Bbb{P}(X_1. This implies that 2\Bbb{P}(X_1 Assuming that $X_1$ has continuous distribution, what can you say about the probability in the right hand-side?2012-10-01

2 Answers 2

0

Here is another way of approaching the problem via transformations which fleshes out the ideas behind the comment of sos440 and myself.

Let $f(\cdot)$ denote the common density of continuous iid random variables $X_1$ and $X_2$ so that $f_{X_1,X_2}(x,y) = f(x)f(y)$. Let $Y_1 = X_2$ and $Y_2 = X_1$. It should be obvious that $f_{Y_1,Y_2}(u,v) = f(u)f(v)$ but if not, this can be readily verified using Jacobians and the like. From the definitions, it follows that $P\{Y_1 > Y_2\} = P\{X_2 > X_1\} = P\{X_1 < X_2\}$. But, $\begin{align} P\{X_1 > X_2\} &= \int_{-\infty}^\infty \int_{-\infty}^x f(x)f(y)\,\mathrm dy\,\mathrm dx\\ &= \int_{-\infty}^\infty \int_{-\infty}^u f(u)f(v)\,\mathrm dv\,\mathrm du &\text{(names of variables don't matter)}\\ &= P\{Y_1 > Y_2\}\\ &= P\{X_1 < X_2\}. \end{align}$ Since $P\{X_1 > X_2\} + P\{X_1 < X_2\} + P\{X_1 = X_2\} = 1$, and $P\{X_1 = X_2\} = 0$ for continuous iid random variables, it follows that $P\{X_1 > X_2\} = P\{X_1 < X_2\} = \frac{1}{2}.$

0

Following your work, you have

$f_{u v}(u,v) = f_{x,y}(u+v,v) = f_x(u+v) f_y(v)$

where the last equality uses the fact of iid. So

$f_u(u) = \int_{-\infty}^{\infty} f_x(v+u) f_y(v) dv$

But

$f_u(-u) = \int_{-\infty}^{\infty} f_x(v-u) f_y(v) dv$

which, doing the change of vairables $v' = v-u$ results in $f_u(-u) = f_u(u)$. Hence (as intuition should say) $f_u(u)$ is symmetric around zero, and (assuming it's continuous), the probability of it being positive is 1/2.