2
$\begingroup$

Let $X$ and $Y$ be two i.i.d random variables. I am trying to prove that $\mathbb{P}(X

I haven't studied joint distributions yet. Here is my attempt for the proof.

Let $\Omega$ be the sample space of interest. Then we have the following.

$\mathbb{P}(X

$\mathbb{P}(Y

One way to show that $\mathbb{P}(X

$$\begin{equation*}\mathbb{P}\left(\{\omega\in\Omega:X(\omega)

However, I couldn't continue further using the above approach. Instead, I am using the following method, but it works only if $X$ and $Y$ are discrete random variables.

Let $A:=\{x\in\mathbb{R}:X(x)>0\}$. Then $A$ is the support of $X$, and also $Y$ since $X$, $Y$ are i.i.d.

Let $B:=\{(x,y)\in A\times A:x

We then have

$\begin{equation*}\begin{split}\mathbb{P}(X

The above proof holds when $X$ and $Y$ are discrete. But how do I approach this problem when $X$ and $Y$ are continuous random variables? Also, is it possible prove the above fact by directly showing that the equation $(1)$ holds?

2 Answers 2

1

$P(X

Swap the order of integration

$\int_{-\infty}^{\infty}\int_{-\infty}^x f(x) f(y) \;dy \;dx = P(Y

1

Since $X$ and $Y$ are independent and have the same distribution, the characteristic function satisfies $$\varphi_{X-Y}(t)=\varphi_X(t)\varphi_{-Y}(t)=\varphi_X(t)\varphi_{-X}(t)$$ and $$\varphi_{Y-X}(t)=\varphi_Y(t)\varphi_{-X}(t)=\varphi_X(t)\varphi_{-X}(t).$$ Hence $\varphi_{X-Y}(t)=\varphi_{Y-X}(t)$ so $X-Y$ and $Y-X$ have the same distribution.