2
$\begingroup$

So far I have this, but I am not sure if you are allowed to do this or if it is correct:

I said since $X$ and $Y$ are independent and follow the same distribution, they must have the same expectation so:

$E(X|S=s)=E(X|X+Y=x+y)=E(Y|X+Y=x+y)=E(Y|S=s)$ since $X$ and $Y$ are i.i.d. Then, can I say?

$E(X|S=s)+E(Y|S=s)=E(X+Y|X+Y=x+y)$

and since $E(X|S=s)=E(Y|S=s)$; $E(X|S=s)+E(Y|S=s)=2E(X|S=s)=x+y,$

then $E(X|S)=\frac{x+y}{2}$.

This is my steps and logic, but I am not sure if this is correct, so any feedback on my work would be greatly appreciated. Thanks!

1 Answers 1

1

There are a lot of typos in your proof, but the steps are correct, yes.

Since $X$ and $Y$ are independent and identically distributed, you have $\mathbb{E}(X|S)=\mathbb{E}(Y|S)$. To proof this properly you can use the transformation theorem:

$\mathbb{E}(X \cdot 1_B(S)) = \int x \cdot 1_B(x,y) \, \underbrace{d\mathbb{P}_{X,Y}(x,y)}_{d\mathbb{P}_X(x) \, d\mathbb{P}_Y(y)=d\mathbb{P}_Y(x) \, d\mathbb{P}_X(y)} = \int \mathbb{E}(Y \cdot 1_B(S))$

for all $B \in \mathcal{B}(\mathbb{R}^2)$.

The rest is fine: Since $X+Y=S$ is $\sigma(S)$-measurable you have

$S=\mathbb{E}(S|S)=2 \mathbb{E}(X|S)=2\mathbb{E}(Y|S) \\ \Rightarrow \mathbb{E}(X|S)=\mathbb{E}(Y|S)=\frac{S}{2}$

Similar argumentation shows that

$\mathbb{E}(X_j|S_n)=\frac{S_n}{n}$

where $(X_j)_j$ are iid random variables, $S_n := \sum_{j=1}^nX_j$.