3
$\begingroup$

Let $X_1, X_2, \ldots $ be independent identically distributed random variables with $ E\left | X_i \right |<\infty$. Show that

$E(X_1\mid S_n,S_{n+1},\ldots)=\dfrac{S_n}{n}$ (a.s.),

where $S_n=X_1+\cdots+ X_n$

  • 0
    @NateEldredge : The fact that this conditional probability depends on $S_n,S_{n+1},S_{n+2},\ldots$ only through $S_n$ may seem to have a trivial proof, which you just gave, but does that mean it's a trivial fact? Perhaps it has non-trivial consequences. And that fact is not mentioned in that other question.2012-05-06

1 Answers 1

3

Observe that the information given by $S_n,S_{n+1},S_{n+2},S_{n+3},\ldots$ is the same as that given by $S_n,X_{n+1},X_{n+2},X_{n+3},\ldots$, i.e., if you know either, you can compute the other. Or, if you like, both sequences generate the same sigma-algebra. So you're looking for $ \mathbb{E}(X_1\mid S_n,S_{n+1},S_{n+2},S_{n+3},\ldots) = \mathbb{E}(X_1\mid S_n,X_{n+1},X_{n+2},X_{n+3},\ldots), $ and the $X$s on which you're conditioning in the second expression are independent of $X_1$ and so can be dropped.

The rest is just the symmetry argument cited in comments above by Nate Eldredge, i.e. this expectation is the same as $\mathbb{E}(X_k\mid S_n)$ for $k=1,\ldots,k$, and it's easy to find the sum of those $k$ expectations.