3
$\begingroup$

Let $X_1, X_2, \ldots $ be independent identically distributed random variables with $ E\left | X_i \right |<\infty$. Show that

$E(X_1\mid S_n,S_{n+1},\ldots)=\dfrac{S_n}{n}$ (a.s.),

where $S_n=X_1+\cdots+ X_n$

  • 0
    possible duplicate of [Help with conditional expectation question](http://math.stackexchange.com/questions/78546/help-with-conditional-expectation-question)2012-05-06
  • 0
    This doesn't seem to be an exact duplicate, as Nate Eldredge suggests, since it's got the whole infinite sequence $S_n,S_{n+1},S_{n+2},\ldots$, and it seems to be part of the problem to show that the whole thing is the same as $\mathbb{E}(X_1\mid S_n)$.2012-05-06
  • 1
    It's not an *exact* duplicate, but the idea is the same: use symmetry to show that $E[X_k \mid S_n, S_{n+1}, \dots]$ is the same for every $1 \le k \le n$. Then add them all together and you get $S_n$.2012-05-06
  • 0
    @NateEldredge : The fact that this conditional probability depends on $S_n,S_{n+1},S_{n+2},\ldots$ only through $S_n$ may seem to have a trivial proof, which you just gave, but does that mean it's a trivial fact? Perhaps it has non-trivial consequences. And that fact is not mentioned in that other question.2012-05-06

1 Answers 1