Let $X_1, X_2, \ldots $ be independent identically distributed random variables with $ E\left | X_i \right |<\infty$. Show that
$E(X_1\mid S_n,S_{n+1},\ldots)=\dfrac{S_n}{n}$ (a.s.),
where $S_n=X_1+\cdots+ X_n$
Let $X_1, X_2, \ldots $ be independent identically distributed random variables with $ E\left | X_i \right |<\infty$. Show that
$E(X_1\mid S_n,S_{n+1},\ldots)=\dfrac{S_n}{n}$ (a.s.),
where $S_n=X_1+\cdots+ X_n$
Observe that the information given by $S_n,S_{n+1},S_{n+2},S_{n+3},\ldots$ is the same as that given by $S_n,X_{n+1},X_{n+2},X_{n+3},\ldots$, i.e., if you know either, you can compute the other. Or, if you like, both sequences generate the same sigma-algebra. So you're looking for $ \mathbb{E}(X_1\mid S_n,S_{n+1},S_{n+2},S_{n+3},\ldots) = \mathbb{E}(X_1\mid S_n,X_{n+1},X_{n+2},X_{n+3},\ldots), $ and the $X$s on which you're conditioning in the second expression are independent of $X_1$ and so can be dropped.
The rest is just the symmetry argument cited in comments above by Nate Eldredge, i.e. this expectation is the same as $\mathbb{E}(X_k\mid S_n)$ for $k=1,\ldots,k$, and it's easy to find the sum of those $k$ expectations.