$X_{k}$ are random variables and they are not independent, I wonder if $E(\sum_{n=1}^{+\infty}X_{k}$)=$\sum_{n=1}^{+\infty}EX_{k}$. if the equation does not hold, what conditions are required in order to make it right.
a question about $E(\sum_{n=1}^{+\infty}X_{k}$)=$\sum_{n=1}^{+\infty}EX_{k}$
4 Answers
If the $X_i$ are nonnegative, it is always true, by either monotone convergence or Tonelli's theorem (thinking of the sum as an integral over $\mathbb{N}$ with respect to counting measure). If we have $\sum_k E|X_k| < \infty$ or equivalently $E \sum_k |X_k| < \infty$, then it is also true, by either dominated convergence or Fubini's theorem.
Otherwise it can fail: let $U \sim U(0,1)$, $Y_k = k 1_{\{U \le 1/k\}}$, so that $Y_k \to 0$ a.s. but $E Y_k = 1$ for each $k$. Let $Y_0 = 0$ and $X_k = Y_k - Y_{k-1}$ for $k \ge 1$. Then you can easily check that $E \sum_{k=1}^\infty X_k = 0 \ne 1 = \sum_{k=1}^\infty E X_k.$
-
0You're right! I was (in the LHS) taking limit and expectation in the wrong order. – 2011-06-10
You can conclude this if $X_i$'s are non-negative. The proof follows immediately from monotone convergence theorem.
-
1The dominated convergence theorem yields the following: If $\sum E|X_k| \lt \infty$ then $\sum X_k$ is integrable and finite a.e. and $E\sum X_k = \sum EX_k$. This seems to me at least as useful. – 2011-06-10
Elaborating on Sivaram's answer.
Suppose that the $X_i$ are nonnegative random variables on a probability space $(\Omega,\mathcal{F},P)$, and let $S_n = \sum\nolimits_{i = 1}^n {X_i }$. Since $S_n$ is monotone increasing, it converges pointwise to $S \in [0,\infty]$; that is, for each $\omega \in \Omega$, $S(\omega ): = \lim _{n \to \infty } S_n (\omega ) \in [0,\infty]$. By the Monotone convergence theorem, since $0 \leq S_1 \leq S_2 \leq \cdots$ is a monotone increasing sequence of nonnegative random variables (measurable functions), the pointwise limit $S$ is also measurable and it holds $ \mathop {\lim }\nolimits_{n \to \infty } {\rm E}(S_n ) := \mathop {\lim }\limits_{n \to \infty } \int_\Omega {S_n dP} = \int_\Omega {(\lim _{n \to \infty } S_n )dP} := {\rm E}(\lim _{n \to \infty } S_n ) \in [0,\infty]. $ Hence $ \sum\limits_{n = 1}^\infty {{\rm E}(X_n )} = \mathop {\lim }\limits_{n \to \infty } {\rm E}(S_n ) = {\rm E}(\lim _{n \to \infty } S_n ) = {\rm E}\bigg(\sum\limits_{n = 1}^\infty {X_n } \bigg), $ where the first equality follows from the linearity of expectation. Note that if the left-hand side is finite, that is $\sum\nolimits_{n = 1}^\infty {{\rm E}(X_n )} < \infty$, then ${\rm E}(\sum\nolimits_{n = 1}^\infty {X_n }) < \infty$, implying that the pointwise limit $\sum\nolimits_{n = 1}^\infty {X_n } \,( = S)$ is almost surely finite (indeed, $\int_\Omega {SdP} < \infty $ implies that $S$ is almost surely finite).
If the $X_k$ are highly correlated with expectations converging fairly fast to $0$ --- $O(k^{-1.001})$ would be enough --- then the summation on the LHS could easily be almost surely infinite while the RHS is well defined. A simple case would be $X_k=X$ for all $k$, with $X$ either $-1$ or $1$ with equal probability. It isn't obvious just how to construct an optimal function of the correlations in order to maintain this condition if and only if the function satisfies a certain bound. I doubt whether there could be such a function---for a start, it would have to be independent of any changes to any finite number of $X_k$.
-
0Yes (assuming "infinite" is "infinite in absolute value"). – 2011-06-12