4
$\begingroup$

Let $c_n$ be a sequence of positive real numbers such that $\sum^{\infty}_{n=1}{c_n} = \infty, \qquad \sum^{\infty}_{n=1}{c_n^2} < \infty.$ Let $X_n$ be a family of i.i.d random variables with $\mathbb{E}(X_n) = 0$ and $\sigma^2(X_n) = 1$ for each $n$, and define the random variable $X = \sum^{\infty}_{n=1}{c_n X_n}.$ Is it true that $\mathbb{E}(X) = \sum^{\infty}_{n=1}{\mathbb{E}(c_n X_n)} = 0$ and $\sigma^2(X) = \sum^{\infty}_{n=1}{\sigma^2(c_n X_n)} = \sum^{\infty}_{n=1}{c_n^2}?$ If so, how does one prove this (and where is a source, if possible)? If not, what counterexamples exist?

2 Answers 2

6

It is easy to see that $\sum c_i X_i$ converges in $L^2$. (For any $n \le m$, we have $E\left(\sum_{i=n}^m c_i X_i\right)^2 = \sum_{i=n}^m c_i^2$ by independence. As $n,m \to \infty$ this goes to 0, since $\sum c_i^2$ converges, so $\sum c_i X_i$ is $L^2$-Cauchy.)

It follows that the mean and variance of $X$, the $L^2$ limit of the sum, are as desired (0 and $\sum c_i^2$ respectively), since mean and variance are continuous with respect to $L^2$.

Edit: Thanks to commenters for pointing out why almost sure convergence also follows.

  • 0
    I found a reference with this approach! It's Theorem 5.1 (p.286) and Exercise 5.2 (p.294) of Allan Gut's "Probability: A Graduate Course". There they show that if $X_n$ are i.i.d. with mean zero and finite variance, and \sum c_n^2 < \infty, then $\sum c_n X_n$ converges in $L^2$ (the Theorem) and actually converges a.s. (the Exercise).2011-08-04
2

Seems like Lévy's continuity theorem is useful in this case. Let $\varphi(t) = E(e^{itX_j})$ and $g(t) = \log\varphi(t)$. We first show that $\widetilde{\varphi}(t) := \prod_j \varphi(c_jt) = \prod_j E(e^{itc_jX_j})$ converges pointwise. As $E(X_j)=0$ and ${\rm Var}(X_j)=1$, we have $(*): g(t)=-\frac12t^2 + o(t^2)$. Therefore, for any fixed $t$ and any $\epsilon>0$, we have $\left|g(c_jt)+\frac12c_j^2t^2\right| < \epsilon c_j^2t^2$ when $c_j$ is small. Now, as $A:=\sum_j c_j^2<\infty$, when $N$ is large, we get $\sum_{j=N}^\infty c_j^2<\epsilon$ and also every $c_j$ with $j\ge N$ will be small. So, if $N\le m\le n$, $ \left|\sum_{j=m}^n \left(g(c_j t) + \frac12 c_j^2t^2\right)\right| \le\sum_{j=m}^n\epsilon c_j^2t^2 <\epsilon At^2. $ Thus $\left\{\sum_{j=1}^n \left(g(c_j t) + \frac12 c_j^2t^2\right)\right\}$ is a Cauchy sequence and we conclude that $\sum_j g(c_jt)$ and hence $\prod_j \varphi(c_jt)$ converge pointwise.

Next we show that $\widetilde{\varphi}$ is continuous at zero. Consider $\widetilde{g}(t) := \sum_j g(c_j t)$. Apply (*) again, we see that for any $\epsilon>0$, if $t$ is small, we have $\left|g(t)+\frac12t^2\right| < \epsilon t^2$ and in turn $ \left|\widetilde{g}(t)+\frac12At^2\right| = \left|\sum_jg(c_jt)+\sum_j\frac12c_j^2t^2\right| < \epsilon At^2. $ Hence $\widetilde{g}(t) = -\frac12At^2+o(t^2)$. In particular, it is continuous at zero. So $\widetilde{\varphi}(t) = e^{\widetilde{g}(t)}$ is also continuous at $0$.

Therefore, by Lévy's continuity theorem, $X=\sum_j c_jX_j$ converges in distribution and $\widetilde{\varphi}, \widetilde{g}$ are resp. the characteristic function and cumulant generating function of $X$. Since $\widetilde{g}(t) = -\frac12 At^2 + o(t^2)$, we conclude that $E(X)=0$ and ${\rm Var}(X)=A$.

  • 0
    I think it's a bit easier to show that $\widetilde{\varphi}(t)$ converges pointwise. You just use the dominated convergence theorem to note that $\widetilde{\varphi}(t) = \mathbb{E}\left(\lim_{n \to \infty} \exp\left(it \sum^{n}_{k = 1}{c_k X_k}\right)\right) = \lim_{n \to \infty} \mathbb{E}\left(\exp\left(it \sum^{n}_{k = 1}{c_k X_k}\right)\right)$. By independence, this is equal to $\lim_{n \to \infty} \prod^{n}_{k = 1}{\mathbb{E}(e^{i t c_k X_k})} = \prod^{\infty}_{k = 1}{\mathbb{E}(e^{i t c_k X_k})}$.2011-08-04