0
$\begingroup$

Let $(X_n)_{n\ge1}$ be an infinite sequence of (not necessarily independent) random variables defined on the same probability space. Suppose that $P(X_n = 0\text{ eventually}) = 1$.

Consider, for every $n\ge1$, $S_n=\sum\limits_{k=1}^nX_k^{(n)}$ the sum of $n$ i.i.d. random variables $X_k^{(n)}$ distributed as $X_n$.

May we conclude that $P(S_n = 0\text{ eventually}) = 1$?

Please note that $S_n$ is not defined as the partial sum $X_1+X_2+\cdots+X_n$.

  • 1
    Assume that every $X_n$ is Bernoulli with $P(X_n=1)=x_n=1-P(X_n=0)$ and that $(X_n)$ is independent, then $P(X_n=0\ \text{eventually})=1$ if and only if $\sum x_n$ converges and $P(S_n=0)=(1-x_n)^n$. Assuming furthermore that $(S_n)$ is independent, one sees that $P(S_n=0\ \text{eventually})=1$ if and only if $\sum (1-(1-x_n)^n)$ converges. If $n^2x_n=\Theta(1)$, the former holds but not the latter hence the answer to your question is "No".2017-01-11
  • 0
    @Did: Come to think about it, why does $\sum(1 - (1-\frac{1}{n^2})^n)$ diverge?2017-01-11
  • 1
    Roughly speaking, because $(1-1/n^2)^n\approx e^{-1/n}$ at least in the sense that $1-(1-1/n^2)^n\sim1/n$.2017-01-11

0 Answers 0