2
$\begingroup$

Let $X_1,X_2,\dots$ be independent, symmetric random variables with characteristic functions $\varphi_{1},\varphi_{2},\dots$

Prove:

If $\varphi_{1},\varphi_{2},\dots$ is differentiable at zero, then $\frac{X_1+\dots +X_n}{n}$ converges in probability to zero.


My attempt:

Use the Weak Law of Large Numbers for not identical distributed variables. For this I need

  • the mean of $X_i$ is zero,
  • independence of the $X_i$,
  • finite variance.

The mean is zero, because the variables are symmetric. The independence also is given. I conclude the finite variance should be shown by the differentiability of $\varphi_n$. For the derivatives it holds that $ \varphi_n^{(k)}(t)=i^k\mathbb E\left(X_n^ke^{itX_n}\right) $ so I know that $ \varphi_n^{(1)}(0)=i\mathbb E(X_n) $ exists for any $n = 1,2,\dots$. But I don't get the information about the variance or the second moment.

Hints? Thoughts?

  • 0
    *No, I'm just assuming independence and symmetry*... Then the result does not hold, for the reason previously explained. (Unrelated: please use @ to signal comments.)2012-02-25

1 Answers 1

1

If the only assumptions are independence and symmetry, the result is not true. For a counterexample, assume $X_n=a_nY_n$ for your favorite sequence $(a_n)$ of real numbers such that $a_n\to\infty$ quickly, and your favorite sequence $(Y_n)$ of i.i.d. well behaved random variables, say centered Bernoulli or standard normal.

  • 1
    To sum up, the Edit history of the question shows that a perfectly valid question (every i.i.d. sequence with a hypothesis on their common characteristic function satisfies a weak law of large numbers) was, later on, transformed into an incorrect statement: Edit #2 cancels the equidistribution hypothesis, without which the result does not hold. The OP might have been interested in restoring the original (correct) version of the question but this never happened.2012-04-28