Let $X_1,X_2,\dots$ be independent, symmetric random variables with characteristic functions $\varphi_{1},\varphi_{2},\dots$
Prove:
If $\varphi_{1},\varphi_{2},\dots$ is differentiable at zero, then $\frac{X_1+\dots +X_n}{n}$ converges in probability to zero.
My attempt:
Use the Weak Law of Large Numbers for not identical distributed variables. For this I need
- the mean of $X_i$ is zero,
- independence of the $X_i$,
- finite variance.
The mean is zero, because the variables are symmetric. The independence also is given. I conclude the finite variance should be shown by the differentiability of $\varphi_n$. For the derivatives it holds that $ \varphi_n^{(k)}(t)=i^k\mathbb E\left(X_n^ke^{itX_n}\right) $ so I know that $ \varphi_n^{(1)}(0)=i\mathbb E(X_n) $ exists for any $n = 1,2,\dots$. But I don't get the information about the variance or the second moment.
Hints? Thoughts?