2
$\begingroup$

Let $X_1,X_2,\dots$ be independent, symmetric random variables with characteristic functions $\varphi_{1},\varphi_{2},\dots$

Prove:

If $\varphi_{1},\varphi_{2},\dots$ is differentiable at zero, then $\frac{X_1+\dots +X_n}{n}$ converges in probability to zero.


My attempt:

Use the Weak Law of Large Numbers for not identical distributed variables. For this I need

  • the mean of $X_i$ is zero,
  • independence of the $X_i$,
  • finite variance.

The mean is zero, because the variables are symmetric. The independence also is given. I conclude the finite variance should be shown by the differentiability of $\varphi_n$. For the derivatives it holds that $$ \varphi_n^{(k)}(t)=i^k\mathbb E\left(X_n^ke^{itX_n}\right) $$ so I know that $$ \varphi_n^{(1)}(0)=i\mathbb E(X_n) $$ exists for any $n = 1,2,\dots$. But I don't get the information about the variance or the second moment.

Hints? Thoughts?

  • 0
    What is $\varphi$?2012-02-21
  • 0
    $\varphi$ is the characteristic function. So the precondition is that all characteristic functions of $X_1$,... are differentiable at $0$.2012-02-21
  • 0
    The characteristic function of what? I mean, $\varphi_{1}$ is the characteristic function of $X_1$, $\varphi_{2}$ is the characteristic function of $X_2$, etc. But $\varphi$ is the characteristic function of nothing.2012-02-21
  • 2
    Be careful. Having a symmetric distribution implies that *if the mean exists* it must be 0. There are symmetric distributions with no mean, such as the [standard Cauchy distribution](http://en.wikipedia.org/wiki/Cauchy_distribution). And you are going to have to do without the condition of finite variance; it won't follow from the conditions you have.2012-02-21
  • 0
    I hope I fixed the notation with $\varphi_n$ - please take a look if it is what you wanted to say.2012-02-21
  • 0
    So I can't conlude $E(X_n)=0$ directly from the symmetry. But I can conlude it then from $E(X_n)=i^{-1}\varphi^{(1)}_n(0)$ and the symmetry, can I? Because $|\varphi^{(1)}_n(0)|<\infty$.2012-02-21
  • 0
    Yes, but you should look carefully at your proof that $E(X_n)=i^{−1} \varphi^{(1)}_n(0)$ to make sure that it doesn't *assume* that $E(X_n)$ exists.2012-02-21
  • 0
    Hmmm... Is this result true? Assume $X_n=a_nY_n$ for your favorite sequence $(a_n)$ of real numbers with $a_n\to\infty$ extremely fast, and your favorite sequence $(Y_n)$ of i.i.d. well behaved random variables, say Bernoulli or gaussian.2012-02-21
  • 0
    Hmmm (bis)... I just checked the previous version(s) of the question and it seems that some later interventions might have modified the text too drastically: are you in fact assuming that the random variables $X_n$ are i.i.d.?2012-02-21
  • 0
    No, I'm just assuming independence and symmetry.2012-02-21
  • 0
    Were there interventions putting forward an i.i.d. condition?2012-02-22
  • 0
    I found a paper considering the relation of the mean and the derivative of the characteristic function: http://projecteuclid.org/DPubS/Repository/1.0/Disseminate?view=body&id=pdf_1&handle=euclid.aoms/1177730443 ...But there should be a more simple solution to the problem.2012-02-23
  • 0
    *No, I'm just assuming independence and symmetry*... Then the result does not hold, for the reason previously explained. (Unrelated: please use @ to signal comments.)2012-02-25

1 Answers 1