4
$\begingroup$

If $X_1,X_2,...$ are i.i.d. random variables such that $\phi (t)=E(e^{itX_{1}})=e^{-c|t|^{\alpha }}$, where $c>0$ and $\alpha\in(0,2]$, then show:

  1. For $1<\alpha\leq2$, show that $\frac{X_1+...+X_n}{n}$ converges to $0$ in probability.
  2. For $\alpha=1$, show that $\frac{X_1+...+X_n}{n}$ does not converge to a constant in probability.
  3. For $0<\alpha<1$, show that the distribution of $\frac{X_1+...+X_n}{n}$ does not converge weakly to any probability measure.
  4. For any $\alpha\in(0,2]$, show that there is some $\beta$ such that the distribution of $\frac{X_1+...+X_n}{n^{\beta}}$ does converge to some non-degenerate probability measure.
  • 0
    The first three parts were listed as properties in a theorem, but a proof was not given. I've been able to prove 1, and have some idea for 2, but am unsure of how to prove 3. Part 4 was given as an exercise after the theorem, and I think $\beta=1/2$ works, but am not positive about that.2011-10-09

1 Answers 1

1

Fix $\alpha$ in $(0,2]$. The $\alpha$-stability and symmetry of $X_1, X_2, \ldots$ imply that $S_n:=X_1 + \ldots +X_n\:\stackrel{d}{=} n^{1/\alpha}X$ for every $n\geq 2$ where $X$ is independent of the other random variables and has the same distribution as say $X_1$ (and $\stackrel{d}{=}$ denotes equality in distribution). Therefore $S_n/n\stackrel{d}{=}n^{1/\alpha -1}X=n^{\gamma}X$ with $\gamma := (1-\alpha)/\alpha$.

Further, recall that for any $\alpha$-stable distributed $X$ $\mathbf{E}|X|^p<\infty \text{ for any } 0

Now we can solve the four exercises.

1) $\gamma \in [-1/2, 0)$, $\mathbf{E}X=0$. For the convergence in probability to $0$ we have to show that for every $\varepsilon>0$ $\lim_{n\to\infty}\mathbf{P}\{|n^{\gamma}X|>\varepsilon\}=0.$ Assume that for some $\varepsilon>0$ this would not hold, then $\lim_{n\to\infty}\mathbf{P}\{|X|>\varepsilon n^{-\gamma}\}>0.$ Since $\gamma<0$, $\varepsilon n^{-\gamma}$ will eventually be bigger than any natural number as $n$ grows. But a probability measure on the real numbers can not fulfil $\lim_{m\to\infty}\mathbf{P}\{|X|>m\}>0.$

2) $\gamma = 0$, $S_n/n\stackrel{d}{=}X$ and since $X$ is not constant (c>0), neither is $S_n/n$. It is important to see that the distribution of $S_n/n$ is always the distribution of $X$, no matter what $n$ is, so no convergence to a constant will occur. This is a statistician's nightmare in a sense, since taking the mean is of no use.

3) $\gamma>0$, the law of $S_n/n$ is just the law of $n^{\gamma}X$ which does not converge weakly for growing $n$.

4) Here, $n^{-\beta}S_n\stackrel{d}{=}n^{1/\alpha-\beta}X$. Choose $\beta = 1/\alpha$ so that $n^{-\beta}S_n\stackrel{d}{=}X$. The distribution of $n^{-\beta}S_n$ is the distribution of $X$, so certainly the sequence $(n^{-\beta}S_n)_n=(X,X,X,\ldots)$ converges weakly.

  • 0
    How would your answer to 4th part change if characteristic function was $e^{-t^2 - \sqrt{|t|}}$?2016-11-05