19
$\begingroup$

I'm trying to prove this conclusion but have some problems with one of the steps.

Assume $X_1,\ldots,X_n,\ldots$ is a sequence of Gaussian random variables, converging almost surely to $X$, prove that $X$ is Gaussian.

We use characteristics function here. Since $|\phi_{X_n}(t)|\leq 1$, by dominated convergent theorem, we have for any $t$

$$ \lim_{n\rightarrow\infty}e^{it\mu_n-t^2\sigma_n^2/2}=\lim_{n\rightarrow \infty}\phi_{X_n}(t) = \lim_{n\rightarrow \infty}\mathbb{E}\left[e^{itX_n}\right] = \mathbb{E}\left[e^{itX}\right] = \phi_X(t) $$

this is the step that I cannot figure out: $e^{it\mu_n-t^2\sigma_n^2/2}$ converges for any $t$ if and only if $\mu_n$ and $\sigma_n$ converges.

Let $\mu=\lim_n \mu_n$, and $\sigma=\lim_n\sigma_n$, then $\phi_X(t)=e^{it\mu-t^2\sigma^2/2}$, which proves that $X$ is a Gaussian random variable.

Why can we get that $\mu_n$ and $\sigma_n$ converge? This looks intuitive for me, but I cannot make a rigorous prove.

  • 0
    try using the continuous mapping theorem which states that if $x_n \to x, g(x_n) \to g(x)$2012-11-08
  • 0
    @jay , Thanks for your reply. With the property of continuous mapping, I can get that if $\mu_n\rightarrow \mu$ and $\sigma_n\rightarrow \sigma$ then $\exp(it\mu_n-t^2\sigma_n^2/2)\rightarrow \exp(it\mu-t^2\sigma^2/2)$. But I don't know how to get the reverse statement.2012-11-08
  • 0
    Apply to all possible limits of subsequences. Also, try to heavily use that the exponent $it\mu_n-t^2\sigma_n^2/2$ converges *for all* $t$.2012-11-08
  • 0
    @Berci, Can you give a little more hint here? I'm kind of stuck to infer that converge **for all $t$** means what.2012-11-08

2 Answers 2

11
  • First, we note that the sequence $\{\sigma_n\}$ and $\{\mu_n\}$ has to be bounded. It's a consequence of what was done in this thread, as we have in particular convergence in law. What we use is the following:

If $(X_n)_n$ is a sequence of random variables converging in distribution to $X$, then for each $\varepsilon$, there is $R$ such that for each $n$, $\mathbb P(|X_n|\geqslant R)\lt \varepsilon$ (tightness).

To see that, we assume that $X_n$ and $X$ are non-negative (considering their absolute values). Let $F_n$, $F$ the cumulative distribution function of $X_n$, $X$. Take $t$ such that $F(t)\gt 1-\varepsilon$ and $t$ is a continuity point of $F$. Then $F_n(t)\gt 1-\varepsilon$ for $n\geqslant N$ for some $N$. And a finite collection of random variables is tight.

  • Now, fix an arbitrary strictly increasing sequence $\{n_k\}$. We extract further sub-sequences of $\{\sigma_{n_k}\}$ and $\{\mu_{n_k}\}$, which converge respectively to $\sigma$ and $\mu$. Taking the modulus, we can see that $e^{-\sigma^2/2}=|\varphi_X(1)|$, so $\sigma$ is uniquely determined.
  • We have $e^{it\mu}=\varphi_X(t)e^{t\sigma^2/2}$ for all $t\in\Bbb R$, so $\mu$ is also completely determined.
  • 0
    Thanks Davide, that clarified all my puzzles!2012-11-09
  • 0
    why does convergence in law imply tightness???2014-01-15
  • 0
    @AndyTeich See edit.2014-01-15
  • 0
    @René If the sequence $(\sigma_n)$ does not converge to $\sigma$, then for some $\delta>0$ and infinitely many $n$'s, $|\sigma_n-\sigma|\gt\delta$. This gives a subsequence which does not converge to $\sigma$.2015-11-06
  • 0
    Yes, I see it now. Sorry for deleting my question (I had gotten there at the exact same time your comment appeared -- let me put the question back in).2015-11-06
  • 0
    [I remarked that Davide's answer seemed to only prove the fact that every convergent *subsequence* of $\{ \sigma_n \}$ converges to some $\sigma$, so that he seemed to use the fact that if a sequence $\{ \sigma_n \}$ has the property that all its convergent subsequences tend to the same value, then the sequence itself must be convergent. But this is quite clearly the case.]2015-11-06