2
$\begingroup$

Let $X_1,\dots,X_n$ be independent random variables, each normally distributed as $X_k\sim N(m_k;\sigma^2_k)$. Let $S_n = \sum_{k=1}^n X_k - m_k$ and $T_n = \frac{S_n}{\sqrt{\operatorname{Var}(S_n)}}$.

We wish to show that $\lim_{n\to\infty} T_n \sim N(0;1)$.

I've laboriously proven that $T_n \sim N(m; \sigma^2)$ where $m=\sum_k m_k$ and $\sigma^2=\sum_k \sigma^2_k$. So now I'm wondering if I'm supposed to assume that the $\lim_{n\to\infty}m=0$ and similarly $\sigma^2\to 1$, which I don't really see a strong justification for.

Is this what I'm supposed to do? What's the justification for it?

(Note this is exercise 14.31.7 in Apostol's Calculus II. Looking online it seems like some other places don't define CLT as it approaches standard normal, just normal distribution of some mean and variance. So this could be an unusual definition he's using.)

  • 0
    @SamL: That is indeed what I did, thanks!2012-05-26

2 Answers 2

1

You don't need these assumptions. In fact, $E[S_n]=0$ and $\operatorname{Var}(T_n)=1$. Furthermore, we know that a sum of two independent random variable $Y_1$ and $Y_2$ normally distributed, respectively $Y_1\sim \mathcal N(m_1,\sigma_1^2)$, $Y_2\sim \mathcal N(m_2,\sigma_2^2)$ then $Y_1+Y_2\sim\mathcal N(m_1+m_2,\sigma_1^2+\sigma^2_2)$. So by induction we deduce that $T_n\sim\mathcal N(0,1)$ for each $n$. Convergence in law follows now from the definition.

0

Each $X_k-m_k$ is normal and has mean 0 and $S_n$ is $N(0, Var(S_n))$ so $\frac{S_n}{\sqrt{Var(S_n)}}$ is $N(0,1)$.