We prove Central limit theorem with characteristic function. If we know the $X_i$ are independent but not identically distributed, is there any weaker condition which still yields the convergence to normal distribution?
About Central limit theorem
2 Answers
For example, suppose $X_i$ are independent with $E X_i = 0$, $\text{Var}(X_i) = \sigma_i^2$, and $\lim_{n \to \infty} \frac{1}{\sigma(n)^3} \sum_{i=1}^n E[|X_i|^3] = 0$ where $\sigma(n)^2 = \text{Var}\left(\sum_{i=1}^n X_i\right) = \sum_{i=1}^n \sigma_i^2$ Then $\displaystyle \frac{1}{\sigma(n)} \sum_{i=1}^n X_i$ converges in distribution to a standard normal random variable.
Following Robert Israel's notation elsewhere in this thread, we set $0 = \mathbb{E}(X_i) = 0$, $\sigma_i^2 = \text{Var}(X_i)$, and $\sigma(n)^2 = \sum_{i=1}^n \sigma_i^2$.
Lindberg's Condition: For every $\epsilon > 0$, $\lim_{n\to\infty} \frac{1}{\sigma(n)^2} \sum_{i=1}^n \mathbb{E}\left(X_i^2 1_{\{ |X_i| \ge \epsilon \sigma(n)\}}\right) \to 0$.
Lyapunov's condition: There exists $\delta > 0$ such that $\lim_{n\to \infty}\frac{1}{\sigma(n)^{2 + \delta}} \sum_{i=1}^n \mathbb{E} \left( |X_i|^{2 + \delta}\right) \to 0$.
The Lyapunov condition implies the Lindberg condition, and the Lindberg condition implies the conclusion of the central limit theorem, ie. that $\frac{\sum_{i=1}^n X_i}{\sigma(n)} \Rightarrow N(0,1)$, where $N(0,1)$ is the standard normal. (Robert Israel's answer presents the Lyapunov condition with $\delta = 1$).