8
$\begingroup$

I have a sequence of $n$ random variables, each drawn from a different distribution $X_1\sim A_1, X_2\sim A_2, \ldots, X_n\sim A_n$. The distributions $A_1, A_2, \ldots, A_n$ have nice properties: finite mean, variance, as well as third and fourth moments; all of them are defined over the support $\mathbb{R}$. Unfortunately, besides the moments and support, I am not given any more information about $A_1,A_2,\ldots,A_n$.

I am interested in studying the convergence of the normalized sum of these random variables:

$\sum_{i=1}^n(X_i-\mu_i)$

in particular, I would like to know if knowledge of moments of $X_i$'s is sufficient to show so that the sequence satisfies Lindeberg's condition. If it satisfies the condition, then CLT applies to the summation. I think that I understand the formula for the condition:

$\lim_{n\rightarrow\infty}\frac{1}{s_n^2}\sum_{k=1}^n\int_{\{|X_k-\mu_k|>\epsilon s_n\}}(x-\mu_k)^2f_k(x)dx=0$

where $s_n^2=\sum_{k=1}^n\sigma_k^2$.

(I wrote the formula as a Riemann integral, please let me know if I made a mistake -- the wikipedia article uses Lebesgue integral but I am not as comfortable with those.)

If moments are not sufficient than what additional information about $X_i$'s do I need? If they are sufficient, how do I use their properties to prove that condition is satisfied? I think I understand the intuition behind the condition: no single random variable in the sequence can "overwhelm" the variance of the sum -- is that correct intuition?

1 Answers 1

3

If I understand your question correctly then perhaps Lyapounov's condition might help. Take a look at pages 359-362 of Probability and Measure by Patrick Billingsley. Basically if your random variables are independent and some conditions on $|X_{n}|^{2+\delta}$ for $\delta>0$ and $E[|X_{n}|^{2+\delta}]$ are satisfied then the Lindeberg condition is implied. This is not exactly (as far as I can see) about the moments of the $X_{n}$'s but it is concerned with properties of your random variables.