This question is related to my previous question but I think it's sufficiently different to warrant a separate one.
Suppose I have a sequence of independently distributed positive random variables: $X_1\sim A_1,X_2\sim A_2, \ldots, X_n\sim A_n$, where $A_i$'s have support over $[0,\infty)$.
Furthermore, suppose that, for each $X_i$, variance $\sigma_{lb}^2<\sigma_i^2<\sigma_{ub}^2$, that is variances of $X_i$'s are bounded from below and above.
Does Lindeberg's Condition:
$\lim_{n\rightarrow\infty}\frac{1}{s_n^2}\sum_{i=1}^n\int_{\{|x-\mu_i|>\epsilon s_n\}}(x-\mu_i)^2f_i(x)dx=0$
hold in this case? Here $s_n^2=\sum_{i=1}^n\sigma_i^2$ and $s_n=\sqrt{s_n^2}$.
My intuition tells me that it does, for the following reason: suppose that it doesn't hold. That means there is some random variable $X_i$ in the sequence most of whose mass isn't centered in an interval about the mean $[\mu_i-\epsilon s_n,\mu_i+\epsilon s_n]$. However, this interval gets larger as we add more random variables to the sequence, since their variances are bounded from below. Since the variance of $X_i$ is upper-bounded, at some $n$, the interval $[\mu_i-\epsilon s_n,\mu_i+\epsilon s_n]$ should "consume" most of $X_i$'s mass.
However, I am not sure how to prove (or disprove) it. One problem may arise if the interval $[\mu_i-\epsilon s_n,\mu_i+\epsilon s_n]$ does not increase fast enough... Not sure if I need additional conditions on higher moments of $A_i$ or not...