1
$\begingroup$

Say $(X_i)$ is a sequence of $n$ independent Bernoulli random variables, with parameters $p_i$, $i:1...n$. How do I prove that the random variables, $$ S_{n} =\frac{1}{n} \sum _{i=1}^{n}a_iX_i $$ converge in distribution to a normal as $n\to \infty$, where $0 are real constants? Note: The classic CLT requires a simple sum of $i.i.d$ random variables... I think we need the Lyapunov version, but I'm not sure how to check the conditions.

  • 0
    This cannot be true for every $(a_n)$. Once you will have the conditions on $(a_n)$ sorted out, try characteristic functions.2012-07-30
  • 0
    Say we have $0. How do we prove the convergence to normal?2012-07-30
  • 0
    Obviously, this cannot suffice. And I already answered the question in your comment.2012-07-30
  • 0
    To show that a r.v. converges in distribution to a normal r.v. you can check that its characteristic function converges to the normal one. The classical CLT is an application of that using the expansion $\exp(x) = 1 + x + \frac{1}{2} x^2 + o(x^2)$2012-07-30
  • 0
    @did: what do you mean "this cannot suffice"? $S_n$ will not $necessarily$ converge in distribution to a normal given $0?2012-07-30
  • 0
    I'd still appreciate if someone could prove or else disprove the OP.2012-07-30
  • 0
    *This* = the condition that $0\lt a_i\lt1$. *Suffice* = guarantee that a CLT holds. (And once again: I answered the question in your post, which you repeated in a comment.)2012-07-30
  • 0
    The condition $0 < a_i < 1 $ does not suffice (as did says), see an example: http://math.stackexchange.com/questions/164456/some-case-when-the-central-limit-theorem-fails/164582#164582 . On the other hand the condition $0 < \epsilon \le a_i < 1 $ would suffice.2012-07-30
  • 0
    Thanks leonbloy. Is this for any $\epsilon>0$ you mean? How difficult would this be to prove it?2012-07-30

1 Answers 1

2

Let $e_n=E(S_n)=\frac{1}{n}\sum_{i=1}^na_i.p_i$

let $\sigma_n=\sigma(S_n)=\frac{1}{n}\sqrt{\sum_{i=1}^na_i^2p_i(1-p_i)}$

$$L_n=\frac{S_n-e_n}{\sigma_n}$$

(We suppose to have $0 to obtain $\sigma_n>0$)

Hence, $E(L_n)=0$ and $V(L_n)=1$. We compute the characteristic function of $L_n$ :

$$\phi_{L_n}(t)=e^{i\frac{-e_n}{\sigma_n}t}\prod_{i=1}^n(1-p_i+p_ie^{i\frac{a_i}{n.\sigma_n}t}) $$

We use the fact the $e^{\epsilon}\approx 1+\epsilon+\frac{\epsilon^2}{2}$ and we suppose that $\lim(n.\sigma_n)=\infty$. (we NEED it !)

$$\lim(\phi_{L_n}(t))\approx e^{i\frac{-e_n}{\sigma_n}t}\prod_{i=1}^n(1+p_i.i\frac{a_i}{n.\sigma_n}t-\frac{p_i}{2}\left(\frac{a_i}{n.\sigma_n}t\right)^2)\approx_* e^{i\frac{-e_n}{\sigma_n}t}. e^{i\frac{e_n}{\sigma_n}t}.e^{-\frac{t^2}{2}}=e^{-\frac{t^2}{2}} $$

Hence, this is convergent to a normal law.

for (*) you need also $\ln(1+\epsilon)\approx \epsilon-\frac{\epsilon^2}{2}$

More generally, you can see from the proof that you need $$\lim (\forall i)\frac{a_i}{n\sigma_n}=0$$

  • 0
    Thanks @Xoff. So from your condition that $\lim(n.\sigma_n)=\infty$, and from the expression of $\sigma_n$ above, we see that it is not enough to require that $0<\epsilon, but also that $0<\delta and $0<\delta<1-p_i$. Right?2012-07-31
  • 0
    I agree, and it should be obvious that if $p_i$ are zeros, you are not convergent, so it's not surprising to see that $p_i$ and $1-p_i$ should be large enough. However, $p_i$ can have a 0 limit, if this limit is reached slowly enough to let $n.\sigma_n$ goes to $\infty$. So your conditions are enough, but not necessary...2012-07-31