2
$\begingroup$

I have some problems with finding this limit: $\lim_{n\to\infty}p^{n}\sum_{k \geqslant{n(p^{-1}-1)}}^{\infty}\binom{n+k-1}{n-1}(1-p)^{k}.$ I know that it is possible to find it with the help of the Central Limit Theorem. Can anyone help me? Thanks in advance!

  • 0
    Hint: If you can identify the probability mass function as that of a sum of $n$ i.i.d. random variables with mean $p^{-1}$, the question comes down to asking for the probability that the sum random variable exceeds its mean.2012-04-10

1 Answers 1

2

Imagine an infinite sequence of independent Bernoulli trials with probability $p$ of success on each trial. Let $X$ be the number of failures before the $n$ success. Then $ \Pr(X=k) = \binom{n+k-1}{n-1} p^n(1-p)^k. $ The expected number of failures before the $n$th success is $\operatorname{E}(X)= n(1-p)/p=n(p^{-1}-1)$. The variance of the number of failures before the $n$th success is $\operatorname{var}(X)=n(1-p)/p^2$, so the standard deviation is the square root of that.

$X$ is the sum of $n$ independent random variables each of which is the number of failures before the first success.

The central limit theorem therefore says $ W=\frac{X- \frac{n(1-p)}{p}}{\sqrt{\frac{n(1-p)}{p^2}}} \text{ converges in distribution to } N(0,1)\text{ as }n\to\infty. $

What you're looking for then is the probability that $X \ge \operatorname{E}(X)$, which is the same as $W\ge 0$.

In the limit, that is $\Pr(Z\ge 0) =1/2$ (where, as usual, $Z\sim N(0,1)$).

  • 0
    Thank you very much, your answer is very competent!2012-04-11