4
$\begingroup$

Today at school we discussed probability distributions and as usual my mind wandered off and I started thinking:

Normally when we have a die, you can make a binomial distribution. So I thought, if you have a die with, instead of 6 sides, an infinite amount of sides, wouldn't the binomial distribution become a normal distribution?

If yes, can you also say that when a normal distribution is 'simplified' (with simplified I mean for example going from an n-sided dice to a 6 sided die) it always turns into a binomial distribution, or is that just the case for some examples (like this die example, which is what I assume is true)?

  • p.s. This was our first lesson on probability, so what I might say might sound ridiculous. At the beginning of each chapter I always wonder off like this, sometimes I get nice results and sometimes I fail epicly.
  • 0
    Aren't a dice and a die the same?2012-12-19

1 Answers 1

2

In the 18th century, Abraham de Moivre introduced the normal distribution in connection with coin-tossing. If you toss a coin $1800$ times (if I recall correctly, that's the number he used) what's the probability that the number of heads is between one number and another, for example, at least $880$ but not more than $910$? Finding an exact answer requires summing binomial probabilities from $880$ through $910$, and each involves lots of rather large factorials, so it's extremely computation-intensive.

It's easy to show that the number of heads you get when you toss one coin has expected value $1/2$ and standard deviation $1/2$. So when you toss a coin $1800$ times, the number of heads has expected value $900$ and standard deviation $ \sqrt{\frac14+\cdots+\frac14} = \sqrt{\frac{1800}{4}}. $

Since $X\ge880$ is the same as $X>879$, and $X\le910$ is the same as $X<911$, when you approximate this discrete random variable with a continuous one, you use the events $879.5.

What de Moivre did was that he treated it as a normally distibuted random variable with the same expected value and the same standard deviation. For (fair) coin tosses this is surprisingly accurate even for relatively small numbers of trials.

De Moivre fled the persectuion of Protestants in France, and settled in England, where he met James Stirling. He wrote a book on probability in English, titled The Doctrine of Chances, one chapter of which is devoted to this topic. De Moivre found the constant in the probability density function $ \text{constant}\cdot e^{-x^2/2} $ numerically, and Stirling showed that it is exactly $1/\sqrt{2\pi\,{}}$.

Why does one use root mean square deviations rather than the average distance of observations from the mean? The answer is that variances of independent random variables can be added up, as we did above. That's one lesson to draw from all this.

Not until the 1930s was it shown that this idea that works for coin tosses works for all random variables with finite variance. That's the Lindeberg–Lévy central limit theorem.