What are the "best" values for length of needle $(l)$ and distance between paralles $(d)$ for an accurate approximation of $\pi$? Does it have to be $l=d=1.0$ or $l
Buffon needle experiment for $\pi$ approximation
-
0Lazzarini's experiment is where we want to derive $335/113$ by giving $l$ and $d$ a good ratio, then we continue the experiment until we have reached $335/113$ as the exact ratio. – 2012-08-16
3 Answers
Let's recapitulate the method for estimating $\pi$ and then discuss the best value of $l$. With no loss of generality, take $d=1$. For each toss of the needle onto the field of parallel lines, the component of the needle's length perpendicular to the lines is $l\sin\theta$, where $\theta$ is uniformly random over $[0,\pi/2)$. Suppose $l \sin\theta=n+\alpha$, where $n$ is a natural number and $\alpha \in [0, 1)$. Conditioned on this value of $\theta$, the needle will cross $n$ lines with probability $1-\alpha$ and $n+1$ lines with probability $\alpha$. We have $ E\left[{X \big\vert \theta}\right]=(1-\alpha)n+\alpha(n+1)=n+\alpha=l\sin\theta $ and $ E\left[{X^2\big\vert\theta}\right]=(1-\alpha)n^2 + \alpha(n+1)^2=(2n+1)(l\sin\theta)-n(n+1), $ where $X$ is the number of lines crossed. Averaging the first of these over $\theta$ gives $ E[X]=\frac{2l}{\pi}. $ This gives us a method for estimating $2/\pi$: throw the needle a large number of times, calculate the average number of intersections per throw, and divide by $l$. The quality of this estimate (for a given number of throws) is improved by shrinking the relevant variance, which is $ {\rm{Var}}\left[\frac{X}{l}\right]=E\left[\frac{X^2}{l^2}\right]-E\left[\frac{X}{l}\right]^2=-\frac{4}{\pi^2}+\frac{2}{\pi l} + \frac{2}{l}E[n\sin\theta]-\frac{1}{l^2}E[n(n+1)]. $ Note that for $l\le 1$, $n=0$ almost surely; the last two terms vanish in that case, and so the variance is minimized at $l=1$. What about for $l>1$?
In the general case, we can define $0=\theta_0<\theta_1<\theta_2<...<\theta_{N}<\theta_{N+1}=\pi/2$ such that $\lfloor {l\sin\theta} \rfloor = n$ for $\theta\in[\theta_{n},\theta_{n+1})$; in particular, $\theta_{n}=\sin^{-1}(n/l)$ for $n\le N$, and $N=\lfloor l \rfloor$. Then $ \begin{eqnarray} E[n\sin\theta]&=&\frac{2}{\pi}\sum_{n=0}^{N}\int_{\theta_{n}}^{\theta_{n+1}}n\sin\theta d\theta \\ &=&\frac{2}{\pi}\sum_{n=0}^{N}n\left(\cos\theta_{n} - \cos\theta_{n+1}\right) \\ &=&\frac{2}{\pi}\sum_{n=1}^{N}\cos\theta_{n} \\ &=&\frac{2}{\pi}\sum_{n=1}^{N}\sqrt{1-\frac{n^2}{l^2}} \end{eqnarray} $ and $ \begin{eqnarray} E[n(n+1)] &=& \frac{2}{\pi}\sum_{n=0}^{N}\int_{\theta_{n}}^{\theta_{n+1}}n(n+1)d\theta \\ &=&\frac{2}{\pi}\sum_{n=0}^{N}n(n+1)\left(\theta_{n+1}-\theta_{n}\right) \\ &=&\frac{2}{\pi}\left((N+1)^2\theta_{N+1}-\sum_{n=1}^{N+1}n\theta_{n}-\sum_{n=0}^{N}n\theta_{n}\right) \\ &=& N(N+1)-\frac{4}{\pi}\sum_{n=1}^{N} n \sin^{-1}\left(\frac{n}{l}\right). \end{eqnarray} $ Putting everything together, we have $ {\rm{Var}}\left[\frac{X}{l}\right]=-\frac{4}{\pi^2}+\frac{2}{\pi l} -\frac{N(N+1)}{l^2} + \frac{4}{\pi l}\sum_{n=1}^{N}\left(\frac{n}{l} \sin^{-1}\left(\frac{n}{l}\right) + \sqrt{1-\frac{n^2}{l^2}}\right). $ This turns out to be a continuous function of $l$, despite its piecewise definition, and it decreases monotonically to its limiting value of $\frac{1}{2}-\frac{4}{\pi^2}$. The larger $l$ is with respect to $d$, the fewer the number of throws required. However, the variance approaches its limiting value quickly: at $l=3$, the standard deviation of $(X/l)$ is already within $10$% of its value at $l=\infty$.
-
1You might want to add some explanations about the way the limit $\frac12-\frac4{\pi^2}$ appears and about the monotonicity of Var(X/l). – 2012-09-15
Well, both (i.e if $l\gt d$ or $l\lt d$) give fairly accurate results but it is easier to estimate the value of $\pi$ if the $l\lt d$. Or, if the needle is shorter than the width. (It should make sense as to why too, right? If it doesn't, feel free to ask :))
By the way, if you'd like to know, when $l\lt d$ the probability that the needle will fall on the line is $\frac{2l}{\pi \cdot d}$. The probability when $l\gt d$ id a little complicated and I don't remember too!
Hope it helps!
Let $\varrho=\ell/d$. The number of intersection points of a needle with the parallels is distributed like some random variable $N_\varrho$ with values in $\{0,1,\ldots,\lceil \varrho\rceil\}$ hence the total number of intersection points divided by $n$ is a random variable $R_\varrho$ with mean $m_\varrho=\mathrm E(N_\varrho)$ and variance $\frac1n v_\varrho$ with $v_\varrho=\mathrm{var}(N_\varrho)$, and a central limit theorem indicates that $R_\varrho=m_\varrho+\frac1{\sqrt{n}}\sqrt{v_\varrho}Z$ where $Z$ is approximately standard normal. By additivity, $m_\varrho=\frac{2\varrho}{\pi}$ for every $\varrho$ and, to deduce a value of $\pi$, one should solve $ \pi\approx\frac{2\varrho}{R_\varrho}+\frac{2}{\sqrt{n}}\sqrt{\frac{\varrho^2v_\varrho}{m_\varrho^4}}Z. $ The absolute precision of this estimation procedure is measured by the coefficient of $Z$, thus one wants to minimize $\varrho^2v_\varrho/m_\varrho^4$, or equivalently, $\kappa(\varrho)=\mathrm E(N_{\varrho}^2)/\varrho^2$.
If $\varrho\lt1$, $N_\varrho$ is a Bernoulli $0-1$ random variable with parameter $\frac{2\varrho}\pi$, hence $\kappa(\varrho)=\frac{\pi}{2\varrho}$, which is minimized when $\varrho$ is large. Hence:
The optimal choice of $\ell$ between (i) $\ell\lt d$, and (ii) $\ell=d$, is (ii) $\ell=d$.
An exact computation in the general case is beyond my current courage but by some measure concentration phenomenon, I would guess that $N_\varrho/\varrho\to0$ when $\varrho\to\infty$. Finally:
The optimal choice of $\ell$ between (i)-(ii) $\ell\leqslant d$, and (iii) $\ell\gt d$, is probably (iii) $\ell\gt d$, in the limit $\ell\gg d$.