1
$\begingroup$

We ought not be able to predict a value in a random sequence, but it seems that in this case (for width equal to needle length, say) we can predict that a rational expression involving pi will occur.

Added by OP Mike Jones on 8.May.2011 (Beijing time):

I believe the answer is affirmative. Below is my proposed proof which, even if it turns out to be defective, will at least clarify what I meant by the question:)

Theorem: pi is irrational. Proof: By the well-known solution to Buffon’s Needle Problem, the sequence f(n)/n converges to 1/pi, where n is the number of tosses of the needle, and f(n) is the number of line-crossings of the needle, where the needle has unit length, and the parallel lines are unit distance apart. Notice that it should be impossible to predict with certainty any value of the ratio f(n)/n. However, because of the discrete (as opposed to continuous), and physical, nature of this process, if this sequence of rational numbers converges to a rational number, it is certain that it will sooner or later take on this rational number as one of its terms. This contradiction establishes that 1/pi is irrational. Therefore pi is irrational. Q.E.D.

  • 0
    I vote against closing2011-05-09

3 Answers 3

13

Notice that it should be impossible to predict with certainty any value of the ratio f(n)/n. However, because of the discrete (as opposed to continuous), and physical, nature of this process, if this sequence of rational numbers converges to a rational number, it is certain that it will sooner or later take on this rational number as one of its terms.

This is not a mathematical argument. For example, the sequence $\frac{2^n - 1}{2^n}$ is a sequence of rational numbers which never takes on the value of its rational limit $1$. If you want a "random" sequence, take $\frac{2^n - 1}{2^n} + X_n$ where $X_n$ is a sequence of random rational numbers of size at most $\frac{1}{2^{n+1}}$.

7

Your proof is unconvincing. The following replaces some of your words without affecting the argument, and if it were a legitimate proof then two would be irrational.

Theorem: two is irrational. Proof: By the well-known solution to tossing the Fair Coin Problem, the sequence f(n)/n converges to 1/two, where n is the number of tosses of the coin, and f(n) is the number of heads shown, where the coin is equally likely to show each of its two faces heads or tails. Notice that it should be impossible to predict with certainty any value of the ratio f(n)/n. However, because of the discrete (as opposed to continuous), and physical, nature of this process, if this sequence of rational numbers converges to a rational number, it is certain that it will sooner or later take on this rational number as one of its terms. This contradiction establishes that 1/two is irrational. Therefore two is irrational. Q.E.D.

  • 2
    "…it will sooner or later take on this rational number as one of its terms. This contradiction…" — actually I don't understand what the contradiction is (in the original question).2011-05-09
3

In short, I would say no. One can make Buffon's Needle situations that converge to more or less any number, rational or not, that you want. Pi happens to emerge here simply because of it's relationship between the radius of a circle and the area of that circle.

In fact, if one were to launch darts randomly at a square with an inscribed circle, one could get an expression for pi by finding the probability that a dart lands in the circle. This relies on the same relationship of pi and the area of a circle.

  • 3
    @Fabian: BTW, there's a truly remarkable proof of Buffon's needle problem that does not involve integration (which with the marvels of hypertext I *can* fit into this margin): see [here](http://gilkalai.wordpress.com/2009/08/03/buffon-needle-and-the-perimeter-of-planar-sets-of-constant-width/) or [here](http://en.wikipedia.org/wiki/Buffon's_noodle).2011-05-09