43
$\begingroup$

Choose a random number between $0$ and $1$ and record its value. Do this again and add the second number to the first number. Keep doing this until the sum of the numbers exceeds $1$. What's the expected value of the number of random numbers needed to accomplish this?

  • 0
    I am flummoxed, based on the comments of the OP to all the attempted solutions. How can you even talk about this without the idea of expected value? How can you discuss a uniform distribution without first knowing integration?2018-08-11

3 Answers 3

27

Here is a (perhaps) more elementary method. Let $X$ be the amount of numbers you need to add until the sum exceeds $1$. Then (by linearity of expectation):

$ \mathbb{E}[X] = 1 + \sum_{k \geq 1} \Pr[X > k] $

Now $X > k$ if the sum of the first $k$ numbers $x_1,\ldots,x_k$ is smaller than $1$. This is exactly equal to the volume of the $k$-dimensional set:

$ \left\{(x_1,\ldots,x_k) : \sum_{i=1}^k x_i \leq 1, \, x_1,\ldots,x_k \geq 0\right\}$

This is known as the $k$-dimensional simplex. When $k = 1$, we get a line segment of length $1$. When $k = 2$, we get a right isosceles triangle with sides of length $1$, so the area is $1/2$. When $k=3$, we get a triangular pyramid (tetrahedron) with unit sides, so the volume is $1/6$. In general, the volume is $1/k!$, and so

$ \mathbb{E}[X] = 1 + \sum_{k \geq 1} \frac{1}{k!} = e. $

  • 0
    yeah it sounds absurd but we haven't...but this is a "bonus" problem...2012-02-20
16

Here is another method. This is exactly the same technique Yuval has given above but with an added step that goes through computing the pmf first before computing the expectation. Hopefully, this will help you understand the problem from a slightly different angle.

Let us denote by $N$ the number of random variables we need to add for the sum to exceed $1$. We first find the distribution (probability mass function) of $N$. The easiest way to do this is by computing $P(N > n)$ for $n=1,2,3,\dots$. Once we know this, we can compute $P(N = n) = P(N > n-1) - P(N > n)$.

The event that $(N > n)$ in plain English says that the sum of the first $n$ uniform random variables did not exceed $1$. The probability of this event is therefore $P(N > n) = P(U_1 + U_2 + \dots + U_n < 1)$. This can be calculated by a standard multi-dimensional integral to be $\frac{1}{n!}$. If you don't want to use calculus, one can justify this result geometrically but integration is perhaps the most natural approach to evaluate this probability. Therefore, we have $P(N = n) = \frac{1}{(n-1)!} - \frac{1}{n!} = \frac{n-1}{n!}$ for $n=1,2,3\dots$.

Once we know the pmf of N, we calculate

$E(N) = \sum_{n=1}^{\infty} n P(N=n) = \sum_{n=1}^{\infty} \frac{n(n-1)}{n!} = \sum_{n=0}^{\infty} \frac{1}{n!} = e$.

15

Assuming the numbers come from a uniform distribution over $[0,1]$ and that the trials are independent, here is an outline (this is example 7.4 4h. in Sheldon Ross' A First Course in Probability, sixth edition):

  1. Let $X_i$ be the number obtained on the $i$'th trial.

  2. For $0\le x\le1$, let $Y(x)$ be the minimum number of trials needed so that the sum of the $X_i$ exceeds $x$. Set $e(x)=\Bbb E [Y(x)]$.

  3. Compute $e(x)$ by conditioning on the value of $X_1$: $\tag{1} e(x)=\int_0^1 \Bbb E [ Y(x) | X_1=y]\, dy. $

Here, use the fact that $\tag{2}\Bbb E [ Y(x) | X_1=y] = \cases{1,& $y>x$\cr 1+e(x-y),& $y\le x $}.$

Substitution of $(2)$ into $(1)$ will give $\tag{3} e(x)=1+\int_0^x e(u)\,du. $

  1. Solve equation $(3)$ (by differentiating both sides with respect to $x$ first) for $e(x)$.

  2. You wish to find $e(1)$.

  • 0
    No you didn't misinterpret my question. But I haven't seen these terms before I think we are expected to solve it in another way I guess. But thank you so much for providing such detailed answers to my question. Thank you!2012-02-20