11
$\begingroup$

This problem is from the book Probability with martingales by Williams. It's numbered as Exercise 12.3 on page 236. It can be stated as follows:

The control system on the starship has gone wonky. All that one can do is to set a distance to be traveled. The spaceship will then move that distance in a randomly chosen direction and then stop. The object is to get into the Solar System, a ball of radius $r$. Initially, the starship is at a distance $R>r$ from the sun.

If the next hop-length is automatically set to be the current distance to the Sun.("next" and "current" being updated in the obvious way). Let $R_n$ be the distance from Sun to the starship after $n$ space-hops. Prove $\sum_{n=1}^\infty \frac{1}{R^2_n}<\infty$ holds almost everywhere.

It has puzzled me a long time. I tried to prove the series has a finite expectation, but in fact it's expectation is infinite. Does anyone has a solution?

  • 0
    There was a common Star Trek computer game in the seventies that could have implemented this principle (not sure). I found [this reference](http://en.wikipedia.org/wiki/Star_Trek_(text_game)#Super_Star_Trek) with a picture at the right. All this to say that it could be a 2D problem and thus convergent!2012-07-26

4 Answers 4

7

$R^2_{n+1}/R^2_n$ has the same distribution as $|u+V|^2$ where $u$ is a fixed unit vector in ${\mathbb R}^3$ and $V$ is a random unit vector (uniform on the unit sphere). I would guess (although I haven't computed it) that $E [\log (|u+V|^2)] > 0$, and the Central Limit Theorem will tell you that almost surely $R_n > c^n$ for some $c > 1$. (EDIT: for sufficiently large $n$; Strong Law of Large Numbers is actually more relevant).

EDIT: Yes, $E[\log(|u+V|^2)] = \int_0^{\pi} \sin(\theta) \log(2 - 2 \cos(\theta))\ d\theta/2 = 2 \log(2) - 1 > 0$.

In the two-dimensional version, however, $E[\log(|u+V|^2)] = \int_0^\pi \log(2 - 2 \cos(\theta))\ d\theta/\pi = 0$, so the Law of Large Numbers doesn't tell you whether $R_n$ will go to $0$ or $\infty$. However, the Central Limit Theorem will say $P(R_n < 1) \to 1/2$ as $n \to \infty$. Thus it's certainly not the case in the two-dimensional version that $R_n \to 0$ with probability $1$, nor does $R_n \to \infty$ with probability $1$.

  • 0
    Hi, I've posted my query as a new question: http://math.stackexchange.com/questions/1723113/question-on-part-3-of-the-star-trek-problem-in-williams-probability-with-martin2016-04-01
3

I think this way works basically, but there are some details to be completed.

Let $A_k$ be the event that the starship goes back to the Solar System just after the n th space jump. Then $\{A_k\}_{k=1}^\infty$ is a sequence of disjoint events. Obviously $A_k$ is adapt to $\mathcal{F}_n$, here $\mathcal{F}_n$ is the $\sigma$- field generated by the first $n$ jumps.

Let $p_{i+1}=P(A_{i+1}|\mathcal{F}_{i})$, then$p_{i+1}= \frac{R_n-\sqrt{R^2_n-r^2}}{2R_n}$

this follows from the area formula of a sphere cap.

(Here I am not sure because $ \frac{R_n-\sqrt{R^2_n-r^2}}{2R_n}$ is the conditional probability that given the ship is outside the solar system after n jumps, it will go back home after the next (n+1 th) jump, is it the same with $P(A_{i+1}|\mathcal{F}_{i})$?)

suppose it is, then we use the inequality $1-\sqrt{1-x}\geq \frac{x}{2}\quad (0 to derive $p_{n+1}\geq \frac{r^2}{4R^2_n}$ so if $\sum_{n}p_n$ converges, so dose $\sum_{n}\frac{1}{R^2_n}$

Levy's extension of the Borel-Cantelli lemma says (see page 124 in Williams's book)

on the set $S=\{\omega:\sum_{n}p_n=\infty\}$, almost surely holds $\frac{\sum_{k=1}^n \mathbf{1}_{A_k}}{\sum_{k=1}^n p_k}\rightarrow 1$

but the numerator can only have values 0 and 1,(note the $A_k$s are disjoint) so the set $S$ must have measure 0.

  • 0
    How can we show that $R_n \to \infty$ a.s.? The problem states that the ship never stops, so we must prove that $R_n \to \infty$ otherwise this proof falls apart.2013-04-13
1

By a simple geometrical property : ${\mathbb P}(R_{n+1}^2>2R_n^2)=\frac{1}{2}$

In fact, we have ($0\le\theta\le\pi$): ${\mathbb P}(R_{n+1}<2\sin(\frac{\theta}{2})R_n)=\frac{1-\cos\theta}{2}$

Let $L_n=\ln(R_n)$ and $I$ such that ${\mathbb P}(I<\ln(2\sin(\frac{\theta}{2})))=\frac{1-\cos\theta}{2}$

$E(I)>0$ and $V(I)$ is defined. As $L_n$ is the sum of $(n-1)$ independent $I$, you can use the central limit theorem, and $L_n$ will be approximated by a normal law with $E=n.E(I)$ and $V=n.V(I)$.

Hence ${\mathbb P}(L_n>2\ln(n)E(I))\rightarrow 1$, hence ${\mathbb P}(R_n>k.n^2)\rightarrow 1$, so the serie is convergent...

  • 0
    Hi, I've posted my query as a new question: http://math.stackexchange.com/questions/1723113/question-on-part-3-of-the-star-trek-problem-in-williams-probability-with-martin2016-04-01
1

in the book it seems to me that the sum terminates when they get home, so if they get home the sum is finite. williams also hints: it should be clear what thm to use here. i am sure he is referring to levy's extension of the borel-cantelli lemma, $\frac 1 {R_n^2}$ is the conditional probability of getting home at time $n+1$ given your position at time n is $R_n$, it is the proportion of the area of the sphere of radius etc.

  • 0
    would you please check my a$n$swer?2012-07-28