11
$\begingroup$

This problem is from the book Probability with martingales by Williams. It's numbered as Exercise 12.3 on page 236. It can be stated as follows:

The control system on the starship has gone wonky. All that one can do is to set a distance to be traveled. The spaceship will then move that distance in a randomly chosen direction and then stop. The object is to get into the Solar System, a ball of radius $r$. Initially, the starship is at a distance $R>r$ from the sun.

If the next hop-length is automatically set to be the current distance to the Sun.("next" and "current" being updated in the obvious way). Let $R_n$ be the distance from Sun to the starship after $n$ space-hops. Prove $$\sum_{n=1}^\infty \frac{1}{R^2_n}<\infty$$ holds almost everywhere.

It has puzzled me a long time. I tried to prove the series has a finite expectation, but in fact it's expectation is infinite. Does anyone has a solution?

  • 1
    Would love to see why the problem is called as such.2012-07-26
  • 2
    Is it William T. Riker?2012-07-26
  • 0
    Is the fact that it's 3-dimensional relevant? Do you know anything about what happens in other dimensions?2012-07-26
  • 0
    Why not try the problem in the one-dimensional setting and see what happens?2012-07-26
  • 0
    @echoone Then it's 1-d discrete random walk, isn't it? And there you should have a gaussian distribution...2012-07-26
  • 0
    no, it's David Willianms.2012-07-26
  • 0
    @zemora You never watched The next Generation...2012-07-26
  • 0
    in 2-dimension $R_n$ will converge to zero with probability 1,but in 3-dimension $R_n$ will diverge to infinity with probability 1.2012-07-26
  • 1
    I can't think of any specific Star Trek episode that featured random jumps of this sort. It might be just as appropriate to call it the "Lost in Space" problem.2012-07-26
  • 3
    Will they ever come home? @RobertIsrael Maybe it's referring to Star Trek: Voyager...2012-07-26
  • 0
    There was a common Star Trek computer game in the seventies that could have implemented this principle (not sure). I found [this reference](http://en.wikipedia.org/wiki/Star_Trek_(text_game)#Super_Star_Trek) with a picture at the right. All this to say that it could be a 2D problem and thus convergent!2012-07-26

4 Answers 4

7

$R^2_{n+1}/R^2_n$ has the same distribution as $|u+V|^2$ where $u$ is a fixed unit vector in ${\mathbb R}^3$ and $V$ is a random unit vector (uniform on the unit sphere). I would guess (although I haven't computed it) that $E [\log (|u+V|^2)] > 0$, and the Central Limit Theorem will tell you that almost surely $R_n > c^n$ for some $c > 1$. (EDIT: for sufficiently large $n$; Strong Law of Large Numbers is actually more relevant).

EDIT: Yes, $E[\log(|u+V|^2)] = \int_0^{\pi} \sin(\theta) \log(2 - 2 \cos(\theta))\ d\theta/2 = 2 \log(2) - 1 > 0$.

In the two-dimensional version, however, $E[\log(|u+V|^2)] = \int_0^\pi \log(2 - 2 \cos(\theta))\ d\theta/\pi = 0$, so the Law of Large Numbers doesn't tell you whether $R_n$ will go to $0$ or $\infty$. However, the Central Limit Theorem will say $P(R_n < 1) \to 1/2$ as $n \to \infty$. Thus it's certainly not the case in the two-dimensional version that $R_n \to 0$ with probability $1$, nor does $R_n \to \infty$ with probability $1$.

  • 0
    how do you derive that $R^2_{n+1}/R^2_n$ has a distribution like $|u+V|^2$?2012-07-26
  • 1
    Let me suggest that the central limit theorem is not the relevant tool here, rather the law of large numbers.2012-07-26
  • 1
    @zemora: $X_{n+1} = X_n + R_n V$ where $V$ is uniform on the sphere and independent of $X_n$, so $R_{n+1} = |X_n + R_n V| = R_n |U + V|$ where $U = X_n/R_n$. Now $U = M u$ for some orthogonal matrix $M$, so $|U+V| = |u + M^{-1} V|$. But $M^{-1} V$ has the same distribution as $V$, namely uniform on the unit sphere and independent of $X_n$.2012-07-26
  • 0
    @RobertIsrael: would you please check my answer? It's a different way posted by mike2012-07-28
  • 0
    Doesn't this prove the stronger result that $\sum R^{-1}_n<\infty$ almost surely?2016-04-01
  • 0
    Hi, I've posted my query as a new question: http://math.stackexchange.com/questions/1723113/question-on-part-3-of-the-star-trek-problem-in-williams-probability-with-martin2016-04-01
3

I think this way works basically, but there are some details to be completed.

Let $A_k$ be the event that the starship goes back to the Solar System just after the n th space jump. Then $\{A_k\}_{k=1}^\infty$ is a sequence of disjoint events. Obviously $A_k$ is adapt to $\mathcal{F}_n$, here $\mathcal{F}_n$ is the $\sigma$- field generated by the first $n$ jumps.

Let $p_{i+1}=P(A_{i+1}|\mathcal{F}_{i})$, then$$p_{i+1}= \frac{R_n-\sqrt{R^2_n-r^2}}{2R_n}$$

this follows from the area formula of a sphere cap.

(Here I am not sure because $ \frac{R_n-\sqrt{R^2_n-r^2}}{2R_n}$ is the conditional probability that given the ship is outside the solar system after n jumps, it will go back home after the next (n+1 th) jump, is it the same with $P(A_{i+1}|\mathcal{F}_{i})$?)

suppose it is, then we use the inequality $$1-\sqrt{1-x}\geq \frac{x}{2}\quad (0 to derive $$p_{n+1}\geq \frac{r^2}{4R^2_n}$$ so if $\sum_{n}p_n$ converges, so dose $\sum_{n}\frac{1}{R^2_n}$

Levy's extension of the Borel-Cantelli lemma says (see page 124 in Williams's book)

on the set $S=\{\omega:\sum_{n}p_n=\infty\}$, almost surely holds $$\frac{\sum_{k=1}^n \mathbf{1}_{A_k}}{\sum_{k=1}^n p_k}\rightarrow 1$$

but the numerator can only have values 0 and 1,(note the $A_k$s are disjoint) so the set $S$ must have measure 0.

  • 0
    that issue you are worried about isn't a problem if $A_n$ is the event of being inside, instead of the first time you are inside. Then you have $\sum \frac 1 {R_n} < \infty$ for 2 different reasons , 1) because they get home and the process stops and 2) because they don't get home in which case $R_n \rightarrow \infty $, as the same sort of borel cantelli argument shows .2012-07-30
  • 0
    How can we show that $R_n \to \infty$ a.s.? The problem states that the ship never stops, so we must prove that $R_n \to \infty$ otherwise this proof falls apart.2013-04-13
1

By a simple geometrical property : ${\mathbb P}(R_{n+1}^2>2R_n^2)=\frac{1}{2}$

In fact, we have ($0\le\theta\le\pi$): $${\mathbb P}(R_{n+1}<2\sin(\frac{\theta}{2})R_n)=\frac{1-\cos\theta}{2}$$

Let $L_n=\ln(R_n)$ and $I$ such that ${\mathbb P}(I<\ln(2\sin(\frac{\theta}{2})))=\frac{1-\cos\theta}{2}$

$E(I)>0$ and $V(I)$ is defined. As $L_n$ is the sum of $(n-1)$ independent $I$, you can use the central limit theorem, and $L_n$ will be approximated by a normal law with $E=n.E(I)$ and $V=n.V(I)$.

Hence ${\mathbb P}(L_n>2\ln(n)E(I))\rightarrow 1$, hence ${\mathbb P}(R_n>k.n^2)\rightarrow 1$, so the serie is convergent...

  • 1
    In fact $E(I)=\frac{2\ln(2)-1}{4}>0$2012-07-26
  • 1
    Let me suggest that the central limit theorem is not the relevant tool here, rather the law of large numbers: (1.) $L_n/n\to E(I)$ almost surely hence for every $x\lt E(I)$, $P(L_n\gt nx)\to1$, (2.) since $E(I)\gt0$, this holds for some $x\gt0$. Done.2012-07-26
  • 0
    How do you conclude that the series converges?2016-04-01
  • 0
    Doesn't this prove the stronger result that $\sum R^{-1}_n<\infty$ almost surely? You seem to get an exponential bound on $R_n$.2016-04-01
  • 0
    Hi, I've posted my query as a new question: http://math.stackexchange.com/questions/1723113/question-on-part-3-of-the-star-trek-problem-in-williams-probability-with-martin2016-04-01
1

in the book it seems to me that the sum terminates when they get home, so if they get home the sum is finite. williams also hints: it should be clear what thm to use here. i am sure he is referring to levy's extension of the borel-cantelli lemma, $\frac 1 {R_n^2}$ is the conditional probability of getting home at time $n+1$ given your position at time n is $R_n$, it is the proportion of the area of the sphere of radius etc.

  • 0
    How did you get the probability $1/R^2_n$?2012-07-26
  • 0
    you have a sphere of radius $R_n$ touching zero, and you're going to move somewhere on its surface, and the probability that you go home is the prob that you move to the part that intersect the ball of radius r around the origin. You could probably compute that exactly, but is it clear the the since the surface area is propotional to $R_n^2$ the prob of a fixed patch on it is order of $\frac 1 {R_n^2}$ ?2012-07-26
  • 0
    The probability is $\frac{R-\sqrt{R^2-r^2}}{2R}$. not order of $1/R^2$.2012-07-27
  • 0
    your expression is of order $\frac 1 {R^2}$ viz. $R - \sqrt{R^2 - r^2} = R - R \sqrt{1-\frac {r^2}{R^2}} \approx R ( 1-(1-\frac 12 (\frac rR)^2$2012-07-27
  • 0
    would you please check my answer?2012-07-28