8
$\begingroup$

While reading Steven Finch's amazing book Mathematical Constants I once encountered Grossman's constant. This is an interesting constant $c$ defined as the unique $x_1\in\mathbb{R}$ such that the sequence $\{x_n\}_{n=0}^\infty$ defined by the recurrence:

$$x_{n}=\frac{x_{n-2}}{1+x_{n-1}}$$

for $n\ge0$ with $x_0=1$ converges, where $c\approx$$\;0.73733830336929...$. This seems like quite a remarkable theorem and I have no idea how to go about proving that a recurrence of this form converges for a single value, although it seems to have something to do with the limiting behaviour of the odd and even terms. I do not have access to the paper referenced by Finch and MathWorld in which the proof is apparently given, so I am wondering at the very least what techniques were used to prove it.

My question is: Does anyone know of (or can come up with) a proof (or even the idea of a proof) that this sequence converges for a unique $x_1$? Also, is any closed form for $c$ yet known?

  • 0
    so is it a $+$ or a $-$ ?2017-02-03
  • 0
    Wolfram states it with a +. I corrected the body of the message, but I forgot to change the sign in the title too. Thanks for noticing it!2017-02-03
  • 0
    @mau Thanks for noticing that, I don't know how that got through.2017-02-05
  • 1
    as for the problem, I think a starting point for proving it is to show that the subsequences of even and odd terms are (eventually?) monotone, so that they have a limit, and to study the behaviour of these limits as $x_1$ varies.2017-02-06
  • 0
    @mau I thought about that, but I'm pretty stumped when it comes to actually putting it into practice. They don't seem very nice subsequences.2017-02-08
  • 0
    well, I made some numerical experiments and I noticed that if $x_1$ is "large" ($\ge 1$) then $a_{2n}$ goes to 0, while if $x_1$ is "small" ($\le .5$) then $a_{2n-1}$ goes to 0. My guess is that this is true for each $x_1$ different from Grossman's constant.2017-02-08
  • 0
    The link you posted http://mathworld.wolfram.com/GrossmansConstant.html states that $x_1=c$ and "no analytic form is known for this constant, either as the root of a function or as a combination of other constants".2017-02-08
  • 0
    @rtybase I was aware of that; I just added that in case one had been found recently (since MathWorld isn't always very up-to-date, especially on obscure topics).2017-02-09

1 Answers 1

2

This is not an answer but here is a collection of facts about the sequences :

If $x_0,x_1 \ge 0$ then $x_n \ge 0$ forall $n$, and $x_{n+2} = \frac{x_n} {1+x_{n+1}} \le x_n$, so that the two sequences
$(x_{2n})$ and $(x_{2n+1})$ are decreasing, so they have limits $l_0$ and $l_1$.

If the limit of one of the subsequences is nonzero, then the other sequence converges to $0$ exponentially, so one of them has to be $0$. Then we have to prove that forall $x_0 \ge 0$ there is a unique $x_1 \ge 0$ such that the sequence converges to $0$.

A long computation shows that
$(x_{n+3} - x_{n+2}) - (x_{n+1} - x_n) = \frac {x_n^2 x_{n+1}}{(1+x_{n+1})(1+x_n+x_{n+1})} \ge 0$,

and so the sequences $(x_{2n+1}-x_{2n})$ and $(x_{2n+2}-x_{2n+1})$ are increasing. In particular, as soon as one of them gets positive, we know that the sequence will not converge. Conversely, if $(x_{2n})$ doesn't converge to $0$ then $(x_{2n+1})$ converges to $0$ and so we must have $x_{2n+1} - x_{2n} > 0$ at some point, and similarly for the other case.

This means that $(x_n)$ converges to $0$ if and only if it stays decreasing forever, and we can decide if a particular sequence doesn't converge to $0$ by computing the sequence until it stops decreasing.

It also follows that the set $\{(x_0,x_1) \in\Bbb R_+^2\mid \lim x_n = 0\}$ is a closed subset of $\Bbb R_+^2$.

  • 0
    This is certainly useful. +1. I'll have a closer look at it later.2017-02-08