2
$\begingroup$

I have had big problems finding the limit of the sequence $x_1=0 , x_{n+1}=\frac{1}{1+x_n}$. So far I've only succeeded in proving that for $n\geq2$: $x_n>0\Rightarrow x_{n+1}>0$

(Hopefully that much is correct: It is true for $n=2$, and for $x_{n+1}>0$ exactly when $\frac{1}{1+x_n}>0$, which leads to the inequality $x_n>-1$ which is true by the induction assumption that $x_n>0$.)

On everything else I failed to come up with answers that make sense (such as proving that $x_{n+1}>x_n \forall n\geq1$). I'm new to recursive sequences, so it all seems like a world behind the mirror right now. I'd appreciate any help, thanks!

  • 0
    Have you tried writing down the first 5 or 10 terms of the sequence? You know $x_1=0$, so $x_2=1$, $x_3=1/2$, and so on. This usually helps get me going.2012-11-12
  • 0
    I did, it's basically fibonacci/fibonacci(n+1), or that much I could tell. But I'm not sure how to use that knowledge.2012-11-12
  • 1
    Try solving $x=\dfrac{1}{1+x}$. It does not prove convergence, but one of the solutions is the limit if such a limit exists.2012-11-12
  • 1
    @DahnJahn: or you can use the closed form formula for the Fibonacci numbers.2012-11-12
  • 0
    Couldn't I maybe prove that it is bounded and increasing without proving it's similarity to Fibonnaci?2012-11-12
  • 0
    It's bounded, but not increasing. You will see from my answer that it oscillates around the limit as it approaches.2012-11-12
  • 0
    A different viewpoint is that this sequence is the sequence of convergents to the continued fraction $\cfrac{1}{1 + \cfrac{1}{1 + \ddots}}$ If we've already proved some theorems about convergence of continued fractions, we can invoke them.2012-11-12
  • 0
    http://math.stackexchange.com/questions/38739/convergence-of-a-n1-frac11a-n2013-04-20

5 Answers 5

5

It is obvious that $f:x\mapsto\frac1{1+x}$ is a monotonically decreasing continuous function $\mathbf R_{\geq0}\to\mathbf R_{\geq0}$, and it is easily computed that $\alpha=\frac{-1+\sqrt5}2\approx0.618$ is its only fixed point (solution of $f(x)=x$). So $f^2:x\mapsto f(f(x))$ is a monotonically increasing function that maps the interval $[0,\alpha)$ into itself. Since $x_3=f^2(x_1)=\frac12>0=x_1$ one now sees by induction that $(x_1,x_3,x_5,...)$ is an increasing sequence bounded by $\alpha$. It then has a limit, which must be a fixed point of $f^2$ (the function mapping each term of the sequence to the next term). One checks that on $ \mathbf R_{\geq0}$ the function $f^2$ has no other fixed point than the one of $f$, which is $\alpha$, so that must be value of the limit. The sequence $(x_2,x_4,x_6,...)$ is obtained by applying $f$ to $(x_1,x_3,x_5,...)$, so by continuity of $f$ it is also convergent, with limit $f(\alpha)=\alpha$. Then $\lim_{n\to\infty}x_n=\alpha$.

  • 0
    I like this solution. You just prove $(x_1,x_3,\ldots)$ is increasing (and hence converges), then use continuity of $f$ to prove $(x_2,x_4,\ldots)$ converges to the same limit.2012-11-12
  • 0
    Thanks! Your answer in particular got me to the final idea I used to prove this, only I used the sort of math I know how to use :)2012-11-13
3

Here is a way to show that the sequence converges to the unique positive solution to $a=1/(1+a)$: Define $f(x)=1/(1+x)$. Then $f'(x)=-1/(1+x)^2$. For every $n$, the mean value theorem gives $$x_{n+1}-a=f(x_n)-f(a)=(x_n-a)f'(c_n)$$ for some $c_n$ between $x_n$ and $a$. Since $-1 for all $x>0$, this shows that $\lvert x_n-a\rvert$ decreases with $n$. Moreover, we already have $f'(1/2)=-4/9$, and $f'$ is increasing, so $-4/9 for all $n\ge2$. Thus $\lvert x_n-a\rvert$ decreases at least as fast as $(4/9)^n$, i.e., geometrically. In particular, the $x_n-a\to0$.

Addendum: To do this without differentiation, just rely on the computation $$f(x)-f(y)=\frac1{1+x}-\frac1{1+y}=\frac{y-x}{(1+x)(1+y)}$$ instead.

  • 0
    Isn't this sequence a Cauchy one?2012-11-12
  • 0
    It converges, therefore it is Cauchy. But yes, you can use this sort of argument to show the Cauchy property directly, but I wouldn't know if the OP is expected to know about Cauchy sequences.2012-11-12
  • 0
    Exactly!. In fact, I wish the OP consider why did(and how did) you take $f(x)$ as defined above. Yours here is perfect Harald +1.2012-11-12
  • 0
    The problem would be that I'm not even supposed to "know" of differentiation and the mean value theorem yet (first uni semester..). I feel bad for putting such restraints on the method of solution, but although I shall definitely take a look at this method, I can't use it for my proof. But thanks for broadening my horizons :)2012-11-12
  • 0
    @DahnJahn: You can do without the derivative too. I added something to get you going.2012-11-12
  • 1
    Thanks. I do understand your approach, although at this point I'd never come up with it myself. Still, isn't there a more 'elementar' way of proving this? Or rather - there has to be one!2012-11-12
3

Firstly, I'd like to thank everybody who contributed! Your insight helped me greatly in understanding the problem. Here is my solution (one that I feel is more down to earth in terms of mathematical skills used!)

It is easy enough to prove that $x_n\in[0,1] \forall n$ through induction. Now we definitely have a bounded sequence.

However, it is not monotonic and after some investigation, one can see that it seems to oscilate around its limit $\frac{\sqrt{5}-1}{2}$ - that is, if it actually converges!

Now the crucial step. I used the expression $x_{n+2}=\cfrac{1}{1+\cfrac{1}{1+x_n}}$ to check for which values it is monotonic, which lead to this:

$x_{n+2}>x_n$ for $x_n\in(0,\frac{\sqrt{5}-1}{2})$

$x_{n+2} for $x_n>\frac{\sqrt{5}-1}{2}$

Thus we have two subsequences, both of which are bounded and monotonic and converging to the same limit, only from two different sides.

This answer isn't probably full, but the additional proofs needed are trivial enough, this was the most important step I needed to take to get to the answer.

  • 0
    I wonder which *mathematical skills* my answer is using, that are not *down to earth*.2012-11-13
  • 0
    Your answer seems to be mathematically beautiful, but even after some working with pen and paper, I wasn't able to understand what is going on there. I wish I did!2012-11-13
  • 0
    Did you try to compute $x_{n+1}-\omega$ in terms of $x_n$? Once again, I wonder what could prevent you to reach the first formula stated in my answer (especially since you have some pen and paper at your disposal...)--but surely I am missing something.2012-11-13
  • 0
    Like that approach. +12012-12-30
2

Let $\omega\gt0$ such that $\omega^2+\omega=1$, then $x_{n+1}-\omega=-\omega\cdot\frac1{1+x_n}\cdot(x_n-\omega)$ hence $$ |x_n-\omega|\leqslant\omega^n\cdot|x_0-\omega|, $$ and the rest is easy since $\omega\lt1$.

1

If you're willing to assume the sequence converges to a number $x$, then you can see that $x = \frac{1}{1 + x}$, and then you can solve for $x$.

  • 0
    It is a very dangerous practice to be willing to assume a limit exists and then compute with it as if it really exists to find out what it would be if it were to exists. What if it does not exist??? The question was about finding the limit, and not about finding the value of the limit, should that exist.2012-11-12
  • 0
    Then I'd get $x=\frac{-1\pm\sqrt5}{2}$, of which the positive one will be the right candidate. How to prove that it really does converge to that, though? edit: oh and sorry, I actually do want to find the value of the limit!2012-11-12
  • 0
    @IttayWeiss: Yes and no. You shouldn't stop there, but it makes very good sense to find out first what the limit must be if it exists, and then try to show that the sequence does (or does not) converge to that limit.2012-11-12
  • 1
    I agree that it's important to be careful about issues of convergence, but a different viewpoint is "too much rigor teaches rigor mortis". There's some value in doing "Eulerian math" as well.2012-11-12
  • 0
    @Harald: Yes and no :) It is a good technique since it would either show the limit does not exist or at least give you (several) candidates. But the question posted was about finding the limit and thus the solution offered, I find, reinforces dangerous tendencies of not worrying about existence proofs.2012-11-12
  • 0
    I wouldn't worry, because I am definitely worried about the existence proofs, haha :)2012-11-12
  • 0
    @littleO: I'm afraid I must disagree with you, at least for the amount of rigor required for this question. Euler, upon wrongly analyzing problems requiring about the same amount of rigor concluded that -1>\infty and proceeded to conclude that god exists.2012-11-12
  • 0
    @IttayWeiss: I suspect we are in substantial agreement. Still, I thought littleO's point was a good one. We sure do know that Euler did some hairy things by today's standards, but he seemed to have a knack at getting it right anyhow. I hadn't heard the anecdote you refer to, though. If it is true, I suspect it was tongue in cheek.2012-11-12
  • 0
    The apocryphal anecdote about Euler (taken from a footnote of "adventures in formalism", Smorynski): .... him [Euler] being called upon to silence Denis Diderot, whose atheistic utterances .... Euler stood up and said, "Monsieur! (a+b^n)/n=x therefore God exists; respond!". Other mathematicians at the time found religious truth in nonsense formulas derived formally (e.g., Bolzano, Grandi, Leibnitz...).2012-11-12
  • 0
    @HaraldHanche-Olsen See [The so-called Euler-Diderot incident](http://www.fen.bilkent.edu.tr/~franz/M300/bell2.pdf).2012-11-12
  • 0
    @did: Oh, that one is quite well known … but that is a very different story. However, thanks for the link to the *Monthly* article about it. It was new to me.2012-11-12