34
$\begingroup$

Thomson et al. provide a proof that $\lim_{n\rightarrow \infty} \sqrt[n]{n}=1$ in this book. It has to do with using an inequality that relies on the binomial theorem. I tried to do an alternate proof now that I know (from elsewhere) the following:

\begin{align} \lim_{n\rightarrow \infty} \frac{ \log n}{n} = 0 \end{align}

Then using this, I can instead prove: \begin{align} \lim_{n\rightarrow \infty} \sqrt[n]{n} &= \lim_{n\rightarrow \infty} \exp{\frac{ \log n}{n}} \newline & = \exp{0} \newline & = 1 \end{align}

On the one hand, it seems like a valid proof to me. On the other hand, I know I should be careful with infinite sequences. The step I'm most unsure of is: \begin{align} \lim_{n\rightarrow \infty} \sqrt[n]{n} = \lim_{n\rightarrow \infty} \exp{\frac{ \log n}{n}} \end{align}

I know such an identity would hold for bounded $n$ but I'm not sure I can use this identity when $n\rightarrow \infty$.

If I am correct, then would there be any cases where I would be wrong? Specifically, given any sequence $x_n$, can I always assume: \begin{align} \lim_{n\rightarrow \infty} x_n = \lim_{n\rightarrow \infty} \exp(\log x_n) \end{align} Or are there sequences that invalidate that identity?

(Edited to expand the last question) given any sequence $x_n$, can I always assume: \begin{align} \lim_{n\rightarrow \infty} x_n &= \exp(\log \lim_{n\rightarrow \infty} x_n) \newline &= \exp(\lim_{n\rightarrow \infty} \log x_n) \newline &= \lim_{n\rightarrow \infty} \exp( \log x_n) \end{align} Or are there sequences that invalidate any of the above identities?

(Edited to repurpose this question). Please also feel free to add different proofs of $\lim_{n\rightarrow \infty} \sqrt[n]{n}=1$.

  • 6
    Did you mean \begin{align} \lim_{n\rightarrow \infty} x_n = \exp(\lim_{n\rightarrow \infty}\log x_n)? \end{align}2011-03-21
  • 0
    @Rasmus No, I didn't. But now that you mention it, I think that might be more appropriate for the question I have.2011-03-21
  • 2
    We have $a = \exp(\log a)$ for $a \gt 0$, so what you have is basically asking whether $\lim x_n = \lim x_n$...2011-03-21
  • 3
    The only thing that can spoil such an identity is that $x_n$ might leave the set of definition of the function $\log$.2011-03-21
  • 0
    @AD Good point. I should have thought of that.2011-03-22
  • 0
    your link is broken2012-11-25

6 Answers 6

3

Let $n$ be an integer $n>2$ and real $x>0$, the binomial theorem says $$ (1+x)^n>1+nx+\frac{n(n-1)}{2}x^2 $$ Let $N(x)=\max(2,1+\frac{2}{x^2})$. For $n>N(x)$, we get that $\frac{n(n-1)}{2}x^2>n$. Thus, for any $x>0$, we get that for $n>N(x)$ $$ 1<\sqrt[n]{n}<1+x $$ Thus, we have $$ 1\le\liminf_{n\to\infty}\sqrt[n]{n}\le\limsup_{n\to\infty}\sqrt[n]{n}\le 1+x $$ Since this is true for any $x>0$, we must have $$ \lim_{n\to\infty}\sqrt[n]{n}=1 $$

  • 1
    Nitpick: It is not exactly sandwich theorem, as the limits are different ($1$ and $1+x$). You also cannot assume existence of the limit $\lim n^{1/n}$. This is easily corrected though: $\liminf n^{1/n} \ge 1$ and $\limsup n^{1/n} \le 1+x$ for arbitrary $x$ and thus $\lim n^{1/n} = 1$. +1.2011-08-08
  • 0
    @Aryabhata: good point. I had a more complicated proof that did properly use the Sandwich Theorem. I altered it for simplicity, but did so carelessly. I have removed the reference to the Sandwich Theorem and used $\liminf$ and $\limsup$. Thanks.2011-08-08
39

Here is one using $AM \ge GM$ to $1$ appearing $n-2$ times and $\sqrt{n}$ appearing twice.

$$\frac{1 + 1 + \dots + 1 + \sqrt{n} + \sqrt{n}}{n} \ge n^{1/n}$$

i.e

$$\frac{n - 2 + 2 \sqrt{n}}{n} \ge n^{1/n}$$

i.e.

$$ 1 - \frac{2}{n} + \frac{2}{\sqrt{n}} \ge n^{1/n} \ge 1$$

That the limit is $1$ follows.

  • 1
    This "trick" is just amazing! Thanks for sharing :)2015-11-20
  • 0
    @GniruT: Thanks! Glad you liked it.2015-11-21
23

Since $x \mapsto \log x$ is a continuous function, and since continuous functions respect limits: $$ \lim_{n \to \infty} f(g(n)) = f\left( \lim_{n \to \infty} g(n) \right), $$ for continuous functions $f$, (given that $\displaystyle\lim_{n \to \infty} g(n)$ exists), your proof is entirely correct. Specifically, $$ \log \left( \lim_{n \to \infty} \sqrt[n]{n} \right) = \lim_{n \to \infty} \frac{\log n}{n}, $$

and hence

$$ \lim_{n \to \infty} \sqrt[n]{n} = \exp \left[\log \left( \lim_{n \to \infty} \sqrt[n]{n} \right) \right] = \exp\left(\lim_{n \to \infty} \frac{\log n}{n} \right) = \exp(0) = 1. $$

  • 5
    given that the limits exist2011-03-21
  • 2
    Thanks! (I never can tell whether I'm overthinking a math problem or not)2011-03-22
21

Here's a two-line, completely elementary proof that uses only Bernoulli's inequality:

$$(1+n^{-1/2})^n \ge 1+n^{1/2} > n^{1/2}$$ so, raising to the $2/n$ power, $$ n^{1/n} < (1+n^{-1/2})^2 = 1 + 2 n^{-1/2} + 1/n < 1 + 3 n^{-1/2}.$$

I discovered this independently, and then found a very similar proof in Courant and Robbins' "What is Mathematics".

  • 0
    It's worth noting that the Bernoulli inequality come from the binomial theorem. If $x>0$, then $(1+x)^n>1+nx$, and $n(n^{-1/2})=n^{1/2}$.2011-08-08
  • 3
    No. It is independent and can be easily proved by induction: True for n=1 since (1+x) >= (1+x). If true for n, then (1+x)^(n+1) = (1+x)(1+x)^n >= (1+x)(1+nx) = 1+(n+1)x+nx^2 >= 1+(n+1)x.2011-08-08
  • 5
    Apologies. By "comes from" I meant "follows easily from" not "can only be proved using." $1+nx$ is literally the first two terms of the binomial theorem. However, having never heard of the Bernoulli inequality, I mistakenly thought that your first inequality was the Bernoulli inequality, not an application of it, which is why I didn't write a fuller explanation. You are right that it follows easily from induction, but generally speaking it is easier to remember one big theorem and its consequences than many smaller theorems.2011-08-08
  • 2
    @Aaron: But Bernoulli's inequality holds for all $x\geq-1$. There's a reason it's has its own name instead of just referring to it as first 2 terms of the binomial expansion.2014-03-06
6

$\sqrt[n]{n}=\sqrt[n]{1\cdot\frac{2}{1}\cdot\frac{3}{2}\dots\cdot\frac{n-1}{n-2}\cdot\frac{n}{n-1}}$ so you have a sequence of geometric means of the sequence $a_{n}=\frac{n}{n-1}$. Therefore its limit is equal to $\lim_{n\to\infty}a_{n}=\lim_{n\to\infty}\frac{n}{n-1}=1$.

4

Let $n > 1$ so that $n^{1/n} > 1$ and we put $n^{1/n} = 1 + h$ so that $h > 0$ depends on $n$ (but we don't write the dependence explicitly like $h_{n}$ to simplify typing) Our job is done if show that $h \to 0$ as $n \to \infty$.

We have $$n = (1 + h)^{n} = 1 + nh + \frac{n(n - 1)}{2}h^{2} + \cdots$$ and hence $$\frac{n(n - 1)}{2}h^{2} < n$$ or $$0 < h^{2} < \frac{2}{n - 1}$$ It follows that $h^{2} \to 0$ as $n \to \infty$ and hence $h \to 0$ as $n \to \infty$.