86
$\begingroup$

I'm trying to find $$\lim_{n\to\infty}\frac{n}{\sqrt[n]{n!}} .$$

I tried couple of methods: Stolz, Squeeze, D'Alambert

Thanks!

Edit: I can't use Stirling.

  • 4
    Hint: Stirling.2011-03-22
  • 14
    @Didier: Thank you for the comment, but unless you ment the city in scotland, I didn't study stirling method yet.2011-03-22
  • 5
    Try taking the natural log and finding the limit of that.2011-03-22
  • 14
    Second try: Stirling formula.2011-03-22
  • 0
    Becca, do you mean Cauchy condensation test? if so, I cannot use that.2011-03-22
  • 0
    @didier: What does Striling formula different from your first suggestion?2011-03-22
  • 1
    I remember back in the 90's, in a course about the use of Mathematica, I asked the lecturer to try this limit, knowing that it converges really slowly. Mathematica couldn't handle it back then. I suppose this has been corrected by now though. I was also surprised the lecturer didn't know that limit. He guessed 1 for the result. Maybe he had been working so long in algebraic geometry that he had forgotten about Stirling?2011-03-22
  • 0
    Didier was referring to what googling these two words would have led you to: http://en.wikipedia.org/wiki/Stirling%27s_approximation2011-03-22
  • 0
    @Anthony: I can't use this formula for solving this limit. not because I don't feel like, just bacause I'm not allowed.2011-03-22
  • 0
    @Raskolnikov: Why did you delete your answer?2011-03-22
  • 0
    @Thei Buehler: Because I realized my bound was not tight enough. I could have tried to fix that, but I didn't immediately see how to do that in a way that wouldn't involve things that Nir can't use. And after seeing your answer in particular, I thought it was not worth trying to work it out any further.2011-03-22
  • 0
    @Raskolnikov: Ok, thanks for the clarification.2011-03-22
  • 2
    This is (after a slight modification) Problem 1.2.2 from Radulescu, Radulescu, Andreescu: Problems from Real Analysis, [p.8](http://books.google.com/books?id=hGYficzfWyQC&pg=PA8).2011-10-19
  • 1
    http://math.stackexchange.com/questions/201906/showing-that-frac-sqrtnnn-rightarrow-frac1e2014-02-21
  • 0
    Possibly related: https://math.stackexchange.com/questions/1904113/limit-cn-n-nn-as-n-goes-to-infinity2016-10-31

7 Answers 7

71

Let $\displaystyle{a_n=\frac{n^n}{n!}}$. Then the power series $\displaystyle{\sum_{n=1}^\infty a_n x^n}$ has radius of convergence $R$ satisfying $\displaystyle{\frac{1}{R}=\lim_{n\to \infty} \sqrt[n]{a_n}=\lim_{n\to\infty}\frac{a_{n+1}}{a_n}}$, provided these limits exist. The first limit is what you're looking for, and the second limit is $\displaystyle{\lim_{n\to\infty}\left(1+\frac{1}{n}\right)^n}$.

Added: I just happened upon a good reference for the equality of limits above, which gives a more general result which is proved directly without reference to power series. Theorem 3.37 of Rudin's Principles of mathematical analysis, 3rd Ed., says:

For any sequence $\{c_n\}$ of positive numbers, $$\liminf_{n\to\infty}\frac{c_{n+1}}{c_n}\leq\liminf_{n\to\infty}\sqrt[n]{c_n},$$ $$\limsup_{n\to\infty}\sqrt[n]{c_n}\leq\limsup_{n\to\infty}\frac{c_{n+1}}{c_n}.$$

In the present context, this shows that $$\liminf_{n\to\infty}\left(1+\frac{1}{n}\right)^n\leq\liminf_{n\to\infty}\frac{n}{\sqrt[n]{n!}}\leq\limsup_{n\to\infty}\frac{n}{\sqrt[n]{n!}}\leq\limsup_{n\to\infty}\left(1+\frac{1}{n}\right)^n.$$ Assuming you know what $\displaystyle{\lim_{n\to\infty}\left(1+\frac{1}{n}\right)^n}$ is, this shows both that the limit in question exists (in case you didn't already know by other means) and what it is.


From the comments: User9176 has pointed out that the case of the theorem above where $\displaystyle{\lim_{n\to\infty}\frac{c_{n+1}}{c_n}}$ exists follows from the Stolz–Cesàro theorem applied to finding the limit of $\displaystyle{\frac{\ln(c_n)}{n}}$. Explicitly, $$\lim_{n\to\infty}\ln(\sqrt[n]{c_n})=\lim_{n\to\infty}\frac{\ln(c_n)}{n}=\lim_{n\to\infty}\frac{\ln(c_{n+1})-\ln(c_n)}{(n+1)-n}=\lim_{n\to\infty}\ln\left(\frac{c_{n+1}}{c_n}\right),$$ provided the latter limit exists, where the second equality is by the Stolz–Cesàro theorem.

  • 0
    I tried to use that but I wasnt smart enough to change the original series. thanks jonas.2011-03-22
  • 0
    @Theo: Thanks. @Nir: I know that this can be stated without explicit reference to power series and radii of convergence, but this answer reveals my bias toward thinking of power series. Is this what you meant by "delambre"? Do you have a reference for Delambre's theorem? (My internet searching isn't successful.)2011-03-22
  • 0
    @jonas: Yeah, this is what I ment. I really tried to look for a reference for that in english but I couldn't find anything that relates to this specifically, I'm sorry.2011-03-22
  • 0
    @Nir: A reference not in English would be good, too. (With probability approximately one I cannot read the reference you have, but I could use online translators.) I'm really curious because my searches for delambre and limits don't turn up anything relevant.2011-03-22
  • 0
    I believe it was made originally regarding convergence tests: http://en.wikipedia.org/wiki/Convergence_tests but in the english version there's no mention for Delambre name.2011-03-22
  • 0
    Heh. Nice proof.2011-03-22
  • 0
    Delambre = D'Alembert ?2011-03-23
  • 1
    @lhf: Of course! Thanks. I didn't think of that in part because searching for a mathematician named Delambre turned up this guy: http://www-history.mcs.st-and.ac.uk/Mathematicians/Delambre.html D'Alembert is credited with the ratio test. I'd still be interested in a reference to this particular "trick" that doesn't use power series as an intermediary (not that I have any thing against power series).2011-03-23
  • 0
    +1. Nice cool trick ! $$\lim_{n \rightarrow \infty} \sqrt[n]{a_n} = \lim_{n \rightarrow \infty} \frac{a_{n+1}}{a_n}$$2011-03-23
  • 2
    Just a short comment, the mentioned Theorem is just the Stolz-Cezaro theorem applied to $\frac{\ln (a_n)}{n}$.2011-04-10
  • 0
    @user9176: Thanks very much for the comment, I didn't know that.2011-04-10
  • 0
    This trick is nice one you know the radius of convergeance of the taylor series > 0. However how do you know this in this case ? Or how can this be known in more general cases ? In this specific case one knows it exists because clearly lim f(n)/g(n) is always >=1 if f(n)>= g(n) (*). I guess this is how it works for the more general case (*) and it cannot be improved ( apart from switching f and g of course ). Should this be added to the answer ? edit : i now understand this trick can be generalized :) I wonder if i will ever use it.2012-09-07
  • 0
    After I read all your answer I couldn't find what was the $\large ANSWER$.2014-05-25
  • 0
    @Felix: What is the answer?2014-06-27
  • 0
    $\displaystyle{\large{\rm e} \approx 2.718281828\ldots}$.2014-06-27
  • 0
    @Felix: Thank you. I see $e$ in the answer.2014-06-27
37

This is going to be a bit difficult (since apparently lots of things aren't allowed). Here's how I would do it (this is far from a complete solution but just a couple of hints):

I hope you know that $e = \lim_{n \to \infty} \left(1 + \frac{1}{n}\right)^{n}$ (this is often taken as the definition of $e$).

You can show easily that the sequence $c_{k} = \left(1 + \frac{1}{k}\right)^k$ is monotonically increasing and that the sequence $d_{k} = \left(1 + \frac{1}{k}\right)^{k+1}$ is monotonically decreasing. This gives the squeezing $$\displaystyle \left(1 + \frac{1}{k}\right)^k = c_k \lt e \lt d_k = \left(1 + \frac{1}{k}\right)^{k+1}.$$

By taking the products $c_{1} c_{2} \cdots c_{n}$ and $d_{1} d_{2} \cdots d_{n}$ you can then show $$\displaystyle \frac{(n+1)^n}{n!} \lt e^n \lt \frac{(n+1)^{n+1}}{n!} $$ using a few manipulations.

Now extract roots on both sides of the last inequalities and you're there.

26

By applying Cauchy-d'Alembert criterion we get that:

$$\lim_{n\to\infty} \frac{n}{n!^{\frac{1}{n}}}=\lim_{n\to\infty}\left(\frac{n^n}{n!}\right)^{\frac{1}{n}} = \lim_{n\to\infty} \frac{(n+1)^{(n+1)}}{(n+1)!}\cdot \frac{n!}{n^n} = \lim_{n\to\infty} \frac{(n+1)^n}{n^n} =\lim_{n\to\infty} {\left(1+\frac{1}{n}\right)^{n}}=e. $$

Q.E.D.

  • 0
    I'll just add that the result used here is shown in this post: http://math.stackexchange.com/questions/287932/convergence-of-ratio-test-implies-convergence-of-the-root-test (And many other posts on this site.) This is pointed out more explicitly in Jonas Meyer's answer.2014-07-16
19

If $f(n)=\frac{n}{\sqrt[n]{n!}}$ and $g(n) = f(n)^n$ then

$$g(n) = \frac{n^n}{n!}$$

and taking the ratio of terms, removing the factorials and using $\frac{n+1}{n} = 1+\frac{1}{n}$,

$$ \frac{g(n+1)}{g(n)} = \left(1 + \frac{1}{n}\right)^n $$

You may recognise this as having a limit of $e$. It implies

$$\lim_{n \to \infty} \frac{g(n+1)}{g(n)} \frac{1}{e} = 1$$

and so multiplying a string of these together

$$\lim_{n \to \infty} \frac{g(n)}{e^n h(n)} = 1$$

for some function $h(n)$ which grows more slowly than $e^n$ or decays more gently than $e^{-n}$, [not that it matters, but $h(n)$ is about $1/\sqrt{2 \pi n}$] so taking the $n$-th root

$$\lim_{n \to \infty} \frac{f(n)}{e} = \lim_{n \to \infty} h(n)^{1/n} = 1$$

and so $\lim_{n \to \infty} f(n) = e$.

  • 0
    +1. Any fine answer should end with a statement which shows the answer to the OP question as you did it !!!.2014-05-25
4

If you take the log, it is:

$$\frac{1}{n}\sum_{k=1}^n \log\left(\frac{k}{n}\right)$$

Which is a Reimann sum for $\int_{0}^1 \log x$.

The indefinite integral is $F(x)=x\log x-x$ and $\lim_{x\to 0} x\log x -x =0$, and $F(1)=-1$.

You have to deal with the fact that this integral is an improper integral, but it "just works."

3

what's wrong with just logging the expression? $$ \varphi (n) = \frac{n}{n!^{\frac{1}{n}}}\\ L \varphi(n) = \log \varphi(n) = \log n - \frac{\log n!}{n} = \log n -\sum_{k=1}^{n}\frac{\log k}{n} \\ \sim \log n -\frac{n \log n -n + 1 }{n} = \log n - \log n +1 + \frac{1}{n}= 1 + o(1) $$ Hence $\lim_{n \to \infty} \varphi(n) =e^1 = e$

EDIT: to make things sharper, here's the approximation using Euler-Maclaurin formula: $\sum_{k=1}^{n} \log k = \int_{1}^{n}\log x dx + O(\log n) = n \log n -n +1 +O(\log n ).$ Obviously $\lim_{n \to \infty} \frac{\log n }{n} = 0$, hence the statement above holds: $$ \frac{n \log n -n -\frac{1}{2} \log n +1}{n} = \log n -1 +o(1) $$ and the result holds because $e^{o(1)} = 1$.

  • 0
    thanks, I fixed the first part. I guess Euler-Maclaurin approximation should be enough to show that the remainder terms $\to 0$ and hence when exponentiated I get $e^{1+o(1)} = e^1$2014-06-09
  • 0
    Euler-Maclaurin is more than enough. You get the necessary bounds by more elementary means already (but of course, if you have Euler-Maclaurin, why not use it?).2014-06-09
  • 0
    please see the edit.2014-06-09
2

Let $[x]$ denote the largest integer not exceeding $x.$ For $n\geq 1$ we have $$\log n! =\int_1^{n+1}\log [x]\; dx<\int_1^{n+1}\log x\; dx=-n+(n+1)\log (n+1)$$ and $$\log n!=\int_1^n \log (1+[x]) \;dx\geq \int_1^n\log x \;dx=1-n+n\log n.$$ So $$1/n\leq 1+\log ( (n!^{1/n}/n)<(1+1/n)\log (n+1)-\log n=\log (1+1/n)+(1/n)\log (n+1).$$ Since $(1/n)\log (n+1)\to 0$ as $n\to \infty$ we have $$\lim_{n\to \infty}\log (n!^{1/n}/n)=-1.$$