5
$\begingroup$

The sum is:

$ S = 1 + 1/2 + \frac {(n-1)(n-2)} {3n^2} + \frac {(n-1)(n-2)(n-3)} {4n^3} + \ldots + \frac {(n-1)!} {n \times n^{n-1}}$ $= \frac 3 2 + \sum_{k=3}^{n} \frac{n!}{k(n-k)!n^k} $

Can we get an asymptotic lower bound of $S$?

I guess it's $\Omega(\frac 1 2 \log n)$, but I'm not sure how to get it. $k$ starts from 3 because I'm actually counting the expectation of cycles in a graph. And least 3 nodes could form a cycle. But since what I need is an asymptotic lower bound, I guess where $k$ starts doesn't really matter.

  • 0
    The 2 $(n-2)$ is a typo, I just fixed it.2011-10-10

2 Answers 2

3

The term $a_k(n) = \frac{1}{k} \prod_{j=1}^{k-1} (1 - j/n)$ satisfies $a_k(n) < 1/k$, so $S(n) < 3/2 + \sum_{k=3}^n 1/k \approx \ln(n)$. On the other hand, take any $p$ with $0 < p < 1/2$. For $k < n^p$ we have $\prod_{j=1}^{k-1} (1 - j/n) > (1 - n^{p-1})^{n^{p}}$, which has limit 1 as $n \to \infty$. So for any $\epsilon > 0$, if $n$ is sufficiently large we have $S(n) > \sum_{k=3}^{n^{p}} \frac{1-\epsilon}{k} \approx (1-\epsilon) \ln(n^p) = (1-\epsilon) p \ln(n)$

  • 0
    That's really impressive.2011-10-10
5

For a complete asymptotic up to $o(1)$ I get $S(n) = \frac{1}{2} \log n + \frac{\gamma + \log 2}{2} + o(1).$

Your sum $S(n)$ is very close to the sum $Q(n) = 1 + \frac{n-1}{n} + \frac{(n-1)(n-2)}{n^2} + \cdots = \sum_{k \geq 1} \frac{n^{\underline{k}}}{n^k}$ considered in Problem 9.56 in Concrete Mathematics and elsewhere in Knuth's work, so I've adapted some of the arguments I found there.

Let's consider the sum S'(n) = \sum_{k \geq 1} \frac{n^{\underline{k}}}{k n^k} = S(n) - \frac{1}{2n} = S(n) + o(1). In the answer to Problem 9.56 in Concrete Mathematics the authors indicate that Stirling's approximation can be used to show that if $k \leq n^{1/2+\epsilon}$ then

$\frac{n^{\underline{k}}}{k n^k} = e^{-k^2/2n} \left(\frac{1}{k} + \frac{1}{2n} - \frac{2}{3} \frac{k^2}{(2n)^2} + O(n^{-1+4 \epsilon})\right).$ Then, Knuth and Pittel, in "A Recurrence Related to Trees," (Proceedings of the AMS 105(2) 1989, pp. 335-349) indicate this means that $\frac{n^{\underline{k}}}{k \, n^k}$ is exponentially small when $k \geq n^{1/2+\epsilon}$ and so can be replaced with other exponentially small terms to get S'(n) = T_{2n}(-1) + \left(\frac{1}{2n} + O(n^{-1 + 4 \epsilon})\right) T_{2n}(0) - \frac{1}{6n^2} T_{2n}(2), where $T_n(x) = \sum_{k \geq 1} k^x e^{-k^2/n}$.

Lemma 1 in the Knuth and Pittel paper then states that if $x > -1$ then $T_n(x) = \frac{1}{2} \Gamma\left(\frac{x+1}{2}\right) n^{(x+1)/2} + O(1).$ They also mention that a derivation of $T_n(-1) = \frac{1}{2} \log n + \frac{\gamma}{2} + O(n^{-1})$ is in Knuth's Art of Computer Programming, Vol. 3, Exercise 5.2.2-4, as part of the analysis of bubblesort.

Putting this all together gives us $S(n) = \frac{1}{2} \log (2n) + \frac{\gamma}{2} + o(1) = \frac{1}{2} \log n + \frac{\gamma + \log 2}{2} + o(1).$

For more on the $Q(n)$ and related functions and their asymptotics, see The Art of Computer Programming, Vol. 1 (3rd ed.), Section 1.2.11.3.

  • 0
    @ablmf: Unfortunately, I don't know of a simpler one. I see that you have asked for a simpler proof as a separate question, though, so maybe somebody else on the site can think of one. However, if Knuth and Pittel felt the need to go through work like this to solve a similar problem, my guess is that finding a simpler proof won't be easy.2011-10-13