7
$\begingroup$

The question comes from a statement in Concrete Mathematics by Graham, Knuth, and Patashnik on page 465.

$\sum_{k \geq n} \frac{(\log k)^2}{k^2} = O \left(\frac{(\log n)^2}{n} \right).$

How is this calculated?

  • 0
    You can guess the result replacing the sum by an integral, integrating by parts.2011-07-19

3 Answers 3

5

Note that the function $f$ defined by $ f(x) = \frac{{(\log x)^2 }}{{x^2 }} $ is decreasing for $x > x_0$ (for some x_0 > 0), and that $ \frac{{\frac{d}{{dx}}\int_x^\infty {\frac{{(\log t)^2 }}{{t^2 }}dt} }}{{\frac{d}{{dx}}\frac{{(\log x)^2 }}{x}}} = \frac{{ - \frac{{(\log x)^2 }}{{x^2 }}}}{{\frac{{2\log x - (\log x)^2 }}{{x^2 }}}} = \frac{{ - \log x}}{{2 - \log x}} \to 1 \;\; {\rm as} \;\; x \to \infty , $ hence also $ \frac{{\int_x^\infty {\frac{{(\log t)^2 }}{{t^2 }}dt} }}{{\frac{{(\log x)^2 }}{x}}} \to 1 \;\; {\rm as} \;\; x \to \infty. $

EDIT (elaborating; what follows also completes mixedmath's answer):

With $f$ as above, f'(x) = \frac{{2\log x(1 - \log x)}}{{x^3 }}, implying that $f$ is decreasing on $[e,\infty)$. It follows that for any $n \geq 3$, $ \int_n^\infty {f(x)\,dx} \leq \sum\limits_{k = n}^\infty {f(k)} \leq f(n) + \int_n^\infty {f(x)\,dx} . $ Hence $ \frac{{\int_n^\infty {f(x)\,dx} }}{{\frac{{(\log n)^2 }}{n}}} \le \frac{{\sum\nolimits_{k = n}^\infty {f(k)} }}{{\frac{{(\log n)^2 }}{n}}} \le \frac{{f(n)}}{{\frac{{(\log n)^2 }}{n}}} + \frac{{\int_n^\infty {f(x)\,dx} }}{{\frac{{(\log n)^2 }}{n}}}. $ So from $ \frac{{f(n)}}{{\frac{{(\log n)^2 }}{n}}} = \frac{1}{n} \to 0 $ and $ \frac{{\int_n^\infty {\frac{{(\log x)^2 }}{{x^2 }}\,dx} }}{{\frac{{(\log n)^2 }}{n}}} \to 1 $ as $n \to \infty$, it follows that $ \frac{{\sum\nolimits_{k = n}^\infty {f(k)} }}{{\frac{{(\log n)^2 }}{n}}} \to 1. $

  • 0
    That's pretty cool. I like that.2011-07-19
10

I think the easiest way here is to simply find $\displaystyle \int_k ^{\infty} \frac {(\log x)^2}{x^2}dx$. After some work, it turns out to be $\dfrac{\log k(\log k + 2) + 2}{k}$. Oh - but I'm also somewhat confident that Qiaochu's suggestion to Sum by Parts would work as well (and give almost the same answer).

  • 1
    @acetv: for non-negative monotonically decreasing functions, yes. See http://en.wikipedia.org/wiki/Integral_test_for_convergence .2011-07-19
9

Here's a pretty straightforward solution. The idea is to split the sum up into a main part and an error term, and generalizes to many sums where the integral test won't work because the corresponding integral is difficult.

First recall that we have $\sum_{k \ge n} \frac{1}{k^s} = O \left( \frac{1}{n^{s-1}} \right)$ (for example by the integral test, although for integer $s$ there is a more elementary way to see this). Split the sum as

$\sum_{k=n}^{n^2} \frac{(\log k)^2}{k^2} + \sum_{k \ge n^2+1} \frac{(\log k)^2}{k^2}.$

Since $(\log k)^2$ is eventually bounded by $k^{\epsilon}$ for all $\epsilon > 0$, the second term is $O \left( \frac{1}{n^{4-2\epsilon}} \right)$ for all $\epsilon > 0$. On the other hand, the first term is at most

$\sum_{k=n}^{n^2} \frac{(\log n^2)^2}{k^2} \le 4 (\log n)^2 \sum_{k=n}^{\infty} \frac{1}{k^2} = O \left( \frac{(\log n)^2}{n} \right).$

(Note that unlike the argument using the integral test, this argument doesn't optimize the constant as it stands. But one can actually replace $n^2$ with $n^{1+\epsilon}$ for any $\epsilon > 0$ and the argument carries through, and taking the limit as $\epsilon \to 0$ gives the correct constant.)


For a fun exercise in splitting sums, try getting an optimal bound for the sum described in this blog post.

  • 0
    @Antonio: very nice!2011-07-26