23
$\begingroup$

I don't understand why ${ \displaystyle \sum\limits_{n=1}^{\infty} \frac{1}{n} }$ is divergent, but ${ \displaystyle \sum\limits_{n=1}^{\infty} \frac{1}{n^2} }$ is convergent and its limit is equal to ${ \displaystyle\frac{\pi^2}{6} }$. In both cases, ${n^{th}}$ term tends to zero, so what makes these series different?

${ \displaystyle \sum\limits_{n=1}^{\infty} \frac{1}{n} = 1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + \frac{1}{5} + \frac{1}{6} + }$ ...

${ \displaystyle \sum\limits_{n=1}^{\infty} \frac{1}{n^2} =1 + \frac{1}{4} + \frac{1}{9} + \frac{1}{16} + \frac{1}{25} + \frac{1}{36} + }$ ...

  • 1
    The concept that the terms tend towards zero, yet one diverges and the other converges, causes quite a confusion. Think of it as, given any positive number $n$, the first series eventually, eventually increases beyond $n$. However, the second series, given however big a number, does not get past its limit value.2014-02-24

4 Answers 4

55

The following example may be easier to grasp. Consider the series $1+\frac{1}{2}+\frac{1}{2}+\frac{1}{4}+\frac{1}{4}+\frac{1}{4}+\frac{1}{4}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\cdots. $ So we have $2$ copies of $\frac{1}{2}$, $4$ copies of $\frac{1}{4}$, $8$ copies of $\frac{1}{8}$, $16$ copies of $\frac{1}{16}$, and so on like that forever.

The $2^k$ copies of $\frac{1}{2^k}$ add up to $1$, so partial sums get arbitrarily large, and therefore our series diverges.

Now look at the series of the squares of the above numbers. We get $1+\frac{1}{4}+\frac{1}{4}+\frac{1}{16}+\frac{1}{16}+\frac{1}{16}+\frac{1}{16}+\frac{1}{64}+\frac{1}{64}+\frac{1}{64}+\frac{1}{64}+\frac{1}{64}+\cdots $ The first entry is $1$. The next $2$ entries add up to $\frac{1}{2}$. The next $4$ entries add up to $\frac{1}{4}$. The next $8$ entries add up to $\frac{1}{8}$. And so on forever. So the full sum is $1+\frac{1}{2}+\frac{1}{4}+\frac{1}{8}+\cdots$ This geometric series has sum $2$.

  • 0
    @kahen I don't get your comment. I can look at the previous LaTeX, and he didn't do a period after double dollar signs. He did the period inside the double dollar signs, which is correct. What am I missing?2012-11-28
15

As littleO's comment says, the difference is in how fast the terms are going to zero. Terms going to zero isn't enough. The rate at which the terms decay also makes a difference. In you example, if you consider $\sum_{n=1}^{\infty}\frac{1}{n^p}$ where $p$ can take any (let's stay with real) value for example, for $p=1$ as you already say, the series diverges. But for $p=2$ we do have convergence. It turns out that $p=1$ is the "boundary" between convergence and divergence. As long as $p$ is bigger than one, no matter how close to one, the series will converge. And as long as $p\leq1$ the series will diverge.

Every calculus student (myself included) by the way goes through these two stages. First I couldn't believe that adding up infinite terms (no matter how small) can give us a finite number. Then I thought as long as the terms go to zero we have convergence. Its tricky stuff but a lot of fun.

  • 0
    Thank you for your answer!2012-11-24
11

We can see that the harmonic series, $\displaystyle\sum_{k=1}^\infty\frac1k$, diverges using the classical observation: $ \frac11+\frac12+\underbrace{\frac13+\frac14}_{\mbox{$2$ terms}}+\underbrace{\frac15+\frac16+\frac17+\frac18}_{\mbox{$4$ terms}}+\dots+\underbrace{\frac1{2^n+1}+\dots+\frac1{2^{n+1}}}_{\mbox{$2^n$ terms}}+\dots $ where each grouping of terms totals at least $\frac12$.


The series $\displaystyle\sum_{k=1}^\infty\frac1{k^2}$ converges to a value $\le2$ by comparison: $ \frac1{1^2}+\underbrace{\frac1{2^2}}_{\Large\lt\frac1{1\cdot2}}+\underbrace{\frac1{3^2}}_{\Large\lt\frac1{2\cdot3}}+\underbrace{\frac1{4^2}}_{\Large\lt\frac1{3\cdot4}}+\dots+\underbrace{\frac1{n^2}}_{\Large\lt\frac1{(n-1)n}}+\dots $ and $ \begin{align} &\frac1{1\cdot2}+\frac1{2\cdot3}+\frac1{3\cdot4}+\dots+\frac1{(n-1)n}+\dots\\ &=\left(\frac11-\frac12\right)+\left(\frac12-\frac13\right)+\left(\frac13-\frac14\right)+\dots+\left(\frac1{(n-1)}-\frac1n\right)+\dots\\ &=1 \end{align} $ This last series is called a "telescoping sum" since the last part of each term is cancelled by the first part of the next term, leaving only the first part of the first term and the last part of the last term. Since the last part of the last term vanishes, this series converges to the first part of the first term.

  • 0
    thank you again, i think I understood. :)2012-11-24
10

Summation is closely related to integration. For a non-increasing function $f$ we have $\sum_{n=t+1}^\infty f(n) \leq \int_t^\infty f(x)\mathrm{d}x \leq \sum_{n=t}^\infty f(n)$ or expressed differently $-f(t) + \sum_{n=t}^\infty f(n) \leq \int_t^\infty f(x)\mathrm{d}x \leq \sum_{n=t}^\infty f(n)$ or equivalently $\int_t^\infty f(x)\mathrm{d}x \leq \sum_{n=t}^\infty f(n) \leq f(t) + \int_t^\infty f(x)\mathrm{d}x$

Illustration of the correspondence between summation and integration. Illustration of the correspondence between integraion and summation.

So if the integral exists, the sum diverges iff the integral is $\infty$ - the sum bounds the integral and vice versa.


Let's consider $p>1$: \begin{align} \int_1^{\infty} \frac{1}{x^p} \mathrm{d}x &= \int_1^{\infty} x^{-p} \mathrm{d}x \\ &= \frac{1}{1-p}(\lim_{x\rightarrow\infty} x^{1-p} - 1^{1-p}) \\ &= \frac{1}{p-1} \end{align}

which means that the corresponding sum converges. We even get some bounds, in particular $ \frac{1}{p-1} \leq \sum_{n=1}^\infty n^{-p} \leq \frac{1}{p-1} + 1$

If $p = 1$ then

\begin{align} \int_1^{\infty} \frac{1}{x} \mathrm{d}x &= \lim_{x\rightarrow\infty} (\ln x) - \ln 1 \\ &= \infty \end{align}

which means that the corresponding sum diverges.

So we can conclude that the sum $\sum_{n=1}^\infty n^{-p}$ converges iff $p>1$.

  • 1
    Thank you for this approach, it's always nice when something can be proved in different ways.2012-11-24