11
$\begingroup$

I've been using the sentence:

If a series converges then the limit of the sequence is zero

as a criterion to prove that a series diverges (when $\lim \neq 0$) and I can understand the rationale behind it, but I can't find a formal proof.

Can you help me?

  • 2
    I find it a little strange that you had trouble finding a formal proof. For instance, every calculus textbook I have ever seen has a proof, as do many elementary analysis textbooks. Also see http://en.wikipedia.org/wiki/Nth_term_test.2010-09-13

2 Answers 2

12

Yes.

$\lim_{n \to \infty} \left ( \sum_{k = 1}^{n + 1} a_k - \sum_{k = 1}^{n} a_k \right ) = \lim_{n \to \infty} a_{n + 1} $ And both sums will converge to the same number so the limit is zero. This is by far the easiest proof I know.

This is the Cauchy criterion in disguise by the way, so you could use that too.

  • 0
    Also, "limit n->infty" n$e$eds to be added to the RHS of the equation.2010-09-13
3

If we know that the sequence converges and merely wish to show it converges to zero, then a proof by contradiction gives a little more intuition here (although the direct proofs are simple and beautiful). Assume $a_n\to a$ with a>0, then for all n>N for some large enough $N$ we have a_n > a/2 (take $\varepsilon = a/2$ in the definition of the limit). Now the sum diverges: \sum_{n>N}a_n > \sum_{n>N}a/2 = \infty. A similar argument works when $a<0$.

  • 0
    Indeed, I thought that this was the question - given that the sequence converges, then the limit is zero and not something else. I'll add this to my answer.2010-09-13