8
$\begingroup$

I'm trying to solve this exercise:

Let $(a_n)_{n\in\mathbb{N}}\subseteq\mathbb{R}$ and $p,q\in\mathbb{R}$ such that $p. Prove that if the series $\sum \frac{a_n}{n^p}$ converges, then the series $\sum \frac{a_n}{n^q}$ also does.

The comparison test sounds useful, but since we don't know that $a_n$ is positive it doesn't apply. Any pointers?

  • 6
    As a result of summation by parts $$ \sum_{k=1}^{n} a_k b_k = s_n b_k - \sum_{k=1}^{n-1} s_{k} (b_{k+1} - b_{k}) $$ for $s_n = a_1 + \cdots + a_n$, we have the following theorem: **(Abel's test)** If $s_n$ converges and $b_n$ is monotone bounded, then $\sum a_k b_k$ converges2012-03-13
  • 0
    @sos440 : Good eye. My answer is way too technical compared to your comment... I didn't give it a closer look until I finished writing my answer. Perhaps you should post yours too.2012-03-13
  • 0
    @sos440 : I am not used to manipulating summation by parts formula, and I am trying to establish a formula similar to yours but that works (the indices are obviously wrong since you put $b_k$ outside a sum). I would really appreciate an answer ; that summation formula I keep hearing about really intrigues me.2012-03-13
  • 1
    @PatrickDaSilva, Typing with iPad2 is much harder than one can imagine... Consequently typos become more frequent. Anyway, summation by part is a standard technique analogous to integration by part. You will obtain a correct formula if you replace $s_n b_k$ by $s_n b_n$. More can be found if you google it... -o-2012-03-13
  • 0
    Lollll, have you tried on your cellphone? It's a pain. Actually the summation by parts formula I know is something like $$ \sum_{k=0}^n a_k(b_{k+1}- b_k) = a_{n+1} b_{n+1} - a_0 b_0 - \sum_{k=0}^{n-1} (a_{k+1}- a_k)b_{k+1} $$ and never involves $s_n = a_1 + \dots + a_n$, that's why I'm confused. I'm actually worried about your idea since I don't trust this identity just yet.2012-03-13
  • 0
    @PatrickDaSilva: $$\sum_{k=1}^n a_kb_k = \sum_{k=1}^n b_k(s_k-s_{k-1}) =\sum_{k=1}^n b_ks_k-\sum_{k=1}^n b_ks_{k-1} = s_nb_n+\sum_{k=1}^{n-1} b_ks_k - \sum_{k=0}^{n-1} b_{k+1}s_k = s_nb_n+\sum_{k=1}^{n-1} s_k(b_{k+1}-b_k)$$2012-03-13
  • 0
    @sos440: Would you care writing your comment as an answer? It was definitely what I was looking for, thanks!2012-03-13

2 Answers 2

3

If $\displaystyle\sum_{n=1}^\infty\frac{a_n}{n^p}$ converges, we know that $\displaystyle\sum_{n=1}^m\frac{a_n}{n^p}$ is bounded. Furthermore, $\dfrac{1}{n^{q-p}}$ decreases monotonically to $0$. Thus, Dirichlet's Test says that $$ \sum_{n=1}^\infty\frac{a_n}{n^q}=\sum_{n=1}^\infty\frac{a_n}{n^p}\frac{1}{n^{q-p}} $$ converges.

  • 0
    Excellent. +1!!2012-04-12
1

If all the terms are positive / negative, it is easy to use the comparison test. The same applies if after say $N$ terms, all the terms of the sequence $a_n$ are of the same sign.

Now suppose we are not in this case, and suppose (without loss of generality) that the first term is positive (otherwise multiply the whole thing by $-1$). Therefore there exists an integer $N_1$ such that for $1 \le n < N_1$, $a_n \ge 0$ and $a_{N_1} < 0$. Since we assume the terms of the sequence are not all negative afterwards, there exists an $N_2$ such that for all $n$ with $N_1 \le n < N_2$, we have $a_n \le 0$ and $a_{N_2} > 0$.

Continuing this pattern by induction, there exists $N_{2k+1}$ such that for $N_{2k} \le n < N_{2k+1}$, $a_n \ge 0$ and $a_{N_{2k+1}} < 0$, and then another integer $N_{2k+2}$ such that for $N_{2k+1} \le n < N_{2k+2}$, $a_n \le 0$, with $a_{N_{2k+2}} > 0$. Therefore you can define a sequence like this : $$ \alpha_i = \sum_{n=N_{i-1}}^{N_i - 1} \frac{a_n}{n^p}. $$ Note that the serie $$ \sum_{i=1}^{\infty} \alpha_i $$ converges as well, since $$ \sum_{n=1}^{\infty} \frac {a_n}{n^p} = \sum_{i=1}^{\infty} \sum_{n=N_{i-1}}^{N_i - 1} \frac{a_n}{n^p} = \sum_{i=1}^{\infty} \alpha_i. $$ Now one can rewrite this series as $$ \sum_{i=1}^{\infty} \alpha_i = \sum_{i=1}^{\infty} (-1)^{i+1} |\alpha_i|. $$ By Leibniz's criterion, it converges if and only if $|\alpha_i| \to 0$ as $i \to \infty$, therefore it converges. Now define $$ \beta_i = \sum_{n = N_{i-1}}^{N_i - 1} \frac{a_n}{n^q}. $$ In the same fashion, it is easy to see that $$ \sum_{n=1}^{\infty} \frac {a_n}{n^q} = \sum_{i=1}^{\infty} (-1)^{i+1} |\beta_i| $$ and since $|\beta_i| \le |\alpha_i|$, $|\beta_i| \to 0$ as $i \to \infty$, and therefore the series on the RHS converges. Note that I am not done yet. To show that this equality actually holds, up to now, I have only shown that the sequence $$ S_N = \sum_{n=1}^N \frac{a_n}{n^q} $$ has a convergent subsequence, namely $S_{N_i}$. But one easily sees that for $N > N_1$, one can bound $S_N$ from above with a term of the sequence $S_{N_k}$, and similarly from below.

More explicity, write $S_{N_i} \to L$. If $N_{2k} \le n < N_{2k+1}$, then since $a_i \ge 0$, $S_{N_{2k}} \le S_n < S_{N_{2k+1}}$. If $N_{2k+1} \le n < N_{2k+2}$, then since $a_i \le 0$, we have $S_{N_{2k+2}} < S_n \le S_{N_{2k+1}}$. Using $x_n = \min \{S_{N_{2k}}, S_{N_{2k+2}} \}$ one can see that $$ \liminf_{n \to \infty} S_n \ge \liminf_{n \to \infty} x_n = \lim_{k \to \infty} S_{N_k} = L. $$ A similar argument shows that $\limsup_{n \to \infty} S_n \le L$, which means $\lim_{n \to \infty} S_n = L$.

Hope that helps,

EDIT : I just noticed Leibniz's criterion requires that the sequence is not just positive convergent to $0$ but also decreasing. This is kind of pissing me off. I guess there's a way around it but I don't want to find it. I'll just leave the answer there and say that "in the case where $|a_n|$ is decreasing, our intuition works right".

  • 0
    Perhaps not the simplest argument, but probably the most intuitive one ; since "the terms get smaller in size", there must be someway "converges faster than the other". The rigor behind this intuition is the proof I built.2012-03-13
  • 0
    I don't think you can associate terms in the partial sums. [Grandi's series](http://en.wikipedia.org/wiki/Grandi_series) are a good example of this.2012-03-13
  • 0
    I can in this case, but not in general : when a series is convergent, this means the sequence $S_n$ converges ; therefore any subsequence $S_{N_k}$ converges (that is for the $\alpha$ case). This is precisely why I can associate terms. This is also why I am not done when I show that $S_{N_k}$ converges for the $\beta$ case, I also need to show that $S_n$ converges, and I use the convergence of $S_{N_k}$ to do so. My only problem here is the use of Leibniz's criterion, where I need the $|\alpha_i|$'s and $|\beta_i|$'s to be a decreasing sequence.2012-03-13
  • 0
    The reason why it doesn't work in general is because in general, the sequence $S_n$ is not always convergent (i.e. the series can diverge). Obviously if the series diverge you cannot associate terms, but if it converges, you can. You can associate, but can't "switch terms" ; to commute some terms in the sum you need absolute convergence in general.2012-03-13