2
$\begingroup$

In the theory of divergent series regularization I've came to the following conclusion and I would like to know if my considerations are right or not. First I'll recall some definitions:

A summation method $S$ is called regular when it associates the same sum to "standard" convergent series:

$$C(\sum_{n=0}^{+\infty}a_n)=\lim_{k}\sum_{n=0}^k a_n=s\implies S(\sum_{n=0}^{+\infty}a_n)=s$$

A summation method $S$ is called linear if:

$$S(\sum_{n=0}^{+\infty}a_n)=s;\:\:\:S(\sum_{n=0}^{+\infty}b_n)=t;\:\:\:S(\sum_{n=0}^{+\infty}\alpha a_n+\beta b_n)=u\implies u=\alpha s+\beta t$$

A summation method $S$ is called stable if:

$$S(\sum_{n=0}^{+\infty}a_n)=s\implies S(\sum_{n=1}^{+\infty}a_n)=s-a_0$$

Now my claim is that every summation method that is both regular and linear is also stable and my proof is:

let $S(\sum_{n=0}^{+\infty}a_n)=s$ and define $b_0=a_0$ and $b_{n\ge 1}=0$. Of Course $\sum_{n=0}^{+\infty}b_n=a_0$ and by regularity $S(\sum_{n=0}^{+\infty}b_n)=a_0$. Now by linearity $S(\sum_{n=1}^{+\infty}a_n)=S(\sum_{n=0}^{+\infty}a_n)-S(\sum_{n=0}^{+\infty}b_n)=s-a_0$.

Is this proof correct? Is this sufficient ? My doubt is that when dealing with divergent sums one cannot always "get rid" of zeroes and so what I have shown is simply that:

$$S(\sum_{n=0}^{+\infty}c_n)=s-a_0$$

where $c_0=0$ and $c_{n\ge 1}=a_n$ and this is not necessarily the same as $S(\sum_{n=1}^{+\infty}a_n)$ (I guess) unless the method is dilutable (actually I don't know if someone in the litterature ever called this way such a method), in other words a method that assigns the same result to two series if they differ only by the presence of some zeroes, to clarify this with an example such a method would assign the same values to:

$$S(1+0+1+0+1+\dots)=S(1+0+1+1+1+\dots)=S(1+1+0+1+1+0+1+\dots)$$

would this hypothesis be sufficient to end my proof ? Can we get rid of this hypothesis with some other manipulations ?

  • 0
    It seems like the following covers it, which is similar to what you already have: $$ S(0 + a_1 + a_2 + ...) + S(a_0 + 0 + 0 + ...) = S(a_0+a_1+a_2+...)$$ Yet, to be precise, it would help to know how $S$ is defined, is it a map from nonnegative infinite sequences to real numbers? Or to extended real numbers (so the answer can be infinity), or is it possibly not defined for all sequences? This would help clarify your definitions of "stable, regular, and linear" since we don't know if the $s$ and $t$ are nonnnegative reals, or can be infinite, or if there are restrictions on the sequences?2017-02-18
  • 0
    @Michael sorry if I haven't specified this however in full generality $S$ is a (partial) map from the set of all infinite sequences to the extended real numbers or even complex, but I guess we don't have a great difference between the cases.2017-02-18
  • 0
    I don't know what a "partial" map is.2017-02-18
  • 0
    @Michael it could not be defined for some element of the domain2017-02-18
  • 0
    In that case the definitions like "linear" are ambiguous as we do not know that linear combinations of sequences are in the domain allowed by $S$. I would expect some structure on the domain of $S$ to be assumed.2017-02-18
  • 0
    @Michael this is a nice observation, in the linearity condition left and right side of the implication could be ill defined. However I think the other conditions are consistent2017-02-18
  • 0
    There is also ambiguity in what is meant by a "standard" convergent series. Is $\sum_{n=0}^{\infty} (-1)^n\frac{1}{n+1}$ a "standard convergent series," or not? The limit of partial sums converges, but the sum is not independent of order. There is no ambiguity if we restrict to nonnegative sequences.2017-02-18

1 Answers 1

3

The notation you used to explicate stability is prone to misunderstanding, and I think that happened to you.

Let $S := \mathbb{C}^{\mathbb{N}}$ the space of all complex sequences. For some things, it will be more convenient to explicitly write/view a sequence as a map $\mathbb{N} \to \mathbb{C}$, rather than to use index notation, so I will use function notation for sequences. $S$ contains a lot of famous subspaces, but the one we are primarily interested in here is

$$c := \left\{f \in S : \lim_{n\to \infty} f(n) \text{ exists}\right\}.$$

Before we come to the summation methods, we define a few useful linear operators on $S$. We need the partial-sums operator and its inverse, the difference operator,

$$\Psi f \colon n \mapsto \sum_{k = 0}^n f(k)\quad\text{and}\quad \Delta f\colon n \mapsto \begin{cases}\qquad f(0) &, n = 0 \\ f(n) - f(n-1) &, n > 0.\end{cases}$$

Further, we need the left and right shift operators,

$$L f \colon n \mapsto f(n+1),\qquad R f \colon n \mapsto \begin{cases}\quad 0 &, n = 0 \\ f(n-1) &, n > 0 \end{cases}$$

and the injection $I \colon \mathbb{C} \to S$ given by

$$I a \colon n \mapsto \begin{cases} a &, n = 0 \\ 0 &, n > 0. \end{cases}$$

Now we define

  • a limiting method is a map $\Lambda \colon \mathcal{D}(\Lambda) \to \mathbb{C}$, where $\mathcal{D}(\Lambda)$ is a subset of $S$, and
  • a summation method is a map $\Sigma \colon \mathcal{D}(\Sigma) \to \mathbb{C}$, where $\mathcal{D}(\Sigma)$ is a subset of $S$.

So far, limiting methods and summation methods are exactly the same thing - a rather uninteresting thing, by the way - but we view them differently, and they will start to differ and become more interesting when we consider methods with some useful properties.

We note that every limiting method induces a summation method and vice versa via

$$\Sigma_{\Lambda} f := \Lambda (\Psi f)\quad\text{and}\quad \Lambda_{\Sigma} f := \Sigma(\Delta f).$$

Thus $\mathcal{D}(\Sigma_{\Lambda}) = \Delta(\mathcal{D}(\Lambda))$ and $\mathcal{D}(\Lambda_{\Sigma}) = \Psi(\mathcal{D}(\Sigma))$.

Now a limiting or summation method $M$ is (unsurprisingly) linear if $\mathcal{D}(M)$ is a linear subspace of $S$, and $M \colon \mathcal{D}(M) \to \mathbb{C}$ is linear. A limiting method $\Lambda$ is regular if $c\subset \mathcal{D}(\Lambda)$ and $\Lambda f = \lim\limits_{n \to \infty} f(n)$ for all $f\in c$. A summation method $\Sigma$ is regular if the associated limiting method $\Lambda_{\Sigma}$ is regular, i.e. if $\Delta(c) \subset \mathcal{D}(\Sigma)$ and

$$\Sigma f = \lim_{n \to \infty} \sum_{k = 0}^n f(k)$$

for all $f \in \Delta(c)$. We note that a limiting method is linear resp. regular if and only if the associated summation method has that property.

This undoubtedly has all been rather cumbersome and boring, but now we can define stability in a concise and unambiguous way. We say a summation method $\Sigma$ is strongly stable (s-stable for short) if

$$f \in \mathcal{D}(\Sigma) \iff Lf \in \mathcal{D}(\Sigma),\quad\text{and}\quad \Sigma f = f(0) + \Sigma(Lf).$$

A limiting method $\Lambda$ is called s-stable if the associated summation method $\Sigma_{\Lambda}$ is s-stable. This is what wikipedia calls stable. And a summation method $\Sigma'$ is weakly stable (w-stable) if

$$f \in \mathcal{D}(\Sigma') \iff RLf \in \mathcal{D}(\Sigma'), \quad\text{and}\quad \Sigma' f = f(0) + \Sigma'(RL f).$$

A limiting method is called w-stable if the associated summation method is s-stable. It seems that is what you used as stability. We have the

Lemma: Every regular and linear summation method is w-stable.

Proof: Clearly $I(\mathbb{C}) \subset c$, and since $RL f = f - I\bigl(f(0)\bigr)$ we have $f \in \mathcal{D}(\Sigma) \iff RL f \in \mathcal{D}(\Sigma)$ for every regular and linear summation method $\Sigma$. The regularity yields $\Sigma \bigl(I\bigl(f(0)\bigr)\bigr) = f(0)$, and then linearity yields

$$\Sigma f = \Sigma\bigl(I\bigl(f(0)\bigr) + RLf\bigr) = \Sigma\bigl(I\bigl(f(0)\bigr)\bigr) + \Sigma(RL f) = f(0) + \Sigma(RL f). \hspace{3em}\square$$

Things are different for s-stability, however. Before we give a few examples of regular and linear summation methods which aren't s-stable, we prove the

Lemma: A regular and linear summation method $\Sigma$ is s-stable if and only if

$$f \in \mathcal{D}(\Sigma) \iff Rf \in \mathcal{D}(\Sigma), \quad\text{and}\quad \Sigma f = \Sigma (Rf).$$

Proof: Since $f = LR f$ and $(Rf)(0) = 0$, if $\Sigma$ is s-stable the definition directly yields

$$(Rf) \in \mathcal{D}(\Sigma) \iff L(Rf) = f \in \mathcal{D}(\Sigma),$$

and

$$\Sigma (Rf) = (Rf)(0) + \Sigma(LRf) = \Sigma(LRf) = \Sigma f.$$

Conversely, if $\Sigma$ is a regular and linear summation method with the property of the lemma, then $I\bigl(f(0)\bigr) \in c \subset \mathcal{D}(\Sigma)$ and $f = I\bigl(f(0)\bigr) + RLf$ shows

$$f \in \mathcal{D}(\Sigma) \iff RL f \in \mathcal{D}(\Sigma) \iff Lf \in \mathcal{D}(\Sigma)$$

and

$$\Sigma f = \Sigma \bigl(I\bigl(f(0)\bigr) + RL f\bigr) = \Sigma\bigl(I\bigl(f(0)\bigr)\bigr) + \Sigma(RL f) = f(0) + \Sigma(RL f) = f(0) + \Sigma(Lf).$$

For our first example of a regular and linear but not strongly stable method, it is more convenient to use limiting methods. Let $\mathscr{U}$ be a free ultrafilter on $\mathbb{N}$. Define

$$\Lambda_{\mathscr{U}} f = \lim f(\mathscr{U})$$

whenever that limit exists. Thus $\mathcal{D}(\Lambda_{\mathscr{U}}) \supsetneq \ell^{\infty}(\mathbb{N},\mathbb{C}) \supset c$, and since $\mathscr{U}$ is free, $\Lambda_{\mathscr{U}}$ is regular. But $\Lambda_{\mathscr{U}}$ is not strongly stable. As an ultrafilter, $\mathscr{U}$ contains either the set of even natural numbers or the set of odd natural numbers, but not both. The sequence $f \colon n \mapsto (-1)^n$ belongs to $\ell^{\infty}(\mathbb{N},\mathbb{C}) \subset \mathcal{D}(\Lambda_{\mathscr{U}})$, and we have $\Lambda_{\mathscr{U}} f = \pm 1$, the sign depending on whether the set of even naturals or the set of odd naturals belongs to $\mathscr{U}$ and

$$\Lambda_{\mathscr{U}}(Rf) = - \Lambda_{\mathscr{U}} f \neq \Lambda_{\mathscr{U}} f.$$

Such a summation/limiting method is of course a) not explicit, and b) not used in practice.

But analytic continuation of Dirichlet series is a widely used summation method, and a slight modification of that yields a regular and linear summation method that is not strongly stable. We let $\mathcal{D}(\Sigma)$ be the subspace of $S$ such that

  • the Dirichlet series $D_f(s) :=\sum_{n = 0}^{\infty} \frac{f(n)}{(n+1)^s}$ converges in some half-plane $\operatorname{Re} s > \alpha$,
  • the function $D_f$ has a meromorphic continuation $M_f$ to at least the half-plane $\operatorname{Re} s > 0$, and
  • $\lim\limits_{t \to 0^+} M_f(t)$ exists,

and define

$$\Sigma f := \lim_{t \to 0^+} M_f(t)$$

for $f \in \mathcal{D}(\Sigma)$.

This method differs from analytic continuation of the Dirichlet series in that a) we don't require that the continuation is holomorphic on a neighbourhood of $0$; in that respect $\Sigma$ is more general, and b) we require the existence of a meromorphic continuation to the whole half-plane $\operatorname{Re} s > 0$; in that respect $\Sigma$ is less general.

We omit the trivial proof that $\Sigma$ is linear and the easy proof that $\Sigma$ is regular, and skip forward to showing that $\Sigma$ is not s-stable.

Since the Riemann $\zeta$-function is meromorphic on the whole plane and has only a single pole at $1$, the two sequences $a \colon n \mapsto 1$ and $b \colon n \mapsto n$ belong to $\mathcal{D}(\Sigma)$, and we have

$$\Sigma a = \zeta(0) = -\frac{1}{2} \quad\text{and}\quad \Sigma b = \zeta(-1) = -\frac{1}{12}.$$

But $Rb = b - a$, and hence $Rb \in \mathcal{D}(\Sigma)$ with

$$\Sigma(Rb) = \Sigma(b - a) = -\frac{1}{12} + \frac{1}{2} = \frac{5}{12} \neq \Sigma b,$$

so $\Sigma$ is not s-stable.

  • 1
    Very interesting and complete answer, now a lot of things seems clearer to me! Could you be so kind to recommend me some textbooks from which you've learned those concepts ?2017-03-09
  • 0
    I haven't learned these concepts from textbooks, the only one I know of is Hardy's "Divergent Series", but I haven't got round to reading it yet. It has a good reputation, though, and as far as I know is freely available (legally) as a PDF (I don't have an URL, sorry).2017-03-09
  • 0
    Actually I have a physical copy of Hardy's book, it is a milestone in the field but it also is almost a hundred years old and I would like to learn more about recent developments. Every reference (books, articles, pages that are not Wikipedia) that pops in your mind is well received. Thanks again for the effort.2017-03-09