7
$\begingroup$

We often ignore convergence when manipulating formal power series. We can add, substract, multiply, divide, differentiate, and do functional composition on such formal sums without worrying about convergence and we derive an equation formally true, well. But when we try to plug some arbitrary number into these equations, the problem happens.

Take an example from Concrete Mathematics EXERCISE 7.32:

An arithmetic progression is an infinite set of integers \[\{an+b\} = \{b,a+b,2a+b,3a+b,\ldots\}\] A set of arithmetic progressions $\{a_1n+b_1\},\ldots,\{a_mn+b_m\}$ is called an exact cover if every nonnegative integer occurs in one and only one of the progressions. For example, the three progressions $\{2n\},\{4n+1\},\{4n+3\}$ constitute an exact cover. Show that if $\{a_1n+b_1\},\ldots,\{a_mn+b_m\}$ is an exact cover such that $2 \le a_1 \le \cdots \le a_m$, then $a_{m-1} = a_m$.

Well, we can use generating function. First assume that $b_k \ge 0$. If $A$ is a nonneggative integer set, we consider a corresponding power series \[ \sum_{n \ge 0} z^n = \sum_{k=1}^m \sum_{n \ge 0} z^{a_kn+b_k} \tag{1} \] Formally, we derive that \[ \frac 1 {1-z} = \sum_{k=1}^m \frac {z^{b_k}} {1-z^{a_k}} \tag{2} \] We want to plug $\omega = e^{2\pi\imath/a_m}$ into equation (2), but wait a minute. We find that $\sum_{n \ge 0} \omega^n$ might diverge, and it's illegal to plug $z = \omega$ into equation (1), so we should observe something else. Well, it's trivial: eqaution (1) is true whenever $|z| < 1$, as well as equation (2). Both sides of equation (2) are rational functions which hold for all $|z| < 1$, so it holds whenever $z \in \mathbb{C}$. Now we can conclude that $a_{m-1} = a_m$, otherwise let $z \to \omega$, the left side of equation (2) is finity and the other is infinity.

For many readers don't know what I mean, I take another example:

Let $a_0 = 1, a_1, \ldots$ is a real sequence, and $A(z) = \sum_{n \ge 0} a_nz^n$ is its generating function. Now let's discover its reciprocal.

Assume that $A(z) = 1 + \bar A(z)$, we get

\begin{align*} \dfrac 1 {A(z)} &= 1 + \sum_{m > 0} (-1)^m \bar A(z)^m \\ &= 1 + \sum_{m > 0} (-1)^m \sum_{k_1, \ldots, k_m > 0} a_{k_1} \ldots a_{k_m} z^{\displaystyle a_{k_1} + \cdots + a_{k_m}} \\ &= 1 + \sum_{n > 0} z^n \sum_{m > 0} (-1)^m \sum_{\scriptstyle k_1, \ldots, k_m > 0 \atop \scriptstyle k_1 + \cdots + k_m = n} a_{k_1} \ldots a_{k_m} = B(z) \end{align*}

I have interchanged $\sum$-notation without consideration of absolute convergence, so $1/A(z) = B(z)$ is only formally true.

Well, only from such proof, can we say that, whenever $A(z_0)$, $B(z_0)$ converges to a non-zero value, $1 / A(z_0) = B(z_0)$?

Now let's consider a more general problem. It's what I want to ask.

If we get $A(z) = B(z)$ from manipulating (including adding, substracting, multiplying, dividing, differentiating, integrating and changing variables -- functional composition) generating functions (usually formal power series), and $A(z_0)$, $B(z_0)$ are convergent (if it is a explicit power series) or nondegenerate (if it isn't a explicit power series), can we conclude that $A(z_0) = B(z_0)$? If not, you could try to get some useful sufficient conditions.

In the first example, $A(z) = 1/(1-z)$ and $B(z) = \sum_{k=1}^m z^{b_k}/(1-z^{a_k})$. We get $A(z) = B(z)$ from manipulating formal power series, and it's an identity over all $z$.

Thanks for help.

  • 0
    @uncookedfalcon I've edited a lot. Is it easier to understand?2012-05-27

3 Answers 3

4

$\def\RR{\mathbb{R}}$The main thing we need here is the following facts:

  • There is a subring $R \subseteq \RR[[x]]$ consisting of those power series that converge near $a$.
  • The "evaluation at $a$" map $R \to \RR$ is a ring homomorphism.

With these fact, then if $f,g \in R$ and $fg=1$ in $\RR[[x]]$, then $fg=1$ in $R$, and applying the evaluation homomorphism gives $f(a)g(a) = 1$.

  • 1
    @Marc: Than$k$s: I was wondering about that, but in the end I went with the statement that was in my memory (after confirming it with a sloppy heuristic argument).2012-06-03
3

I admit I don't really understand the relevance of the general question to the specific example. Convergence is a property of a formal power series at a point. If $A(z) = B(z)$ as formal power series, then $A$ converges iff $B$ converges and by definition we have $A(z_0) = B(z_0)$ whenever we have convergence.

As far as the specific example, the problem is not that we have an equality $A(z) = B(z)$ of formal power series but that we have e.g. an equality $A(z) = B(z) + C(z)$ where $A$ may converge at $z = z_0$ and $B$ and $C$ might not. But this isn't a big issue in the example; both sides are rational functions so you can just think about poles.

  • 0
    I've edited a lot a$n$d figured out the relationship between the example and the general problem. Sorry for my bad English!2012-05-27
2

Working with formal power series is actually quite easy if you think of them as formal objects, and of all the operations on them as operations on formal series. Part of your confusion seems to be that you don't seem to really think of them like that; the notation $A(z)$ encourages such confusion, so I'll avoid it, and assume the indeterminate is called $X$ rather than $z$ (that is mainly psychological, but for me it helps).

If $A=B$ as formal power series, they are really just the same, and everything you can do with $A$ you can do with $B$; in particular for any fixed $z\in\mathbf C$ the substitution of $z$ for $X$ into $A$ gives a converging series if and only if the same is true for $B$,

Here is this applied to your first example. For $1\leq k\leq m$ put $A_k=\sum_{n\geq0}X^{a_kn+b_k}$ and assume that $\sum_{k=1}^mA_k=\sum_{n\geq0}X^n=A_0$ (my definition to facilitate the sequel). One has $A_k=\frac{X^{b_k}}{1-X^{a_k}}$ and $A_0=\frac1{1-X}$; these are identities of formal power series, meaning that the denominators are invertible formal power series, and multiplying their inverse series by the numerator gives series the fraction is equated to.

Now take $\omega$ a primitive $a_m$-th root of unity as you do on the question. Since the powers of $\omega$ do not tend to $0$ and all of the series $A_k$ have ceofficients of arbitrarily large powers of $X$ that are $1$, none of these formal series give a converging series when substituting $\omega$ for $X$. This does not help you to get to you conclusion. Instead you reason by using substitutions of values near $\omega$ but strictly inside the unit circle, for which all these formal series do give converging numeric series. You can reason by comparing the functions defined by these series on the interior of the disk, and their limit behaviour near $\omega$; that limit exists for those $A_k$ with $a_k (actually $a_k$ not a multiple of $a_m$ is what counts; the limit exists for $A_0$ since $a_0=1$). Now $A_m=A_0-A_1-A_2-\cdots-A_{m-1}$, and one must have $a_{m-1}=a_m$, or else the limit would exist for the right hand side (finite sums commute with limits) but not for the left hand side.

Personally I would prefer another approach, which avoids two levels of limits (on to get a converging series and anther for the limit of their limits). Instead of looking at the series obtained by substitution, look at the sequence of partial sums $(\sum_{l=0}^nc_l\omega^l)_{n=0}^\infty$ associated to a formal series $\sum_lc_lX^l$. While these sequences never converge for the series $A_k$, they do have different behaviour according as $a_k or $a_k=a_m$, namely the sequence is bounded in the former case (because $1+\omega^{a_k}+\omega^{2a_k}+\cdots+\omega^{(a_m-1)a_k}=\frac{1-\omega^{a_ma_k}}{1-\omega^{a_k}}=0$ when $a_k, so the partial sums are periodic), and unbounded in the latter case. This allows concluding along the same lines (using that a sum of finitely many bounded sequences is bounded).

(Added) Your being cautious about the validity of substitution of concrete (complex) values is quite justified. If one has $AB=C$ as formal power series (like in your second example, where $C=1$), then just knowing that for some given $z_0\in\mathbb C$ the substitution of $z_0$ for $X$ into $A$ and $B$ both give convergent series does not allow you to conclude that the same is true for the substitution of $z_0$ for $X$ into $C$. See my comment to the (inexact) answer by Hurkyl for an example where this fails.

However not all is lost. First of all, if the substitution into $A$ and $B$ both give absolutely convergent series, then the same is true for the substitution into $C$, and moreover the limit for $C$ is the product of those for $A$ and $B$ in this case. Also, if the subsitution of some nonzero $z_0$ gives a (just) convergent series, then substitution of any $z$ with $|z|<|z_0|$ gives an absolutely convergent series. I believe that in this case it is guaranteed that the limit of the series at $z_0$ is also the limit as $z\to z_0$ of the limits of the (absolutely convergent) series at $z$ with $|z|<|z_0|$; in other words that the function defined by a series on set of all points where it converges (if it has positive radius of convergence, this is the interior of a disk and some subset of its boundary) is continuous. If this is so (I have no easy reference or proof at hand) then at least we will have that assuming that $AB=C$ and that all three series converge at $z_0$ one can conclude $A(z_0)B(z_0)=C(z_0)$.

  • 0
    The case of rational functions are not too difficult. I'm afraid I'll manipulate a more complex one, for example, on $e^z$ or something else.2012-06-03