9
$\begingroup$

MathWorld gives the root linear coefficient theorem as

The sum of the reciprocals of roots of an equation equals the negative coefficient of the linear term in the Maclaurin series.

The theorem appears to me to be false as stated. For example, the equation $e^x = 0$ has no roots, yet (taking the Maclaurin series of $e^x$) the root linear coefficient theorem claims that the sum of the reciprocals of these nonexistent roots would be $-1$.

Is MathWorld missing some hypotheses? Or is there something happening in the complex plane that I'm not aware of? The MathWorld entry also says to see Vieta's formulas, but those are for polynomials and not Maclaurin series.

The only real information I could find from a Google search on "root linear coefficient theorem" was this statement (from Robert Israel of UBC):

This won't work in general for non-polynomials (e.g. try it for $p(x) exp(x)$. For a rational function such that $0$ is neither a root nor a pole, you want to take the sum of the reciprocals of the roots minus the sum of the reciprocals of the poles (again counting multiplicity).

O.K., so it won't work for non-polynomials, and for rational functions you have to include the poles.

But then why is MathWorld applying it to $\sin z/z$ in this "proof" that $\zeta(2) = \sum_{n=1}^{\infty} \frac{1}{n^2} = \pi^2/6$? (Start near eq. (18).)

The value $\zeta(2)$ can also be found simply using the root linear coefficient theorem. Consider the equation $\sin z=0$ and expand $\sin$ in a Maclaurin series $\sin z = z- \frac{z^3}{3!}+\frac{z^5}{5!}+ \ldots =0$ $0 = 1-\frac{z^2}{3!}+\frac{z^4}{5!}+ \ldots$ $ = 1-\frac{w}{3!}+\frac{w^2}{5!}+ \ldots,$
where $w=z^2$. But the zeros of $\sin z$ occur at $z=\pi, 2\pi, 3\pi, \ldots$, or $w=\pi^2, (2\pi)^2, \ldots$. Therefore, the sum of the [reciprocals of the] roots equals the [negative of the] coefficient of the leading term $\frac{1}{\pi^2}+\frac{1}{(2\pi)^2}+\frac{1}{(3\pi)^2}+ \ldots =\frac{1}{3!}=\frac{1}{6},$
which can be rearranged to yield $\zeta(2)=\frac{\pi^2}{6}.$

(This is where I ran across the root linear coefficient theorem in the first place.)

Could someone enlighten me with respect to these questions:

  1. Is MathWorld just wrong?

  2. Am I missing something here?

  3. What is the correct statement of the root linear coefficient theorem?

2 Answers 2

3

This is something between an answer and a comment:

I believe that the root linear coefficient theorem only holds for polynomials (and can possibly be extended to rational functions as explained by the OP). The proof is simple (its in Robert Israel's post and basically is just Vieta's formula). I believe that MathWorld uses this theorem to obtain $\zeta(2)$ because this is the line of argument which Euler used to solve the Basel problem. However, Euler's proof is not rigorous (as the root linear coefficient theorem cannot be applied to $\sin(x)/x$ which is not a polynomial).

So to the answers of your question:

  1. yes

  2. no

  3. Let $p(x)$ be a polynomial then the sum of the reciprocals of roots of $p(x)$ equals the negative coefficient of the linear term in the Maclaurin series of $p(x)/p(0)$.

  • 0
    @Mike & Moron: of course Weierstrass applies to $e^z$. Take a look at http://en.wikipedia.org/wiki/Weierstrass_factorization_theorem#The_Weierstrass_factorization_theorem and set $m=0$, $g(z)=z$, and take the sequence $p_n$ to be the empty sequence ;-)2011-03-23
1

Root linear coefficient theorem is only true for polynomials. It follows trivially from the Vieta formulae. Now let us never mention it again.

The Mathworld proof you cite has a massive hole. Namely, as mentioned in Fabian's comment, you need to know the product expansion (and then you're OK, compare doefficient of z2) $\sin(\pi z)=\pi z \prod_{n=1}^\infty (1-\frac{z^2}{n^2}).$

This identity is not obvious, but is routine once you know enough complex analysis. Here is one approach (keyword: "order of growth of an entire function" may give you a theorem to invoke to avoid some of this work).

Let f(z) be the quotient of sin(πz) by that infinte product. First we show f is an entire holomorphic function. Then show that there exists entire g(z) with f(z)=exp(g(z)). (needs a proof, as log is multivvalued)

Now the idea is to bound f(z). Any bound of the form |f(z)|≤Aexp(C|z|) should do the trick. (for this, first showing f is periodic is useful, then you only have to find a bound for z near the imaginary axis.)

This gives a bound of the form Re(g(z))≤C'|z| for sufficiently large |z|. Now use the following lemma that is better than Louiville's Theorem.

Lemma: If h(z) is entire and Re(h(z))≤C|z|^k for sufficiently large |z|, then h is a polynomial of degree at most k.

Proof: Consider the Taylor series $h(z)=\sum_{j=0}^\infty a_je^{i\phi_j}z^j$ with aj nonnegative real and φj real. Let z=r.exp(iθ). We know $ \sum_{j=0}^\infty a_j r^j cos(\phi_j + j \theta) \leq C r^k$ for r sufficiently large. Now multiply by (1+cos(φm+m&\theta;)), integrate from 0 to 2π and let r tend to infinity.

Upshot: Now you know g(z)=A+Bz. To find A, set z=0. To find B, differentiate and set z=0.