3
$\begingroup$

My question is whether induction (and backward induction) can be applied as $n \rightarrow \infty$.

To explain what I mean, I will show inductively that $a_1=\ldots=a_{n}=0$ is the only solution to $$ 0 = a_1x + (a_1+a_{2})x^{2}+ \ldots+(a_1+\ldots+a_{n})x^{n}, \qquad \forall \ x\in (0,1] $$ For $x$ small enough, we must have that $a_1=0$. Assume that $a_1=\ldots=a_{i-1}=0$, then
$$ 0 = a_ix^i + (a_{i}+a_{i+1})x^{i+1}+ \ldots+(a_i+\ldots+a_{n})x^{n}, \qquad \forall \ x\in (0,1] $$ so that for $x$ small enough we must have $a_i=0$. By induction, $a_0=\ldots=a_n=0$.

My question is if and where does this solution break down as $n \rightarrow \infty$. Specifically does the same inductive reasoning work to show that $0=a_1=a_2=\ldots$ when $$ 0 = \sum_{i=1}^\infty x^i \left(\sum_{j=1}^{i} a_{j} \right), \qquad \forall \ x\in (0,1] $$

Also what about backward induction. For example if $$ 0 = \sum_{i=1}^n x^i \left(\sum_{j=i}^{n} a_{j} \right), \qquad \forall \ x\in \mathbb{R} $$ then by backward induction I can show that $0=a_1=\ldots=a_n$ (i.e., for $x$ large enough, we must have that $a_n=0$. Now assume that $a_n=\ldots=a_{i+1}=0$ then for $x$ large enough we must have $a_i=0$). Does the same reasoning apply to show that $0=a_1=\ldots$ when $$ 0 = \sum_{i=1}^{\infty} x^i \left(\sum_{j=i}^{\infty} a_{j} \right), \qquad \forall x\in \mathbb{R}? $$

Edit: I have added a bounty because I would like to get more intuition for why the induction argument fails. Specifically,

  • Assuming convergence of the function, say the function is analytic, can you provide an example where the induction argument holds for finite $n$ and then falls apart when $n \rightarrow \infty$?

  • What is the intuition that the argument breaks down?

  • The proof that if an analytic function equals zero in an open ball implies that the function is exactly zero uses the maximum modulus principle (https://en.wikipedia.org/wiki/Maximum_modulus_principle) rather than first using a series expansion and then showing that all the coefficients are zero. Is there a proof that first uses a series expansion?

  • Is this related to the fundamental theorem of algebra (https://en.wikipedia.org/wiki/Fundamental_theorem_of_algebra) and why it cannot be expanded to an infinite dimensional polynomial?

3 Answers 3

6

No. In general, proving a statement for positive integer $n$ by induction, does not prove it "as $n \to \infty$". Induction only proves the statement for each positive integer.

In fact, for many statements, it does not even make sense to take $n \to \infty$. Even when it does, there is always an extra element that has to be introduced. For example, in your case, your infinite sums have to be defined by some notion of convergence, which wasn't necessary for finite sums. Your proof by induction (for finite $n$) never made use of this notion of convergence, so it couldn't possibly be right to just take the limit as $n \to \infty$ and apply the same proof.

  • 0
    Thanks for you answer. I've been reading about transfinite induction (https://en.wikipedia.org/wiki/Transfinite_induction) and I thought it might apply here. Is the reason that it wouldn't because it doesn't apply in the limit case?2017-01-30
  • 1
    Transfinite induction is induction on *all ordinals*, which is not particularly relevant here. You only have a single infinite case.2017-01-30
  • 0
    Assuming that all the convergence conditions on $a_i$ are satisfied (e.g. $g(x)=\sum b_i x_i$ is analytic and if $g(x) \equiv 0$ then $b_i=0$) then is there an intuitive reason that the induction argument fails as $n \rightarrow \infty$ (or is it just not intuitive what happens as $n \rightarrow \infty$)?2017-01-31
1

I'll answer here the questions in your Edit : your reasoning worked for a fixed $n$, so although in this particular case the answer with the infinite series works, there would have some work to be done.

There is a proof using the series expansion, if you have, say around $0$, the following $f(x) = \displaystyle\sum_{n=0}^{\infty} a_n x^n$ (assuming of course the convergence radius to be $>0$), then $f$ is $C^{\infty}$, and one can calculate its derivative term by term and one gets $\frac{f^{(n)}(0)}{n!} = a_n$. Now if it equals zero on an open ball around zero, then its derivatives are $0$ too, so the $a_n$'s are zero too, and so $f=0$.

It is in fact related to the fundamental theorem of algebra, which can be proved using analytic functions (through Liouville's theorem). And I think the proof could be adjusted to fit some analytic functions (not all of them, though, I suspect. I would have to look into the details of the proof)

  • 0
    So what you are saying is that with some convergence conditions, induction can be done for infinite $n$?2017-02-09
  • 0
    It makes no sense to ask "for infinite $n$", because here you're obviously asking about integers, and integers are finite. What I'm saying is that in your proof, you took a fixed $n$, and showed by induction on $i\in \{1,...,n\}$ that $a_i =0$, which doesn't let you generalize "for all $i\in \mathbb{N}, a_i =0$", which is a claim that would make very little sense if the only sentence you write is the first one (because $a_i$ is only defined for $i\in \{1,...,n\}$).2017-02-09
  • 0
    Moreover, if you want to prove your second claim about the infinite series, then your induction argument isn't correct (although here your argument can be modified to fit the situation)2017-02-09
  • 0
    Thanks. Can you explain how the second claim can be modified?... also do you know a "simple" counterexample where some induction argument holds for finite $n$ but fails the moment $n \rightarrow \infty$?2017-02-11
  • 1
    Well using what I've said, you have that for all $i\in \mathbb{N}$, $\displaystyle\sum_{j=i}^{\infty} a_j = 0$. Then you can show, by induction, from this information, that every $a_i= 0$ (you can easily do it without induction; but if you really want, you can do it by induction).2017-02-11
  • 0
    And there are counterexamples as soon as the induction property stops making sense as $n\to \infty$, such as in your example2017-02-11
1

Theorem: If $(a_{k})_{k=0}^{\infty}$ is a sequence of complex numbers, and if $$ \sum_{k=0}^{\infty} a_{k} x^{k} = 0\quad\text{for all real $x$ in some neighborhood of $0$,} $$ then $a_{k} = 0$ for all $k$.

The simplest proof I know uses nothing but continuity of a convergent power series, i.e., the fact that if $\sum_{k} a_{k} x^{k}$ converges in some neighborhood $I$ of $0$, and if $f(x)$ denotes the sum for $x$ in $I$, then $f$ is continuous in $I$. (The details involve estimating the tail of a convergent power series by a convergent geometric series. Nothing as fancy as Taylor's theorem or the maximum modulus theorem is needed.)

The argument is contrapositive. Assume $a_{n} \neq 0$ is the first non-zero coefficient, i.e., $a_{k} = 0$ for $0 \leq k < n$. Writing $$ f(x) = \sum_{k=0}^{\infty} a_{k} x^{k} = \sum_{k=0}^{\infty} a_{n+k} x^{n+k} = x^{n} \sum_{k=0}^{\infty} a_{n+k} x^{k} = x^{n} g_{n}(x), $$ the function $g_{n}$ is continuous in some neighborhood of $0$, and $g_{n}(0) = a_{n} \neq 0$, so that $g_{n}$ is non-vanishing in some neighborhood of $0$. It follows that $f$ is non-vanishing is some deleted neighborhood of $0$. Contrapositively, if $f$ vanishes identically in some neighborhood of $0$, there is no non-zero coefficient of smallest index, i.e., $a_{k} = 0$ for all $k$.


If I understand your motivating questions, the answer to each is yes. Specifically,

  1. If $$ \sum_{k=1}^{\infty} x^{k} \left(\sum_{j=1}^{k} a_{j}\right) = 0\quad\text{for all $x$ in $(0, 1]$,} $$ then $a_{j} = 0$ for all $j$.

  2. If $$ \sum_{k=1}^{\infty} x^{k} \left(\sum_{j=k}^{\infty} a_{j}\right) = 0\quad\text{for all real $x$,} $$ then $a_{j} = 0$ for all $j$.

In each case, the theorem guarantees you're "really" assuming the coefficients in parentheses vanish for all $k$ (and, in the second, that the indicated series converge for each $k$). The fact that $a_{j} = 0$ for all $j$ follows by elementary algebra.

I'm not sure what you mean by the fourth question. As a general matter, I all but forbid my students from using pronouns; it's important to know (for oneself) exactly what noun is being discussed. If you can't decide what noun to use, there's a detail you don't understand. To annotate my uncertainties:

Is this[1] related to the fundamental theorem of algebra and why it[2] cannot be expanded[3] to an infinite dimensional[4] polynomial?

  1. "This" means "this phenomenon"? And does that mean "the fact that finite induction does not always extend to the limit as $n \to \infty$", some property of the series you asked about, something else?

  2. The fundamental theorem of algebra? Inductive proof?

  3. Extended?

  4. Degree (i.e., to an entire function)?

Anyway, I think the answer is "no", there's no connection to the fundamental theorem of algebra. You're really asking local questions about convergent power series, not global questions about polynomials and other entire functions.

  • 1
    Thanks for the insightful answer... I meant "Can the fundamental theorem of algebra be extended to an infinite degree polynomial using induction (or otherwise)?" i.e. does an infinite degree polynomial always have a root.2017-02-12
  • 2
    Thank you for the clarification. Briefly, no: The exponential function$$\exp(z) = \sum_{k=0}^{\infty} \frac{z^{k}}{k!}$$has no complex roots. If $f$ is an arbitrary entire function, then $g(z) = \exp(f(z))$ is also entire (representable globally by a power series), but has no roots. It turns out, conversely, that every non-vanishing entire function $g$ can be written in the form $\exp \circ f$ for some entire function $f$.2017-02-12