0
$\begingroup$

Note that I am specifically looking at this version of Taylor's theorem:

Let $f: \mathbb{R}\to\mathbb{R}$ be $n$-times differentiable at $x$. Then $\exists\,g : \mathbb{R} \to \mathbb{R}$ where $\lim_\limits{h\to 0}g(x + h) = 0$ and $$f(x + h) = h^n\ g(x + h) + \sum_{k=0}^n\frac{h^k}{k!}\,f^{(k)}(x)$$

My question is, what makes this a nontrivial theorem? Can't we just solve for $g$ in a single step?
Why is the observation that $g$ exists useful at all?

  • 0
    You could solve for $g$ in a single step, but it's not trivial that $\lim_{h\to 0} g(x + h) = 0$.2017-02-06
  • 0
    @littleO: Mhm... in fact if you scroll down, you'll see I actually said the exact same thing as you when I self-answered my question. :) (I only posted it since it was something that had confused me before.) I deleted it though, since the accepted answer explained it much better. But thanks anyhow haha.2017-02-06
  • 0
    Ah, I see that you did!2017-02-06
  • 1
    @littleO: actually I un-deleted it now since today I feel it's clearer in some respects than the other answer... go figure!2017-02-06

3 Answers 3

0

This is the so-called Taylor-Young theorem, that is the key-result which proves the existence a priori of Taylor expansions !

For example, if we consider the map $f:\mathbb{R}\to\mathbb{R},x\mapsto x+\exp(x)$, it is easy to see that $f$ is a smooth bijection, and that $f'$ doesn't vanish. Therefore, its reciprocal is also smooth and in particular $n-$times differentiable at $0$, for every $n$. Applying the Taylor-Young theorem, we know that $f^{-1}$ has an $n-$order expansion at $0$. The coefficients can be obtained by identification, using the fact that $\forall x\in\mathbb{R},\,f^{-1}(f(x))=x$.

1

It's 'surprising' that the error term can be written as $h^n g(x+h)$ because, if you solve for $g$, you get

$$ \frac{f(x + h) - \sum_{k=0}^n\frac{h^k}{k!}\,f^{(k)}(x)}{h^n} = g(x + h) $$

and because of the division, the solution only works for $h \neq 0$. A priori, it's not at all obvious that you can even pick a value for $g(x)$ so that $g$ is continuous at $x$, let alone that $g(x) = 0$ is the value that does so.

The usefulness is that it shows the error in approximation by using the Taylor polynomial is less significant than $h^n$ as $h \to 0$.

0

(Answering my own question since this is something that had been confusing me sometime ago and so I had posted this question for others' reference.)

The nontriviality is in the condition

$$\lim_\limits{h\to 0}g(x + h) = 0$$

The theorem would be trivial if this condition did not need to hold, since we could already easily solve for a $g$ that satisfies the equation by virtue of its $h^n$ coefficient, but it is far from obvious that such a $g$ can necessarily satisfy this condition as well.

Clarification: Note that $h^n$ does not pose a problem for $g$ despite the division by zero when solving for $g$. This is because $g(x) = 0$ could be defined piecewise, avoiding the division by zero. The problem really is just the limit (continuity) requirement on $g$ at $x$, not the mere existence of such a function $g$ that satisfies the equation.