11
$\begingroup$

It's true that I'm not familiar with too many exotic functions, but I don't understand why there exist functions that cannot be described by a Taylor series? What makes it okay to describe any particular functions with such a series? Is there any difference for different number sets? In the case of complex numbers maybe? Could somebody provide an example?

  • 2
    A Taylor series exists if and only if the function is infinitely differentiable at some a.2017-02-02
  • 9
    And even then, the Taylor series doesn't always converge to the function in some neighbor. @DougM2017-02-02
  • 1
    @ThomasAndrews Indeed, I thought about going there, but then decided to keep it to the point. Radius of convergence is then a whole 'nuther thing.2017-02-02
  • 3
    Related questions: [Motivating infinite series](http://math.stackexchange.com/questions/9524/motivating-infinite-series) ; [Why doesn't a Taylor series converge always?](http://math.stackexchange.com/questions/1308992/why-doesnt-a-taylor-series-converge-always) ; [Is it possible for a function to be smooth everywhere, analytic nowhere, yet Taylor series at any point converges in a nonzero radius?](http://math.stackexchange.com/questions/620290/is-it-possible-for-a-function-to-be-smooth-everywhere-analytic-nowhere-yet-tay)2017-02-02
  • 2
    The Taylor series represent the function when the Taylor remainder tends to zero, otherwise the Taylor series doesnt represent the function.2017-02-02
  • 0
    If you want a set of functions whose Taylor series exist and converges everywhere to the function in question then you want to consider [analytical (entire) functions](https://en.wikipedia.org/wiki/Entire_function). This includes polynomials, $\sin$, $\cos$, the exponential any many more however it is a very restrictive property that rules out most "exotic" functions.2017-02-02
  • 0
    Okay, thanks, what about the sine integral function? What is the deal with that, i assume it cannot be described by a series, how else could one go about it?2017-02-02
  • 0
    @smaude Do you mean the sine integral, $\text{Si}(x) = \int_0^x \frac{\sin(y)}{y}{\rm d}y$? That function is entire so it's Taylor series converges everywhere to the correct function. For example expanding about $x=0$ the series starts out like $x - \frac{x^3}{18} + \frac{x^5}{600} + \ldots$.2017-02-02
  • 0
    I don't understand why there are close votes. Could someone enlighten me?2017-02-02

5 Answers 5

23

We have the somewhat famous function:

$$f(x)=\begin{cases}e^{-1/x^2}&x\neq 0\\ 0&x=0 \end{cases}$$

is infinitely differentiable at $0$ with $f^{(n)}(0)=0$ for all $n$, so, even though the function is infinitely differentiable, the Taylor series around $0$ does not converge to the value of the function for any $x>0$.


Technically, any function that is infinitely differentiable at $a$ has a Taylor series at $a$. Whether you find that Taylor series useful depends on what you want the series to do.

For example, if given a $g$ infinitely differentiable at $0$, the we know that there exists $C,\epsilon>0$ such that:

$$\left|g(x)-\sum_{k=0}^{n} \frac{g^{(k)}(0)}{k!}x^k\right|

for all $|x|<\epsilon$.

So the finite terms of the Taylor series are in some sense always the "best" polynomial for agreeing with function.

So what happens to our function $f$ above is that $f(x)$ converges to $0$ faster than any function $x^n$.

What we don't always get, for real functions, is a Taylor series that converges to the function in the interval.


In complex numbers, things become intriguing. It turns out, if you define differentiation on complex functions in a relatively simple way, then any function which is differentiable at a point is infinitely differentiable at that point, and the Taylor series converges in some "ball" centered on that point.

  • 5
    I've never understood why so many find this surprising; I guess it is because many are taught (incorrectly) that Taylor Series *must* converge to the function. I was taught from the beginning that a Taylor Series simply provides an approximation in terms of derivatives around a point and might happen to converge. In the neighborhood of $0$, the function $e^{-1/x^2}$ is approximated *quite well* by $0$2017-02-02
  • 11
    @BrevanEllefsen: That's not really why. It's because you would intuitively expect a *smooth* function to be *predictable*, and it seems weird that the derivatives of a function do not contain enough information to predict it at a nearby point.2017-02-02
  • 1
    Yes, but that function *does* have a Taylor series, it just doesn't agree with the function anywhere but at 0.2017-02-02
  • 1
    @BrevanEllefsen But any infinitely differentiable function in complex numbers does have a converging Taylor series, so the question for reals is interesting. I wouldn't say it is surprising that there are counter-examples in the reals, but it is interesting.2017-02-02
  • 0
    @ThomasAndrews good points, I see what you're saying. I suppose that, for me at least, the really surprising thing is that there *aren't* counter-examples of the same kind among functions in complex numbers. For me the really surprising part about Taylor Series is that they don't have to converge everywhere or just at a single point - they can often converge in a small radius (or disk) of convergence. Nevertheless, have a good day sir!2017-02-02
  • 7
    One of my favorite "proofs" that complex numbers "exist" is that the Taylor series at the real number $a$ for the function $\frac{1}{1+x^2}$ has radius of convergence $\sqrt{1+a^2}$. This is entirely a statement about real numbers, but it suggests there is some root of $1+x^2$ that is $\sqrt{1+a^2}$ distance away from each real number $a$. (If $p(x)$ has all real roots, then the radius of convergence at $a$ of $\frac{1}{p(x)}$ is always the distance to the nearest root to $a$.) @BrevanEllefsen2017-02-02
  • 0
    I would recommend replacing ""ball" centered around that point" with something like "disk in the complex plane centered at that point". I think it will be understandable to strictly more people that way.2017-06-09
  • 0
    @Mehrdad : The best way then to dispel this is to show you can modify a smooth function "locally" (i.e. within some interval _only_) while keeping smoothness and having the same values elsewhere. Thus it is a much more restrictive and profound property if the function is not only smooth but can be holographically reconstructed from an arbitrarily small piece, which is what "analytic" means, and the _most_ profound bit of all is that when you go to the _complex_ plane, existence of even a _single_ derivative is sufficient (if differentiable everywhere) to guarantee this.2017-07-17
  • 0
    @ThomasAndrews, me too! That was the example that suddenly made me go - hang on... Maybe these complex number things aren't just pointless abstraction :)2018-11-02
9

If the limit of the Lagrange Error term does not tend to zero (as $n \to \infty $), then the function will not be equal to its Taylor Series.

You can also read more on this in Appendix $1$ in Introduction to Calculus and Analysis $1$ by Courant and John. Hope it helps.

6

I think the intuition you want is the fact that functions that are not complex-differentiable* (also known as holomorphic) are not described by a Taylor series.

And to give another example that is perhaps even more unexpected than the one given by Andrew:

$$f(z) = \begin{cases} e^{-\frac{1}{z}} && \text{if } z > 0 \\ 0 && \text{otherwise}\end{cases}$$

This function is smooth and zero over an infinitely long interval, and yet nonzero, because it is not holomorphic.


*If you're not familiar with complex differentiation, it's like real differentiation, with $h$ complex:

$$f'(z) = \lim_{h \to 0} \frac{f(z + h) - f(z)}{h}$$

For details, see here.

5

The existence of functions that cannot be described by Taylor series is actually completely intuitive; take the indicator function of the rational numbers viewed as a subset of the reals, for example. Try to keep in mind that functions can be really... arbitrary.

Much more subtle is the existence of smooth functions that aren't analytic; Thomas Andrews gives the standard example of such a beast. Fwiw, my understanding of why this is possible is that okay, there's functions that change behaviour suddenly at a point, BUT the change in behaviour at that point is so gradual, so gentle, so smooth, that none of the function's derivatives can see the change happening; therefore, the Taylor series can't, either.

2

In addition to all the comments here, I would like to add the curious Weierstrass function, which is known for its quality of being nowhere differentiable despite the fact that it is continuous everywhere:

$$ W(x) = \sum_{n=0}^\infty a^n\cos(b^n\pi x)$$

Consequently, it does not have a Taylor series.

You can find a visualization of $W$ here.

  • 0
    This is a nice fractal! Are all fractals no differentiable?2018-08-23
  • 3
    @dmtri while I'm unsure of a formal definition of a fractal, differentiable functions must look "linear" if you "zoom in enough". Fractals self similarity on "zooming in" seems to rule this out.2018-08-23