5
$\begingroup$

The question is the following:

Is there any proof that shows that the Taylor series of an analytical function is the series with the fastest convergence to that function?

The motivation to this question comes from numerically calculate $\exp(x)$ with arbitrary precision on the result. Suppose one can only calculate it using simple multiplications, division, sum and subtraction.

One approach would be to calculate the Taylor series centered on a particular known value (for instance, for the $\exp$, centered at $0$), and stop when the next term of the series has the desired precision. I.e. considering

$$y_n = \sum_{i=0}^n \frac{x^n}{n!}$$

we can call the error of the approximation of $y_n$ as

$$\epsilon_n = |y_n - e^x|\simeq \frac{x^{n+1}}{(n+1)!}$$

It is not obvious to me that the Taylor series is the fastest way of approaching $\exp(x)$ (in the sense that the Taylor Series is the one that leads to the n required to achieve a given precision is the minimum).

I think the problem can also be stated in the following way: on the set of all series that converge to $e^x$, which converges faster in the sense that it requires the minimum number of terms (and only requires $+,-,\cdot,/$)?

Generically, I would like to extent this results to less trivial functions, like $\cos, \arcsin, \log$, etc. So, first I would like to understand which series (or other things like Padé approximants, as Cocopuffs pointed out) should I use...

  • 0
    The Padé approximant is a faster way to calculate exp(x), although not a series per se2012-06-14
  • 0
    That depends upon the setting of the problem. On that set approximation is considered? If on a segment then corresponding polynomials are called of the best approximation. They do not generally coincide with Taylor polynomials.2012-06-14
  • 0
    @Andrew Ok, so the set is real numbers. which polynomials are you talking?2012-06-14
  • 0
    @J.C.Leitão I don't quite understand the formulation. At one point series may converge faster than another series and in an another point otherwise. Consider Taylor series for $\exp(x)$ centered at different points $x_0$.2012-06-15
  • 0
    @Andrew I've added some more information to the question.2012-06-15
  • 0
    "Is there any proof that shows that the Taylor series of an analytical function is the series with the fastest convergence to that function?" - in a nutshell: it depends on the function.2012-06-15
  • 0
    @J.M. ok. That already helps a lot!2012-06-15
  • 0
    Fastest convergence in what sense?2012-06-15
  • 0
    @QiaochuYuan : In the sense that the error converges in the minimum number of terms possible.2012-06-15
  • 0
    @J. C.: what kind of error? Converges to what?2012-06-15
  • 0
    Consider the taylor series of exp(x) centered at 0. An approximation to calculate exp(x) with an error \epsilon (e.g. to know the value of exp(x)\pm\epsilon would be to sum that taylor series till the next term of the series is less than \epsilon. \epsilon is what I define has error. I will edit the post to try to be more explicit.2012-06-15
  • 0
    Hmm, I'm looking for the same and the lack of answers here is disconcerting. By the way, "stop when the next term of the series has the desired precision" is not likely to work. In a monotonic sequence, the infinite number of small terms can add up to any error — you could be off by infinity. For an alternating sequence, the error is likely to be equal to the last term, which is still something of a worst case.2014-07-22

0 Answers 0