4
$\begingroup$

I have the two equations (Taylor's theorem):

f(x+h) = f(x) + f'(x)\cdot h + \dfrac{1}{2}h^{2}f''(x)+\dots+\dfrac{1}{n!}h^{n}f^{(n)}(x)+R_{n+1}

and

f(x-h) = f(x) - f'(x)\cdot h + \dfrac{1}{2}h^{2}f''(x)+\dots+\dfrac{1}{n!}(-h)^{n}f^{(n)}(x)+R_{n+1}

Are these two remainder-terms equal?

  • 0
    @Gingerjin: ???2012-03-03

1 Answers 1

3

I assume the variable $h$ is supposed to represent a non-negative value, correct?

Then, if you define a remainder $R_{n+1}(x, h)$ by f(x+h) = f(x) + f'(x)\cdot h + \dfrac{1}{2}h^{2}f''(x)+\dots+\dfrac{1}{n!}h^{n}f^{(n)}(x)+R_{n+1}(x, h) then the domain of $R_{n+1}(u,v)$ only includes non-negative values for $v$.

If you define another remainder term $S_{n+1}(x,h)$ by f(x-h) = f(x) - f'(x)\cdot h + \dfrac{1}{2}h^{2}f''(x)+\dots+\dfrac{1}{n!}(-h)^{n}f^{(n)}(x)+S_{n+1}(x, -h)

then the domain of $S_{n+1}(u,v)$ only includes non-positive values for $v$.

Their domains aren't the same so they can't be the same. Of course, they agree where their domains overlap ($R_{n+1}(u,0) = S_{n+1}(u,0)$), so I can define a new function $T_{n+1}(u,v)$ by $ T_{n+1}(u,v) = \begin{cases} R_{n+1}(u,v) & \text{when defined} \\ S_{n+1}(u,v) & \text{when defined} \end{cases} $

and have an equation

f(x+\epsilon) = f(x) + f'(x)\cdot \epsilon + \dfrac{1}{2}\epsilon^{2}f''(x)+\dots+\dfrac{1}{n!}\epsilon^{n}f^{(n)}(x)+T_{n+1}(x, \epsilon) valid for all $\epsilon$: positive, negative, and zero.

The values of $R_{n+1}(x, h)$ and $S_{n+1}(x, -h)$ don't really have anything at all to do with each other either, so they're completely different in that sense too.

However, if $f(x)$ is an analytic function, then $R_{n+1}(u,v)$ and $S_{n+1}(u,v)$ are analytic continuations of each other, so in that sense they are the "same" function.

The formulas for estimating the remainder are also symmetric, of course -- so they're similar in that fashion. Of course, $R$ depends on values of $f$ for $h > 0$, and $S$ depends on values of $f$ for $h < 0$, so again they are fairly independent ideas.

One of the most famous counterexamples for dealing with Taylor series is the function $ f(x) = \begin{cases} 0 & x \leq 0 \\ e^{-1/x^2} & x > 0 \end{cases} $. Every derivative of this function at zero is zero. So, formulas for the two remainders at $x=0$ are: $ R_{n}(0, h) = e^{-1/h^2} $ $ S_{n}(0, -h) = 0 $

  • 0
    I think for the function f(x) = \begin{cases} 0 & x \leq 0 \\ e^{-1/x^2} & x > 0 \end{cases} , $ R_{n}(0, h) = p(h)e^{-1/h^2} $,p is a polynomial2012-03-04