6
$\begingroup$

Unfortunately, I don't have much detail to give here. But is the general idea to cancel out the constant obtained from taking the derivative.

For instance, say my function was $f(x)=f_o+f_1x+f_2x^2+...$

Then $f'(x)=f_1+2f_2x+...$

And if the expansion is centered around $x=0$...
$f'(0)=0$,
$f''(0)=2f_2$
$f'''(0)=3*2f_3$

Therefore
$f_0=f(0)$
$f_1=f'(0)/1$
$f_2=f''(0)/2$

And so forth. Is that where the factorial comes from?

It is quite clear for a polynomial, but what about a trig function such as $sin(x)$ other than using taylor's formula?

  • 1
    Ignoring differentiability issues and rigor, you can obtain the coefficients in a purely algebraic manner by following the method I used in my answer at [power series expansion](http://math.stackexchange.com/questions/178704/power-series-expansion).2012-10-08

3 Answers 3

3

Start with the fundamental theorem of calculus: $ f(x) = f(x_0) + \int_{x_0}^x f^\prime(y) \mathrm{d} y $ and reapply it to $f(y)$: $ f(x) = f(x_0) + \int_{x_0}^x \left( f^\prime(x_0) + \int_{x_0}^y f^{\prime\prime}(z) \mathrm{d} z \right) \mathrm{d} y = f(x_0) +f^\prime(x_0) \int_{x_0}^x \mathrm{d} y + \underbrace{\int_{x_0}^x \left( \int_{x_0}^y f^{\prime\prime}(z) \mathrm{d} z\right)\mathrm{d} y}_{\mathcal{R}_2(x)} $ Repeated this with $f^{\prime\prime}(z)$: $ f(x) = f(x_0) + f^\prime(x_0) \underbrace{\int_{x_0}^x \mathrm{d} y}_{I_1(x)} + f^{\prime\prime}(x_0) \underbrace{\int_{x_0}^x \int_{x_0}^y \mathrm{d}z \mathrm{d} y}_{I_2(x)} + \underbrace{\int_{x_0}^x \int_{x_0}^y \int_{x_0}^z f(w) \mathrm{d} w \mathrm{d} z \mathrm{d} y}_{\mathcal{R}_3(x)} $ and keeping going we get: $ f(x) = f(x_0) + f^\prime(x_0) \int_{x_0}^x \mathrm{d} y + \cdots + f^{(k)}(x_0) \underbrace{\int_{x_0}^{x} \int_{x_0}^{y_1} \int_{x_0}^{y_2} \cdots \int_{x_0}^{y_{k-2}} \mathrm{d} y_{k-1} \cdots\mathrm{d} y_3 \mathrm{d} y_2 \mathrm{d} y_1}_{I_k(x)} + \mathcal{R}_{k+1}(x) $ The iterated integrals $I_k(x)$ are easy to evaluate. They can be defined recursively $ I_0(x) = 1, \quad I_k(x) = \int_{x_0}^x I_{k-1}(y) \mathrm{d} y $ Giving $I_k(x) = \frac{1}{k!} (x-x_0)^k$.

2

It sounds like you already accept that the $n!$ terms make sense when you're talking about polynomials. For other functions like $\sin{x}$, the whole motivation for Taylor series is to approximate those functions by polynomials, so in my opinion I would say that the $n!$ terms appear because that is precisely the property that mathematicians wanted out of Taylor series when they first invented it - so that any random function, $\sin{x}$, $\ln{x}$, etc, could look like a polynomial.

Alternatively, maybe this can help you see: if we have the Taylor series for $f(x)$ at $0$, $ f(0) + f'(0)x + \frac{1}{2} f''(0) x^2 + \frac{1}{3!} f'''(0) x^3 + \ldots$ then if we differentiate this function once, we get $ f'(0) + f''(0) x + \frac{1}{2} f''(0) x^2 + \ldots $ which gives us the Taylor series for $f'(x)$ at $0$! Notice that all the terms "shifted" downwards; allowing us to recover the familiar form of the Taylor series.

  • 0
    @Tom, yeah, that's my opinion. But others may have different interpretation, so you should stick around and see what other people have to say.2012-10-08
2

Watch this video to see why there are factorials: http://www.youtube.com/watch?v=QMJvRNFhEGc This guy is a simple, but effective teacher.

In short, the Taylor Series expansion is derived from the Power Series formula.

Power Series formula is: f(x) = a + ax^1 + ax^2 + ax^3 + ...

For example, multiple derivatives of f(0) = ax^5 leads to:

f(0) = ax^5

f '(0) = 5ax^4 then...

f ''(0) = 20ax^3 but wait!! Instead, write the second derivative as 5*4a(x^3)

f '''(0) = 5*4*3a(x^2)

f ''''(0) = 5*4*3*2a(x)

f '''''(0) = 5*4*3*2*1a

Solve for 'a' yields a = f '''''(0)/5! Now, insert this 'a' value for the Power Series term ax^5. So, [f '''''(0)/5!]x^5

Not easy to edit with this but hopefully the video will help.