I am thinking either it does not have a Taylor series since there does not seem a way to define the $n^{\rm th}$ derivative of $x^2$ or its Taylor series is just $x^2$ and radius of covergence is infinity. Don't know which argument is correct? Any help is appreciated. Thanks!
Find a Taylor Series and its radius of convergence for $f(x) = x^2.$
-
0A consequence of Antonio's point is that the terms of order greater than $2$ are $0$, not that the whole Taylor series is $0$. – 2012-11-26
2 Answers
Look at the derivatives of the function. $ f(x) = x^2\\ f'(x) = 2x\\ f''(x) = 2\\ f'''(x) = 0 $ Any subsequent derivatives will also be zero, so you only need to compute the first three terms of the Taylor series, as the rest of the terms will be zero. Because all of the other terms are equal to zero, the series will always have to converge, as you're only adding a finite number of elements. Now, for the actual series, $ \sum_{i=0}^\infty \frac{f^{(i)}(a)}{i!}(x-a)^n = \frac{f(a)}{0!}+\frac{f'(a)}{1!}(x-a)+\frac{f''(a)}{2!}+\sum_{i=3}^\infty \frac{f^{(i)}(a)}{i!}(x-a)^n\\= a^2+2a(x-a)+(x-a)^2+\sum_{i=3}^\infty \frac{0}{i!}(x-a)^n\\= a^2+2a(x-a)+(x^2-2ax+a^2)\\ =x^2 $ So as it turns out, the Taylor series for this, and in fact for any polynomial, is just the original function. Polynomial functions are unique in this respect, because after you differentiate them a finite number of times, you get zero and can ignore the rest of the series.
Given
$f(x)=x^2 \implies f'(x)=2x \implies f''(x) = 2 \implies f^{(n)}(x)=0\,\, \forall n\geq 3. $
Now, construct Taylor series at $x=a$
$ x^2 = f(a)+f'(a)(x-a)+\frac{f''(a)}{2!}(x-a)^2 $
$ x^2 = a^2 + 2 a (x-a) + \frac{2}{2!}(x-a)^2 $