6
$\begingroup$

Using Lagrange's remainder, I have to prove that:

$\log(1+x) = \sum\limits_{n=1}^\infty (-1)^{n+1} \cdot \frac{x^n}{n}, \; \forall |x| < 1$

I am not quite sure how to do this. I started with the Taylor series for $x_0 = 0$:

$f(x_0) = \sum\limits_{n=0}^\infty \frac{f^{(n)}(x_0)}{n!} \cdot x^n + r_n$, where $r_n$ is the remainder. Then, I used induction to prove that the n-th derivative of $\log(1+x)$ can be written as:

$f^{(n)} = (-1)^{n+1} \cdot \frac{(n-1)!}{(1+x)^n}, \forall n \in \mathbb{N}$

I plugged this formula into the Taylor series for $\log(1+x)$ and ended up with:

$f(x_0) = \sum\limits_{n=1}^\infty (-1)^{n+1} \cdot \frac{x^n}{n} + r_n$, which already looked quite promising.

As the formula which I have to prove doesn't have that remainder $r_n$, I tried to show that $\lim_{n \to \infty} r_n = 0$, using Lagrange's remainder formula (for $x_0 = 0$ and $|x| < 1$).

So now I basically showed that the formula was valid for $x \to x_0 = 0$. I also showed that the radius of convergence of this power series is $r = 1$, that is to say the power series converges $\forall |x| < 1$.

What is bugging me, is the fact, that to my opinion, the formula is only valid for $x \to 0$. I mean sure, the radius of convergence is 1, but does this actually tell me that the formula is valid within $(-1,1)$? I've never done something like this before, thus the insecurity. I'd be delighted, if someone could help me out and tell me, whether the things I've shown are already sufficient or whether I still need to prove something.

  • 0
    (But I have now added a sketch and reference for a correct method.)2010-12-10

2 Answers 2

3

I think there is a problem with the above solution. In the estimate, $|f^{(k+1)}(x)| \leq \left(\frac{1}{1-r}\right)^{k+1},$ there is a dropped $k!$. Indeed, it should read, $|f^{(k+1)}(x)| \leq \left(\frac{k!}{1-r}\right)^{k+1},$ and thus $ |r_k(x)| \leq \left( \frac{r}{1-r} \right)^{k+1} \cdot \frac{k!}{(k+1)!} = \left( \frac{r}{1-r} \right)^{k+1} \cdot \frac{1}{k+1}. $ Unfortunately now this expression won't go to $0$ if $r>0$ (now the exponential term will dominate $\frac{1}{k+1}$).

The above solution does work for x in (-1/2,1). Here's a way to handle the remaining cases. In fact, let's just take x in (-1,0). Now Lagrange's form of the remainder gives: $ r_k(x) = \int_0^x \frac{f^{(k+1)}(t)}{k!} (x-t)^{k} dt = \int_0^x \frac{(-1)^k}{(1+t)^{k+1}} (x-t)^{k} dt $ Note that for $x<0$, the above integrand has the same sign for every $t$. In particular, $ |r_k(x)| = \int_x^0 \frac{1}{(1+t)^{k+1}} (t-x)^{k} dt. $ Consider $\frac{t-x}{1+t}$ as a function in $t$ with $x$ fixed. It then is an increasing function on [x,0] with maximal value of $-x$ when $t=0$. Thus, $ |r_k(x)| \leq \int_x^0 (-x)^k \frac{1}{1+t} dt \leq \int_x^0 (-x)^k \frac{1}{1+x} dt = \frac{(-x)^{k+1}}{1+x}. $ As desired, this last expression does go to $0$ as $k \to \infty$ as $-1.

Note that in Spivak's Calculus, the stronger bound of $ |r_k(x)| \leq \frac{(-x)^{k+1}}{(1+x)(1+k)} $ is left as Exercise 16 in Chapter 20. I don't see how to get this, but I'm anxious that I'm just making some mistake in the computations above.

  • 0
    @Jonas: I just put up an alternative solution. I couldn't figure out how to cross things out, so I just rewrote it all.2010-12-10
6

$f(x_0) = \sum\limits_{n=1}^\infty (-1)^{n+1} \cdot \frac{x^n}{n} + r_n$

That should say

$f(x)=\sum_{n=1}^k (-1)^{n+1} \cdot \frac{x^n}{n} + r_k(x),$

where $r_k$ is the error term of the $k^\text{th}$ partial sum. You want to use estimates to show that the error term goes to $0$ as $k$ goes to $\infty$, which will justify convergence of the series to $f(x)=\log(1+x)$.


Edit: I've struck through part of my answer that relied on a wrong estimate of the derivatives, as pointed out by Robert Pollack. With the missing $k!$ term, the estimate only works on $[-\frac{1}{2},1)$.

Added: To make this answer a little more useful, I decided to look up a correct method. Spivak in his book Calculus (3rd Edition, page 423) uses the formula $\frac{1}{1+t}=1-t+t^2-\cdots+(-1)^{n-1}t^{n-1}+\frac{(-1)^nt^n}{1+t}$ in order to write the remainder as $r_n(x)=(-1)^n\int_0^x\frac{t^n}{1+t}dt$. The estimate $\int_0^x\frac{t^n}{t+1}dt\leq\int_0^xt^ndt=\frac{x^{n+1}}{n+1}$ holds when $x\geq0$, and the harder estimate $\left|\int_0^x\frac{t^n}{1+t}\right|\leq\frac{|x|^{n+1}}{(1+x)(n+1)}$, when $-1\lt x\leq0$, is given as Problem 11 on page 430. Combining these, you can show that the sequence of remainders converges uniformly to $0$ on $[-r,1]$ for each $r\in(0,1)$.

Lagrange's form of the error term can be used to do this. The estimates, which follow from Taylor's theorem, are also found on Wikipedia. In this case, if $0\lt r\lt 1$, then $|f^{k+1}(x)|\leq \frac{1}{(1-r)^{k+1}}$ whenever $x\geq-r$, so you have the estimate $|r_k(x)|\leq \frac{r^{k+1}}{(1-r)^{k+1}}\frac{1}{(k+1)!}$ for all $x$ in $(-r,r)$, which you can show goes to $0$ (because (k+1)! grows faster than the exponential function $\left(\frac{r}{(1-r)}\right)^{k+1}$), thus showing that the series converges uniformly to $\log(1+x)$ on $(-r,r)$. Since $r$ was arbitrary, this shows that the series converges on $(-1,1)$, and the convergence is uniform on compact subintervals.

  • 0
    @Huy: By the way, thanks for putting work into making a clear question. If I hadn't used up my 30-vote limit for the day, I would vote it up.2010-11-27