Could you help me to prove that the series $\sum\frac{x^2}{(1+x^2)^n}$ is not uniformly convergent on $[0,b]$ for any $b>0$, please?
why this series is not uniformly convergent?
-
1A good start would probably be to find the pointwise sum, even if only to get an overview of the problem. Can you do that? – 2011-10-08
1 Answers
Why this is true: For $n$ large enough the maximum of the function $x\mapsto x^2/(1+x^2)^n$ on the interval $(0,b)$ is reached when $x\sim1/\sqrt{n}$ and is equivalent to $1/(n\mathrm e)$.
How to prove this is true: The sum of the full series is $1+x^2$ and its $n$th rest is $ R_n(x)=\sum\limits_{k\geqslant n+1}\frac{x^2}{(1+x^2)^k}=\frac1{(1+x^2)^n}, $ hence $R_n(x)\to0$ for every positive $x$ but $\sup\{|R_n(x)|\,;\,x\in(0,b)\}=1$ for every $n$, in particular this supremum does not converge to $0$ when $n\to\infty$.
Hint for the proof that this is true: Think geometric series, that is, note that $r=\frac1{1+x^2}<1$ for every nonzero $x$ and note that, for every $a$, every $|r|<1$ and every nonnegative integer $n$, $\sum\limits_{k\geqslant0}ar^k=\frac{a}{1-r}$ and $\sum\limits_{k\geqslant n}ar^k=\frac{ar^n}{1-r}$.