1
$\begingroup$

When we consider the function $f(x)=1+x+x^2+x^3+...+x^n$ where $n$ tends to infinity, we can rewrite this as $f(x)=1+x(1+x+x^2+x^3+...)=1+x(f(x))\qquad (1)$ After some algebraic manipulations, we arrive at $f(x)=\frac{1}{1-x}\qquad (2)$ This expression is used to assign a "sum" to certain divergent series. If we take another look at (1), however, we can also write $f(x)=1+x(1+xf(x))$ from which we can deduce that $f(x)=\frac{1+x}{1-x^2}$. Of course, we can produce infinitely many functions that satisfy $f(x)$ in this manner, but I only see (2) in the literature. Is there a good reason for this phenomenon? Thanks,

Max

3 Answers 3

8

$1-x^2=(1-x)(1+x)$ so you did not find a new solution.

  • 0
    No, I just meant that a functional equation is not unique without thinking about the domain of the function.2011-05-05
4

Notice that

$f(x) = \frac{1+x}{1-x^2} = \frac{1+x}{(1+x)(1-x)} = \frac{1}{1-x}$

so you end up with the same function.

  • 1
    that's [Cesàro summation](http://en.wikipedia.org/wiki/Cesàro_summation#Examples).2011-05-05
2

Let $f(x) = \sum x^k$ and let $g(x) = 1/(1-x)$. Then where $f$ makes sense (namely for $|x| < 1$) the functions $f$ and $g$ agree. This answers your last question "Is there a good reason for this phenomenon?"