When we consider the function $f(x)=1+x+x^2+x^3+...+x^n$ where $n$ tends to infinity, we can rewrite this as $f(x)=1+x(1+x+x^2+x^3+...)=1+x(f(x))\qquad (1)$ After some algebraic manipulations, we arrive at $f(x)=\frac{1}{1-x}\qquad (2)$ This expression is used to assign a "sum" to certain divergent series. If we take another look at (1), however, we can also write $f(x)=1+x(1+xf(x))$ from which we can deduce that $f(x)=\frac{1+x}{1-x^2}$. Of course, we can produce infinitely many functions that satisfy $f(x)$ in this manner, but I only see (2) in the literature. Is there a good reason for this phenomenon? Thanks,
Max