A theorem in my book states:
Let $g$ be a function and $r$ a number fixed by the function (i.e. $g(r) = r$). Assume $g$ is continuously differentiable, $g(r) = r$ and $|g'(r)| < 1$, then the fixed point iteration is locally convergent to $r$.
In other words, the sequence defined by $g(r_i) = r_{i+1}$ converges to $r$.
Let $g(x) = 0.5x^2 + 0.5x$
First I try to find the fixed points. I would like to find $x$ such that:
$x = 0.5x^2 + 0.5x$
Upon simplifying and solving I get that $x = 0, 1$.
By testing it on a calculator, I find that both values are the limit of the sequences of numbers I get by repeated iteration. However, when I try to apply the theorem, I find that they shouldn't be!
By the theorem:
$g'(x)$ is continuous, thus the function $g(x)$ is continuously differentiable. Next, $g(0) = 0$ and $g(1) = 1$. BUT, $|g'(0)| = 1$ and $|g'(1)| = 2$. Neither are less than $1$, like I expected.
What am I doing wrong?