1
$\begingroup$

(I'm opening a new post because the claim has been found in another book and now I have an exact description)

I wish to prove this claim I found in a numerical analysis book :

Assume that $\varphi(x)$ is $p$ times continuously differentiable. Then the iteration method $x_{n+1} = \varphi(x_n)$ is of order $p$ for the root $\alpha$ if and only if $\varphi(j)(\alpha) = 0$, $j = 1 : p − 1, \varphi(p)(\alpha) \not= 0$.

I'm not sure how to prove this but I think Taylor might help (don't know how though...)

1 Answers 1

1

Unfortunate that the word "root" is used here, as we are not finding a root of $\varphi,$ we are finding a fixed point $\varphi(\alpha) = \alpha.$ Indeed, with the middle derivatives vanishing at $\alpha,$ we have the Taylor estimate $ \varphi(\alpha + h) = \alpha + \; \; \mbox{const} \; \cdot \; h^p \; + \mbox{smaller}$ where const is the $p$-th derivative of $\varphi$ at $\alpha$ divided by $p!.$

Most likely what is being discussed is methods related to Newton's method. In this language, what Newton's method does is take the search for a root of some function $f(x)$ which has a root at $\alpha,$ then transform this to finding a fixed point (the same unknown $\alpha$) for a new function $\varphi.$ Newton's method typically gives $p=2.$

I think you need to do some experiments with a calculator. Take $\varphi(x) = x^3,$ start with any seed value $x_0$ between 0.01 and 0.99, and carefully note the size of each $x_n.$ Then do the same experiment with $\varphi(x) = x^2,$ which resembles typical Newton's.

See TAYLOR