0
$\begingroup$

On page 7 of the book 'An Introduction to Numerical Analysis' by Suli and Mayers a proof of the Contraction Mapping Theorem is given. Hopefully somebody has the book or can understand the equations I'm writing here as is because I have left out alot of the background info (as the proof draws on many previous statements).

While proving that $(x_k)$ converges to the fixed point $\epsilon$ the book states the following -

1) $|x_k - \epsilon| = |g(x_{k-1}) - g(\epsilon)| \leq L|x_{k-1} -\epsilon|$ with $k \geq 1$

It then says 'we can deduce by induction' that

2) $|x_k - \epsilon| \leq L^k|x_{k0} -\epsilon|$ with $k \geq 1$

I can't see how to show by induction that 1) leads to 2). I can see the trivial base case where $k=1$ but I can't see how the inductive step is handled. Any ideas?

1 Answers 1

1

$$ |x_k - \epsilon| = |g(x_{k-1}) - g(\epsilon)| \leq L|x_{k-1} -\epsilon| \leq L^2 |x_{k-2}-\epsilon| \leq L^3 |x_{k-3}-\epsilon| \leq \ldots $$ I hope you can fill in the very easy details.

  • 0
    Cheers, I'll try...I always have problems with induction.2012-09-13
  • 0
    How can you say $L|x_{k-1} -\epsilon| \leq L^2 |x_{k-2}-\epsilon|$? L is actually $0 < L < 1$ so it looks as if the terms to the right get smaller (barring the values $|x_{k-j} - \epsilon|$ are diverging faster than the L's). Which doesn't make sense as the $|x_{k-j} - \epsilon|$ is what is supposed to be converging...?2012-09-13
  • 1
    Because $|x_{k-1}-\epsilon| = |g(x_{k-2})-g(\epsilon)| \leq L |x_{k-2}-\epsilon|$.2012-09-13
  • 0
    OK thanks, I'll look at it again.2012-09-14