2
$\begingroup$

Currently i am reading this page which discusses the newton-raphson method of approximating the roots of an equation. It says given a function $f$ over the reals $x$, and its derivative $f$,we begin with a first guess $x_{0}$ for a root of the function $f$. Provided the function is reasonably well behaved a better approximation $x_{1}$ is $$x_{1} = x_{0} - \frac{f(x_{0})}{f \prime(x_{0})}.$$

First question: What does it mean to say that the function is reasonably well behaved?What's according to them is "Reasonably well behaved?"

The process is repeated as $$x_{n+1} = x_{n} - \frac{f(x_{n})}{f \prime(x_{n})}$$ until a sufficiently accurate value is reached.

For example i tried solving the equation $x^3 - 3x - 4 = 0$ where $f(1) < 0$ and $f(3) > 0$ so there's a solution to the equation between 1 and 3.We shall take this to be 2,by bisection and hence $x_{0} = 2$.A better approximation $x_{1}$ is given by the above formula.

Second question:"until a sufficiently accurate value is reached." When do i get to the "sufficiently accurate" value?How do i know that i've reached a point where i stop using this formula?

  • 1
    Typically one stops when the difference between two successive estimates is less than your desired tolerance. Good computer implementations have more sophisticated stopping rules.2011-12-20
  • 3
    If the function $f$ has a continuous second derivative, then the mapping $x_n \mapsto x_{n+1}$ can be shown to be a contraction mapping. Therefore, if $f(x)=0$ has a solution $x'$, there exists a neighborhood of $x'$ such that Newton's method will converge. The error can also be approximated. http://www.math.uconn.edu/~kconrad/blurbs/analysis/contractionshort.pdf2011-12-20
  • 1
    In Andre Nicolas's comment, the statement *is less than your desired tolerance* might mean something to me if an example is shown i guess.2011-12-20
  • 1
    @alok: Suppose that you want the "answer" to be within $0.0001$ of the true root. Then you continue until $|x_{n+1}-x_n|<0.0001$.2011-12-20
  • 2
    "Provided the function is reasonably well behaved" is sloppiness; what really matters is that $x_0$ be sufficiently close to the root; just how close depends on the particulars of $f$ in a way that is *not* captured by any reasonable interpretation of "sufficiently well behaved" **alone**.2011-12-20

1 Answers 1

2

Suppose you want to compute $\sqrt{2}$, so you use $f(x) = x^2-2$ and start with $x_1=1$. Say you want 50 places. Here we go:

x[ 1] = 1.00000000000000000000000000000000000000000000000000  x[ 2] = 1.50000000000000000000000000000000000000000000000000  x[ 3] = 1.41666666666666666666666666666666666666666666666667  x[ 4] = 1.41421568627450980392156862745098039215686274509804  x[ 5] = 1.41421356237468991062629557889013491011655962211574  x[ 6] = 1.41421356237309504880168962350253024361498192577620  x[ 7] = 1.41421356237309504880168872420969807856967187537723  x[ 8] = 1.41421356237309504880168872420969807856967187537695  x[ 9] = 1.41421356237309504880168872420969807856967187537695  x[10] = 1.41421356237309504880168872420969807856967187537695  x[11] = 1.41421356237309504880168872420969807856967187537695  x[12] = 1.41421356237309504880168872420969807856967187537695  x[13] = 1.41421356237309504880168872420969807856967187537695  x[14] = 1.41421356237309504880168872420969807856967187537695  x[15] = 1.41421356237309504880168872420969807856967187537695  

So presumably we can stop at $x_8$.

  • 1
    Abruptly stop at $x_{8}$ just because it "Kind of" looks like a value close to $\sqrt{2}$?I think i need some insight on this.your explanation looks good though.2011-12-21
  • 1
    When the change is beyond the 50th place, we don't see it, since I printed only 50 places. Generally, with Newton's method, the number of correct decimals doubles on each iteration. I would bet big money that this $x_8$ is correct within $10^{-49}$. And limitations on the amount I bet reflect not my confidence in the algorithm, but my confidence in Maple, which made the listing you see above.2011-12-21
  • 1
    Oh maple? Then i trust your answer!!(just kidding.I know what your saying)2011-12-22