0
$\begingroup$

I was reading the Wikipedia article on numerical methods for ODEs https://en.wikipedia.org/wiki/Numerical_methods_for_ordinary_differential_equations#Consistency_and_order

and I saw that when it discusses "consistency and order", the consistency is defined as $$\lim_{h\to 0} \frac{\delta^h_{n+k}}{h} = 0$$

where $\delta^h_{n+k}$ is the local truncation error with mesh size $h$.

Why is consitencty defined this way? My guess about the definition of consistency was ${\delta^h_{n+k}} \to 0$ as $h\to 0$

2 Answers 2

0

It depends on if you define the local truncation error as $$ y(t+h)-y(t)-h\Phi_f(t,y(t),h) $$ or as the difference of the slopes $$ \frac{y(t+h)-y(t)}h-\Phi_f(t,y(t),h). $$ Wikipedia uses the first, you seem to expect the second.


$δ^h_{n+k}→0$ as $h→0$, using the WP definition of the error, is a property that is true for all continuous functions, regardless of whether they are solutions or not. Consistency means that the method should be as least as good as the explicit Euler method.

  • 0
    Isn't my approach using the first approach? I am taking the difference of the two solutions2017-01-12
  • 1
    That, $δ^h_{n+k}→0$ as $h→0$, is then a property that is true for all continuous functions, regardless of whether they are solutions or not.2017-01-12
1

Suppose you have $y(0)$ and your goal is to obtain $y(t)$ where $t$ is some small positive number. If $t$ is small enough and the method is consistent, you should be able to run your numerical method with $h=t/N$ for large $N$, and then as $N \to \infty$ you should get convergence to $y(t)$. This means that you will incur the local truncation error $N=O(1/h)$ times in the course of the calculation. Thus the local truncation error had better be $o(h)$ so that the overall error is $o(1)$.

(If $t$ is no longer required to be small then you need stability as well, as you probably are aware.)