I am given a function describing a curve:
f(t) = (f_1(t), f_2(t))',\quad t \in \mathbb{R},\quad f_1, f_2: \ \mathbb{R} \rightarrow \mathbb{R}\ .
How would I calculate the length of that curve corresponding to a given $t$-interval $[a, b]$?
I am given a function describing a curve:
f(t) = (f_1(t), f_2(t))',\quad t \in \mathbb{R},\quad f_1, f_2: \ \mathbb{R} \rightarrow \mathbb{R}\ .
How would I calculate the length of that curve corresponding to a given $t$-interval $[a, b]$?
The required length is given by the formula $L(\gamma|a,b)\ =\ \int_a^b\sqrt{\dot f_1^2(t)+\dot f_2^2(t)}\ dt\ .$ There are two possibilities:
${\it Either}$ you find an elementary primitive $\Sigma(t)$ of the function $\sigma(t):=\sqrt{\dot f_1^2(t)+\dot f_2^2(t)}$. In this case the length is simply $=\Sigma(b)-\Sigma(a)$. Examples for such $\gamma$ are, e.g., parabolas, circles, logarithmic spirals, but not ellipses.
${\it Or}$ you have to resort to a numerical procedure. The simplest is to put $t_k:=a+k {b-a \over N}$ $\ (0\leq k\leq N)$ for some large $N$ and compute the following approximation: $L(\gamma|a,b)\ \dot=\ {b-a \over N}\Bigl({1\over 2}\bigl(\sigma(t_0)+\sigma(t_N)\bigr)+\sum_{k=1}^{N-1} \sigma(t_k)\Bigr)\ .$