I'm trying to find the "first" (greater than some initial $t_0$) odd root (that is, a root after which the sign of the function changes) of a function $f(t)$, if there is one, that is also less than some max bound $t_{\max}$. I can evaluate derivatives of the function, too, but they get more much more expensive the further I go, so I don't want to derive past the first or second derivatives. (The function is a combination of polynomials and trig functions). The idea is that this would run as part of another algorithm, so I can't visually inspect the function to try and bracket the roots, and the method I use has to be robust.
So far I figure I can walk "downhill" (or "uphill", depending on where we start) from $t_0$ to $t_{\max}$ by using a Taylor expansion's bounds on the max error and iterating by taking the largest "safe" step size that won't cross 0. Eventually, this should converge to the "first" zero of the function. It's not a great algorithm numerically, but let's say I use either this or some other method to find the first root.
If the zero has multiplicity of 1, the first derivative will not be 0 and I can test for that, and then I'm done. However, anything greater than that multiplicity-wise will generate a first derivative that is 0, so I won't be able to determine from that if the multiplicity is even or odd.
So I had the thought that if I could take some step size $h$ and evaluate the function at $f(t+h)$ I could check its sign and see if the function crossed the x axis or just bounced off of it (if the root was odd or even). If it's an even root, I can start hillclimbing away from 0 and continue on with the algorithm. If it's an odd root I'm done.
However for this scheme to work I need to be sure that there aren't any roots in the interval $(t, t+h]$. So I need some way to find a safe step size $h$. I could just use some small epsilon but I was hoping that there was a better symbolic way to approach the problem.
Or even better if there's a way to determine whether a given root is even or odd multiplicity when you have it in hand, that would be even better.
Specifically, annoying test cases I have in mind:
$f(t) = \cos(t)+1$ <-- Infinite number of even roots
$f(t) = (t-3)^3$ <-- Odd root, but first derivative is 0
$f(t) = t^2-1^{-15}$ <-- Odd root, but not necessarily detectable with machine precision. I wouldn't mind counting these cases as even, but I want to be careful throwing epsilons all over the place.
...
Specifically, the function of interest has this form:
$f(t) = \mu^T \Pi^T \upsilon$
where:
$\mu = [\theta_1(t)](r_2-r_1)$
$\upsilon = b(t) - [\theta_0(t)]r_0 + [\theta_1(t)]r_1$
$\Pi$ is the 2x2 rotation matrix that rotates a vector by 90 degrees.
$b(t)$ is a 2 element column vector polynomial (that is, a polynomial with coefficients that are 2D vectors)
$\theta_0(t), \theta_1(t)$ are scalar polynomials
$r_0, r_1, r_2$ are constant vectors.
$[\theta(t)]$ is shorthand for the 2x2 rotation matrix by angle $\theta$.
...
So basically, is there a way to declare some neighborhood around a given root as containing no other roots?
Or alternatively, if anyone has other ideas about how to approach this problem.