Take some analytic function, $f(x)$, that goes from $-\infty$ to $\infty$, with a finite number of points such that $\frac{df}{dx}=0$. You can divide the y axis into intervals, where the boundary between each interval is the y value at a critical points (see graph).
Within each interval, there are an odd number of real inverse functions, $g_n(y)$. Number them in ascending order. Then, for every function I've tried, $\sum_n (-1)^n g_n(y)=ag_m(y)+b$ for some a,b and m that is constant across the interval. Does anyone know why this would be the case?