This should be easy, but I cannot figure out what I'm doing wrong, and it's killing me.
Let $\mathbf{f}(t)$ be a function from $\mathbb{R}$ to $\mathbb{R}^3$. I want to find the "rate of change of the angle between $\mathbf{f}(t)$ and nearby vectors (from $\mathbf{f}$)".
I'm going to assume $|\mathbf{f}(t)|=1$, because it makes the writing easier, and doesn't change the answer I get.
So we want to look at nearby vectors $\mathbf{f}(t+h)$, and the angle between that and $\mathbf{f}(t)$ is just $\mathbf{f}(t+h)\cdot\mathbf{f}(t)$. The rate of change is then
$$ \lim_{h\to0}\dfrac{\mathbf{f}(t+h)\cdot\mathbf{f}(t)-1}{h} $$
And this is just $\mathbf{f}'(t)\cdot\mathbf{f}(t)$, which is $0$ (because of the assumption above about the norm of $\mathbf{f}$).
But the right answer is (I think) $|\mathbf{f}'(t)|$. What am I doing wrong?
For context, this comes up in do Carmo's book on Differential Geometry, where it is claimed that for an arc-length parameterized curve, $|\mathbf{\alpha}''(t)|$ measures the rate of change of direction for the tangent curves. Think of $\mathbf{f}$ above as $\mathbf{\alpha}'$.