I'm reading about time series and I thought of this procedure: can you differentiate a function containing a random variable.
For example:
$f(t) = a t + b + \epsilon$
where $\epsilon \sim N(0,1)$. Then:
$df/dt = \lim\limits_{\delta t \to 0} {(f(t + \delta t) - f(t))/ \delta t} = (a \delta t + \epsilon_2 - \epsilon_1)/\delta t = a + (\epsilon_2 - \epsilon_1)/\delta t$
But:
$\epsilon_2 - \epsilon_1 = \xi$
where $\xi \sim N(0,2)$.
But this means that we have a random variable over an infinitesimally small value. so $\xi/\delta t$ will be infinite except for the cases when $\xi$ happens to be 0. Am I doing something wrong?