Consider an independent increment process $X_t$, such that $X_t$ follows a continuous distribution. Examples would be a Brownian motion, Gamma process or a stable Levy process.
When sampling from these processes on a regular time grid, linear interpolation in between time-grid points is sometimes used.
This got me thinking of the error committed to the statistics of $X_t$. That is with $X_t^{(1)} = (1-\lambda_t) X_{t_i} +\lambda_t X_{t_{i+1}}$ for $t_i < t < t_{i+1}$, with $\lambda_t = \dfrac{t-t_i}{t_{i+1}-t_i}$, how far can the distribution function of $X_t^{(1)}$ deviate from $X_t$ ?
I realize the question is vague, but I am trying to see if it can be made more precise, to ask what choice of $\lambda_t$ function would least distort the statistics.
Thanks for reading.