Suppose I have a diffusion $dX_t = a(X_t)dt + b(X_t)dW_t$. Is there a straightforward way of estimating the variance of $X_T$ for some time $T$, assuming that $T$ is large enough so that a simple Euler approximation isn't accurate?
Clearly, Monte-Carlo methods could be used here, but I'd like something more analytical.
Any ideas?
Many thanks.