Why is $dB^2=dt$? Every online source I've come across lists this as an exercise or just states it, but why isn't this ever explicitly proved? I know that $dB=\sqrt{dt}Z$, but I don't know what squaring a Gaussian random variable means.
Wiener Process $dB^2=dt$
-
0I think the answer to my bounty clarification request is that a simple calculation shows that the standard deviation of $dB^2$ is actually of the order of $dt^{3/2}$, while its expectation is of the order of $dt$. So the randomness can be ignored. – 2013-09-26
4 Answers
$dB_t^2 = dt, \qquad (dt)^2 = 0, \qquad dB_t \, dt = 0 \tag{1}$ are basically rules to simplify the calculation of the quadratic (co)variation of Itô processes - and nothing more:
Let $(B_t)_{t \geq 0}$ a one-dimensional Brownian motion and $(X_t)_{t \geq 0}$ an Itô process, i.e.
$dX_s = \sigma(s) \, dB_s + b(s) \, ds$
Then, by Itô's formula,
$f(X_t)-f(X_0) = \int_0^t f'(X_s) \, dX_s + \int_0^t f''(X_s) \, \sigma^2(s) \, ds. \tag{2}$
The point is: If we simply apply the rules in $(1)$, we obtain
$dX_s^2 = (\sigma(s) \, dB_s + b(s) \, ds)^2 = \sigma^2(s) \underbrace{dB_s^2}_{ds} + 2b(s) \sigma(s) \underbrace{ds B_s}_{0} + b^2(s) \, \underbrace{ds^2}_{0} \\ = \sigma^2(s) \, ds.$
Therefore, we can rewrite $(2)$ in the following way:
$f(X_t)-f(X_0) = \int_0^t f'(X_s) \, dX_s + \int_0^t f''(X_s) \, dX_s^2$
i.e. Itô's formula justifies the calculation rules in $(1)$.
The (mathematical) reason why this works fine can be seen while proving Itô's formula. Actually, one can show that
$\sum_{j=1}^n g(B_{t_{j-1}}) \cdot (B_{t_j}-B_{t_{j-1}})^2 \tag{3}$
converges to
$\int_0^t g(B_s) \, ds$
as the mesh size $|\Pi|$ of the partition $\Pi=\{0=t_0<\ldots
$\int_0^t g(B_s) \, dB_s^2 := \lim_{|\Pi| \to 0}\sum_{j=1}^n g(B_{t_{j-1}}) \cdot (B_{t_j}-B_{t_{j-1}})^2.$
Consequently,
$\int_0^t g(B_s) \, dB_s^2 = \int_0^t g(B_s) \, ds.$
A similar reasoning applies to Itô processes. Note that these integrals, i.e. integrals of the form
$\int_0^t g(X_s) \, dX_s^2,$
are exactly the integrals appearing in Itô's formula $(2)$ ($g \hat{=} f''$).
-
0The only heuristical explanation I know is the following: By scaling property, $B_t \sim \sqrt{t} B_1$. Thus (heuristically!) $dB_t^2 = B_1^2 \, dt$ Since $\mathbb{E}B_1^2 = 1$, we have $dB_t^2 = dt$. This might give you some intuition why this works, but as @Did mentioned several times the formula is based on a much deeper result. – 2013-09-30
For independent random variables, the variance of the sum equals the sum of the variances. So $\mathbb{E}((\Delta B)^2)=\Delta t$, i.e. if you increment $t$ a little bit, then the variance of the value of $B$ before that increment plus the variance of the increment equals the variance of the value of $B$ after the increment.
Or you could say $ \frac{\mathbb{E}((\Delta B)^2)}{\Delta t} = 1. $ That much follows easily from the first things you hear about the Wiener process. I could then say "take limits", but that might be sarcastic, so instead I'll say that for a fully rigorous answer, I'd have to do somewhat more work.
-
0@Did : probably too late to re-do this as a new question at this point - only would cause more confusion. Unless moderators agree and help. – 2013-09-22
Obviously $dB_t^2 \neq dt$, since $dB_t \sim \mathcal{N} (0, dt)$ is a random variable, while $dt$ is deterministic.
As Michael Hardy said, they really meant to say $\mathbb{E} \left[ dB_t^2 \right] = dt$. To convince yourself, compute $ \mathbb{E} \left[ dB_t^n \right] = \int_{-\infty}^{+\infty} \frac{1}{\sqrt{2 \pi dt}} \exp\left(-\frac{x^2}{2 dt}\right) x^n dx \, .$
-
2...Where the author takes care to repeat regularly that the derivation is "informal". – 2014-08-13