3
$\begingroup$

Let $h:[0,\infty) \to \mathbb{R}$ be a measurable, square integrable function on $[0,t]$, for all $t \geq 0$.

I want to show that if $H_t = \int_0^t h(s)\;dB_s$, where $(B_t)_{t\geq0}$ is a standard Brownian Motion, then $$\operatorname{Cov}(H_s,H_t)=\int_0^sh^2(u)\;du,\text{ for }s

Using a characteristic funtion of $\mathcal{N}$ and Ito's lemma I have shown that

$$H_t \sim \mathcal{N(0,\int_0^th^2(u)\;du)}$$

So I am only left to show that $\int_s^t h(u)\;dB_u$ is independent of $\mathcal{F}_s$. ($\mathcal{F}_{t\geq 0}$ is the natural filtration of $B_t$)

The problem here is that $\int_s^t h(u)\;dB_u$ is a limit of the functions of a form $\int_0^{\infty}\displaystyle\sum_{k=1}^{N_k}a_k\mathbf{1}_{(s_k,t_k]}dB_u$ in $\displaystyle \sup_{t\geq 0}||\cdot||_2$ norm, where $a_k's$ are deterministic constants in this case and $s_1 > s$

The stochastic integrals of simple processes (as above) are definitely independent of $\mathcal{F}_s$, but how can I deduce that the independence is preserved after taking the limits in the $\displaystyle \sup_{t\geq 0}||\cdot||_2$ norm?

  • 4
    Notice the inequality $\| E[ X | \mathcal{G} ] \|_p \leq \| X \|_p$ for $p \geq 1$. Thus if $X_n, X \in L^p$ and $X_n \stackrel{L^p}{\longrightarrow} X$, then $E[X_n|\mathcal{G}] \stackrel{L^p}{\longrightarrow} E[X|\mathcal{G}]$. Thus if $(\phi_n)$ is a sequence of simple processes approximating $h$, then $$0 = E \left[ \int_{s}^{t} \phi_n \; dB \Bigg| \mathcal{F}_{s} \right] \stackrel{L^p}{\longrightarrow} E \left[ \int_{s}^{t} h \; dB \Bigg| \mathcal{F}_{s} \right].$$2012-02-22
  • 0
    This computes the covariance beautifully and shows that $E \left[ \int_{s}^{t} h \; dB \Bigg| \mathcal{F}_{s} \right]=0$ a.s. Thank you a lot. Still can't deduce the independence directly though. I could invoke now that I have two normal random variables with 0 covariance ($\int_0^sh(u)\;dB_u$, $\int_s^th(u)\;dB_u$), and so must be independent. But $\mathcal{F}_s$ could be bigger than $\sigma(\int_0^vh(u)\;dB_u,\;v \leq s)$2012-02-22

1 Answers 1

0

Just use the integration by parts formula with some cleverly chosen processes: Integration by parts says $$ X_tY_t = X_0Y_0 + \int_0^t X_s dY_s + \int_0^t Y_s dX_s + [X,Y]_s $$

For $s < t$ we define the processes $X_t = \int_0^t h(u) dB_u = H_t$ and $Y_t = \int_0^t 1_{[0,s]}(u) h(u) dB_u = H_s$ then applying integration by parts. Note that $dX_u = h(u) dB_u$ and $dY_u = h(u)1_{[0,s]}(u) dB_u$

$$ H_tH_s = 0 + \int_0^t \left[\int_0^u h(v) dB_u \right] h(u)1_{[0,s]}(u) dB_u + \int_0^t \ldots dB_v + \int_0^t h(u)^2 1_{[0,s]}(u) du $$

I didn't develop the interior of the stochastic integrals since it won't matter afterwards. The $h$ function is square integrable and so the stochastic integrals are martingales, and so their expectations are equal to the expectation at time 0 which is exactly zero!. For the same reason $E(H_t) = 0$ for all t.

And so the covariance is given by $$ E(H_tH_s) - E(H_t)E(H_s) = E(\int_0^t h(u)^2 1_{[0,s]}(u) du) - 0 = E(\int_0^s h(u)^2 du)$$