Let f be a deterministic function in $L^2[a,b]$ and $X_t=X_a+\int^t_af(s)dB(s)$.
Show that $\int^b_af(t)X_tdB(t)=\frac{1}{2}(X_b^2-X_a^2-\int^b_af(t)^2dt)$.
My attempt:
$\frac{1}{2}(X_b^2-X_a^2)=X_a\int^b_af(t)dB(t)+\frac{1}{2}(\int^b_af(t)dB(t))^2$.
On the other hand:
$\int^b_af(t)X_tdB(t)=\int^b_af(t)X_adB(t)+\int^b_af(t)(\int^t_af(s)dB(s))dB(t)$.
So it remains to show that:
$\int^b_af(t)(\int^t_af(s)dB(s))dB(t)=\frac{1}{2}(\int^b_af(t)dB(t))^2-\frac{1}{2}\int^b_af(t)^2dt$
I learn only about the basic definition of stochastic integral. Is there any suggestions or hints? Also, I am wondering why f should be deterministic. What will go wrong if f is adapted and being in $L^2$? Thank you.
$f=\sum_n f_n$where $f_n$ are indicator functions for disjoint intervals, the identity holds. Then we approxiamte $f$ by such $f_n$ in $L^2$ sense and pass to limit. – 2017-02-14