4
$\begingroup$

Let $X_t$ be a solution to the SDE, $dX_t=-aX_t \; dt+\sigma \; dW_t$, $a>0$, $\sigma>0$, $X_0=\text{constant}$ where $W_t$ is Brownian. What is the joint distribution of $(X_t, \int_0^t X_s \; ds)$?

I have calculated the solution to the SDE, I have $X_t=e^{-at}[X_0+\int_0^t \sigma e^{as} \; dW_s]$, I need to also get $\int_0^t e^{-as}[X_0+\int_0^s \sigma e^{au} \; dW_u] \; ds$ but I am not sure how to calculate that, and also the distribution of $(X_t, \int_0^t X_s \; ds)$

Thanks for your help

I have also shown that $X_t$ is a gaussian , but I am not sure how to find the distribution of the second element in the vector.

  • 0
    We are working doing some practice questions together2011-10-22

2 Answers 2

3

Let $Y_t = \int_0^t X_s \mathrm{d} s$.

The process $Z_t = (X_t, Y_t)$ is also an Ito process, with $\mathrm{d} Z_t = X_t (-a, 1) \mathrm{d} t + (\sigma, 0) \mathrm{d} W_t$ and the initial condition $Z_0 = (x_0, 0)$.

The process $Z_t$ is Gaussian, you determined $X_t$ to be Gaussian, and $Y_t$ is Gaussian as a linear functional of $X_t$. Thus to determine the joint distribution at time $t$, one needs to compute $\mathbb{E}(Z_t)$ and $\mathbb{E}(Z_t \otimes Z_t )$.

This would be done using Ito's lemma. Let $\tilde{X}_t = \mathrm{e}^{a t} X_t$ and $\tilde{Z}_t = ( \tilde{X}_t, Y_t )$. Then $\mathrm{d} \tilde{Z}_t = (0, X_t) \mathrm{d} t + ( \sigma \mathrm{e}^{a t}, 0 ) \mathrm{d} W_t$. That is $\mathbb{E}(\tilde{Z}_t) = \mathbb{E}(\tilde{Z}_0) + \int_0^t (0, \mathrm{E}(X_s)) \mathrm{d} s $.

That is the $\tilde{X}_t$ process is martingale, hence $ \mathbb{E}( \tilde{X}_t ) = \mathbb{E}( \tilde{X}_0 ) = \mathbb{E}( X_0 ) = x_0 $ Thus, $\mathbb{E}(X_t) = x_0 \exp(-a t)$, and by Ito's lemma $\mathbb{E}(Y_t) = \int_0^t x_0 \exp(-a s) \mathrm{d} s = \frac{x_0}{a} \left(1 - \mathrm{e}^{-a t} \right) $.

By Ito's lemma, again $ \mathrm{d}( \tilde{X}_t^2, \tilde{X}_t Y_t, Y_t^2 ) = ( \mathrm{e}^{2 a t} \sigma^2, \mathrm{e}^{a t} X_t^2, 2 X_t Y_t ) \mathrm{d} t + \sigma ( \mathrm{e}^{2 a t} X_t, \mathrm{e}^{a t} Y_t, 0 ) \mathrm{d} W_t $ Thus $ ( \mathbb{E}(\tilde{X}_t^2), \mathbb{E}(\tilde{X}_t Y_t), \mathbb{E}(Y_t^2) ) = ( x_0^2, 0, 0 ) + \int_0^t ( \sigma^2 \mathrm{e}^{2 a s}, \mathrm{e}^{a s} \mathbb{E}(X_s^2), 2 \mathbb{E}( X_s Y_s) ) \mathrm{d} s $ This results in $ \begin{eqnarray} \mathbb{E}\left(X_t^2\right) &=& x_0^2 \mathrm{e}^{-2 a t} + \frac{\sigma^2}{2} \frac{1 - \mathrm{e}^{-2 a t}}{a} \\ \mathbb{E}\left(X_t Y_t\right) &=& x_0^2 \mathrm{e}^{-a t} \left( \frac{ 1-\exp(-a t)}{a} \right) + \frac{\sigma^2}{2} \left( \frac{ 1 - \exp(-a t)}{a} \right)^2 \\ \mathbb{E}\left(Y_t^2\right) &=& x_0^2 \left( \frac{ 1 - \exp(-a t)}{a} \right)^2 + \frac{\sigma^2}{2 a^3} \left( 2 a t - 4 \left( 1 - \mathrm{e}^{-a t} \right) + \left( 1 - \mathrm{e}^{-2 a t} \right) \right) \end{eqnarray} $

Converting the second moment into central moments, we get that the joint distribution of $Z_t$ is the bivariate normal distribution with means $ (m_1(t), m_2(t)) = x_0 \left( \mathrm{e}^{-a t}, \frac{1- \mathrm{e}^{-a t}}{a} \right) $ and covariance matrix:

$ \Sigma(t) = \sigma^2 \left(\begin{array}{cc} \frac{1-\mathrm{e}^{-2 a t}}{2 a} & \frac{1}{2} \left( \frac{1-\mathrm{e}^{-a t}}{a} \right)^2 \\ \frac{1}{2} \left( \frac{1-\mathrm{e}^{-a t}}{a} \right)^2 & \frac{4 e^{-a t}+2 a t-3-e^{-2 a t}}{2 a^3} \end{array} \right) $


Added: Notice, that as $a$ becomes smaller the covariance matrix approaches the result from this related question: $ \Sigma(t) \sim \sigma^2 \left( \begin{array}{cc} t(1 - a t + o(a)) & \frac{1}{2}t^2 \left(1-a t + o(a)\right) \\ \frac{1}{2}t^2 \left(1-a t + o(a)\right) & t^3 \left( \frac{1}{3} - \frac{a t}{4} + o(a) \right) \end{array} \right) $

  • 0
    What I've done is tedious. I am sure there is a better way through either FKE or BKE. I would be interested in seeing it.2011-10-22
2

For every $t\geqslant0$, let $Y_t=\int\limits_0^tX_s\mathrm ds$. As explained by @Sasha, $Z_t=(X_t,Y_t)$ is gaussian hence the values $ x(t)=\mathrm E(X_t),\quad y(t)=\mathrm E(Y_t),\quad u(t)=\mathrm E(X_t^2),\quad v(t)=\mathrm E(Y_t^2),\quad w(t)=\mathrm E(X_tY_t), $ fully determine its distribution. To compute these, a convenient method is to consider directly the ordinary differential equations that the expectations of the solutions of stochastic differential equations solve, as follows.

First, $(X_t)$ and $(Y_t)$ solve the system $X_0=x_0$, $Y_0=0$, and $ \mathrm dX_t=-aX_t\mathrm dt+\sigma\mathrm dW_t,\quad \mathrm dY_t=X_t\mathrm dt, $ and $(W_t)$ is a martingale hence x'(t)=-ax(t),\quad y'(t)=x(t). Since one knows that $x(0)=x_0$ and $y(0)=0$, this linear differential system determines $x(t)$ and $y(t)$ for every $t\geqslant0$.

Turning to the second order quantities, let us first recall a general form of Itô's formula: for every $n\geqslant1$, every regular enough real valued function $\varphi$ defined on (an open set of) $\mathbb R^n$, and every $n$ dimensional diffusion $(\xi_t)$, $ \mathrm d\varphi(\xi_t)=\mathrm{grad}\, \varphi(\xi_t)\cdot\mathrm d\xi_t+\frac12H(\varphi)(\xi_t)\cdot\mathrm d\langle\xi,\xi\rangle_t, $ where $\mathrm{grad}\, \varphi(\xi)$ is the gradient of $\varphi$ at $\xi=(\xi^i)_{1\leqslant i\leqslant n}$ hence $ \mathrm{grad}\, \varphi(\xi_t)\cdot\mathrm d\xi_t=\sum\limits_{i=1}^n\partial_i\varphi(\xi_t)\mathrm d\xi^i_t, $ and $H(\varphi)(\xi)$ is the Hessian matrix of $\varphi$ at $\xi$ hence $ H(\varphi)(\xi_t)\cdot\mathrm d\langle\xi,\xi\rangle_t=\sum\limits_{i=1}^n\sum\limits_{j=1}^n\partial^2_{ij}\varphi(\xi_t)\mathrm d\langle\xi^i,\xi^j\rangle_t. $ First application: for $\varphi(\xi)=(\xi)^2$ and $\xi_t=X_t$, $\mathrm d\langle X,X\rangle_t=\sigma^2\mathrm dt$ yields $ \mathrm d(X_t^2)=2X_t\mathrm dX_t+\mathrm d\langle X,X\rangle_t=-2aX_t^2\mathrm dt+\mathrm d\text{(MG)}_t+\sigma^2\mathrm dt, $ where $\mathrm d\text{(MG)}_t$ is a martingale term whose value is irrelevant. Hence u'(t)=-2au(t)+\sigma^2. Second application: for $\varphi(\xi)=(\xi)^2$ and $\xi_t=Y_t$, $\mathrm d\langle Y,Y\rangle_t=0$ yields $ \mathrm d(Y_t^2)=2Y_t\mathrm dY_t=2Y_tX_t\mathrm dt, $ hence v'(t)=2w(t). Third application: for $\varphi(\xi^1,\xi^2)=\xi^1\xi^2$ and $\xi_t=(X_t,Y_t)$, $\mathrm d\langle X,Y\rangle_t=0$ yields $ \mathrm d(X_tY_t)=X_t\mathrm dY_t+Y_t\mathrm dX_t=X_t^2\mathrm dt-aX_tY_t\mathrm dt+\mathrm d\text{(MG)}_t, $ hence w'(t)=u(t)-aw(t). The three differential equations written above are a first order linear system in $(u(t),v(t),w(t))$. Since $(u(0),v(0),w(0))=(x_0^2,0,0)$, the values of $u(t)=\mathrm E(X_t^2)$, $v(t)=\mathrm E(Y_t^2)$ and $w(t)=\mathrm E(X_tY_t)$ follow, for every $t\geqslant0$.

To answer directly the OP's question, one can also keep in mind the expectations $x(t)=\mathrm E(X_t)$ and $y(t)=\mathrm E(Y_t)$ and write the differential system satisfied by the elements of the covariance matrix $C(t)=\mathrm E((X_t,Y_t)^T(X_t,Y_t))-(\mathrm E(X_t),\mathrm E(Y_t))^T(\mathrm E(X_t),\mathrm E(Y_t))$, namely $ U(t)=\mathrm{var}(X_t)=u(t)-x(t)^2,\quad V(t)=\mathrm{var}(Y_t)=v(t)-y(t)^2, $ and $ W(t)=\mathrm{cov}(X_t,Y_t)=w(t)-x(t)y(t). $ One gets U'(t)=-2aU(t)+\sigma^2,\quad V'(t)=2W(t),\quad W'(t)=U(t)-aW(t). Together with the initial condition that $U(0)=V(0)=W(0)=0$, this fully determines the covariance matrix $C(t)=\begin{pmatrix}U(t) & W(t)\\ W(t) & V(t)\end{pmatrix}$ for every $t\geqslant0$.