4
$\begingroup$

A mean-reverting geometric Brownian motion is defined by a system of the equations:

$dX_t = \mu(X_t, \overline{X}_t) X_t dt + σ X_t dW_t$ and

$d\overline{X}_t = λ(\overline{X}_t − X_t)dt \, .$

Suppose we want to calculate $f(x,\overline{x},t) = \mathbb{E} \left[V(X_T) \middle\vert \{ X_t = x, \overline{X}_t = \overline{x} \} \right] \, .$

Write the partial differential equation satisfied by $f$.

  • 1
    Thanks for down-voting my question -) Don't worry -- I will answer my own question in a few days.2012-12-11

2 Answers 2

5

First, let's work out a standard problem, where $ g(w, t) = \mathbb{E} \left[ V(W_T) \middle \vert W_t = w \right] \, .$ Integrate both sides from $t$ to $T$: $ V(W_T) - g(W_t, t) = \int_t^T dg(W_s, s) = \int_t^T \left( \partial_s g \cdot ds + \partial_w g \cdot dW_s + \frac{1}{2} \partial_w^2 g \cdot ds \right) \, . $ Note that $\mathbb{E}\left[ \int_t^T \partial_w g \, dW_s \right] = 0$ since $dW_s$ is in the future of $W_t = w$. Next, take the expectation on both sides conditional on the filtration $\mathcal{F}_t$: $ \mathbb{E} \left[ V(W_T) - g(W_t, t) \middle \vert \mathcal{F}_t \right] = \mathbb{E} \left[ \int_t^T \left( \partial_s g + \frac{1}{2} \partial_w^2 g \right) ds \middle \vert \mathcal{F}_t \right] $

$ g(W_t, t) - g(W_t, t) = 0 = \mathbb{E} \left[ \int_t^T \left( \partial_s g + \frac{1}{2} \partial_w^2 g \right) ds \middle \vert \mathcal{F}_t \right] \, . $ We expect the integrand on the right-hand-side to be zero; that is, $ \partial_t g + \frac{1}{2} \partial_w^2 g = 0 \, , $ which is the backward PDE for $g(w,t)$.

Next, let's use the same approach for $f(x,\overline{x},t)$. Again, integrate both sides from $t$ to $T$: $ V(X_T) - f(X_t, \overline{X}_t, t) = \int_t^T df(X_s, \overline{X}_s, s) = \int_t^T \left( \partial_t f \cdot ds + \partial_x f \cdot dX_s + \frac{1}{2} \partial_x^2 f \cdot dX_s^2 + \partial_\overline{x} f \cdot d\overline{X}_s + \frac{1}{2} \partial_\overline{x}^2 f \cdot d\overline{X}_s^2 \right) \, . $ Note that $ dX_s^2 = \sigma^2 X_s^2 ds + \mathcal{O}(ds^2)$, and $d\overline{X}_s^2 = \lambda^2 \left(X_s - \overline{X}_s \right)^2 ds^2 = \mathcal{O}(ds^2) $.

Take the expectation on both sides conditional on the filtration $\mathcal{F}_t$: $ \mathbb{E} \left[ V(X_T) - f(X_t, \overline{X}_t, t) \middle \vert \mathcal{F}_t \right] = 0 = \mathbb{E} \left[ \int_t^T \left( \partial_t f + \partial_x f \cdot \mu X_s + \frac{1}{2} \partial_x^2 f \cdot \sigma^2 X_s^2 + \partial_\overline{x} f \cdot \lambda (X_s - \overline{X}_s) \right) ds \middle \vert \mathcal{F}_t \right] \, .$

Hence the backward PDE is $ \partial_t f + \partial_x f \cdot \mu X_s + \frac{1}{2} \partial_x^2 f \cdot \sigma^2 X_s^2 + \partial_\overline{x} f \cdot \lambda (X_s - \overline{X}_s) = 0 \, .$

1

if $f(x,\bar{x},t) = \mathbb{E}[V(X_T) | \{ X_t = x , \bar{X_t} = \bar{x} \}]$, then let us perform ito's lemma on $V(X_T)$ itself.

$ dV = \frac{\partial V}{\partial X}dx + \frac{\partial V}{\partial t}dt + \frac{1}{2} \frac{\partial^2 V}{\partial x^2}(dx)^2 $

We know that $f(x,\bar{x},t) = \mathbb{E}[V(X_T) | \{ X_t = x , \bar{X_t} = \bar{x} \}]$

Therefore, $f(x,\bar{x},t)$ must satisfy the pde $\mathbb{E}[\frac{\partial V}{\partial X}dx + \frac{\partial V}{\partial t}dt + \frac{1}{2} \frac{\partial^2 V}{\partial x^2}(dx)^2 ]$ which looks something like Feynman-Kac

http://en.wikipedia.org/wiki/Feynman%E2%80%93Kac_formula

  • 1
    I don't think you can write a PDE on $V$ since $V = V(x)$ only.2012-12-11