1
$\begingroup$

How to get the conditional distribution of $W(t/2)$ given $W(t)=x$ where $W(t)$ represents wiener process.

This was a problem in my exam and i couldn't think how to start :(

Any help!!

Thanks in advance.

  • 2
    What is stopping you in the usual approach?2012-09-02
  • 1
    $W(t)=W(t/2)+(W(t)-W(t/2))$, where the two summands are independent and normally distributed with mean $0$ and variance $t/2$. Does that help?2012-09-02
  • 0
    @HaraldHanche-Olsen Not sure this is the most direct way.2012-09-03
  • 0
    @did: I could have added that when $X$ and $Y$ are independent normally distributed variables with mean 0 and the same variance, then $X+Y$ and $X-Y$ are normal and independent as well. If you assume this known, the desired result should be right around the corner.2012-09-04

2 Answers 2

2

Joint distribution of $(W(t/2), W(t))$ is a binormal distribution with zero means, and covariance: $$ \mathbb{Var}(W(t/2)) = \frac{t}{2}, \quad \mathbb{Var}(W(t)) =t, \quad \mathbb{Cov}(W(t/2),W(t)) = t/2 $$ translating into $\rho = \frac{\sqrt{2}}{2}$, $\sigma_1^2 = \frac{t}{2}$, $\sigma_2^2 = t$. Now, conditional distribution of binormal is well known: $$ W(t/2)|W(t)=x \sim \mathcal{N}\left(\frac{x}{2}, \frac{\sqrt{t}}{2} \right) $$ Alternatively, you could have used that the Wiener process conditioned upon $W(t)=x$ gives the Brownian bridge process.

  • 0
    Replace the variance t/2 by t/4 in the final formula.2012-09-03
  • 0
    As an aside, note that this is how Paul Lévy built Brownian motion on [0,1], using successive piecewise-linear approximations X_n: first choose W(1) centered normal with variance 1, this yields X_1(t)=tW(1) for every t in [0,1]. Second, choose W(1/2)-X_1(1/2) independently centered normal with variance 1/4, this yields X_2 linear on [0,1/2] and [1/2,1] and X_2(t)=W(t) for t=0, t=1/2, t=1. Repeat at 1/4 and 3/4 using variances 1/16, and so on. Then X_n(t) converges almost surely to some W(t) and the path t --> W(t) is indeed a Brownian motion.2012-09-03
  • 0
    @did Thank you for catching the typo. Using $\mathcal{N}\left(\mu, \sigma\right)$ notation, I should have written $\sqrt{t/4}$. Incidentally, the book of Yuval and Peres, "Brownian motion" (available from one of the author's [homepage](http://research.microsoft.com/en-us/um/people/peres/)), constructs the Brownian motion exactly this way.2012-09-03
  • 0
    Interesting. I probably knew it then forgot it. (But Yuval and Peres are one same individual...)2012-09-03
  • 0
    @did An embarrassing slip. I should have said the book of Mörters and Peres. Sorry about that.2012-09-03
0

Here is the trick. Find $\alpha$ such that $W_{t/2}+\alpha W_t$ and $W_t$ are independent i.e. -since $W$ is a Gaussian process with zero mean- such that $$ \mathbb{E} \left[( W_{t/2}+ \alpha W_t) W_t\right] = 0 $$ You find $\alpha = -1/2$ using $\mathbb{E}\left[ W_a W_b \right] = a \wedge b$. Then

$$ \mathbb{E} \left[ W_{t/2} \,\big|\, W_t = x \right] = \mathbb{E} \left[ W_{t/2} - \frac{1}{2} W_t + \frac{1}{2} W_t \,\big|\, W_t = x \right] = \underbrace{\mathbb{E} \left[ W_{t/2} - \frac{1}{2} W_t \right] }_{=0}+ \frac{1}{2} x = \frac{1}{2} x$$

More generally we have (using the same trick) the famous interview question $$ t \le s, \quad \mathbb{E} \left[ W_t \,\big|\, W_s \right] = \frac{t}{s} W_s $$

  • 0
    Conditional expectation $\ne$ conditional distribution. You provide the former, which is not enough to deduce the latter.2012-09-09
  • 0
    Oh misread sorry.2012-09-10