2
$\begingroup$

Recently I've been studying Brownian Motion, Martingales, and Stochastic Calculus by Jean-François Le Gall. But I was stuck by this exercise (1.16 p.15):

Consider a sequence of random variables $(X_n)$ and $(Y_n)$ defined recursively by $$X_{n+1}=a_nX_n+\epsilon_{n+1}$$ and $$Y_n=cX_n+\eta_n$$

where $a_n>0$, $c>0$ and $\epsilon_n\sim N(0,\sigma^2)$, $\eta_n\sim N(0,\delta^2)$ i.i.d.. Also, assume $(\epsilon_n)$ and $(\eta_n)$ is independent. Now define $$\hat{X}_{n/m}=E[X_n|Y_0,\dots,Y_m].$$ Show that for $n\geq 1$, $$\hat{X}_{n/n}=\hat{X}_{n/n-1}+\frac{E[X_nZ_n]}{E[Z_n^2]}Z_n.$$ where $Z_n:=Y_n-c\hat{X}_{n/n-1}$.

I guess the solution involve some kind of inductive arguments, but I have no idea how to start... This would be nice for someone to offer me hints and ideas. Thanks!

  • 1
    what are $Z_n$ in the last line?2017-01-28
  • 0
    @spaceisdarkgreen I have added the definition, thanks for your comment.2017-01-28
  • 0
    This question had been asked here. Can't find it though...2017-01-28
  • 0
    @zhoraster Really?! Then I will try to find it... But could you give me some ideas of this problem?2017-01-28
  • 0
    @Fuxuan, I remember that it is a simple algebra. It's strange that I can't find it, as I remember having responded to that question. Maybe it was deleted.2017-01-28
  • 0
    I suspect that I started writing an answer there with the same hint I posted here, but I was outpaced and abandoned the incomplete answer.2017-01-28

2 Answers 2

2

Here are some hints.

When the random variables in question are square integrable (the Gaussianity is even not needed), you may understand conditional expectation as orthogonal projection. Let $V_m = \operatorname{span} (Y_0,Y_1,\dots,Y_m)$. Then $\hat X_{n/m}$ is the orthogonal projection of $X_n$ to $V_m$. Your claim reads, $$ \hat X_{n/n} = \hat X_{n/n-1} + a Z_n = \operatorname{pr}_{V_{n-1}} \hat X_{n/n} + a Z_n. $$ We know that $$ \hat X_{n/n} = \operatorname{pr}_{V_{n-1}^{\vphantom{\perp}}} \hat X_{n/n} + \operatorname{pr}_{V_{n-1}^{\perp}} \hat X_{n/n}, $$ where $V_{n-1}^{\perp}$ is the orthogonal complement of $V_{n-1}$. So you need to prove that $a Z_n = \operatorname{pr}_{V_{n-1}^{\perp}} \hat X_{n/n}$. Since $\hat X_{n/n}\in V_n\supset V_{n-1}$, this exactly amounts to proving that

  1. $V_n = \operatorname{span}(V_{n-1}, Z_n)$;

  2. $Z_n \perp V_{n-1}$;

  3. $aZ_n = \operatorname{pr}_{Z_n} \hat X_{n/n}$.

Write if you have problems with one of these three points.

  • 0
    Thanks for your hint! I would accept your answer and write my solution beneath yours.2017-01-28
1

First note that we are working in the space of centered gaussian random variables. Since $\operatorname{span}(Y_0,\dots,Y_m)$ is a closed subspace, we have $$\hat{X}_{n/m}=P[X_n|Y_0,\dots,Y_m]$$ be the projection of $X_n$ on $\operatorname{span}(Y_0,\dots,Y_m)$.

  1. $V_n = \operatorname{span}(V_{n-1}, Z_n)$: just linear algebra.
  2. For $i=0,...,n-1$, $$E[Y_iZ_n]=E[Y_iY_n]-cE[Y_iE[X_n|Y_0,\dots, Y_n]]=E[Y_iY_n]-cE[Y_iX_n]=E[Y_i\eta_{n+1}]=0.$$
  3. By our note and 2. we get $\hat{X}_{n/n}=\hat{X}_{n/n-1}+aZ_n$. By 2., $Z_n \perp V_{n-1}$, so we have $E[\hat{X}_{n/n-1}Z_n]=0$. Also, $$E[\hat{X}_{n/n}Z_n]=E[E[X_nZ_n|Y_0,\dots,Y_n]]=E[X_nZ_n].$$ Thus, $$a=\frac{E[X_nZ_n]}{E[Z_n^2]}.$$