0
$\begingroup$

suppose I have a family of i.i.d standard normal random variables $Y_{n,k}$ and I define $X^N_t:=\sum_{n=0}^N\sum_{k=1}^{2^n}Y_{n,k}\phi_{n,k}(t)$ for $t\in [0,1]$ where $\phi_{n,k}$ are the Schauder functions.Furthermore, I know that $(X_t^N)$ is a martingale bounded in $L^2$ and therefore converges a.s. and in $L^2$ to a random variable $X$.

Why am I allowed to interchange expectation and the sum in the following expression

$$E[X_sX_t]=\sum\sum E[Y_{n,k}Y_{l,m}]\phi_{n,k}\phi_{l,m}$$

Note: The two sums are running both over two variables. The first over $n,m$ and the second over $k,l$.

  • 0
    If the martingales converges in $L^2$, it will converge in $L^1$.2012-07-05
  • 0
    $(L^2)^2 \ni (X,Y) \mapsto E(XY)$ is continuous ...2012-07-05
  • 0
    @ Davide: How does $L^1$ convergence implies the interchanging of sum and integral?2012-07-05
  • 0
    @ procrastinator: why can I apply Fubini? It must be product integrable? why is this here the case?2012-07-05
  • 0
    @ procastinator: General, I have to show $\int_{X\times Y}|F(x,y)|d(x,y) < \infty$. If this is true, then $\int_{X\times Y}F(x,y)d(x,y) = \int_X\int_YF(x,y)dydx$. The expression on the RHS is $\int_X\int_YF(x,y)dydx$ and not $\int_{X\times Y}|F(x,y)|d(x,y) < \infty$. Or am I wrong? Actually I'm not quite sure how this product measure should look like.2012-07-05

1 Answers 1

1

By the Cauchy-Schwarz inequality, the mapping \[ L^2(P)\times L^2(P) \ni (X,Y) \mapsto E(XY) \] is continuous, which can be seen as follows: Let $(X_0, Y_0) \in L^2(P)^2$ and $\epsilon > 0$. Let $\delta = \min\{\epsilon, \frac{\epsilon}{\max\{\|X_0\|_2, \|Y_0\|_2+\epsilon\}}\} $, then for $(X,Y) \in L^2(P)^2$ with $\|X-X_0\|_2 + \|Y-Y_0\|_2 < \delta$ it holds \begin{align*} \left|E(X_0Y_0) - E(XY)\right| &\le \left|E[X_0(Y_0 - Y)]\right| + \left|E[Y(X_0-X)]\right|\\ &\le \|X_0\|_2 \|Y_0 - Y\|_2 + \|Y\|_2 \|X-X_0\|_2\\ &\le \max\{\|X_0\|_2, \|Y_0\|_2 + \epsilon\} \bigl(\|X-X_0\|_2 + \|Y-Y_0\|_2\bigr)\\ &< \epsilon. \end{align*} Now we can continue: As $X^N_t \to X_t$ and $X^M_s \to X_s$ in $L^2(P)$, we have $E(X_sX_t) = \lim_{N,M} E(X^N_s Y^M_s)$. As expectation is linear, the last term equals $\lim_{N,M} \sum_{n\le N} \sum_{m\le M} \sum_{k=1}^{2^n}\sum_{l=1}^{2^m} E(Y_{n,k}Y_{m,l})\phi_{n,k}(t)\phi_{m,l}(s)$.

  • 0
    why do we need the continuity of this map?2012-07-05
  • 0
    We have $(X_t^N, X_s^M) \to (X_t, X_s)$ in $(L^2)^2$ and we want $E(X_t^N X_s^M) \to E(X_tX_s)$ ...2012-07-05
  • 0
    what norm do you uses on $L^2\times L^2$, like this $\|(X,Y)\| = \|X\|_{L^2}+\|Y\|_{L^2}$? How exactly follows the continuity from CS?2012-07-05
  • 0
    For example ... yes.2012-07-05
  • 0
    Thank you for your patience. But sorry, I do not see how continuity should follows from Cauch-Schwartz. It would be appreciated if you could add this little technical step.2012-07-05
  • 0
    @hulik Done, HTH, AB,2012-07-05
  • 0
    @ martini: Thank you for your help. Can you just clarify one thing. I do not see why do you have to choose the in the max $\|Y_0\| + \epsilon$. So in fact why is $ \|X_0\| \|Y_0-Y\|+\|Y\| \|X-X_0\|\le \max\{\cdots\}(\|X-X_0\|+\|Y-Y_0\|$ ?2012-07-09
  • 0
    $\|X_0\|_2 \|Y_0-Y\|_2 \le \|Y_0 - Y\|_2\max\{\|X_0\|_2, \|Y_0\|_2 + \epsilon\}$ is obvious, for the other summand, we have $\|Y\|_2 \|X_0-X\|_2 \le (\|Y_0 - Y\|_2 + \|Y_0\|_2)\|X-X_0\|_2 \le (\|Y_0\|_2 + \epsilon)\|X_0 - X\|_2 \le \|X_0 - X\|_2\max\{\|X_0\|_2, \|Y_0\|_2 + \epsilon\}$.2012-07-09
  • 0
    Thank you so much for your patience. One last question: Why is $\|Y-Y_0\|\le \epsilon$ ?2012-07-09
  • 0
    Sry ... my mistake, we can choose $\delta < \epsilon$, then $\|Y_0 - Y\| \le \delta < \epsilon$. Will correct it above ...2012-07-09