2
$\begingroup$

a paper I read states, that a Quasimartingale (an process $(X_t)_{t\in [0,T] }$ with $\mathbb E[|X_t|]<\infty$ for all $t\in [0,T]$, which suffices $$\sup_\Delta \sum^{n-1}_{j=0} \left\|\mathbb E\left[X_{t_{j+1}} - X_{t_j} \middle| \mathcal F_{t_j}^X\right] \right\|_1 < \infty $$ for its natural filtration $(\mathcal F^X_t)$ und $\Delta$ the set of all partitions $\pi : 0=t_0 < t_1 <\ldots < t_n = T$ of $[0,T]$) is automatically an $(\mathcal F^X_t)$-quasi-Dirichlet process, which means, that $$ \sum^{n-1}_{j=0} \mathbb E \left[ \left|\mathbb E\left[ X_{t_{j+1}} - X_{t_j} \middle| \mathcal F_{t_j}^X\right] \right|^2 \right] \xrightarrow{|\pi|\to 0} 0 $$ holds for $\pi : 0 = t_0 < t_1 < \ldots < t_n = T$ partitions of $[0,T]$.

While this seems reasonable, I have a hard time proving this and would be glad about any help.

1 Answers 1

1

Not sure the implication holds: if $X_0=0$ and $X_t=1$ for every positive $t$ then both sums are $1$ for every partition $\pi$ hence the first condition (boundedness) is met while the second (convergence to zero) is not.

Edit: To get a less degenerate Gaussian process $(X_t)_{t\geqslant0}$, try $X_0=0$ and $X_t=\xi$ for every positive $t$, for some nonzero normal random variable $\xi$ (note that our previous example was a Gaussian process as well).

  • 0
    I think your argument is valid. Maybe it holds for $X$ is a gaussian process. That would be met in the given Case but would exclude your example.2012-12-16
  • 0
    The first example IS a Gaussian process.2012-12-16
  • 0
    I do not understand how a deterministic constant function can follow a normal distribution. Also I'm not sure, whether $(\xi, \xi)$ follows a 2-dimensional gaussian Distribution, as it has to, to be a gaussian process.2012-12-16
  • 0
    All this points at the fact that you could check the definition of being a [normal vector](http://en.wikipedia.org/wiki/Multivariate_normal_distribution#Definition). You shall see that Dirac distributions are normal.2012-12-16
  • 0
    Thank you very much. Especially that last thing helped me much for my understanding.2012-12-16