2
$\begingroup$

Consider a homogeneous Markov Chain $X$ on a countable state space, ie a jump process. It is said to be regular (does not explode) if there are only a finite number of jumps in every finite interval. $X(t) < \infty\quad \forall t > 0$

What can you say the expected number of jumps in some interval? Is it finite too for regular processes in general? Is there a $a > 0$ so that

$E[X(a)] < \infty$

holds?

Consider for example a Poisson process with paramter $\lambda$. It is regular, and the expected number of jumps in a unit time is $\lambda$.

Especially I'd like to find out if the expected number of jumps of a regular pure birth-process is finite in some interval. That is a process process that jumps from $k$ to $k+1$ with rate $\lambda_k$, where for the rates $\sum_{k=1}\frac{1}{\lambda_k} = \infty$ holds. By Reuter's criterion this is sufficient for the pure-birth process to be regular.

1 Answers 1

2

The answer is: not necessarily.


We begin with some partial positive results. Recall that for every pure birth-process $(X_t)_{t\geqslant0}$ with positive rates $(\lambda(k))_{k\geqslant0}$ and for every suitable function $u$, $ \frac{\mathrm d}{\mathrm dt}\mathrm E(u(X_t))=\mathrm E((u(X_t+1)-u(X_t))\cdot\lambda(X_t)). $ In particular, the expectation $\mathrm E(X_t)$, if it exists, solves the differential equation $ \frac{\mathrm d}{\mathrm dt}\mathrm E(X_t)=\mathrm E(\lambda(X_t)). $ If $\lambda(k)\leqslant\lambda_0(k)$ for every integer $k$, for some positive concave function $\lambda_0$, then Jensen inequality yields $\mathrm E(\lambda(X_t))\leqslant\mathrm E(\lambda_0(X_t))\leqslant\lambda_0(\mathrm E(X_t))$. Integrating this yields $t\geqslant M(\mathrm E(X_t))$, where $ M(x)=\int_{\mathrm E(X_0)}^{x}\frac{\mathrm dz}{\lambda_0(z)}. $ If the integral of the function $1/\lambda_0$ diverges at infinity, $M$ is unbounded and this proves that $\mathrm E(X_t)\leqslant M^{-1}(t)$ is finite for every $t$.

Here is another integrability result, always valid. Consider $ \Lambda(k)=\sum_{i=0}^{k-1}\frac1{\lambda(i)}. $ Then the derivative of the function $t\mapsto\mathrm E(\Lambda(X_t))$ is $1$, hence $\mathrm E(\Lambda(X_t))=\Lambda(X_0)+t$ for every $t$ and in particular $\mathrm E(\Lambda(X_t))$ is finite for every $t$.

But of course, nothing guarantees that such a concave function $\lambda_0$ exists nor that $\Lambda(k)$ would be equivalent to a multiple of $k$ when $k\to\infty$...


...Which brings us to the negative result. Consider an infinite increasing positive integer sequence $(K(i))_{i\geqslant0}$, to be chosen later on. The sites in the set $\mathcal K=\{K(i)\,;\,i\geqslant0\}$ are slow sites and the other sites are fast sites, in the sense that one assumes that $\lambda(k)=1$ for every $k$ in $\mathcal K$ and that $ \sum\limits_{k\notin \mathcal K}\frac1{\lambda(k)}\ \text{is finite}. $ Then, Reuter criterion holds thanks to the infinitely many slow sites hence $(X_t)_{t\geqslant0}$ is regular. However, the total time $T$ spent at fast sites is almost surely finite, hence, on $[T\leqslant t]$, $ X_t\geqslant K(Y_{t-T(t)})\geqslant K(Y_{t-T}), $ where $T(t)$ is the time spent at fast sites up to time $t$, and $(Y_t)_{t\geqslant0}$ is a pure birth-process with constant rate $1$, independent on $T$. For every $t\gt0$, $[T\leqslant t]$ has positive probability and $ \mathrm E(X_{2t}:T\leqslant t)\geqslant\mathrm E(K(Y_{2t-T}):T\leqslant t)\geqslant\mathrm E(K(Y_{t}):T\leqslant t)=\mathrm E(K(Y_{t}))\cdot\mathrm P(T\leqslant t). $ Since $Y_t$ is Poisson with parameter $t$, the choice $K(i)=(i!)^2$ yields $ \mathrm E(K(Y_t))=\mathrm e^{-t}\sum_{i=0}^{+\infty}i!\,t^i, $ which diverges, for every positive $t$. Thus, $\mathrm E(X_{2t})\geqslant\mathrm E(X_{2t}:T\leqslant t)$ is infinite for every positive $t$.

  • 0
    Ah, I see. Hm I hoped that regularity would suffice for the process I'm dealing with in my work. I opened a [new question](http://math.stackexchange.com/questions/178150/expected-number-of-jumps-in-a-regular-pure-birth-process-with-malthusian-paramet) that includes all the assumptions that I have for $X$.2012-08-02