7
$\begingroup$

Let $X$ be a discrete-time Markov process on some measurable space $(\mathscr X,\mathscr B)$. Let $B\in\mathscr B$ and $ \tau_B:=\inf\{n\geq 0:X_n\in B\} $ is the first hitting time of $B$.

  1. Let us assume that for any $x\in \mathscr X$ $ \mathsf P_x\{\tau_B<\infty\} = 1. $ Can there exist $x$ such that $\mathsf E_{x}[\tau_B] = \infty$?

  2. Let us assume that for some x'\in \mathscr X \mathsf P_{x'}\{\tau_B<\infty\} = 1. Can it be that \mathsf E_{x'}[\tau_B] = \infty?

No assumptions on the irreducibility (or any other properties) are imposed. I've tried to find a counterexample but didn't manage to do it.

  • 0
    @D.Thomine: thanks, I guess I got it. Still I would appreciate if you can put it as an answer, I'll accept it. Sorry for the delay - the first time I forgot to ping you in a comment.2012-02-17

1 Answers 1

6

I'll explain my comment.

General stuff

For a Markov process on a countable state space and for a given state $x$, let us denote by $\tau$ the first return time in $x$ starting from $x$. I also assume that the Markov process is transitive, so that what I say holds for the whole space (and not only for the single point $x$), otherwise things become a bit more complicated. There are then three possibilities:

  • $\tau$ has finite expectation. The Markov chain is said to be positive recurrent, and there exists an invariant probability measure $\mu$. Asymptotically, the process spend a positive proportion of the time $\mu (x)$ at site $x$. Examples of such Markov chains include transitive Markov chains of finite state spaces, and something I will present later.

  • $\tau$ is finite almost surely, but has infinite expectation. The Markov chain is said to be null recurrent. Asymptotically, the process spend a null proportion of the time at site $x$. Example of such Markov chains include "good" random walks on $\mathbb{Z}$ and $\mathbb{Z}^2$, and something I will present later (yes, the same something).

  • $\tau$ is infinite with positive probability. The Markov chain is said to be transient. Almost surely, at some time the process will leave $x$ and never come again at $x$. Examples of such Markov chains include non-degenerate random walks on $\mathbb{Z}^d$ for $d \geq 3$.

Now, I'll present two classes of examples. The first has more far-reaching applications, while the second is more flexible (and has its fair share of applications).

Random walks

Let us consider a random walk $(S_n)$ on $\mathbb{Z}^d$ starting from $0$. Let $(p_i)_{i \in \mathbb{Z}^d}$ be the transition kernel, i.e. $\mathbb{P} (S_1=i)=p_i$. I'll denote by $S$ its covariance matrix:

$(u,Sv) = \sum_{i \in \mathbb{Z}^d} (p_i,u) (p_i,v).$

We shall assume that the random walk has no drift, finite variance and is non-degenerate, i.e. that the matrix $S$ is positive definite.

Then the random walk is transitive of a $d$-dimensional lattice. Note that the Markov process $(S_n)$ can't be positive recurrent: asymptotically, the process spends roughly the same time at each point of the lattice, and since there are infinitely many of them, it can't spend a positive proportion of the time at any given point. Hence, the process is either null recurrent or transient. We indeed have the following result. If $\tau$ denote the first return time at $0$, then:

  • If $d=1$, then:

$\mathbb{P} (\tau > n) \sim \sqrt{\frac{2 \det S}{\pi}} \frac{1}{\sqrt{n}}.$

  • If $d=2$, then:

$\mathbb{P} (\tau > n) \sim \frac{2 \pi \sqrt{\det S}}{\ln (n)}.$

  • If $d \geq 3$, then the random walk is transient.

I am not totally sure I have the constants right, but at least the order of the growth in $n$ is. A good reference is, for instance, Random walk in random and non-random environments by P. Révész. In the case of the simple random walk on $\mathbb{Z}$ (when $p_1=p_{-1}=1/2$), you can check the formula via Stirling formula and a generous helping of renewal theory. By being cunning, you can deduce the $2$-dimensional case for the simple random walk from the $1$-dimensional one.

Anyway, since we have the formula:

$\mathbb{E} (\tau) = \sum_{n \in \mathbb{N}} \mathbb{P} (\tau > n),$

both the $1$-dimensional and the $2$-dimensional cases are counter-examples.

Now, I'll give a more general (and more simple) class of examples: the "something I will present later".

Towers

Let $\mu$ be a probablity distribution over $\mathbb{N}$. I claim that there exists a Markov process over a countable state space and a state $x$ such that, if $\tau$ is the first return in $x$ starting from $x$, then the law of $\tau$ is $\mu$.

Indeed, let us consider a Markov process $(X_n)$ on $\mathbb{N}$ such that:

  • From any $k>0$, if $X_n=k$ then $X_{n+1} = k-1$;

  • $\mathbb{P} (X_{n+1}=k | X_n=0) = \mu (\{k\})$.

Then this Markov chain fulfills the conditions of my claim. You can see $\mathbb{N}$ as a "tower"; then this process comes down one floor at a time, and after it reaches $0$ it goes up at the floor $k+1$ (damn English labels for the floors...) with probablity $\mu(\{k\})$. This construction is closely related to Rokhlin towers in ergodic theory.

This mean that you can find an example such that $\tau$ has any specifies distribution over the nonnegative integers. Crafting examples for which $\tau$ is almost surely finite but has finite expectation becomes easy.

  • 0
    I can't upvote this answer more than once, so I guess a small bounty will do it for me :) Many thanks for your answer2012-02-20