0
$\begingroup$

Consider a positive recurrent Markov chain $X_n, n\geq 0$ on the set of strictly positive integers. Let the initial state of the Markov process be 1, i.e., $X_0 = 1$. Let $T$ denote the first recurrence time to state 1,i.e., $T = \inf\{n\geq 1:X_n = 1\}$. Is $\mathbb{E}[\sum_{k=0}^{T}X_k]< \infty$.

  • 0
    Neat question, but what do you think about it?2017-02-13
  • 0
    I am able to prove this with following conditions . Assume $\mathbb{E}[T^2] < \infty$. Assume that for each state $i$ , the transition probability $p(i,j) = 0$ if $j>i+M$ for some fixed strictly positive integer $M$. Then we have $\mathbb{E}[\sum_{k=0}^{T}X_k] = \sum_{n=1}^{\infty}\sum_{k=1}^{n}\mathbb{E}[X_k/T=n]P(T=n)\leq 1+ \sum_{n=1}^{\infty}n(n+1)P(T=n)< \infty$2017-02-13
  • 0
    That first requirement is natural enough, but the second one seems quite restrictive, much stronger than just positive recurrence. Can you determine how to remove that? (You certainly can in some cases, for example if the chain is stationary then this follows from Wald's identity.)2017-02-13
  • 0
    I assumed the Markov chain starts with the stationary distribution $\pi$, but i still do not know why, $\mathbb{E}[\sum_{j=1}^{T}X_k]$ can be broken as $\mathbb{E}_{\pi}[X_k]\mathbb{E}[T]$.2017-02-13
  • 0
    I also tried considering the expectation $\mathbb{E}[\sum_{j=1}^{T-1}X_k]$ and conditioned the term $\sum_{n=1}^{\infty}\mathbb{E}[X_k 1_{k \leq T-1}]$ using $X_T$, but am still not getting anything.2017-02-13
  • 0
    Yes, if the initial distribution is stationary then you get a sufficient result out of Wald's identity; but the initial distribution is not stationary here (except in somewhat trivial cases). So you'll need to do something else. (Also, is it given in your source that this result is true exactly as stated?)2017-02-13
  • 0
    Actually even when the Markov chain begins with the stationary distribution, i am also unable to prove the result as i do not know how to show that the event $1_{T \geq k}$ is independent of the terms {X_0, X_1,....X_k}. Why is the event $1_{T < k-1}$ independent of $X_k$.2017-02-13
  • 0
    Ah, I think I forgot that just because the process is stationary does not mean that you are merely taking iid draws from it, there can still be persistent covariance. Tricky...2017-02-13

1 Answers 1

0

I believe that there is a counterexample. Let $\nu$ be a probability measure on $\Bbb N$ with infinite first moment (i.e. $\sum k\nu(k)=+\infty$), such that $\nu(k)>0$ for all $k \in \Bbb N$.

Define out transition kernels by $p(1,j) = \nu(j)$ and $p(i,j) = 1_{[j=1]}$ for $i \geq 2$.

This chain is positive recurrent because clearly $T \leq 2$ almost surely, and because all states communicate with each other.

However, $X_1$ has law $\nu$ and thus $E[X_1]=\infty$. And since $\sum_1^T X_j \geq X_1$ almost surely, the claim follows immediately.