I have the following homework problem:
Let $(X_n)_{n \geq 0}$ be a Markov chain on the state space $\lbrace0,1,...\rbrace$. Writing $p_i := p_{i,i+1}$ and $q_i := p_{i,i-1}$, the transition probabilities are $p_0 = 1, \qquad \mathrm{and}\quad p_i + q_i =1, \quad \frac{q_i}{p_i} = \left(\frac{i}{i+1}\right)^2 \mathrm{for} \; i \geq 1$
I am asked to show that almost surely we have $X_n \rightarrow \infty$ as $n \rightarrow \infty$.
[Incidentally, I'd like to check my definition of almost sure convergence in this case: I'd say that $X_n \rightarrow \infty$ almost surely just when for every $K\geq 0$ there is $N=N_K$ such that $\mathbb{P}(X_n \geq K \;| \;n \geq N) = 1.] $
It is easy to see that $p_i > 1/2$ for all $i$. The result is then I think intuitively clear by comparison to the asymmetric random walk on the same state space with $p_i = p > 1/2$ for all $i$; the limit in that case is a consequence of the strong law of large numbers.
Now write the hitting probability $h_i^k := \mathbb{P}(X_n = k \textrm{ for some }n \geq 0 \; | \; X_0 = i)$. I believe the result would follow if I could show that
(a) wherever $i \leq k$ we have $h_i^k = 1$, and
(b) for any fixed k, we have $\lim_{i \rightarrow \infty} h_i^k = 0$
Questions: (1) Does the result indeed follow from (a) and (b)?
(2) (b) is clear. How might one show (a)?