See this page.
Edit: Since incorrect statements continue to pop up on this page, let me explain once again the situation.
First, two notations (standard in the field): for every state $i$, $P_i$ denotes the distribution of the Markov chain $(X_n)_{n\ge0}$ conditionally on the event $[X_0=i]$ and $T_i=\inf\{n\ge1;X_n=i\}$ denotes the first hitting time of $i$. Next, a tautology: the events $[\exists n\ge1,X_n=i]$ and $[T_i<+\infty]$ are equal.
Second, a statement of the question the OP is interested in. One considers two properties which may or may not be satisfied by a given Markov chain $(X_n)_{n\ge0}$ and a given state $i$:
(1) $P_i(\exists n\ge1,X_n=i)=1$.
(2) For every state $j$, either $P_i(T_j<+\infty)=0$ or $P_i(T_j<+\infty)=1$.
Property (1) is often taken as a definition of the fact that $i$ is a recurrent state of the Markov chain $(X_n)_{n\ge0}$. Property (2) is more unusual.
Now, an answer to the question.
Property (1) implies property (2). Proof: see Byron's post. Note that if (1) holds, both $P_i(T_j<+\infty)=0$ and $P_i(T_j<+\infty)=1$ can occur for different states $i$ and $j$ of the same Markov chain. Example: state space $\{0,1,2\}$, transitions $0\to1$, $0\to2$, $1\to1$, $1\to2$, $2\to1$ and $2\to2$ all with positive probabilities, the other transitions with probability zero. Then $P_i(T_0<+\infty)=0$ and $P_i(T_j<+\infty)=1$ for every $i$ and every $j\ne0$.
Property (2) does not imply property (1). Proof by an example: state space $\mathbb{Z}$, transitions $i\to i+1$ with probability $1$. Then $P_i(T_j<+\infty)=0$ if $j\le i$, $P_i(T_j<+\infty)=1$ if $j\ge i+1$, but $P_i(\exists n\ge1,X_n=i)=0$ for every $i$.
Finally (and once again), a congenial reference for this is the first chapter of the book Markov chains by James Norris, which is freely available here.