Edit
(As Robert pointed out, what I was trying to prove is incorrect. So now I ask the right question here, to avoid duplicate question)
For infinite independent Bernoulli trials with probability $p$ to success, define a random variable N which equals to the number of successful trial. Intuitively, we know if $p > 0$, $\Pr \{N < \infty \} = 0$, in other word $N \rightarrow \infty$. But I got stuck when I try to prove it mathematically.
\begin{aligned} \Pr \{ N < \infty \} & = \Pr \{ \cup_{n=1}^{\infty} [N \le n] \} \\ & = \lim_{n \rightarrow \infty} \Pr \{ N \le n \} \\ & = \lim_{n \rightarrow \infty}\sum_{i=1}^{n} b(i; \infty, p) \\ & = \sum_{i=1}^{\infty} b(i; \infty, p) \\ \end{aligned}
I've totally no idea how to calculate the last expression.
(Original Question)
For infinite independent Bernoulli trials with probability $p$ to success, define a random variable N which equals to the number of successful trial. Can we prove that $\Pr \{N < \infty \} = 1$ by:
\begin{aligned} \Pr \{ N < \infty \} & = \Pr \{ \cup_{n=1}^{\infty} [N \le n] \} \\ & = \lim_{n \rightarrow \infty} \Pr \{ N \le n \} \\ & = \lim_{n \rightarrow \infty}\sum_{i=1}^{n} b(i; \infty, p) \\ & = \sum_{i=1}^{\infty} b(i; \infty, p) \\ & = \lim_{m \rightarrow \infty}\sum_{i=1}^{m} b(i; m, p) \\ & = \lim_{m \rightarrow \infty}[p + (1 - p)]^m \\ & = \lim_{m \rightarrow \infty} 1^m \\ & = 1 \end{aligned}
I know there must be some mistake in the process because if $p = 1$, N must infinite. So the equation only holds when $ p < 1 $. Which step is wrong?