In probability, one classic proof is that a binomial distribution converges to a Poisson distribution as $n \to \infty$. That is, suppose $X \sim \text{Bin}(n, p)$ with moment-generating function $$M_{X}(t) = [pe^{t}+(1-p)]^{n}$$ and $Y \sim \text{Poi}(\lambda)$ with moment-generating function $$M_{Y}(t) = e^{\lambda(e^{t}-1)}\text{.}$$ Then, with $\lambda = np$, $\lim_{n \to \infty}M_X(t) = M_{Y}(t)$.
Showing this would involve writing $$M_{X}(t) = \left(1+\dfrac{np(e^{t}-1)}{n} \right)^{n} = \left(1+\dfrac{\lambda(e^{t}-1)}{n} \right)^{n}\text{.}$$ The problem I have with this is as follows: isn't it the result that if $x$ is a constant - that is, not dependent on $n$ - that $$\lim_{n \to \infty}\left(1+\dfrac{x}{n}\right)^{n} = e^{x}$$ and this doesn't seem to make any sense since $\lambda$ is dependent on $n$? Or are there conditions imposed on $p$ or $\lambda$ to guarantee the limit?