When studying real sequences, I often come across exercises with the following structure: given two divergent sequences $(a_n)$ and $(b_n)$, to determine whether $(\frac{a_n}{b_n})_{n \in \mathbb{N}}$ converges and, if so, to what limit. I was wondering if there are general conditions on both sequences that may help in answering this question. For instance, it seems to me that if $(b_n)$ "grows faster" than $(a_n)$, the sequence converges to $0$. But this is too intuitive; is there any way of making this intuition more precise?
EDIT: Here's a more specific situation. Suppose $(a_n)$ and $(b_n)$ are given by the restriction to $\mathbb{N}$ of two differentiable functions $f, g$, respectively (say, $f(x)=a^x$ for some $a$ and $g(x)=x^b$ for some $b$). Is there any connection between the convergence of the series $(a_n/b^n)$ and the derivatives $f', g'$?