How does $\lim\limits_{n \to \infty} \frac{f(n)}{g(n)} = 0$ imply that $f(n) \in \mathcal{O}(g(n))$? I'm having trouble understanding that.
Limit and asymptotics
2 Answers
By definition, $f(x)\in\mathcal{O}(g(x))$ if and only if there exists some $N>0$ such that $|f(x)| \le N|g(x)|$ for all $x$ greater than some $x_0$. Given the above limit, for any $\epsilon >0$ we have $\left|\frac{f(x)}{g(x)}\right| < \epsilon$ for $x$ greater than some $x_0$. If we take $N=\epsilon$ then we have the required $N$ for $|f(x)| \le N|g(x)|$ so this shows that $f(x)\in\mathcal{O}(g(x))$. In fact, we have $f(x)\in\mathcal{o}(g(x))$ since the above holds for every $N>0$.
-
0Thanks! You are using the $\delta$-$\epsilon$ definition. I think other limits will give other bounds. – 2012-09-29
Hint: If the limit is $0$, then if $n$ is large enough, we have $\left|\frac{f(n)}{g(n}\right|\lt 1,$ meaning that $|f(n)|\lt |g(n)|$.
If, as is common in Computer Science, you restrict "big $O$" statements to non-negative functions, one can remove the absolute value signs.