2
$\begingroup$

Say we have: $f(n) \in O(g(n))$ By definition we need to show that: $0 \le f(n) \le c\cdot g(n) $ for some $c>0$ and for all $n>n_0$.

This is usually not difficult when rational and polynomial functions are involved, but if functions have logarithms and square roots, I get confused and not sure how to proceed. I know there has to be a calculus proof, but I'm unaware of it.

Would it be enough to show that: $c\cdot g(n) - f(n) \ge 0 $

And to do that we need to find the minimum point by deriving it (which we can use as n0), and showing that the slope is always positive?

  • 0
    A function may have a positive slope on $I=[x_0,+\infty)$ without being positive at any point in $I$.2012-05-13

1 Answers 1

2

Your general idea is reasonable, but some details need work. Looking for a minimum of $cg-f$ will likely be a waste of time. The basic fact to use is:

If $h(x_0)>0$ and $h'(x)>0$ for all $x>x_0$, then $h(x)>0$ for all $x>x_0$.

A formal proof of the above involves the mean value theorem: $h(x)-h(x_0)=h'(\xi)(x-x_0)>0$.

As an example, let's show that $\ln x=O(x^\epsilon)$ for any (fixed) $\epsilon>0$. Introduce $h(x)=Cx^\epsilon-\ln x$ and calculate $h'(x)=C\epsilon x^{\epsilon-1}-x^{-1} = x^{-1}(C\epsilon x^\epsilon -1)$ Turns out, it's not even necessary to take large $x_0$: we can pick $x_0=1$ for simplicity. Choosing $C=1/\epsilon$ makes sure that $h'(x)=x^{-1}(x^\epsilon-1)>0$ for all $x>1$. And since $h(1)=C>0$, we are done.