7
$\begingroup$

I am having the hardest time with Big-O notation (I am using this Rosen book for the class I am in).

On the surface, Big-O reminds me of derivatives, rate of change and what not; is this proper thinking? If $f(n)$ is $O(g(n))$, would the derivatives have any affect on this?

Essentially is there a good resource for learning Big-O for the first time?

If I missunderstand this forum and need a specific question, then:

Prove that if $f(n)\le g(n)$ for all $n$, then $f(n) + g(n)$ is $O(g(n))$. (I'd rather gain an understanding of how to do this than to have an answer to a problem).

EDIT:

My attempt at the answer to my specific question using l'Hôpital:

$$\lim_{x\to\infty} \frac{f'(x)}{f'(x) + g'(x)} = \lim_{x\to\infty} \frac{1}{g'(x)}.$$

  • 2
    A cautionary example that may help in getting some intuition for Big-$O$ vs. derivatives. Consider a function that (smoothly) connects the points $$(0,0), (1,1), (3,1), (4,4), (7,4), (8,8), \dots, (2^k, 2^k), (2^{k+1}-1, 2^k), (2^{k+1}, 2^{k+1}) \dots.$$ In other words, the function stays flat for long periods, than jumps up to meet up with $y=x$ again. Such a function satisfies $f(x)=O(x)$, but the derivative of $f$ can be arbitrarily large (even though the derivative of $g(x)$ isn't). In other words, Big-$O$ notation only says something about AVERAGE growth, not growth at all places.2012-10-26

5 Answers 5

0

One more easy way of looking at this type of problem is noticing that if $f(n) \leq g(n)$ then $$ f(n)+g(n) \leq g(n) +g(n)=2g(n)= O(g(n)) $$ In fact it is easy to show that this sum is $\Theta(g(n))$

6

Big-Oh is is not completely determined by derivatives. For example $\sin(x^2)\in O(1)$ but the derivative $2x\cos(x^2)$ is unbounded.

The claim that $f(n)\le g(n)$ implies $f(n)\in O(g(n))$ is false: Consider $g(n)=n$, $f(n)=-n^2$. But if you replace the condition with $|f(n)|\le g(n)$ then the claim is easy: That is almost the definition of $f(n)\in O(g(n))$. And of course trivially $g(n)\in O(g(n))$. Then since Big-ohs are closed under addition, also $f(n)+g(n)\in O(g(n)$.

  • 0
    I would up vote if I could.2012-10-26
  • 2
    @Jeff: I bet you can :)2012-10-26
2

I found this question (and the first answer) helpful: Big-O Notation and Asymptotics

For example, f(n) is O(g(n)). Then, f(n) may diverge (increase without bound). However, (f(n))/(g(n)) does not, as g(n) is always greater than f(n) beyond some number N.

So, really, it has more to do with the limit of the ratio of two functions than derivatives.

  • 0
    This is helpful, thank you. I found this: https://www.cs.unm.edu/~saia/362-s08/lec/lec2-2x2.pdf and the l'hopital rule comes up a lot.2012-10-26
  • 0
    @Jeff L'Hôpital's rule is very one-sided. If $f'(n)/g'(n)$ converges, then it informs you of the original limit. However in many many cases, $f'(n)/g'(n)$ fails to converge, and this tells you nothing about the original limit. That's why comparing derivatives in this general setting is doomed to failure.2012-10-27
1

Concrete Mathematics is a good place to start with asymptotics and related ideas, particularly if you are in computer science (which your tags suggest).

  • 0
    Thank you for the reference. I found a PDF and will browse through this.2012-10-26
1

The definition of the derivative can be expressed using asymptotic notation.

We say f has a derivative at x if there exists M such that:

$$f(x+\epsilon) = f(x) + M\epsilon + o(\epsilon)$$

We denote this M as f '(x)

(edited as per Antonio's correction)

  • 0
    The error term isn't necessarily $O(\epsilon^2)$. For example, consider the derivative of $$f(x) = \int_0^x \sqrt{|t|}\,dt$$ at $x=0$. For this function we have $$f(0+\epsilon) = f(0) + f'(0)\epsilon + O(\epsilon^{3/2}).$$ For your answer, the correct expression would be $$f(x+\epsilon) = f(x) + M\epsilon + o(\epsilon).$$2018-05-10
  • 0
    Yes, that's true. @Jeff Please note in this context, we take $$\epsilon \to 0$$2018-05-11
  • 0
    Could've sworn I edited out that squared. Thanks.2018-10-05