I am having the hardest time with Big-O notation (I am using this Rosen book for the class I am in).
On the surface, Big-O reminds me of derivatives, rate of change and what not; is this proper thinking? If $f(n)$ is $O(g(n))$, would the derivatives have any affect on this?
Essentially is there a good resource for learning Big-O for the first time?
If I missunderstand this forum and need a specific question, then:
Prove that if $f(n)\le g(n)$ for all $n$, then $f(n) + g(n)$ is $O(g(n))$. (I'd rather gain an understanding of how to do this than to have an answer to a problem).
EDIT:
My attempt at the answer to my specific question using l'Hôpital:
$\lim_{x\to\infty} \frac{f'(x)}{f'(x) + g'(x)} = \lim_{x\to\infty} \frac{1}{g'(x)}.$