This is probably not the best worded question but here goes.
I've been reading a text book trying to get my head around time complexity.
I understand the most of it, but this example has threw me. Am I missing something or is the textbook simply wrong.
It has the following table:
$g(n)$, where $f(n) = O(g(n))$
- $g(n) = 5 \to f(n) = O(1)$
- $g(n) = 20n + 17 \to f(n) = O(n)$
- $g(n) = 40n^2 + 3n - 10\to f(n) = O(n^2)$
- $g(n) = 10n^3 + 26n^2 + 220 \to f(n) = O(n^3)$
I understand the first two cases: If ($g(n)$ is 5) time complexity is a constant. and if ($g(n)$ is $20n + 17$ then time complexity is $O(n)$ as constants are ignored.
What I'm not sure I understand is why the last two cases are equal to $O(n^2)$ and $O(n^3)$ respectively.
From my math understanding and ignoring constants it should be $O(n^3)$ and $O(n^5)$ respectively and not what was in the text book.
Some enlightenment would be great, I've searched all over for my answer. Thanks