I know the definition of big $\mathcal O$ is:
$g(n) = \mathcal O(f(n))$ if and only if for some constants $c$ and $n_0$, $|g(n)| \leq c\cdot|f(n)|$ for all $n\ge n_0$
All I want to know is why this alternative definition is wrong:
$g(n) = \mathcal O(f(n))$ if and only if $|g(n)/f(n)|$ is bounded from above as $n → ∞$,
I guess it is because $f(n)$ may approach $0$ and division by $0$ is not defined, but I would like to see an example (I couldn't find any one). Please tell me if I'm on the right path.
I hope you can help me.