In Number Theory, the Big-O notation is defined $f=O(g)$ if there exist $C>0$ and $x_0$ such that $|f(x)|
$$g(x)>f(x),$$
for all $x>x_0$ for some $x_0$. Would proving $f=O(g)$ be enough to prove the inequality? (*)
If so I've read about the "limit rule" for Big-O, which states that if $\lim_{x\to\infty}f(x)/g(x)=L$ exists, then,
$$f=\left\{\begin{array}{ll}O(g)&\text{if }L=0\\\Theta(g)&\text{if }0 so if $L=0$ then $g(x)>f(x)$, assuming (*) is correct. My doubts about (*) surround the fact that some Big-O definitions use $|f(x)|\leq Cg(x)$, where $\leq$ replaces $<$ in my original definition. But then again surely we could just increase $C$ slightly to make the strict inequalities work. Also, would we need to prove that $C=1$ ?