Let $f(n) = \omega(n)$.
Then for all constants $c > 0$ there exists a constant $n_0$ such that $f(n + 1) - f(n) > c$ for all $n > n_0$.
The concept of Little-Omega is that the function must be increasing asymptotically at a rate faster than the bounding function. So if the bounding function is linear, then the difference between the two points in f(n) has to be greater than some constant.
I understand conceptually why this is true, but is there a way to prove this mathematically?
(note: this is not homework)