I am being asked to order functions by growth rate and I didn't make much of this case until I started thinking about it more and more.
Let's postulate that we have some function
public void doStuff(int N) {
final int k = 1000000;
int m = k/N;
for (int i = 0; i < m; i++) {
// Do stuff
}
}
At first glance, it does seem to me that for larger and larger inputs of $N$, this function will execute faster and faster. However, in regards to growth of functions, are we not concerned with what happens when $$\lim_{N\to\infty} \frac{1}{N}$$
The method I posted above does execute faster for larger inputs of $N$, but only up to a certain point. After that point, the run time would be $0$.
With a function that grows at a rate of $\frac{1}{N}$, wouldn't its running time be limited to be just as fast as a constant time algorithm? It doesn't make sense to me that for large enough values of $N$, a function can have a run time of $0$ when it would still do something.