3
$\begingroup$

Hi I have an algorithm for which I would like to provide the total runtime:

def foo(x):     s = []     if(len(x)%2 != 0):         return false     else:         for i in range(len(x)/2):           //some more operations         return true 

The loop is in O(n/2) but what is O() of the modulus operation? I guess it is does not matter much for the overall runtime of the Algorithm, but I would really like to know.

  • 0
    I was just pointing out that checking parity is dirt cheap. The general question about evaluating $x\%m$ efficiently for large $m$ is interesting, and much more difficult.2012-04-16

4 Answers 4

3

The run-time of the % operator would depend on the implementation. I don't expect python to have an optimized implementation for large values, so the runtime of m % n is probably something like O(log m log n)

It is somewhat more likely that python has special case code for n % 2, which would mean that specific calculation will actually run in constant time.

As for your specific usage of the % operator, if x is any ordinary sort of container at all, then len(x) is bounded above by, e.g., the amount of memory on your computer: the number simply can't get big enough for asymptotic analysis to matter.

Instead, len(x) is going to fit into a single precision integer, and python almost surely has special case code to compute the % of two single-precision integers by doing a single machine instruction (plus the checks to ensure you're in that case, data loads, et cetera), and this time is likely to be small in comparison with the actual time the interpreter spends working out what function to call and invoking it. So for all practical purposes, len(x) % y will run in a fixed constant amount of time.

2

The runtime of the modulus operation is O(1) (always the same, regardless of the "magnitude" of n). It is:

if(len(x)%2 != 0): #length is odd 

Which is really

if(len(x)%2 & 1): #bitwise and 

I'm not sure if the python interpreter will optimize it exactly this way, but that is the most efficient way you could implement a check if a number is odd (because in binary odd numbers always end in 1. eg 7 is 111, and 6 is 110.)

In the worst case, you're talking about a subtract and a divide, which is still O(1)

a % n = a - n * floor(a / n) 

eg

8 % 5 = 8 - 5 * floor(8/5) = 8 - 5 = 3 

In any case, the runtime of modulus doesn't depend on the size of the inputs, so it's always constant.

1

In if(len(x)%2 != 0) you're doing 4 operations: first you do len(x), which is $O(1)$, then %2 which is again $O(1)$ then !=0, again $O(1)$ and finally if(...), again $O(1)$.

Note that $O(\frac{n}{2}) = O(n)$.

The total running time of foo of course also depends on //some more operations. If you have a nested for loop in there your foo might be $O(n^2)$, for example.

  • 0
    For mod 2 it seems particularly intuitive but you make the same simplifying assumptions even if in reality it will be more than one op. For example you assume the same for mod 3.2012-04-16
0

There are two meanings for "run time". Many people have pointed out that if we assume the numbers fit in one register, the mod operaton is atomic and takes constant time.

If we want to look at arbitrary values of $x$, we need to look at the model of computation that is used. The standard way of measuring computational complexity for arbitrary inputs is with the Turing machine model using binary representation for numbers. In this model, the run time to compute $x \% 2$ is $O(\log x)$, which is also the number of bits in the binary representation of $x$. This is because $x \% 2$ is just the last bit of $x$ in binary notation.

Moreover, it is impossible to use this model and compute the function in time $f(x)$ for any $f$ with $\lim_{x \to \infty} f(x)/\log(x) < 1$. This is because with that sort of bound, there would be values of $x$ large enough that the machine could not even get to the last bit of $x$ before time expires. So, in this strong sense, the run time is "exactly" $\log x$ in the Turing machine model.

  • 0
    Yes, it would be easier if the numbers were stored in reverse order, with the least significant bit first.2017-03-14