3
$\begingroup$

Hi I have an algorithm for which I would like to provide the total runtime:

def foo(x):
    s = []
    if(len(x)%2 != 0):
        return false
    else:
        for i in range(len(x)/2):
          //some more operations
        return true

The loop is in O(n/2) but what is O() of the modulus operation? I guess it is does not matter much for the overall runtime of the Algorithm, but I would really like to know.

  • 1
    You seem to be only worried about the remainder on division by $2$. That costs nothing (OK, constant).2012-04-16
  • 0
    No my algorithm shall only iterate over the first half of the input so it should only accept inputs of even length. That's why I do the check with mod 22012-04-16
  • 0
    I was just pointing out that checking parity is dirt cheap. The general question about evaluating $x\%m$ efficiently for large $m$ is interesting, and much more difficult.2012-04-16

4 Answers 4

3

The run-time of the % operator would depend on the implementation. I don't expect python to have an optimized implementation for large values, so the runtime of m % n is probably something like O(log m log n)

It is somewhat more likely that python has special case code for n % 2, which would mean that specific calculation will actually run in constant time.

As for your specific usage of the % operator, if x is any ordinary sort of container at all, then len(x) is bounded above by, e.g., the amount of memory on your computer: the number simply can't get big enough for asymptotic analysis to matter.

Instead, len(x) is going to fit into a single precision integer, and python almost surely has special case code to compute the % of two single-precision integers by doing a single machine instruction (plus the checks to ensure you're in that case, data loads, et cetera), and this time is likely to be small in comparison with the actual time the interpreter spends working out what function to call and invoking it. So for all practical purposes, len(x) % y will run in a fixed constant amount of time.

2

The runtime of the modulus operation is O(1) (always the same, regardless of the "magnitude" of n). It is:

if(len(x)%2 != 0): #length is odd

Which is really

if(len(x)%2 & 1): #bitwise and

I'm not sure if the python interpreter will optimize it exactly this way, but that is the most efficient way you could implement a check if a number is odd (because in binary odd numbers always end in 1. eg 7 is 111, and 6 is 110.)

In the worst case, you're talking about a subtract and a divide, which is still O(1)

a % n = a - n * floor(a / n)

eg

8 % 5 = 8 - 5 * floor(8/5) = 8 - 5 = 3

In any case, the runtime of modulus doesn't depend on the size of the inputs, so it's always constant.

  • 1
    -1: he's asking for a measure of runtime, not a measure of the number of arbitrary precision integer arithmetic operations involved.2012-10-10
1

In if(len(x)%2 != 0) you're doing 4 operations: first you do len(x), which is $O(1)$, then %2 which is again $O(1)$ then !=0, again $O(1)$ and finally if(...), again $O(1)$.

Note that $O(\frac{n}{2}) = O(n)$.

The total running time of foo of course also depends on //some more operations. If you have a nested for loop in there your foo might be $O(n^2)$, for example.

  • 0
    Thanks Matt. I know about the cubic runtime and my loop body has no nested loop (that is why i didn't bother to write it out). I don't understand however why all the operations should have O(1). I was more expecting something depending on the size of the input. Note: We're talking bit-operations here.2012-04-16
  • 1
    @er4z0r Well when you write "size of the input" here you assume that it means the length of `x`. You usually make further assumptions (abstractions) such as that an integer fits into one register. Then the mod operation is one op, namely the logical `and` of the register containing the integer with the register looking like this: `0000...01`. Then one such integer operation counts as O(1) because it's one op which does not depend on the size of the input.2012-04-16
  • 0
    For mod 2 it seems particularly intuitive but you make the same simplifying assumptions even if in reality it will be more than one op. For example you assume the same for mod 3.2012-04-16
0

There are two meanings for "run time". Many people have pointed out that if we assume the numbers fit in one register, the mod operaton is atomic and takes constant time.

If we want to look at arbitrary values of $x$, we need to look at the model of computation that is used. The standard way of measuring computational complexity for arbitrary inputs is with the Turing machine model using binary representation for numbers. In this model, the run time to compute $x \% 2$ is $O(\log x)$, which is also the number of bits in the binary representation of $x$. This is because $x \% 2$ is just the last bit of $x$ in binary notation.

Moreover, it is impossible to use this model and compute the function in time $f(x)$ for any $f$ with $\lim_{x \to \infty} f(x)/\log(x) < 1$. This is because with that sort of bound, there would be values of $x$ large enough that the machine could not even get to the last bit of $x$ before time expires. So, in this strong sense, the run time is "exactly" $\log x$ in the Turing machine model.

  • 0
    Your answer contributes an important detail, looking at the underlying computational model. Just one question: Why do we assume that looking up the last bit of a b-bit number requires b steps? What if we chose to store our input bits the other way round, wouldn't that only require one step of the touring machine?2017-03-14
  • 0
    Yes, it would be easier if the numbers were stored in reverse order, with the least significant bit first.2017-03-14