Clearly the variable sum
will be $\lfloor n/2 \rfloor$ after this code is run. But the OP's code will be less efficient than the code:
int sum=n >> 1;
This is the right bit-shift of n by 1, i.e. it returns $\lfloor n/2 \rfloor$. Note that many compilers would replace the OP's code with an equivalent of the above. This can be performed in $O(\log n)$ time.
Assuming the OP's code is compiled naively then it performs the following steps:
- It computes the upper limit $\lfloor n/2 \rfloor$ (the compiler probably converts this to
n >> 1
). [time complexity: $O(\log n)$] - It iterates from $i=1$ to $i=\lfloor n/2 \rfloor$ (requiring $\lfloor n/2 \rfloor-1$ operations "addition by one").
- At each step, it increments
sum
(requiring $\lfloor n/2 \rfloor$ operations "addition by one").
[In fact, the OP's code will compute $\lfloor n/2 \rfloor$ three times: once for sum
, once for i
and once for n/2
.]
Now consider the operation "addition by one". Of the numbers $1,2,\ldots,\lfloor n/2 \rfloor$:
- There are at most $n/4$ numbers whose binary representation ends in 0 (when the operation "addition by one" requires 1 step),
- There are at most $n/8$ numbers whose binary representation ends in 01 (when the operation "addition by one" requires 2 steps),
- There are at most $n/16$ numbers whose binary representation ends in 011 (when the operation "addition by one" requires 3 steps),
- And so on.
Hence, we can expect the average time complexity of the operation "addition by one" to be at most: \[\frac{1}{\lfloor n/2 \rfloor} \sum_{k \geq 1} k \frac{n}{2^{k+1}} = O(1).\]
So we can conclude that the average time complexity is $O(n)$.