1
$\begingroup$

I am going through some CS basics and the documents says in places that:

We need the run time to be at least logarithmic.

What does that mean?

By def, log means:

The logarithm of a number is the exponent by which another fixed value, the base, has to be raised to produce that number.

What does it truly mean when we say we want a run time to be logarithmic?

  • 1
    At least logarithmic means $f(n)\ge C\log n$ for some $C$.2012-07-07
  • 1
    This is also written in CS literature as $f(n)\in\mathcal{O}(\log(n))$2012-07-07
  • 3
    anon's and Shaktal's comments disagree; $f(n) = O(\log n)$ would mean $f(n) \le C\log n$ for some $C$ (and all sufficiently large $n$). But I think Shaktal's is the more likely interpretation; it's rare to want an algorithm to be *slower* than something, but it's common to want it to be "at least *as good as*" something.2012-07-07
  • 0
    I would assume "at least" logarithmic means that a function grows equal to or faster than logarithms, but the others are right that the opposite interpretation makes more sense in the context of computer science (where generally we *want* programs to be as efficient as possible). Weird.2012-07-07

1 Answers 1

3

Your question has two answers (although one is considerably more likely).

Firstly, the author could mean that the running time must be slower or as slow as some $C\log{n}$ (as pointed out by anon in the comments), i.e:

$$f(n)\ge C\log{n}\implies f(n)\in\mathcal{\Omega}{(\log{n})}$$

This means that given some set of $n$ inputs, the algorithm will complete no faster than $C\log{n}$, for some arbitrary constant $C\in\mathbb{R}^{+}$.

However, as you have stated it is a piece of CS literature, what is more likely is that the author intended that the algorithm would complete faster or as fast as some $C\log{n}$, i.e:

$$f(n)\le C\log{n} \implies f(n)\in\mathcal{O}{(\log{n})}$$

This means that the algorithm, given a set of $n$ inputs as before, will complete no slower than $C\log{n}$, for some arbitrary constant $C\in\mathbb{R}^{+}$.

Bear in mind that we often use asymptotic notation (e.g. $f(n)\in\mathcal{\Omega}(\log{n})$, $f(n)\in\mathcal{O}(n^{2})$) in computer science to approximate the running time of algorithms. Also note that $f(n)\in\mathcal{O}(n)$ could also be written as $f(n)=\mathcal{O}(n)$ (indeed this notation is more common in CS literature).

Hope this helps clear things up!