What would be a common algorithm with an order of Θ(2^n)? When I say "order", I mean time complexity analysis. I was thinking exponential growth but are there any that are more computer science oriented?
Common algorithm with an order of Θ(2^n)
1
$\begingroup$
algorithms
computer-science
examples-counterexamples
-
7Testing the satisfiability of a Boolean formula by enumerating all possible assignments? – 2011-04-04
-
0Wouldn't a simple brute force comparison algorithm do the job? Say for example the problem would be to find the longest path on a weighted binary tree and you do so by checking out each path (starting from some fixed root vertex) of the tree, you would easily get $2^n$ operations. – 2011-04-04
-
0@David: That's if $n$ is the depth of the tree -- but one would usually take $n$ to be the number of nodes -- then if you determine the weights in a single traversal (which is easy) it takes $O(n)$, and even if you do it inefficiently and add up the weights separately for each path, it would still take only $O(n\log n)$. – 2011-04-04
-
5Do you mean to ask for algorithms which run in $\Theta(2^n)$? (Any polynomial algorithm is $O(2^n)$.) Are you looking for best-known algorithms that run in $\Theta(2^n)$, or does any problem with a natural $\Theta(2^n)$ algorithm suffice? Do you care about additional polynomial factors? For example, the natural dynamic programming solution to the Traveling Salesman Problem runs in $\Theta(2^n \cdot n^2)$. – 2011-04-04
-
4any algorithm that enumerates the subsets of a set to find a subset with a certain property – 2011-04-04
-
0@Zach: Anything with a natural Θ(2n) suffices, but I was looking for a common one so I could view some psuedo-code to see how it functions methodically. The TSP is a good example because it does increase exponentially with the number of cities he visits, but again, I would like to use an algorithm that has been documented in code. – 2011-04-04
-
2@Trevor: Your question is ambiguous. You say $O(2^n)$. Even printf("hello world") is $O(2^n)$. Once someone points out a mistake (see Zach's comment), why don't you edit the question to correct it? Also, you are missing the model of computation. I presume you mean the word RAM model. – 2011-04-04
-
0@Moron see edit. – 2011-04-04
-
0@Trevor: Insufficient. It still means the same thing. – 2011-04-04
-
0@Moron Then I don't understand what your definition of order is. If you look at the Insertion Sort algorithm it has a worst case scenario of order W(n^2). It takes roughly n^2 times to complete the basic step of the algorithm for an arbitrary number n. I need an algorithm that uses 2^n. – 2011-04-04
-
0@Trevor: There is a difference between BigOh, Omega and Theta. You are using BigOh, where I presume you mean Theta. – 2011-04-04
-
0@Moron If a function is in both BigO and Omega isnt it also in Theta? Big O is the most commonly used asymptotic notation for comparing functions, although in many cases Big O may be replaced with Big Theta Θ for asymptotically tighter bounds. – 2011-04-04
-
2@Trevor: I would say BigOh is most commonly _misused_ in algorithm analysis. Take your question for instance. Were you looking for O(n) time algorithms? Probably not, but O(n) algorithms are also O(2^n)... – 2011-04-04
-
0Fair enough. You win. – 2011-04-04
-
0@Trevor: In the case of describing the running time for an insertion sort, Big-Oh makes sense. Saying that insertion sort takes $O(n^2)$ time guarantees that the time taken will be within a constant factor of $n^2$. Note that sometimes insertion sort performs better than this though; for sorted inputs, it runs in $\Theta(n)$. – 2011-04-04
-
1In algorithm analysis, Big-Theta is usually reserved for instances where we know the algorithm is optimal. For example, addition of two $n$-digit numbers must take $\Omega(n)$ because it takes $O(n)$ just to read and write the number. Grade-school addition of two $n$-digit numbers gives us a $O(n)$ algorithm, so addition can be performed in $\Theta(n)$. The same applies for comparison-based sorting algorithms; we can prove a $\Omega(n\log{n})$ lower-bound, and so we say that merge sort runs in $\Theta(n\log{n})$. – 2011-04-04
-
0Cool, thanks, Zach. My algorithms class is a pain because the book is terrible and all the teacher does is read directly from the book... – 2011-04-04
-
0I found Anany Levitin's book "Introduction to the Design and Analysis of Algorithms" very good on covering both subjects. http://www.aw-bc.com/info/levitin/ – 2011-04-05
-
1@Trevor: I have flagged your comment. In any case, I am confident I know what you wanted. It seemed like you had some misconceptions about BigOh (and I think you still do, based on your not editing the O to be theta) and I was just trying to help you clarify it. I will try once more: BigOh = Upper Bound. Omega = Lower Bound. Theta = Upper and Lower. Good luck. – 2011-04-06
-
0Not much of an example, but a naïve implementation of the simplex method for linear programming will require exponential time if given the Klee-Minty problem. – 2011-04-07
2 Answers
4
The naive algorithm for $3$-coloring takes time $2^n$, though this is not optimal (Wikipedia mentions a $1.3289^n$ algorithm). Lots of other NP-complete programs have $2^n$ algorithms (or in general $c^n$), and it is conjectured that some of them in fact require time $c^n$; if this is true then randomness doesn't help for efficient computation (BPP=P).
-
0Just curious: Your last sentence means that if the exponential time hypothesis is true, then P=BPP? – 2011-04-04
-
1See http://www.math.ias.edu/~avi/PUBLICATIONS/MYPAPERS/IW97/proc.pdf. – 2011-04-04
0
According to Wikipedia, "Solving the traveling salesman problem using dynamic programming" is an exponential time problem. They write $2^{O(n)}$, I'm not sure it's the same as $O(2^n)$. Am I right in thinking that it is the same?
-
3No. Try $2^{3n}$. – 2011-04-04
-
0@Didier Piau: right, thanks! – 2011-04-04