4
$\begingroup$

I am watching the Introduction to algorithm video, and the professor talks about finding a Fibonacci number in $\Theta(n)$ time at point 23.30 mins in the video.

  1. How is it $\Theta(n)$ time?

  2. Which case of the master theorem does it fall under?

  3. What would the recursion tree look like if we try to use a divide and conquer approach?

P.S: I know pretty much what the $\Theta(n)$ notation means, so please write your answer accordingly.

  • 0
    See my edited answer.2012-08-03

2 Answers 2

5

What he's doing is using a simple example of what's known as dynamic programming, where one computes a function $f(n)$ by working up from the bottom to get to the desired result. Specifically, to compute $fib(n)$, the $n$-th Fibonacci number, he's doing this

fib(n)=    if (n <= 1)       return n    else       old = 0, prior = 1, next       for i = 2 to n          next = old + prior          old = prior          prior = next       return next 

To see this in action, we compute $fib(4)$, which we know is $3$:

i  old  prior  next 2   0     1      1    (compute next = old + prior) 2   1     1      1    (shift everything to the left) 3   1     1      2    (compute next again) 3   1     2      2    (shift again) 4   1     2      3    (and so on...) 4   2     3      3 

How long does this algorithm take? No need for the Master Theorem here: we have an algorithm that consists, essentially, of a loop, so assuming you can add in constant time, this will have running time $T(n) = \Theta(n)$.

(Actually, this won't work at all on real machines, since $fib(n)$ grows so fast that the numbers will quickly exceed the size of an integer. For example, $fib(2500)$ is 523 decimal digits long, so we'd need to implement some big integer package to do the addition and assignment statements, increasing the run time of this algorithm.)


The recursion tree for the original (terrible) recursive algorithm looks like this, when computing $fib(4)$:

enter image description here

For example, the first call, to $fib(4)$, requires two recursive calls, to $fib(3)\text{ and }fib(2)$. Notice how inefficient this is: just to compute $fib(4)$ we needed nine function calls. In fact, this is a classic example of when not to use recursion, especially since there are vastly more efficient ways to do this porblem.

1

I have to say I know very little about algorithms, but for your second question I did a quick search to see if that theorem even applies to the Fibonacci sequence.

According to this link:

  • Master Theorem does NOT apply to Factorial or recursive Fibonacci, because their runtimes do not satisfy the appropriate type of recurrence.

(Added later, after the link died)

Here is the relevant snippet from the link, which was a set of notes in a computer science course. It can be seen with this Wayback link

Since then, it looks like the notes have been relocated to this location and are more grammatical now. I think the new version of what I was citing starts around page 81 of these notes. It's a little disturbing though that I can't find "Master theorem" referenced by name via a simple text search.

  • 0
    @rschwieb, I'm sorry. I didn't read the whole post, just clicked the link2015-06-19