3
$\begingroup$

Sorry if this question has been asked before, but I am trying to figure this out. I am using the CLRS text, Introduction to Algorithms. In the Recurrences chapter, in the Master Theorem section, the following example is given with the solution:

$$T(n) = 3T\left(\frac{n}4\right) + n\log n $$

Here,

$$ a = 3, b =4, f(n) = n\log n $$

Using the Master method, it says

$$f(n) = \Omega(n^{\log_4 3 + \epsilon}) $$

I don't understand how $f(n)$ was "assumed" to have this lower bound ($\Omega$). I would be extremely greatful to anyone who points out to me why this is so.

If more details are needed to understand the problem, please let me know. I will add more details to this question.

TIA,

Jake Clawson

EDIT: I thank all those who have responded below. I have replied to each below individually. But my question still stands. The book directly goes on to say that $f(n)$ is $\Omega$. I don't understand this and this is the key to applying the Master Theorem because once $f(n)$ is known that we know which case to apply.

  • 1
    Shouldn't your last displayed equation be about $T(n)$ rather than $f(n)$?2012-02-12
  • 0
    Does the text have a proof of Master's theorem?2012-02-12
  • 0
    @HenningMakholm: $f(n)$ is correct. $T(n)$ for this problem is found out to be (in the solution for this example) as being $\Theta(n\lg n)$2012-02-14
  • 0
    @Aryabhata: Yes the text does have a proof for the Master's theorem. But the text says that the proof (or understanding thereof) is not need to apply the Master Theorem in this case.2012-02-14
  • 0
    @JakeClawson: I have added an answer. Let me know if that helps.2012-02-14

4 Answers 4

2

So you want to understand why $n\log n=\Omega(n^{\log_4 3+\epsilon})$

The easiest way is to look at the limit

$\lim_{n->\infty}\frac{n\log n}{n^c}$

which is evaluated to

$\lim_{n->\infty}\frac{\log n+1}{cn^{c-1}}$ where $c=log_4 3+\epsilon$

since $\log_4 3\lt1$, there exists $\epsilon\gt0$ such that $\log_4 3+\epsilon=c\lt1$. Therefore, the limit would tends to $\infty$ and hence $nlogn=\Omega(n^{\log_43+\epsilon})$

1

Well, actually, the fact you state is not quite s interesting.

  1. If $G(n)$ satisfies $G(n) = 3G(n/4)$, then clearly, $G(n) < T(n)$ (for the same boundary condition). As you tell from the master theorem, or easily check yourself, $G(n) = A \, n^{\log_4{3}} + B$.

  2. $T(n)$ is surely $> n\, \log n$. Because $\log_4{3} < 1$, $n^{\log_4{3}} = o(n) = o(n\, \log n)$. So $T(n)$ is $\omega(n) = \omega(n^{\log_4{3}+\epsilon})$ for some small positive epsilon.

  • 0
    Anybody who knows how to make the maths nicer, please edit.2012-02-12
  • 0
    @jpaleck: (1) I understand $G(n)$. But I don't understand the need for $G(n) = An^{\log_4 3} + B$. What does this accomplish? (2) When you say 'surely' you mean by 'looking' at $f(n) = n\log n$ and the fact that the recurrence input $T(n)$ is being divided into four each time $(n/4)$ you can conclude that $T(n)$ is $ > n\log n$. Please verify if this is your thinking. If it so that I don't dispute that. Your conclusion about $o$ and $\omega$ is probably correct but the Master Theorem in the book as it is stated only shows $\Theta$ solutions.2012-02-14
  • 0
    @jpaleck: My question is concerning why the book "assumed" $\Omega$ boundary right away. Please enlighten me on this.2012-02-14
0

Master's theorem deals with recurrences of the form

$$T(n) = aT(n/b) + f(n)$$

The theorem is divided into three cases, depending on the behaviour of $f(n)$ "relative" to $n^{\log_b a}$

Case 2) $f(n) = \Theta(n^{\log_b a})$

Case 1) $f(n) = \mathcal{O}(n^{\log_b a - \epsilon})$

Case 3) $f(n) = \Omega(n^{\log_b a + \epsilon})$

In your case, $f(n) = n\log n$, $a = 3$, $b=4$.

In order to apply the Master we need $f(n)$ to fit in one of those cases.

Now $\log_4(3) \lt \log_4 4 = 1$.

Thus we can eliminate Cases 1) and 2).

What remains is Case 3), and we can apply it, by picking $\epsilon = 1 - \log_4 3$, because $n \log n = \Omega(n) = \Omega(n^{\log_4 3 + (1- \log_4 3)})$.

-2

Take $c=\lg\left(n_{0}\right)$ in the following definition for the $\Omega$-notation:

$$ \Omega\left(g\left(n\right)\right)=\left\{ f\left(n\right):\exists\left(c>0\wedge n_{0}>0\right)\backepsilon0\leq cg\left(n\right)\leq f\left(n\right)\forall n\geq n_{0}\right\} $$

which appears on page 48 of the third edition of the book. Thus, with $g\left(n\right)=n$, for $n\geq n_{0}$ we have that

$$ n\lg\left(n_{0}\right)\leq n\lg\left(n\right) $$

This shows that $f\left(n\right)=\Omega\left(n\right)$, which answers the question.

  • 0
    Where does $c$ come from in the Master Theorem? Are you implying the use of Induction in figuring out what $f(n)$ is? But the book seems to imply that this is not needed (and hence the usefulness of the Master Theorem). Please clarify.2012-02-14
  • 0
    Which $c$? For the "regularity" condition: $af(n/b) \eqslantless cf(n)$? Or are you implying the use of Induction in figuring out what $f(n)$ is? But the book seems to imply that this is not needed (and hence the usefulness of the Master Theorem). Please clarify. As I have EDITed above, my question was regarding how $f(n)$ was "assumed" to be in \Omega.2012-02-14
  • 0
    I have just edited my answer to take the last comment into account.2012-02-20