1
$\begingroup$

I have a list of order of magnitudes I want to compare.

My only idea is using calculus methods (limits , integral, etc...) to assert the functions relation.

I need your help with the following.

I need to determine (again by order of magnitude) how to order the following (by asceneding order):

$log(n) / log(log(n))$

$log(log(n))$

$log^3(n)$

$n/log(n)$

$n^{log(n)}$

Is it even possible to use limits on such functions?

I mean if I take $f(n)$ to be $log(n) / log(log(n))$ and $g(n)$ to be $n^{log(n)}$, It seems very hard to find out the limit.

Thanks in advance!

3 Answers 3

2

There are quite a number of comparisons to be made. It is in most cases relatively straightforward to decide about the relative long-term size.

Let's start with your pair. We have $f(n)=\log(n)/\log(\log (n))$ and $g(n)=n^{\log(n)}$.

For large $n$ (and it doesn't have to be very large), we have $f(n)\lt \log(n)$.

Also, for large $n$, we have $\log(n)\gt 1$, and therefore $g(n)=n^{\log(n)}\gt n$.

So for large $n$, we have $$\frac{f(n)}{g(n)} \lt \frac{\log(n)}{n}.$$ But we know that $\lim_{n\to\infty}\dfrac{\log(n)}{n}=0$. This can be shown in various ways. For instance, we can consider $\dfrac{\log(x)}{x}$ and use L'Hospital's Rule to show this has limit $0$ as $x\to\infty$.

It takes less time to deal with the pair $n/\log(n)$ and $n^{\log(n)}$. If $n$ is even modestly large, we have $n/\log(n)\lt n$. But after a while, $\log(n)\gt 2$, so $n^{\log(n)}\gt n^2$. It follows that in the long run, $n^{\log(n)}$ grows much faster than $n/\log(n)$.

As a last example, let us compare $\log^3(n)$ and $n/\log(n)$. Informally, $\log$ grows glacially slowly. More formally, look at the ratio $\dfrac{\log^3(n)}{n/\log(n)}$. This simplifies to $$\frac{\log^4 (n)}{n}.$$ We can use L'Hospital's Rule on $\log^4(x)/x$. Unfortunately we then need to use it several times. It is easier to examine $\dfrac{\log(x)}{x^{1/4}}$. Then a single application of L'Hospital's Rule does it. Or else we can let $x=e^y$. Then we are looking at $y^4/e^y$, and we can quote the standard result that the exponential function, in the long run, grows faster than any polynomial.

Remark: The second person in your list is the slowest one. Apart from that, they are in order. So you only need to prove four facts to get them lined up. A fair part of the work has been done above.

  • 0
    Your answer is very very good and understood. I will work on these tomorrow, make sure I understand anything. Thank you very much!2012-11-03
  • 0
    @Guy: I don't **mind** your accepting my answer. However, it is useful in general to wait somewhat longer, in order to elicit a variety of approaches.2012-11-03
  • 0
    I commented below. Can you please advise ?Thank you.2012-11-04
1

For instance: $$\log(n^{\log n})=\log^2(n)$$ And since asymptotically: $$\frac{\log(n)}{\log(\log(n))} < \log^2(n)$$ We know that: $$n^{\log n}$$ Is at least exponentially larger than: $$\frac{\log(n)}{\log(\log(n))}$$

  • 0
    How did you calculate this inequalties?2012-11-03
  • 1
    Just observe: a $\log(n)$ cancels out, and $1/\infty$ is obviously smaller than $\infty$.2012-11-03
  • 0
    Undestood! Thank you very much!2012-11-03
0

@André Nicolas I really liked the comprasion of the functions with simplier functions.

I was trying to use this technique on some of these functions.

Eventually I found a couple which I was not able to solve using this technique and I don't understand why.

Assume $f(n) = log(log(n))$ and $g(n) = n/log(n)$

Now one might say, $f(n) < log(n)$ and $g(n) > 1/n$

If I look at the ratio/limit of $f(n)/g(n)$, I find out that $f(n)$ is much faster.

I completly understand that I chose unwisely the lower bound of g(n). Does it mean that on such functions I cannot use this technique?

Thank you in advance!!

  • 1
    If you "give away" stuff to make things simpler, have to be sensible about it. Replacing $g(n)$ by something that approaches $0$ is unreasonable. For example, could use $f(n)\lt \log(n)$, and keep $g(n)$. How one "weakens" an inequality has to be guided by basic intuition/experience about the sizes of things. The one thing that will be constantly useful is (i) $\log(n)$ is long-run slower than any positive power of $n$ and its twin (ii) For constant $a\gt 1$, $a^n$ is long-run faster than any polynomial in $n$.2012-11-04
  • 0
    So my replacement was incorrect. that's what I thought. Thank you!2012-11-04