4
$\begingroup$

Lets take for example $\lim_{x\rightarrow\infty} \log(x)$, from a mathematical point of view this is $+\infty$, but from a logical point of view it's clear that $x$ converges to $+\infty$ much more quickly than $\log(x)$. Is there any kind of mathematics which take this into account?

  • 0
    Could you please clarify your question? Is there anything specific that is bothering you, or that you would like to have a second opinion on? The question "Is there any kind of mathematics which take this into account?" is rather vague as asked.2012-02-15
  • 0
    Let me just add that, from a strictly mathematical point of view (which I presume won't help to hint your intuition, but perhaps the answer is in line with your question), we have: the derivative of $\log(x)$ is $1/x$, while the second derivative is $-1/x^2$. It follows that while $\log(x)$ diverges to infinity, its "drive" to infinity is decelerating as a quadratic (i.e., it goes to infinity, so to speak, at ever decreasing speed with time, if you regard $x$ as time).2012-02-15
  • 0
    @WNY Well to clarify the question, I just wondered if there is any numerical analysis methods to compute the speed of convergence of a given function. And if it exists, I wondered if this notion of "speed of convergence" can be taken into account to compute F(+inf, with_a_convergence_speed) instead of "limit of F(x) when x-->+inf"2012-02-15
  • 2
    You can say that the function $f(x)$ converges faster to infinity than the function $g(x)$ if $\lim_{x\rightarrow \infty }\frac{f(x)}{g(x)}=\infty $ or equivalently $\lim_{x\rightarrow \infty }\frac{g(x)}{f(x)}=0$.2012-02-15

1 Answers 1

7

The big O notation (and its various siblings and cousins) used to classify the asymptotic growth rates of functions addresses this problem head on.

For your two examples $f(x)=x$ and $g(x)=\log x$, we say that $g(x)$ is $O(f(x))$ but $f(x)$ is not $O(g(x))$. The $O(\cdots)$ intuitively denotes "grows no quicker than" or "is eventually proportional to or smaller than". Formally $f(x)$ is $O(g(x))$ if there exists an $N$ and a constant $c>0$ such that $f(x)\le cg(x)$ for all $x>N$. This is equivalent to the limit-based comparison given by Américo Tavares in the comment about; the result is a partial order on the set of all functions from positive reals (or integers) to positive reals (or integers).

Such properties are intensively used, for example, in the study of algorithmic complexity, where the functions being compared are the running times (or resource usage) of computer programs as a function of the size of their input.

Incidentally, there's no good way to assign numbers to different asymptotic growth rates. A coarse first approximation would be to classify a function $f$ according to the infimum of the $b$'s such that it is $O(x^b)$. But it turns out that $x\mapsto x^b\log x$ has a growth rate strictly between $x^b$ and $x^{b+\varepsilon}$ for any $\varepsilon>0$ -- and there is an entire hierarchy of logarithmically differing growth rates in that gap, which would be lost if we reduced it all to being about the exponent $b$.

  • 0
    Is there any calculable convergence rate that accompanies this notion of big O for functions ? Like for sequences, the convergence rate is the speed at which a convergent sequence approaches its limit: http://en.wikipedia.org/wiki/Rate_of_convergence. By the way, it always exists a constant c > 0 such that x <= c log(x) what ever is x > 1, that means according to the definition that x is O(log(x)), which is not correct since x grows quicker than log(x), or I missed something. Coincidentally, I had this question when I was thinking about how to define a worst-case bounds for a program.2012-02-15
  • 0
    And formally can we also define the big O as: f(x) is O(g(x)) <==> limit of (g(x)/f(x)) is 0 when x-->+inf ? as @AméricoTavares said.2012-02-15
  • 1
    @user995434: First, no, the same $c$ must work for _all_ $x>nN$, and you can't do that for $x \le c\log x$. The big-O notation is more fine-grained than the "rate of convergence" vocabulary you link to -- when faced with a _finite_ limit $L$, one might speak about the asymptotic behavior of the function $x\mapsto f(x)-L$ -- only then _smaller_ values mean quicker convergence, so many details end up looking opposite of the $f(x)\to\infty$ case.2012-02-15
  • 1
    [typo alert: $nN$ above should have been just $N$] _(continued...)_ Yes, the partial order expressed by big-O is the same as the one Américo defines using limits (except for some often irrelevant caveats about what if one of the functions is $0$ infinitely often so the rations don't exist, etc). Finally, congratulations on rediscovering asymptotic analysis of algorithms! If you want to go deeper, there are lots of excellent textbooks about the matter, often with "algorithms" as the central keyword in their titles.2012-02-15
  • 0
    Thanks for the precision. I'm looking at the free book [The Design of Approximation Algorithms](http://www.designofapproxalgs.com/download.php), actually I'm a beginner in this area.2012-02-15