15
$\begingroup$

I find the frequent emergence of logarithms and even nested logarithms in number theory, especially the prime number counting business, somewhat unsettling. What is the reason for them?

Has it maybe to do with the series expansion of the logarithm? Or is there something inherently exponential in any of the relevant number distributions, like in complexity theory or combinatorical problems? I think maybe in how you can construct bigger integers out of smaller ones.

  • 0
    @anon: Why do you equate $\$m$athrm e$ with $p$robabilistic? Central limit theorem? Or something simpler?2014-02-13

3 Answers 3

14

If you are asking, why do you find it unsettling that logarithms occur in Number Theory, I'm afraid you will have to ask a psychiatrist.

If you are asking, why are there logarithms in Number Theory, consider the following naive effort to find the number of primes up to $N$:

There are $N$ integers up to $N$; $(1/2)N$ odd integers up to $N$; $(1/2)(2/3)N$ are prime to 2 and 3; $(1/2)(2/3)(4/5)N$ are prime to 2, 3, and 5; and so on. Continue this reasoning up to the largest prime $p$ not exceeding $\sqrt N$, and what's left should be the primes between $\sqrt N$ and $N$. So you have to estimate $(1/2)(2/3)(4/5)\cdots((p-1)/p)$, and in the limit as $N\to\infty$, that product looks something like $1/\log N$ (but not exactly - the naive argument needs a fair bit of sophisticated tweaking). It's the limiting process that lets logarithms into the mix.

  • 0
    A beautifully simple explanation! I have just come across this answer by chance - what a pleasant surprise! :)2014-06-18
11

One reason that it occurs in analytic number theory has to do with the Riemann zeta function:

$\zeta(s) = \displaystyle\sum_{n=1}^{\infty} \frac{1}{n^s},$

where $s = \sigma + it$ is a complex number complex number, and for reasons of convergence, we require that $\sigma > 1$. It turns out that due to an argument by Euler, we have

$\zeta(s) = \displaystyle\sum_{n=1}^{\infty} \frac{1}{n^s} = \displaystyle\prod_p \frac{1}{1-p^{-s}}.$

(where the $\displaystyle\prod_p$ operation indicates taking a product over all prime numbers $p$ and $\displaystyle\sum_p$ indicates taking a sum over all primes $p$). Big products like that aren't as easy to use and manipulate as sums, so mathematicians will do just about anything to turn a product into a sum to make it easier to work with. You probably know these three facts about logarithms:

i) $\log(ab) = \log(a)+\log(b)$,

ii) $\log{\frac{a}{b}}=\log(a)-\log(b)$, and

iii) $\log(1)=0$.

We can exploit those three rules after taking a logarithm of both sides of the above equation and we get the following:

$\begin{array}{ll} \log(\zeta(s)) &= \log \left( \displaystyle\prod_p \frac{1}{1-p^{-s}} \right) \\ &= \displaystyle\sum_p \log \left(\frac{1}{1-p^{-s}} \right) \\ &= \displaystyle\sum_p \left[ \log(1) - \log({1-p^{-s}}) \right] \\ &= -\displaystyle\sum_p \log({1-p^{-s}}). \end{array}$

Without going into too much detail, after you get to this point, you apply a standard operation from calculus called a "derivative" which you then scrutinize with analysis.

This general technique of investigation leads directly to proving the prime number theorem, which says

$\displaystyle\lim_{x \rightarrow \infty} \frac{\pi(x)}{\frac{x}{\log(x)}} = 1,$

which can be stated roughly in English as "as you go farther down the number line, calculating $\frac{n}{\log(n)}$ will give you a pretty good estimate of how many primes there are up to that point".

2

From A source book in Mathematics: Gauss (1971, at the age of fourteen) was the first one to suggest, in a purely empiraical way, the asymptotic formula $ \displaystyle \frac{x}{\log{x}}$ for $\phi(x)$.$^1$ Later on (1792-1793,1849) he suggested another formula $ \displaystyle \int_2^x \frac{dx}{\log{x}}$ of which $ \displaystyle \frac{x}{\log{x}}$ is the leading term$^2$. Legendre, being of course, unaware of Gauss' results, suggested another empirical formula $ \displaystyle \frac{x}{A \log x+B}$ $^3$ and specified the constants $A$ and $B$ as $A=1$, $B=-1.08366$$^4$ in the second edition of the Essai. Legendre's formula, which Abel quoted as "the most marvelous in mathemtics",$^5$ is correct up to the leading term only. This fact was recognized by Dirichlet$^6$. In this note Dirichlet suggested another formula $\displaystyle \sum^x \frac{1}{\log n}$.

$1$: (Werke, Vol.X$_1$, p.11, 1917).

$2$: (Gauss's letter to Encke, 1849, Werke, Vol II, pp.444-447, 1876)

$3$: (Essai sur ka théorie des nombres, 1st. ed., pp 18-19, 1789)

$4$: [Not from the book, but Wikipedia] He guessed $B$ to be about $1.08366$, but regardless of its exact value, the existence of $B$ implies the prime number theorem.

$5$: (letter to Holmboe, Abel Memorial, 1902, Correspondence, p.5)

$6$: ("Sur L'uasge des séries infinies dans la théorie des nombres." Crelle's Journal, Vol 18., p.272, 1838, in his remrk written on the copy presented to Gauss. Cf.Dirichlet, Werke, Vol. 1 , p372, 1889)

From Wikipedia: Initially, it might seem that since our numbering system is base 10, this base would be more "natural" than base $e$. But mathematically, the number $10$ is not particularly significant. Its use culturally—as the basis for many societies’ numbering systems—likely arises from humans’ typical number of fingers. Other cultures have based their counting systems on such choices as 5, 8, 12, 20, and 60.

$\log_e$ is a "natural" $\log$ because it automatically springs from, and appears so often in, mathematics.

Further senses of this naturalness make no use of calculus. As an example, there are a number of simple series involving the natural logarithm. Pietro Mengoli and Nicholas Mercator called it logarithmus naturalis a few decades before Newton and Leibniz developed calculus.

  • 0
    It's a sort of OK to me. +12012-02-12