18
$\begingroup$

Are there any results relevant to the distribution of the sequence $\{\log \log n!\}$ for integers $n$, where $\{x\}$ denotes the fractional part of $x$?

For instance, it is known that for irrational real numbers $\alpha$, the sequence $\{n\alpha\}$ is dense in $[0,1]$ and in fact equidistributed. Does something similar hold for the logarithms of the logarithms of the factorials?

(This curiosity is provoked by this question, and an affirmative answer here would complete the answer to that question.)

  • 0
    @robjohn, thanks for notifying me. See my comments there.2011-10-18

3 Answers 3

3

A (very weak) version of Stirling's approximation says that $n!=n^{n+o(n)}$, hence $\log\log(n!)=\log(n\log n)+o(1)$. If two sequences $(a_n)$ and $(b_n)$ are such that $a_n\to\infty$, $b_n\to\infty$, and $a_n-b_n=o(1)$, the limiting distributions of $\{a_n\}$ and $\{b_n\}$ are the same (of course, one should prove this) hence, from now on, we look at the asymptotic distribution of $x_n=\{\log(n\log n)\}$.

For every nonnegative $x$, call $n(x)$ the smallest integer such that $n(x)\log n(x)\geqslant\mathrm e^x$. Fix $x$ in $(0,1)$. The set of integers $n$ such that the integer part of $\log(n\log n)$ is $k$ is $n(k+1)-n(k)$. Among these, $n(k+x)-n(k)$ are such that $x_n\leqslant x$. Hence the proportion of integers $n$ in $[n(k),n(k+1)-1]$ such that $x_n\leqslant x$ is $ F_k(x)=\frac{n(k+x)-n(k)}{n(k+1)-n(k)}. $ Here is a result: $F_k(x)\to F^0(x)$ when $k\to\infty$, with $ F^0(x)=\frac{\mathrm e^x-1}{\mathrm e-1}=\int\limits_0^xf^0(y)\mathrm dy,\quad f^0(x)=\frac{\mathrm e^x}{\mathrm e-1}. $ This (which should also be proved) implies that the proportion of integers $n\leqslant n(k)$ such that $x_n\leqslant x$ converges to $F^0(x)$ when $k\to\infty$. In this sense the sequence of general term $\{x_n\}$ and the sequence of general term $\{\log\log n!\}$ both follow the distribution with density $f^0$ on $(0,1)$.

Note however that this result is not based on the usual version of the density of a subset $A$ of the integers, defined as the limit (when it exists) of the sequence $ d_n(A)=\frac1n\sum\limits_{k=1}^n\mathbf 1_{k\in A}. $ Considering the set $A(x)$ of integers $n$ such that $\{x_n\}\leqslant x$, we proved that $d_{n(k)}(A(x))\to F^0(x)$ when $k\to\infty$ but $(d_n(A(x))$ diverges (except for $x=0$ and $x=1$) since for example, $ d_{n(k+x)}\to\mathrm e^{1-x} F^0(x)>F^0(x). $ By the way, one can prove that $F^0(x)$ and $F^x(x)=\mathrm e^{1-x} F^0(x)$ are the lower and upper densities of $A(x)$ in the sense that $ F^0(x)=\liminf d_n(A(x))<\limsup d_n(A(x))=F^x(x). $ Coming back to our problem, one could very plausibly define the density of $\{x_n\}$ by fixing a real number $\alpha$, considering the proportion of integers $n\leqslant n(k+\alpha)$ such that $x_n\leqslant x$ and the limit $F^\alpha(x)$ of these proportions when $k\to\infty$. It happens that this limit $F^\alpha(x)$ exists for every $x$ and suggests as density the function $f^\alpha$ such that $ F^\alpha(x)=\int\limits_0^xf^\alpha(y)\mathrm dy. $ Thus, each $f^\alpha$ is as good a candidate as any to be the asymptotic density of the sequence with general term $\{\log\log(n!)\}$.

Note As said in the comments, Benford's law yields similar oscillatory phenomena. Of course, the procedures developed in the case of Benford's law to escape the conundrum we described above may be applied to the present problem as well.

Other, more advanced, keywords for the interested reader are divergent series or summation methods, see here or the version in French here which seems to mention more references.

  • 0
    @joriki Thanks for the appreciation and for the correction. Indeed, rereading the whole page after all this time makes for an odd experience. For some reason, at the moment, I obviously failed to convey the message I was trying to convey, namely, that the two other undeleted answers were not addressing the question at all, mathematically speaking (its *density* part excepted).2015-09-07
11

The definition of factorial gives $ \log(n!)-\log((n-1)!)=\log(n)\tag{1} $ Since the derivative of $\log(x)$ is $1/x$, $(1)$ and the Mean Value Theorem yield $ \begin{align} \log(\log(n!))-\log(\log((n-1)!)) &\in\left(\frac{\log(n)}{\log(n!)},\frac{\log(n)}{\log(n!)-\log(n)}\right)\\ &=\frac{\log(n)}{n\log(n)-n+O(\log(n))}\tag{2} \end{align} $ The density of $\log(\log(n!))$ is the reciprocal of $(2)$: $n-\frac{n}{\log(n)}+O(1)$ and $ \log(\log(n!))=\log(n)+\log(\log(n))-\frac{1}{\log(n)}+O\left(\frac{1}{\log(n)^2}\right)\tag{3} $ As $n\to\infty$, $\log(\log(n!))\sim\log(n)$ and the density $\sim n$. Since the limiting density is determined when $n$ is large, we get that the density of $\{\log(\log(n!))\}$ is $\frac{e^x}{e-1}$ on $[0,1]$.

More explanation about $\frac{e^x}{e-1}$:

Since $\log(\log(n!))=\log(n)+\log(\log(n))-\frac{1}{\log(n)}+O\left(\frac{1}{\log(n)^2}\right)$, $\{\log(\log(n!))\}$ cycles through $[0,1]$ just a bit quicker than $\log(n)$ does; approximately when $n$ goes to $ne\left(1-\frac{1}{\log(n)}\right)$. In each of those cycles, the logarithm of the density, $\log\left(n-\frac{n}{\log(n)}\right)+O\left(\frac{1}{n}\right)$, increases approximately linearly by $1$. Thus, the density of $\{\log(\log(n!))\}$ is proportional to $e^x$, and $\frac{e^x}{e-1}$ is normalized to have total weight $1$.

Density Details:

Let $I_n=\{k\in\mathbb{Z}:n-1<\log(\log(k!))\le n\}$. The density approximated here is the function $\phi:[0,1]\mapsto\mathbb{R}$ so that $ \int_a^b\phi(x)\;\mathrm{d}x=\lim_{n\to\infty}\left.\left|\{k\in\mathbb{Z}:n-1+a<\log(\log(k!))\le n-1+b\}\right|\middle/\left|I_n\right|\right. $ Within a given $I_n$ this density is roughly proportional to the reciprocal of the distance between $\log(\log((k-1)!))$ and $\log(\log(k!))$, which is $k-\frac{k}{\log(k)}+O(1)$. For $k\in I_n$, let $x=\log(\log(k!))-n+1=\log(k)+\log(\log(k))-n+1+O\left(\frac{1}{\log(k)}\right)$. Then $ k=\frac{e^{x+n-1+O(1/(x+n))}}{(x+n-1)^{1-1/(x+n)}}=\frac{e^{n-1+O(1/(x+n))}}{(x+n-1)^{1-1/(x+n)}}e^x $ Thus, in terms of $x$, the density is $ k-\frac{k}{\log(k)}+O(1)=\frac{e^{n-1+O(1/(x+n))}}{(x+n-1)^{1-1/(x+n)}}e^x $ As $n\to\infty$, the coefficient of $e^x$ tends toward constancy. Normalizing this so that the integral over $[0,1]$ is $1$, we get $\phi(x)=\frac{e^x}{e-1}$.

We get the same density if we consider $\displaystyle\lim_{N\to\infty} \bigcup_{n\le N} I_n$. However, if we use a partial $I_N$, the fact that $|I_N|$ is approximately $(e-1)\left|\bigcup_{n causes bad behavior.

Thus, the fractional parts of $\log(\log(n!))$ are dense in $[0,1]$, but not uniformly distributed.


Charts

On $I_{10}$, the counts in each interval of size $0.01$ are

enter image description here

On $I_{11}$, the counts in each interval of size $0.01$ are

enter image description here

On $I_{12}$, the counts in each interval of size $0.01$ are

enter image description here

On $I_{13}$, the counts in each interval of size $0.01$ are

enter image description here

So as described above, on each $I_n$, the density is $\frac{e^x}{e-1}$. However, since the counts on $I_{n+1}$ are approximately $e$ times the counts on $I_n$, the picture is different if we don't consider complete intervals $I_n$:

enter image description here

The jump has a ratio of about $e\left(1-\frac1{\log(n)}\right)$ since $\{\log(\log(n!))\}$ cycles through $[0,1]$ as $n$ goes to $ne\left(1-\frac1{\log(n)}\right)$, as mentioned above.

  • 0
    to the downvoter: I am not claiming that the distribution has a limit, but only that on the intervals $I_n$, the distribution tends to $\frac{e^x}{e-1}$.2015-09-07
10

Sequence $\log n$ is not equidistributed. The reason is $\log[n]$ grows very slowly and so $\{\log n\}$ concentrates to the lower part of $[0,1]$. Since $\log\log n!$ grows approximately as $\log(n\log n)$ probably the same proof will be valid.

  • 0
    Not sure @GenericHuman's comment above was addressing any of the points I was making. But since all this dates back from several years ago and since the accepted answer actually does not address the equidistribution part of the question (in the mathematical sense of the term), I guess none of this really matters.2015-09-07