9
$\begingroup$

For fixed $a,b,c \in \mathbb{R}$ with $ac \neq 0$, it seems to me that one can find an increasing sequence of integers $\{\alpha_n\}$ such that the quantity $c \log \alpha_n$ becomes arbitrarily close to elements of the set

$$ A = \{ ak+b \,\colon k \in \mathbb{Z} \}. $$

First, is this true? If so, my question is: how good is the approximation?

For example, given any $\epsilon > 0$, is it possible to find infinitely-many integers $n$ such that, for some constant $C$,

$$ \textrm{dist}(c \log n,A) := \inf_{k \in \mathbb{Z}} |c \log n - ak-b| \leq C n^{-\epsilon}? $$

How about

$$ \textrm{dist}(c \log n,A) \leq C e^{-n}? $$

I am also interested in results which might say something like "There are at most finitely-many $n$ satisfying

$$ \textrm{dist}(c \log n,A) \leq C e^{-n^2} $$

for any positive constant $C$" to illustrate the "best-possible" nature of a less restrictive bound.


Motivation

I am trying to determine the behavior of a quantity like

$$ \left|\left(1-e^{i c \log n}\right)g(n)\right|^{1/n}, $$

where $g(n)$ is well-behaved. Until now I have simply been excluding all $n$ for which $1-e^{i c \log n}$ lies in some small fixed neighborhood of the origin, but doing this I lose infinitely-many $n$.

If, for example, it turns out that, for some positive constant $C$, there are only finitely-many $n$ satisfying

$$ \left| 1-e^{i(\theta + c \log n)} \right| \leq C n^{-1-\epsilon} \tag{1} $$

for any $\epsilon > 0$, then all but finitely-many $n$ satisfy

$$ C n^{-1} \leq \left| 1-e^{i(\theta + c \log n)} \right| \leq 2. $$

In that case we could exclude at most finitely-many $n$ to obtain the desirable property

$$ \left| 1-e^{i(\theta + c \log n)} \right|^{1/n} \to 1 $$

as $n \to \infty$.

Now, if we let

$$ B = \{2\pi k - \theta \,\colon k \in \mathbb{Z}\}, $$

then equation $(1)$ is equivalent to the existence of a positive constant $C_1$ such that only finitely-many $n$ satisfy

$$ \textrm{dist}(c \log n,B) \leq C_1 n^{-1-\epsilon} $$

for any $\epsilon > 0$.

  • 0
    I think it's a very hard problem. Let's look at some simplest cases. Let $a=1,b=0$, we have $A=\Bbb N$, and the distance of a particular $\beta$ becomes the distance, say $\|\beta\|$, to the nearest nonnegative integer. If $c\log n=\log_2 n$, we could choose $\alpha_n=2^n$, therefore $\|\log_2\alpha_n\|=0$. If $c\log n=\ln n$, that's a bit more complicated. Suppose $\epsilon_n$ is such sequence that there's no $\alpha_n$. That means for any large $N$, there's some $n>N$ such that $\|\log_2\alpha_n\|<\epsilon_n$ doesn't hold.2012-08-17
  • 1
    It's certainly true that $c \log n$ gets arbitrarily close to any given unbounded subset of $\mathbb R^+$. This holds in general for any increasing sequence $a_n \to \infty$ such that $a_{n+1}-a_n \to 0$. Actually, we can probably remove "increasing".2012-08-17
  • 0
    The latest one is $\|\ln\alpha_n\|<\epsilon_n$. Suppose $l$ is the closest integer to $\ln\alpha_n$, we have $l=\ln\alpha_n+O(1)$, and there's no $\alpha_n$ such that $e^{l-\epsilon_n}<\alpha_n, so $e^l(e^{\epsilon_n}-e^{-\epsilon_n})\le1$. Notice that $e^\epsilon=1+\epsilon+O(\epsilon^2)$, we have $e^{\epsilon_n}-e^{-\epsilon_n}=2\epsilon_n+O(\epsilon_n^2)$, so $\epsilon_n=O(e^{-l})=O(1/\alpha_n)$.2012-08-17
  • 0
    @ErickWong Yeah, it could get arbitrarily close but the OP wants a speed related to $n$, which is not easy.2012-08-17
  • 0
    @ErickWong And notice that the OP want the sequence to comprise integers! Therefore as your notation $a_{n+1}-a_n\ge1$, which cannot tend to zero.2012-08-17
  • 0
    @FrankScience I mean that $a_n = c \log n$ satisfies the conditions I wrote.2012-08-17
  • 0
    @ErickWong But what about $\{n\}$ as OP described? It seems that the OP considers increasing integer sequence $\alpha_n$ and wants to estimate the distance $c\log\alpha_n$ to $A$, as an asymptotics of $\alpha_n$ (such as $e^{-\alpha_n})$.2012-08-17
  • 0
    @FrankScience yes, your interpretation of the question is correct.2012-08-17
  • 1
    @AntonioVargas And I think you're only interested in the *least* (or nearly *least*) asymptotics of function $f(x)$ such that there's **some** $\alpha_n$ such that $\textrm{dist}(c\log\alpha_n,A)\le f(\alpha_n)$ holds for all $n$.2012-08-17
  • 1
    @FrankScience yes, ideally. Though I am also interested in answers which prove *some* bound of the type I describe, even if it's not optimal.2012-08-17
  • 0
    @AntonioVargas It seems that the problem is related to the distribution of $\|c\ln n\|$, where $c$ is a constant. As far as I know, $cn\bmod1$ is uniformly distributed if $c$ is an irrational number.2012-08-17
  • 0
    @FrankScience The point is that $\ln n$ is a much more slowly-growing sequence than $n$. Uniform distribution is therefore impossible, but density mod $1$ is easy to obtain for any $c$ (I agree that sharp bounds are probably very hard).2012-08-17
  • 0
    Could I inquire on your motivation for this problem?2012-08-17
  • 0
    @Raskolnikov, I have added more on my motivation in the question.2012-08-17
  • 3
    Since $\ln n$ increases in steps $\approx1/n$, for a suitable $C$ you can find $n$ with $\text{dist}(c\ln n,a) for all sufficiently large $a\in A$. My guess is that (for most $a,b,c$), you should be able to find an infinite number of $\text{dist}(c\ln n,A): this is based on the idea that for positive $a\in A$ the likelihood of finding such an approximation (if the match between $c\ln n$ and $A$ is fairly random) is proportional to $1/a$. By the same argument, the bound $ should only give a finite number.2012-08-17
  • 0
    @EinarRødland, I would be very interested if you would develop your thoughts into a formal proof as an answer.2012-08-17
  • 0
    I think it might be more appropriate for MathOverflow.2012-08-18
  • 0
    Just to correct my above claim. I would **not** expect an infinite number of solutions with error $: that should be with error $. I've written a more extensive answer below.2012-08-20
  • 0
    @AntonioVargas You say you want to investigate the behaviour of the function $\left|\left(1-e^{i c \log n}\right)g(n)\right|^{1/n}$. Would it be possible to know exactly what characteristics are you concerned with? For example, $\log 4$ is almost $\pi/2$ and $\log 23$ is almost $\pi$. I guess you want to know how many $n$ are there such that the term $c \log n $ is approximately equal to some angle $\phi + 2k\pi$ . Is this correct?2012-08-24

3 Answers 3

2

Here's a more formal proof of the main argument presented in the comment.

First, to make things a little easier on ourselves, since $|ak+b-c\ln n|=c\cdot|a'k+b'-\ln n|$ where $a'=a/c$ and $b'=b/c$, we can pick $c=1$ without loss of generality. Secondly, we can assume that $b$ is the smallest positive number in $A$ and $a>0$ so that we're only concerned with the numbers $ak+b\in A$ for non-negative $k$.

To find $\ln n\approx ak+b$, we take $m_k=\lfloor e^{ak+b}\rfloor$. Then, we get $$\ln m_k \le ak+b < \ln(m_k+1)=\ln m_k+\ln\left(1+\frac{1}{m_k}\right) < \ln m_k+\frac{1}{m_k} $$ so since $ak+b$ is inside an interval of length less than $1/m_k$, the distance to either end of the interval, i.e. either $\ln m_k$ or $\ln(m_k+1)$, must be $\epsilon_k<1/2m_k$. If we let $n_k$ be either $m_k$ or $m_k+1$ depending on which logarithm gave the best estimate, we get $\epsilon_k=|ak+b-\ln n_k|<\frac{1}{2(n_k-1)}$.

Thus, for any $C>1/2$, we'll have $\epsilon_k for all sufficiently large $k$, providing an infinite number of solutions.

The second claim I made, I'll have to give a little more thought on how to formalise, although I can give the main idea more clearly. Anyway, I think it was wrong as stated: I'd expect an infinite number of solutions with $\epsilon which would translate to $\epsilon (different $C$). I'd mixed up the $k$ and $n$ when I was thinking this through.

For generic $a$ and $b$, we would expect $ak+b$ to lie at some seemingly random place inside the interval $[\ln m_k,\ln(m_k+1))$. If we let $l_k=\ln(1+1/m_k)/2$ be half the length of the interval, we should then expect that the distance $\epsilon_k$ from the closes end should be uniformly distributed on $[1,l_k]$. Stated differently, we would expect the $\epsilon_k/l_k$ to be uniformly distributed on $[0,1]$.

If we take a sequence $p_k\in(0,1]$, the likelihood that $\epsilon_k/l_k is $p_k$. The expected number of $k$ for which $\epsilon_k is then $\sum_k p_k$. If we let $p_k=1/k$, the expected number of solutions where $\epsilon_k is thus infinite, but by a small margin: $C/[n(\ln n)^{1+\epsilon}]$ should be expected to give a finite number of solutions.


To get a better understanding of the randomness perspective, let's rewrite the inequality $$ m_k\le\beta\alpha^k1, \beta=e^b\ge1 $$ and note that $\delta_k=\text{Dist}(\beta\alpha^k,\mathbb{Z})\approx\epsilon_km_k$.

A case where the randomness argument fails is when $\alpha$ is an odd natural number and $\beta=3/2$. This makes $\delta_k=1/2$ for all $k$. I suspect numerous similar examples can be made for algebraic $\alpha$.

I would hypothesise the set of $(\alpha,\beta)$ for which the number of arbitrarily small $\epsilon_k$ (or $\delta_k$) is finite should have measure zero.

  • 0
    Thank you, this is a really nice answer!2012-08-26
2

PART I

We can solve this problem by constructing such a sequence. First we simplify our terms. So, we divide both the terms by the quantity $c$ and let the new terms be $\log\alpha_{n}$ and $a'k+b'$.

So, now the problem is to find a sequence $\{\alpha_{n}\}$ such that

\begin{equation} \log(\alpha_{n+1}) - (a'k_{n+1}+b') < \log(\alpha_{n}) - (a'k_{n}+b') \end{equation}

I say the following is such a sequence \begin{equation} \alpha_{n} = \lceil {e^{a'n+b'}} \rceil \end{equation}

PROOF \begin{equation} \alpha_{n} - {e^{a'n+b'}} \leq 1 \end{equation}

Hence, \begin{align} &\log(\alpha_{n}) - (a'n+b') \\ &= \log\left(\frac{\alpha_{n}}{e^{a'n+b'}}\right) \\ &\leq \log\left(\frac{e^{a'n+b'} + 1}{e^{a'n+b'}}\right) \\ &= \log\left(1 + \frac{1}{e^{a'n+b'}}\right) \end{align}

Now, \begin{align} \log\left(1 + \frac{1}{e^{a'n+b'}}\right) < \log\left(1 + \frac{1}{e^{a'(n+1)+b'}}\right) \end{align}

And hence, the proof for the first part.

  • 0
    The $n$ in the bound in the original problem corresponds to the $n=\alpha_k$ values in your formulation. In terms of the original formulation, i.e. approximations by $\log n$, the errors you find in both part II and III are of order $C/n\sim C/\alpha_k$.2012-08-24
  • 0
    @EinarRødland Ohh, I see. So basically I duplicated your answer. Let it stay here though, I find it easier to understand my formulation, I will correct the answer though to reflect the nature of approximation and delete the $e^{-n^2}$ part. I guess that would be correct right?2012-08-24
1

Without loss of generality, we may assume $a = 1$. Then,

$$ \text{inf}_{k \in \mathbb{Z}} |c \log n - k - b| $$

is really just computing the distance in the circle $\mathbb{R}$ modulo 1. That is, we are comparing the fractional parts of $c \log n$ and $b$, with wraparound. (i.e. 0.9 and 0.1 are separated by a distance of 0.2)

The statement

$$ \text{dist}(c \log n, A) < C n^{-\epsilon} $$

is equivalent to the statement

$$ b \in (c \log n - C n^{-\epsilon}, c \log n + C n^{-\epsilon}) $$

in the circle. (remember the interval wraps around from 0 to 1!)

Each interval has length $\min\{1, 2 C n^{-\epsilon}\}$. If $\epsilon > 1$, the sum of the lengths of these intervals converges for every $C$.

In particular, this means for any $\delta > 0$ there exists some $M$ such that the union of all the intervals for $n>M$ has total length less than $\delta$.

From this, we can conclude the set of $b$'s for which $\text{dist}(C \log n, A) < C n^{-\epsilon}$ infinitely often has measure 0.

It is still possible that such $b$'s exist, however. I bet there is a phenomenon related to irrationality measure that applies here, but that topic is beyond me.

  • 0
    In retrospect, this is closely related to Einar's idea about the "probability" of success.2012-08-21
  • 0
    I really appreciate the perspective this answer gave. Thank you for posting it :)2012-08-26