9
$\begingroup$

For fixed $a,b,c \in \mathbb{R}$ with $ac \neq 0$, it seems to me that one can find an increasing sequence of integers $\{\alpha_n\}$ such that the quantity $c \log \alpha_n$ becomes arbitrarily close to elements of the set

$ A = \{ ak+b \,\colon k \in \mathbb{Z} \}. $

First, is this true? If so, my question is: how good is the approximation?

For example, given any $\epsilon > 0$, is it possible to find infinitely-many integers $n$ such that, for some constant $C$,

$ \textrm{dist}(c \log n,A) := \inf_{k \in \mathbb{Z}} |c \log n - ak-b| \leq C n^{-\epsilon}? $

How about

$ \textrm{dist}(c \log n,A) \leq C e^{-n}? $

I am also interested in results which might say something like "There are at most finitely-many $n$ satisfying

$ \textrm{dist}(c \log n,A) \leq C e^{-n^2} $

for any positive constant $C$" to illustrate the "best-possible" nature of a less restrictive bound.


Motivation

I am trying to determine the behavior of a quantity like

$ \left|\left(1-e^{i c \log n}\right)g(n)\right|^{1/n}, $

where $g(n)$ is well-behaved. Until now I have simply been excluding all $n$ for which $1-e^{i c \log n}$ lies in some small fixed neighborhood of the origin, but doing this I lose infinitely-many $n$.

If, for example, it turns out that, for some positive constant $C$, there are only finitely-many $n$ satisfying

$ \left| 1-e^{i(\theta + c \log n)} \right| \leq C n^{-1-\epsilon} \tag{1} $

for any $\epsilon > 0$, then all but finitely-many $n$ satisfy

$ C n^{-1} \leq \left| 1-e^{i(\theta + c \log n)} \right| \leq 2. $

In that case we could exclude at most finitely-many $n$ to obtain the desirable property

$ \left| 1-e^{i(\theta + c \log n)} \right|^{1/n} \to 1 $

as $n \to \infty$.

Now, if we let

$ B = \{2\pi k - \theta \,\colon k \in \mathbb{Z}\}, $

then equation $(1)$ is equivalent to the existence of a positive constant $C_1$ such that only finitely-many $n$ satisfy

$ \textrm{dist}(c \log n,B) \leq C_1 n^{-1-\epsilon} $

for any $\epsilon > 0$.

  • 0
    @AntonioVargas You say you want to investigate the behaviour of the function $\left|\left(1-e^{i c \log n}\right)g(n)\right|^{1/n}$. Would it be possible to know exactly what characteristics are you concerned with? For example, $\log 4$ is almost $\pi/2$ and $\log 23$ is almost $\pi$. I guess you want to know how many $n$ are there such that the term $c \log n $ is approximately equal to some angle $\phi + 2k\pi$ . Is this correct?2012-08-24

3 Answers 3

2

Here's a more formal proof of the main argument presented in the comment.

First, to make things a little easier on ourselves, since $|ak+b-c\ln n|=c\cdot|a'k+b'-\ln n|$ where $a'=a/c$ and $b'=b/c$, we can pick $c=1$ without loss of generality. Secondly, we can assume that $b$ is the smallest positive number in $A$ and $a>0$ so that we're only concerned with the numbers $ak+b\in A$ for non-negative $k$.

To find $\ln n\approx ak+b$, we take $m_k=\lfloor e^{ak+b}\rfloor$. Then, we get $\ln m_k \le ak+b < \ln(m_k+1)=\ln m_k+\ln\left(1+\frac{1}{m_k}\right) < \ln m_k+\frac{1}{m_k} $ so since $ak+b$ is inside an interval of length less than $1/m_k$, the distance to either end of the interval, i.e. either $\ln m_k$ or $\ln(m_k+1)$, must be $\epsilon_k<1/2m_k$. If we let $n_k$ be either $m_k$ or $m_k+1$ depending on which logarithm gave the best estimate, we get $\epsilon_k=|ak+b-\ln n_k|<\frac{1}{2(n_k-1)}$.

Thus, for any $C>1/2$, we'll have $\epsilon_k for all sufficiently large $k$, providing an infinite number of solutions.

The second claim I made, I'll have to give a little more thought on how to formalise, although I can give the main idea more clearly. Anyway, I think it was wrong as stated: I'd expect an infinite number of solutions with $\epsilon which would translate to $\epsilon (different $C$). I'd mixed up the $k$ and $n$ when I was thinking this through.

For generic $a$ and $b$, we would expect $ak+b$ to lie at some seemingly random place inside the interval $[\ln m_k,\ln(m_k+1))$. If we let $l_k=\ln(1+1/m_k)/2$ be half the length of the interval, we should then expect that the distance $\epsilon_k$ from the closes end should be uniformly distributed on $[1,l_k]$. Stated differently, we would expect the $\epsilon_k/l_k$ to be uniformly distributed on $[0,1]$.

If we take a sequence $p_k\in(0,1]$, the likelihood that $\epsilon_k/l_k is $p_k$. The expected number of $k$ for which $\epsilon_k is then $\sum_k p_k$. If we let $p_k=1/k$, the expected number of solutions where $\epsilon_k is thus infinite, but by a small margin: $C/[n(\ln n)^{1+\epsilon}]$ should be expected to give a finite number of solutions.


To get a better understanding of the randomness perspective, let's rewrite the inequality $ m_k\le\beta\alpha^k1, \beta=e^b\ge1 $ and note that $\delta_k=\text{Dist}(\beta\alpha^k,\mathbb{Z})\approx\epsilon_km_k$.

A case where the randomness argument fails is when $\alpha$ is an odd natural number and $\beta=3/2$. This makes $\delta_k=1/2$ for all $k$. I suspect numerous similar examples can be made for algebraic $\alpha$.

I would hypothesise the set of $(\alpha,\beta)$ for which the number of arbitrarily small $\epsilon_k$ (or $\delta_k$) is finite should have measure zero.

  • 0
    Thank $y$ou, this is a really nice answer!2012-08-26
2

PART I

We can solve this problem by constructing such a sequence. First we simplify our terms. So, we divide both the terms by the quantity $c$ and let the new terms be $\log\alpha_{n}$ and $a'k+b'$.

So, now the problem is to find a sequence $\{\alpha_{n}\}$ such that

\begin{equation} \log(\alpha_{n+1}) - (a'k_{n+1}+b') < \log(\alpha_{n}) - (a'k_{n}+b') \end{equation}

I say the following is such a sequence \begin{equation} \alpha_{n} = \lceil {e^{a'n+b'}} \rceil \end{equation}

PROOF \begin{equation} \alpha_{n} - {e^{a'n+b'}} \leq 1 \end{equation}

Hence, \begin{align} &\log(\alpha_{n}) - (a'n+b') \\ &= \log\left(\frac{\alpha_{n}}{e^{a'n+b'}}\right) \\ &\leq \log\left(\frac{e^{a'n+b'} + 1}{e^{a'n+b'}}\right) \\ &= \log\left(1 + \frac{1}{e^{a'n+b'}}\right) \end{align}

Now, \begin{align} \log\left(1 + \frac{1}{e^{a'n+b'}}\right) < \log\left(1 + \frac{1}{e^{a'(n+1)+b'}}\right) \end{align}

And hence, the proof for the first part.

  • 0
    @EinarRødland Ohh, I see. So basically I duplicated your answer. Let it stay here though, I find it easier to understand my formulation, I will correct the answer though to reflect the nature of approximation and delete the $e^{-n^2}$ part. I guess that would be correct right?2012-08-24
1

Without loss of generality, we may assume $a = 1$. Then,

$ \text{inf}_{k \in \mathbb{Z}} |c \log n - k - b| $

is really just computing the distance in the circle $\mathbb{R}$ modulo 1. That is, we are comparing the fractional parts of $c \log n$ and $b$, with wraparound. (i.e. 0.9 and 0.1 are separated by a distance of 0.2)

The statement

$ \text{dist}(c \log n, A) < C n^{-\epsilon} $

is equivalent to the statement

$ b \in (c \log n - C n^{-\epsilon}, c \log n + C n^{-\epsilon}) $

in the circle. (remember the interval wraps around from 0 to 1!)

Each interval has length $\min\{1, 2 C n^{-\epsilon}\}$. If $\epsilon > 1$, the sum of the lengths of these intervals converges for every $C$.

In particular, this means for any $\delta > 0$ there exists some $M$ such that the union of all the intervals for $n>M$ has total length less than $\delta$.

From this, we can conclude the set of $b$'s for which $\text{dist}(C \log n, A) < C n^{-\epsilon}$ infinitely often has measure 0.

It is still possible that such $b$'s exist, however. I bet there is a phenomenon related to irrationality measure that applies here, but that topic is beyond me.

  • 0
    I really appreciate the perspective this answer gave. Thank you for posting it :)2012-08-26