2
$\begingroup$

Consider a cousin of the Chebyshev function:

$$t(x) = \sum \alpha_i $$ such that $$ p_i^{\alpha_i }= x, \ p_i \leq x$$

I speculated that $ t(x) \sim C(x)$, the composites $\leq(x)$

If $t(x) = \sum \alpha_i = \sum \log_{p_i}x = \sum \frac{\ln x}{\ln p_i } = \ln x \sum \frac{1}{\ln p_i}\sim \ln x $ Li(x) $ \sim \ln x \frac{x}{ \ln x} \sim x \sim C(x) $.

After some thought the above seems okay, and also for

$$r(x) = \sum \beta_j $$ such that $$c_j^{\beta_j }= x, \ c_j \leq x$$ in which $c_j$ is the jth composite (excluding 1).

Because s = $\sum\frac{1}{\ln c_j} $ is nearly $\sum \frac{1}{\ln x}$ we would expect that $\ln x\sum \frac{1}{\ln c_j}\sim x$ also.

My lingering question is this: why would

$\sum \beta_j - x \sim \pi(x)$, or equivalently, $\ln x\sum \frac{1}{\ln c_j}- x \sim \pi(x)$?

A typical calculation: x= 900,000, r(x) = 971141, r(x) - x = 71141 and $$\pi(x) = 71274$$

  • 0
    The number of composites less than $x$ is asymptotic to $x$, so I don't see what isn't satisfying. I'm sure $\sum_{p\lt x}(1/\log p)$ is discussed in intro analytic number theory texts, but I'm away from my references so I can't be more specific.2011-12-30
  • 0
    @GerryMyerson: I worked through most of my questions about the above but edited in one lingering question. Thanks for any thoughts.2011-12-30
  • 1
    It should be easy to get error terms for $\sum(1/\log m)$. Then it should be possible to find good estimates for $\sum(1/\log p)$ in the literature. Once you have both of these you'll be able to subtract and get something about the sum over the composites.2011-12-31

1 Answers 1

2

First, I wanted to point out that most of what you wrote is not correct. $t(x)\sim\frac{x}{\log x}$, not $\sim x$. This is because if $\sum_{p\leq x}\sim \text{li(x)}$, then we expect that by dividing by $\log p$, I will get $\text{li}(x)/\log(x)$, and then $t(x)\sim \text{li}(x)$. Also, the computations are way off. Checking on Matlab, I have that: $$r(900000)=9000090.94\dots$$ and hence $$r(900000)-900000=90.94\dots$$ You did not remove the primes correctly. Now lets prove all of this in full.

Some Rigorous Proofs: Lets go over everything in detail. The two series you are considering are

$$t(x)=\log x\sum_{p\leq x}\frac{1}{\log p}$$ and $$r(x)=\log x\sum_{11\text{ is composite.}$$ Notice that $r(x)+t(x)=\sum_{2\leq n\leq x}\frac{1}{\log n}$ so we need only evaluate $\sum_{2\leq n\leq x}\frac{1}{\log n}$ and $\sum_{p\leq x}\frac{1}{\log p}$. First

Lemma 1: $$\sum_{2\leq n\leq x}\frac{1}{\log n}=\text{li}(x)-C+O\left(\frac{1}{\log x}\right)$$ where the constant $C$ is given by $$C=\int_{2}^{\infty}\frac{\left\{ t\right\} }{t\left(\log t\right)^{2}}dt.$$

Proof: We may write this as a Riemann-Stieltjes integral: $$\sum_{2\leq n\leq x}\frac{1}{\log n}=\int_{2}^{x}\frac{1}{\log t}d\left[t\right]=\int_{2}^{x}\frac{1}{\log t}dt-\int_{2}^{x}\frac{1}{\log t}d\left\{ t\right\}$$ where $\left[t\right]$ and $\left\{ t\right\}$ are the floor and fractional parts of $t$, respectively. Using integration by parts and the definition of $\text{li}(x)$ this is $$\text{li}(x)-\frac{\left\{ x\right\} }{\log x}-\int_{2}^{x}\frac{\left\{ t\right\} }{\left(\log t\right)^{2}}dt.$$

Remark: Although I have not done the computation myself, I believe the constant $C$ can be cleaned up in terms of other known constants.

Lemma 2: We have that $$\sum_{p\leq x}\frac{1}{\log p}=\text{li}\left(x\right)-\frac{x}{\log x}+O\left(xe^{-c\sqrt{\log x}}\right).$$

Proof: If $\theta(x)=\sum_{p\leq x}\log p$, we may write $$\sum_{p\leq x}\frac{1}{\log p}=\int_{2}^{x}\frac{1}{\left(\log t\right)^{2}}d\theta\left(t\right)=\int_{2}^{x}\frac{1}{\left(\log t\right)^{2}}dt+\int_{2}^{x}\frac{1}{\left(\log t\right)^{2}}d\left(\theta(t)-t\right).$$ For the first term, notice that by integration by parts $$\int_{2}^{x}\frac{1}{\log t}dt=\frac{x}{\log x}+\int_{2}^{x}\frac{1}{\left(\log t\right)^{2}}dt.$$ For the second term we can use integration by parts along with the prime number theorem which states that $\theta(t)-t=O\left(xe^{-c\sqrt{\log x}}\right).$ Then $$\int_{2}^{x}\frac{1}{\left(\log t\right)^{2}}d\left(\theta(t)-t\right)=\frac{\left(\theta(x)-x\right)}{\left(\log x\right)^{2}}+2\int_{2}^{x}\frac{\theta(t)-t}{t\left(\log t\right)^{3}}dt+O(1)$$ $$=O\left(xe^{-c\sqrt{\log x}}\right),$$ and the lemma follows.

Consequences: Putting these two lemmas together, we find that: $$t(x)=\text{li}(x)\log x-x+O\left(xe^{-c\sqrt{\log x}}\right),$$ $$r(x)+t(x)=\text{li}(x)\log x -C\log x+O\left(xe^{-c\sqrt{\log x}}\right).$$ We have to put in the extra error term since we are subtracting $r(x)$, and this will consume the $C\log x$ term since it is larger. Hence $$r(x)=x+O\left(xe^{-c\sqrt{\log x}}\right)$$ and $$r(x)-x=O\left(xe^{-c\sqrt{\log x}}\right).$$

I hope that helps,

  • 0
    The Riemann Hypothesis gives $\theta(t)-t=O(x^{(1/2)+\epsilon})$, so I'd expect the agreement between $r(x)$ and $x$ to be better in practice than what we can actually prove.2012-01-01
  • 0
    @GerryMyerson: Certainly, if we assume RH then $$r(x)-x=O_\epsilon\left(x^{\frac{1}{2}+\epsilon}\right)$$ for any $\epsilon$. (Of course the other direction is true as well, and this is equivalent to RH)2012-01-01
  • 0
    @EricNaslund: I will be working through this for a while. Lemma 2 is especially interesting. Thanks.2012-01-02
  • 1
    @daniel: I just wanted to add, notice in particular that since it is $\text{li}(x)-\frac{x}{\log x}$ the sum will be asymptotic to $\frac{x}{(\log x)^2}$2012-01-02
  • 0
    @EricNaslund: The last integr. by parts in Lemma 2--if v is (1/log t)^2, isn't the coefficient of the integral 2 rather than 1/2, since dv is -2(1/log t)^(-3)(1/t)?2012-01-06
  • 0
    @Daniel: I think you are right, it is a 2. This doesn't affect the proof, as the constant is sucked into the big-O term, so I might of been a bit sloppy.2012-01-06
  • 0
    Right, doesn't affect it. This has been very helpful. I was able to parse the R-S integrals using definitions and am close to having a feel for what you have done. Thanks again.2012-01-06