1
$\begingroup$

In a previous question Summing something with random and non-random parts I asked how one could estimate (specifically I am interested in an upper bound). $\sum_{k=1}^n e^{-k B_k^2} $ where $B_k$ is a uniformly distributed sequence on [0,1]. The response was to take the expected value of the summand. But what can I do if my $B_k^2$ are deterministic (and not random variables?). To clarify, by uniformly distributed I mean $\{B_k\}$ satisfies $ \lim_{n \rightarrow \infty} \frac{|[a,b] \cap \{B_1,B_2,\ldots B_n\}|}{n} = b-a$.

Edit: Suppose $B_n = \{ \alpha n \}$ is the fractional part of $\alpha n$ for some irrational number $\alpha$. By Weyl's Criterion I know that $\{B_n\}$ is uniformly distributed. From some numerical calculations it seems that there exists some $K > 0$ that only depends on $\alpha$ such that $\sum_{k=1}^n e^{-k B_k^2} < K \sqrt{n}$ for all sufficiently large $n$. How would I go about demonstrating this?

Edit 2 Here is an idea I had. $ \sum_{k=1}^n e^{-k B_k^2} = \sum_{k \leq n \: : e^{-k B_k^2} \leq n^{-1/2} } e^{-k B_k^2} + \sum_{k \leq n \: : e^{-k B_k^2} > n^{-1/2}} e^{-k B_k^2} $ which is less than $ \sum_{k = 1}^n n^{-1/2} + \sum_{k \leq n \: : e^{-k B_k^2} > n^{-1/2}} 1 $ Now the left sum is just $n^{1/2}$. Now $ \#\{e^{-k B_k^2} > n^{-1/2} : k \leq n\} = \#\{B_k < \sqrt{\log (n)/(2k)} : k \leq n\}$. I believe that $\#\{B_k < \sqrt{\log (n)/(2k)} : k \leq n\} \approx \int_1^n \sqrt{\frac{\log (n)}{2t}} \, dt$ Or maybe that integral should be a sum, I don't know. But that implies there exists $\epsilon > 0$ such that for sufficiently large $n$, $\#\{B_k < \sqrt{\log (n)/(2k)} : k \leq n\} < n^{1/2+\epsilon}$ So finally we have $ \sum_{k=1}^n e^{-k B_k^2} < n^{1/2} + n^{1/2+\epsilon} < 2 n^{1/2+\epsilon}$ Clearly the above argument has quite a few gaps in it. I am hoping that someone can give me some feedback as to whether or not the gaps can be filled.

  • 0
    Why does it "clearly depend strongly on the first few terms"?2011-02-09

3 Answers 3

2

There are some uniformly distributed sequences $B_k$ for which $\Sigma_k=0^n e^{-kB_k^2}$ grows faster than $\sqrt n$ :

Simply pick $B_k = 0$ about $n^{2/3}$ times out of $n$ and pad the rest of the values with any uniformly distributed sequence. You will still get a uniformly distributed sequence because the density $n^{2/3} / n$ is still convergent to $0$.

But with such a sequence, you have $\Sigma_{k=0}^n e^{-kB_k^2} > n^{2/3}$. (and we could pick any function $g(n)$ provided g(n)/n converges to $0$)

Truly random sequences have a lot of better properties than "uniformly distributed". Here, a property that would work and that random sequences have, is that for any interval $I$, $ \lim_{n \rightarrow \infty} \frac{\#\{k \in \{0\ldots n\}, \sqrt{k}B_k \in I\}}{\sqrt{n} \text{length}(I)} = 2$.

Then you can show that the sum $\Sigma_{k=0}^n e^{-kB_k^2}$, behaves the sames as random sequences, like: $\begin{array}{rcl} \Sigma_{k=0}^n \int_{t=0}^1 e^{-kt^2} dt &<& \Sigma_{k=0}^n \int_{t=0}^\infty e^{-kt^2} dt = \Sigma_{k=0}^n \int_{t=0}^\infty \frac{e^{-t^2}}{\sqrt{k}} dt \\ &=& \left(\int_{t=0}^\infty e^{-t^2} dt \right) \left(\Sigma_{k=0}^n \frac{1}{\sqrt{k}} \right) \sim \sqrt{\pi n} \end{array} $

The property described above implies that in an interval $I = [x;x+dx]$, there are about $2\sqrt{n} dx$ integers $k$ such that $\sqrt{k}B_k$ is in $I$. By dividing $\mathbb{R}^+$ in such intervals and regrouping the terms according to which interval they land into, you obtain bounds for $\Sigma_{k=0}^n e^{-k B_k^2}$. The finer you divide, the closer you get to the value of the integral, until you arrive at $\sqrt{\pi n}$

0

I would have thought the lower bound was 0.5819767068693 which can be approached by taking $B_k$ very close to $1$ for small $k$. Plouffe's inverse symbolic calculator tells me this is the solution to $\log_e(1+x) = 1+\log_e(x)$

Similarly, there is no upper bound, since taking $B_k$ very close to $0$ for small $k$ will give an arbitrary large sum.


Forget that, but I still think your $K \sqrt{n}$ may not be an upper bound.

Take for example $B_k = \dfrac{2k-1}{2n},$ i.e. $\frac{1}{2n}, \frac{3}{2n}, \frac{5}{2n}, \ldots, \frac{2n-1}{2n},$ which might be described as being uniformly distributed on $[0,1]$. Then I think you will find that for any $n$ you care to check $0.77 n^{2/3} < \sum_{k=1}^n e^{-k B_k^2} < 0.9 n^{2/3} $ so increasing faster than $K \sqrt{n}$.

This may not be what you describe as uniformly distributed (in particular the $B_k$ depend on $n$). Nor is it necessarily an upper bound, as I have not checked whether there may be another way of ordering the $B_k$ to give a higher sum.

  • 0
    $0.5819767068693\ldots =1/(\mathrm{e}-1)$.2011-02-10
0

Upper bound: $\sum_{k=1}^n e^{-0}=n $

The rest of this question makes no sense, what does 'what can I do' mean ? Either you know B_k and you calculate the sum, or you dont and you estimate it.