0
$\begingroup$

Suppose I have a deterministic sequence $\{t_n\}$ that is uniformly distributed on $[0,1]$ (for example $t_n = \{ \pi n \}$, i.e the fractional part of $\pi n$) and a decreasing function $f : \mathbb{R} \rightarrow [0,1]$.

I think it is reasonable to expect that $\#\{t_k < f(k) : k \leq n \} \approx \sum_{k = 1}^n f(k)$ But I'm not sure how I could demonstrate something like that, nor what the correct error would be.

I know that $\#\{t_i < f(k) : i \leq n \} \approx f(k)n$ but I am unsure how I can translate that into a (rigorous) statement about $\#\{t_k < f(k) : k \leq n \} = \sum [t_k \leq f(k) \text{ and } k \leq n]$ where $[x]$ is Iverson's Bracket which is equal to $1$ if $x$ is true and $0$ if $x$ is false.

Edit: As chandok has pointed out, it is possible to pick a function and sequence where the above does not hold. If we require that $\#\{t_k < f(n) : k \leq n\}$ be unbounded, can we avoid "bad" functions?

Edit 2: The motivation for this question comes from Estimating a sum containing a uniformly distributed but deterministic term .

1 Answers 1

0

The problem with deterministic "random" sequences is that you can easily use them to provide an object (here a decreasing function) against which they won't be "random" at all :

If you pick $f(k) = \inf \{t_m, 1 \leq m \leq k\} /2$, you have that $\#\{t_k < f(k) : k \leq n \} = \#\{t_k = 0 : k \leq n \}$.

You can easily pick "uniformly distributed" sequences where this is $0$, while $\Sigma f(k)$ gets as large as you want (in fact I think it will diverge almost surely if you pick a random uniformly distributed sequence).

  • 0
    Asking for $f$ to have a non-zero limit should work better. Otherwise, if you pick a random sequence against a deterministic f, or if you pick a deterministic sequence against a random f, there should be some good result that happens with probability 1. If you pick both deterministically, you will always be able to pick specific things that just won't work.2011-02-09