Suppose I have a deterministic sequence $\{t_n\}$ that is uniformly distributed on $[0,1]$ (for example $t_n = \{ \pi n \}$, i.e the fractional part of $\pi n$) and a decreasing function $f : \mathbb{R} \rightarrow [0,1]$.
I think it is reasonable to expect that $\#\{t_k < f(k) : k \leq n \} \approx \sum_{k = 1}^n f(k)$ But I'm not sure how I could demonstrate something like that, nor what the correct error would be.
I know that $\#\{t_i < f(k) : i \leq n \} \approx f(k)n$ but I am unsure how I can translate that into a (rigorous) statement about $\#\{t_k < f(k) : k \leq n \} = \sum [t_k \leq f(k) \text{ and } k \leq n]$ where $[x]$ is Iverson's Bracket which is equal to $1$ if $x$ is true and $0$ if $x$ is false.
Edit: As chandok has pointed out, it is possible to pick a function and sequence where the above does not hold. If we require that $\#\{t_k < f(n) : k \leq n\}$ be unbounded, can we avoid "bad" functions?
Edit 2: The motivation for this question comes from Estimating a sum containing a uniformly distributed but deterministic term .