If $f:\mathbb{R}\to\mathbb{R}$ is continuous and monotonically increasing on the interval $[1,\infty]$ with $f'(x)\leq\frac{1}{x}$ on the interval $[1,\infty]$ then is it true that:
$\lim_{n \rightarrow \infty} \frac{1}{nf(n)}\sum_{k=1}^n f(k)=1$
This is by no means a theorem also, its just a guess I made after experimenting with sums of the natural logarithm. It makes intuitive sense to me, because any monotonic function $f(x)$ with a derivative that isn't monotonic results in that function increasing as $x$ increases, but the rate at which it increases is decreasing, therefore in a sense its sort of approaching a constant value, ie 'increaseing at a very slow rate', making all terms slightly less then $f(x)$ , ie $f(x-1), f(x-2),\ldots$ etc, all very close in value to $f(x)$. Meaning the summands towards the end of the summation should all be very close in value, while the smaller ones like $f(1),f(2),\ldots$ etc are neglible. So athough clearly $\sum_{k=1}^n f(k)
A disproof/proof of the theorem would be nice, but in addition some background intuition would also be greatly appreciated.