I was trying to prove the convergence of the following integral:
$\int^{\infty}_{0}\frac{\ln{x}}{1+x^{2}}\,\mathrm{d}x$
The only way (and indeed quite a convenient one) that came to mind was using the fact that if it is convergent, then the series
$\sum^{\infty}_{n=1}\frac{\ln{n}}{1+n^{2}}$
should also converge. Indeed, it is quite easy to show with Cauchy condensation test that it does. However, it seems to me that we cannot use it: the theorem which states that the convergence of $\int^{\infty}_{a}f(x)dx$ and $\sum^{\infty}_{n=[a+1]}f(n)$ for $a\geq{0}$ relies on the assumption that $f$ is decreasing. This is not the case for our function, since we have
$\lim_{x\to{0^{+}}}\frac{\ln{x}}{1+x^{2}}=-\infty$
and our function in increasing up to some $x_{0}\in(1,3)$. I though of breaking it up into $\int^{y}_{0}f(x)dx$ and $\int^{\infty}_{y}f(x)dx$ for some $y\geq{3}$ and then applying the theorem I mentioned. Unfortunately, it seems like the first intregral doesn't converge...
My question is therefore as following: can we violate the assumption that $f$ is decreasing? If yes, than how can we show that it is still legitimate to use that theorem? If no, in what other way can we prove the convergence of the mentioned integral?