In Titchmarsh's book on Fourier Transforms it states that the integral
$\int\nolimits_0^\infty \frac{x^{-a}}{1+x^2} dx,\quad 0
may be calculated by either contour integration or by series expansion. It certainly is easy to do this, technically. However, since the series for $\frac{1}{1+x^2}$ will only have radius of convergence equal to 1, due to the poles at $\pm i$, how does one justify doing the expansion then the integration?
I know this must be simple but I'm not sure why it is true. Is it because the terms when combied with $x^{-a}$ form an absolutely convergent series?
Thanks, Tom