Problem 17 of Chapter 6 of Rudin's Principles of Mathematical Analysis asks us to prove the following:
Suppose $\alpha$ increases monotonically on $[a,b]$, $g$ is continuous, and $g(x)=G'(x)$ for $a \leq x \leq b$. Prove that,
$\int_a^b\alpha(x)g(x)\,dx=G(b)\alpha(b)-G(a)\alpha(a)-\int_a^bG\,d\alpha.$
It seems to me that the continuity of $g$ is not necessary for the result above. It is enough to assume that $g$ is Riemann integrable. Am I right in thinking this?
I have thought as follows:
$\int_a^bG\,d\alpha$ exists because $G$ is differentiable and hence continuous.
$\alpha(x)$ is integrable with respect to $x$ since it is monotonic. If $g(x)$ is also integrable with respect to $x$ then $\int_a^b\alpha(x)g(x)\,dx$ also exists.
To prove the given formula, I start from the hint given by Rudin $\sum_{i=1}^n\alpha(x_i)g(t_i)\Delta x_i=G(b)\alpha(b)-G(a)\alpha(a)-\sum_{i=1}^nG(x_{i-1})\Delta \alpha_i$ where $g(t_i)\Delta x_i=\Delta G_i$ by the intermediate mean value theorem.
Now the sum on the right-hand side converges to $\int_a^bG\,d\alpha$. The sum on the left-hand side would have converged to $\int_a^b\alpha(x)g(x)\,dx$ if it had been $\sum_{i=1}^n \alpha(x_i)g(x_i)\Delta x$ The absolute difference between this and what we have is bounded above by $\max(|\alpha(a)|,|\alpha(b)|)\sum_{i=1}^n |g(x_i)-g(t_i)|\Delta x$ and this can be made arbitrarily small because $g(x)$ is integrable with respect to $x$.