While studying for my exams, I came across this question and I'm trying to think about an intelligent way to solve it (the context is Lebesgue integration):
Let $f:\mathbb{R} \to \mathbb{R}$, a continuous function (please see note below) on every bounded interval. Show that if $f$ and f' are integrable in $\mathbb{R}$ (meaning $\displaystyle \int_{\mathbb{R}} f(x) dx < \infty$ and $\displaystyle\int_{\mathbb{R}} f'(x) dx < \infty$), then:
$\lim_{x \to \infty} f(x) = \lim_{x \to -\infty} f(x) = 0$
and:
\int_{-\infty}^{\infty} f'(x)dx = 0
Definitions:
I Don't know how it's called in English: for every $\epsilon > 0$ there is a $\delta > 0$ such that for every ${[x_i, y_i]_{i=1}^{n}}$, if $\sum (y_i - x_i) < \delta$, then $\sum |f(y_i) - f(x_i)| < \epsilon$