2
$\begingroup$

I need to prove that if $f \in L^{1}(\mathbb{R})$ then $\int |f(x+t)-f(x)|dx \to 0$ as $t \to 0$. I thought about approximating f with simple or continuous functions but then realized I couldn't apply any of the standard convergence theorems directly because they're stated for sequences. I can prove $\int |f(x+\frac{1}{n})-f(x)|dx \to 0$ as $n \to \infty$ but I'm not sure how this would imply the desired statement. Are there generalized versions of the convergence theorems that apply to non-discrete indexing parameters?

  • 0
    you might look at this: http://math.stackexchange.com/questions/59363/when-can-the-order-of-limit-and-integral-be-exchanged2011-08-24
  • 3
    Pass to sequences.2011-08-24

1 Answers 1