4
$\begingroup$

Another question in my studies (of Lebesgue integration):

I'm given a continuous function $f:\mathbb{R}\to\mathbb{R}$, a nonempty compact set $E\subseteq\mathbb{R}$, and a sequence of nonempty compact sets $E_i$ such that $\lim d(E_i , E) = 0$where this $d$ is Hausdorff distance.

I am asked to prove that $\lim \int_{E_i} f=\int_E f$

Visually, this seems pretty straightforward, the $E_i$ are being forced to look more and more like $E$, and the continuity of $f$ ensures that the farther along the $E_i$ we pick, $f$ has to vary less and less where $E_i$ and $E$ don't coincide.

I've been at this for a while and seem to be stuck. For a while though I hadn't noticed the compactness assumed of $E$ and the $E_i$. Putting the continuity and convergence in Hausdorff distance together, I see that for any $x\in E$, $\varepsilon>0$, and $\delta > 0$, there is eventually some $E_i$ for which I can find a $y\in E_i$ within $\delta$ of $x$ and with $|f(x)-f(y)|<\varepsilon$. I guess using compactness, I can make this uniform over all $x\in E$ (so that my picture described above is actually as nicely behaved as I was probably picturing it to begin with).

At this point though I'm not sure how to proceed. Should I try to bound the integral over the symmetric difference of $E_i$ and $E$ or something like that? I'm not sure how I would control it, despite knowing it would be constrained somehow by "nearby" values of $f$. Should I cut up $E$ somehow, or select some dense subset of $E$ to use representative values of $f$ on? Is there some subtlety my visual intuition is overlooking?

Added

It seems like there is a counterexample to this theorem as stated:

Let $E=[0,1]$ and let $E_i = \{ k 2^{-i} : k=0,1,\dots,2^i \}$ so that $\array{ E_1 & = & \{0,\frac{1}{2}, 1\} \\ E_2 & = & \{0,\frac{1}{4}, \frac{2}{4}, \frac{3}{4}, 1\} \\ & \vdots }$

Then, for example, with $f$ taken to be the constant $1$ function on $[0,1]$, we have $\int_{E_i} f = 0$ for all $i$, yet $\int_{E} f = 1$.

  • 0
    My opinion is that with the counterexample, this question was satisfactorily answered, and should have been left like that. Then, a new question could be opened (and linked to this one) to ask about appropriate hypotheses so that the conclusion holds.2011-05-02

2 Answers 2

2

I might have an idea, don't shoot me if it is wrong.

Given compact $E$ we can find for every $\epsilon > 0$ a finite number of closed sets $Q_j$ such that

$F = \bigcup_{j = 1}^N Q_j \text{ and } m(E \Delta F) \leq \epsilon$

So define a function $f_\epsilon$ which is equal to $\sup f|_{Q_j}$ on $Q_j$.

Now we we want to estimate

$\int_E |f - f_\epsilon| \, dm$

We can split integral in three parts

$\int_E |f - f_\epsilon| \, dm = \left (\int_{E \Delta F} + \int_{E \cap F} - \int_{F \setminus E} \right ) |f - f_\epsilon| \, dm$

So this can be made smaller than $C \epsilon$ (by uniform continuity, maybe adjust $f_\epsilon$ a bit).

Now we need to estimate

$\int_{E_i} |f - f_\epsilon| \, dm,$ we note that $E_i \subset E \cup [\inf E - d(E, E_i), \inf E] \cup [\sup E + d(E, E_i)]$. The integral over $E$ is treated the same and the other two follow by Lebesgue dominated convergence.

Remarks?

  • 0
    I think this is the right idea -- as mentioned by Robert, any correct argument has to use convergence in measure. One question: is this equivalent to adding $\mu(E_i) \rightarrow \mu(E)$ as an additional hypothesis? (so that we have convergence in the Hausdorff metric and in measure) I would think so, and then it seems like it should definitely work.2011-05-02
2

Let $\varepsilon >0$ and choose $\delta >0$ so that whenever $m(A)<\delta$, it follows that $\int _A|f|dx<\varepsilon$.

Then, choose $N$ sufficiently large so that $d(E_N,E)<\delta$. Then,

$ \left| \int _{E_N}fdx-\int _Efdx\right| =\left| \int _{E_N-E}fdx-\int _{E-E_N}fdx\right| \leq \int _{E_N-E}|f|dx+\int _{E-E_N}|f|dx. $

Now, from here you have to shown, because $d(E_N,E)<\delta$, that $m(E_N-E),m(E-E_N)<\delta$ (up to a constant factor anyways, so that you might have to rescale the choice of $\delta$). Then, by the above inequality, you would have

$ \left| \int _{E_N}fdx-\int _Efdx\right| <2\varepsilon , $

and hence $\lim \int _{E_N}fdx=\int _Efdx$.

Simplifying the picture of Hausdorff distance given on Wikipedia to one dimension, I think that the argument that I left out shouldn't be too bad. If I have time I'll try to help with the details later.

-Jonny Gleason

  • 0
    @GleasSpty I think you should probably edit the answer.2011-05-02