I'm working on a past exam to help me study for my finals and I came across the following question:
Let $\varepsilon>0$ and define $A=\bigcup_{j=1}^\infty(x_j-\varepsilon,x_j+\varepsilon)$ where $x_j\in\mathbb{R}$. Suppose that $A\cap[0,1]$ is dense in $[0,1]$. Then $|A\cap[0,1]|=1$ (Lebesgue measure).
A hint is provided, suggesting the use of Lebesgue's Differentiation Theorem. On one hand, I don't see how to use the theorem, but, is it necessary? My line of thinking is as follows:
Since we're in $\mathbb{R}$ and $A$ is open (as a union of open sets), that means $A$ can be written as a countable union of disjoint open intervals, say $\bigcup_{j=1}^\infty(a_j,b_j)$. Assume the measure of $A\cap[0,1]$ is strictly less than $1$, say $1-\delta$ for some $\delta>0$. By the minimum fixed length of $b_j-a_j$, this implies there is some nondegenerate interval in $[0,1]$, which in turn implies there are elements with open sets disjoint from $A$, which is a contradiction since $A$ is dense in $[0,1]$.
In fact, if I'm thinking about the construction correctly, it would seem to contain all but finitely many (based on the idea that $b_j-a_j\geq2\varepsilon$ for each $j$). I realize my proof doesn't generalize to any higher dimensions (although something similar might work) since an open set can't be written as a countable union of disjoint open sets in higher dimensions. Will anyone critique my proof and possibly suggest a line of proof using Lebesgue's Differentiation Theorem? Thanks!