For $\alpha \in (0,\frac{1}{2}]$, I am trying to show that there is no $E \in \mathcal{L}(\mathbb{R})$ such that for every interval $I$ we have $ \alpha\lambda(I) \leq \lambda(E \cap I) \leq (1-\alpha)\lambda(I). $
I know of the Lebesgue Density Theorem which immediately blows this question out of the water, but we are far away from that point in class, and I am sure that proving this theorem is not the intent of the problem. I have already shown that there is a set $E \in \mathcal{L}(\mathbb{R})$ such that $ 0< \lambda(E \cap I) < \lambda(I). $
I have stared at it all day now and I think I need some help. There is supposedly a very short proof using only basic properties about $\lambda$. If you have any hints I would appreciate them.