Suppose $Q$ is a rectangle in $\mathbb R^n$ and $f: Q \to \mathbb R$ is a bounded function. How is it that if $f$ vanishes outside a closed set $B$ of measure zero, the integral of $f$ sub $Q$ exists and equals zero?
I can introduce $D$ to be the set of points of $Q$ at which $f$ fails to be continuous and then integral of $f$ sub $Q$ exists and equals zero. But how to make this rigorous?
my proof:
Select a refinement, B, of Q s.t., we are interested in rectangles that don't intersect B, and which are those that have measure zero.
The squares that don't indersect at all, have no discontinuities. We note that discontinuity occurs if we are to select B intersect Q sub j is a subset of B. B has measure zero, so B intersect Q sub j has measure zero and j is an arbitrary integer.
B is closed implies B intersect Q closed and contains all discontinuity points.