I will start with a definition.
A monotone function $f$ on $[a,b]$ is called singular if f'=0 almost everywhere.
Let $f$ be a nondecreasing function on $[a,b]$ such that given $\epsilon~,~\delta\gt 0$, $\exists$ a finite collection $\{[y_k,x_k]\}$ of nonoverlapping intervals such that $\sum |x_k-y_k|\lt \delta~~~~~\text{and}~~~~\sum\left(f(x_k)-f(y_k)\right)\gt f(b)-f(a)-\epsilon.$
I would like to show that $f$ is singular.
Attempt:
From a previous exercise, I showed that a monotone function $f$ can be written as the sum of an absolutely continuous function, $g$ and a singular function, $h$. Thus f = g + h ,~~~\text{where}~~g=\int_a^x f' .
My goal is to show that $g=0$ almost everywhere. Let $I=\bigcup (y_k,x_k).$ Then \int_I f' = \sum \int_{(x_k,y_k)}~f' = \sum\left(f(x_k)-f(y_k)\right)\lt \epsilon.
Now, I know that since f' is integrable, there is an $\epsilon$ such that \int_{[a,b]\setminus I}~f'\lt \epsilon. But \begin{align*} 0\leq \int_a^b f' & = \int_I f'+\int_{[a,b]\setminus I}~f'\\ & \lt \epsilon + \epsilon\\ & = 2\epsilon. \end{align*}
Since $\epsilon$ is arbitrary, we can let $\epsilon \rightarrow 0$ and thus \int_a^b f'=0 ~~\text{and }~~g = 0.
Is what I've done right? Is there another way of approaching the problem?
Thanks.