A theorem in my notes claims the following:
If $f\in L_1(\mathbb{T})$, and both $f(x+0) = \lim\limits_{t\to x^+} f(t)$ and $f(x - 0) = \lim\limits_{t\to x^-} f(t)$ exist, then
$\lim_{n\to\infty}\sigma_n (f) = \frac{[f(x + 0) + f(x - 0)]}{2}.$
The proof begins with defining
$I:= \sigma_n(f) - \frac{[f(x + 0) + f(x - 0)]}{2}.$
(1) Next it is noted that
$I = \frac1{2\pi}\int_0^\pi K_n (t)\{[f(x + t) - f(x+0)] + [f(x - t) - f(x - 0)]\}\mathrm dt.$
I just can't see it.
(2) In a previous argument we did show that $[\sigma_n(f)](x) - f(x) = \frac1{2\pi}\int_{-\pi}^\pi [f(x + t) - f(x)]K_n (t)\mathrm dt.$
Can someone help me understand how we obtain (1)? I'm not sure if it is a consequence of (2) or if it is unrelated.
Thank you.