Show $\exists f \in C \ni \|f\|_\infty \le 1$, but it's Fourier series diverges.
The proof is in our textbook (Katznelson, Harmonic analysis). It uses this argument.
Let $D_n(t)=\sum_{k=-n}^ne^{ikt}$ be the Dirichlet Kernel, $g=\text{sgn}(D_n(t))$, and $E=\{f \in C: \|f\|_\infty \le 1\}$.
Choose $f$ in $E$ for which $ f(t)=g(t) \text{, except around the discountinuity of g,} $ and for which the sum $S$ of the length of the intervals where $f$ and $D_n(t)$ differ is less than $\epsilon/2n$.
Why is choosing an $f$ with $S \le \epsilon/2n$ possible?
If the number of discontinuity of $g$ is countable, I understand.
But if it's not, I don't see how to justify this choice.
And I don't know the cardinality of the discontinuities of $g$.