Many other posts have discussed the standard result that the smoothness of a function is related to the rate at which its Fourier coefficients decay. For example, there are proofs that show that if $f$ is of class $C^k$ then $|c_n|$ tends to zero faster than $|n|^{-k}$ as $n\to\pm\infty$.
Is there a intuitive geometric picture to explain this? If $f$ is not differentiable imagine slowly changing the function so that it becomes twice differentiable, then the Fourier coefficients $c_n = \frac{1}{2\pi}\int_{-\pi}^{\pi}f(\theta) e^{-in\theta}d\theta$ should change their rate of decay, but how is that obvious from the definition of the Fourier coefficients.
What about considering $c_n(\theta) = \frac{1}{2\pi}\int_{-\pi}^{\theta}f(\phi) e^{-in\phi}d\phi$. Is there some kind of bad behavior in $c_n(\theta)$ when it encounters a discontinuity in $f^{(k)}(\theta)$ that would prevent $c_n(\theta)$ from being small.