Since your question was about the geometry behind convergence, I'll chime in with a very geometric way to think about these concepts. However, as Qiaochu Yuan mentions, in order to do so, we must first nail down in what sense we mean convergence. I'll discuss the "big three" types of convergence: pointwise, uniform, and mean-square (also called $L^2$) convergence.
Let's begin with defining a notion of $error$ between $f(x)$ and the $N$th partial sum of its Fourier series, denoted by $F_N(x)$, on $-\ell. Define the (absolute) pointwise error, $p(x)$, by $p(x)=|f(x)-F_N(x)|, \quad -\ell The geometry of the situation belies its name: $p(x)$ represents the point-by-point difference (or error) between $f(x)$ and $F_N(x)$.
We can then define the following three types of convergence based on the behavior of $p(x)$ as $N\to\infty$.
- $F_N(x)$ converges pointwise to $f(x)$ on $-\ell if $p_N(x)\to 0 \text{ as } N\to\infty \text{ for each fixed }x\in(-\ell,\ell).$
- $F_N(x)$ converges uniformly to $f(x)$ on $-\ell if $\sup_{-\ell
- $F_N(x)$ converges in the mean-square or $L^2$ sense to $f(x)$ on $-\ell if $\int_{-\ell}^\ell p_N^2(x)\,dx\to 0 \text{ as } N\to\infty.$
Think of each of these in terms of what is happening with the pointwise error as $N\to \infty$. The first says that at a fixed $x$, the difference between $f(x)$ and $F_N(x)$ is going to zero. This may happen for some $x$ in the interval and fail for others. On the other hand, uniform convergence says that the supremum of all pointwise errors tends to zero. Finally, the mean-square error says that the area under $p^2(x)$ must tend to zero as $N\to\infty$.
The first is a very local way to measure error (at a point), whereas the second two are global ways to measure the error (across the entire interval).
We can formulate this in terms of norms by setting $\|f-F_N\|_\infty:=\sup_{-\ell Then, $F_N(x)\to f(x)$ uniformly on $-\ell provided $\|f-F_N\|_\infty\to 0$ as $N\to\infty$. (This is why we call it the uniform norm!)
On the other hand, if we set $\|f-F_N\|_{L^2}:=\sqrt{\int_{-\ell}^\ell |f(x)-F_N(x)|^2\,dx},$ then $F_N(x)\to f(x)$ in the $L^2$ sense on $-\ell provided $\|f-F_N\|_{L^2}\to 0$ as $N\to\infty$. (This is called the $L^2$ norm on $-\ell.)
To illustrate this geometrically, here's $f(x)=x^2$ (black) and its Fourier sine series $F_N(x)$ (blue) on $0 for $N=5,\dots,50$ and the corresponding pointwise error (red). We can see this series converges pointwise but not uniformly on $0. You can also get an idea of the $L^2$ convergence by envisioning the area under the square of the red curve and seeing it tend to zero also. I was going to post that picture as well, put the shaded area is so thin it is difficult to see.

These illustrations are of course not a proof of the convergences, but simply a way to interpret them geometrically.
For the sake of completeness, here's an example which does converge uniformly: the same function and interval as above, but $F_N(x)$ is the Fourier cosine series.

Hope that helps.