Let $C$ be a smooth, geometrically irreducible projective curve defined over a finite field $\mathbb{F}_q$. Given a (scheme-theoretic) point $x \in C$, define the degree of $x$ to be the degree of the extension $[k(x): \mathbb{F}_q]$. The degree of a divisor on $C$ can thus be defined by linearity.
We proved today in my algebraic number theory class that there is always a divisor of degree one. The argument was to suppose that all the degrees $[k(x): \mathbb{F}_q]$ were divisible by some $m >1$, and then to apply (a very weak form of) the Cebotarev density theorem to the extension of function fields $k(C) \to k(C) \otimes_{\mathbb{F}_q} \mathbb{F}_{q^m}$. All places would have to split completely if $\mathbb{F}_{q^m}$ is contained in every residue field, which is a contradiction.
I also learned another proof from a comment of Felipe Voloch on MO: for large $n$, the Weil bound on the number of $\mathbb{F}_{q^n}$-rational points implies that there is a point of degree $n$ and a point of degree $n+1$. Taking the difference gives a divisor of degree one.
Is there an elementary geometric way of seeing this? (Related question: Are there other fields for which this is true?)