Since Mark already gave, in the comments, a proof using duals, let me sketch here a proof using convexity.
We make the following simplifying assumptions:
- You are integrating over a space $X$ with finite total volume. (If not, approximate $f$ by cut-offs of $f$ on subsets of finite volume. That $f$ is integrable guarantees that you can do so (Chebychev's inequality).)
- $X$ has total volume 1. This you can do by rescaling, since the norm scales linearly by definition.
Observe that the norm is a convex function. We shall prove here Jensen's inequality for a probability space, which will then imply the desired triangle inequality.
Theorem (Jensen's inequality) Let $(X,\Sigma,\mu)$ be a probability space (that is, it is a measure space with total volume 1). Let $f:X\mapsto V$ be an integrable function taking values in some (real) topological vector space $V$. Let $\Psi:V\to\mathbb{R}$ a convex function, then we have $ \Psi(\int f d\mu) \leq \int \Psi(f) d\mu $
Sketch of Proof:
Let $g = \int f d\mu \in V$. By convexity, there exists a subdifferential of $\Psi$ at $g$, in the sense that there exists a linear functional $k\in V^*$ such that $\Psi(g) + k(h-g) \leq \Psi(h) $ for any $h\in V$. (This is the generalisation of the supporting hyperplane theorem; in the finite dimensional case you can just use the supporting hyperplane theorem.) Integrate the expression we get
$ \int \Psi(g) d\mu + \int k(f-g) d\mu \leq \int \Psi(f) d\mu $
Since the space has total mass 1, and $g$ is independent of the position $x\in X$, the first integral on the LHS is just $\Psi(g) = \Psi(\int f d\mu)$. Now $k$ is a linear functional, so it commutes with integration, but
$ \int (f-g)d\mu = \int f d\mu - \int f d\mu = 0 $
so the second term on the LHS is 0. Q.E.D.