4
$\begingroup$

Let $E$ be a finite dimensional real vector space with a norm $\|.\|$. Define integral of mesurable functions with value in $E$ by choosing a basis and integrate componentwise. How do we prove the triangle inequality : $ \left\| \int_X f \right\| \leq \int_X \| f \|, $ ($f : X \rightarrow E $) ?

I know a proof when $E=\mathbb{C}$ or $\mathbb{H}$, i.e. when $\|.\|=\|.\|_2$ in dimension $2$ or $4$ (the key is to use the multiplicative law). (I think) I also came up to proof of this inequality by using approximation of $f$ by simple functions, but my proof is very messy...

  • 2
    The Pettis integral has the advantage that one doesn't really need to develop a new integration theory, as everything is reduced to the scalar case. The definition of the Bochner integral of a general function is not technically complicated, but it sometimes baffles people who are used to the approach where integrals of non-negative functions are defined first and then generalized to general functions.2011-07-20

1 Answers 1

2

Since Mark already gave, in the comments, a proof using duals, let me sketch here a proof using convexity.

We make the following simplifying assumptions:

  1. You are integrating over a space $X$ with finite total volume. (If not, approximate $f$ by cut-offs of $f$ on subsets of finite volume. That $f$ is integrable guarantees that you can do so (Chebychev's inequality).)
  2. $X$ has total volume 1. This you can do by rescaling, since the norm scales linearly by definition.

Observe that the norm is a convex function. We shall prove here Jensen's inequality for a probability space, which will then imply the desired triangle inequality.

Theorem (Jensen's inequality) Let $(X,\Sigma,\mu)$ be a probability space (that is, it is a measure space with total volume 1). Let $f:X\mapsto V$ be an integrable function taking values in some (real) topological vector space $V$. Let $\Psi:V\to\mathbb{R}$ a convex function, then we have $ \Psi(\int f d\mu) \leq \int \Psi(f) d\mu $

Sketch of Proof:

Let $g = \int f d\mu \in V$. By convexity, there exists a subdifferential of $\Psi$ at $g$, in the sense that there exists a linear functional $k\in V^*$ such that $\Psi(g) + k(h-g) \leq \Psi(h) $ for any $h\in V$. (This is the generalisation of the supporting hyperplane theorem; in the finite dimensional case you can just use the supporting hyperplane theorem.) Integrate the expression we get

$ \int \Psi(g) d\mu + \int k(f-g) d\mu \leq \int \Psi(f) d\mu $

Since the space has total mass 1, and $g$ is independent of the position $x\in X$, the first integral on the LHS is just $\Psi(g) = \Psi(\int f d\mu)$. Now $k$ is a linear functional, so it commutes with integration, but

$ \int (f-g)d\mu = \int f d\mu - \int f d\mu = 0 $

so the second term on the LHS is 0. Q.E.D.