1
$\begingroup$

My question is at the bottom, but I'll give the set up first:

Let $\mathbb{P}$ be a probability measure on set $\Omega$ equipped with sigma algebra $\mathcal{F}$. A random variable $X:\Omega \rightarrow \mathbb{R}^n$ is defined to be a $\mathcal{F}$-measurable function, i.e. one for which each preimage of open $\mathcal{U} \subset \mathbb{R}^n$ is open, i.e. $ X^{-1}(\mathcal{U}) \equiv \{\omega \in \Omega, X(\omega)\in \mathcal{U} \}$

is open for any open $\mathcal{U} \subset \mathbb{R}^n$. A simple random variable $Y$ is one that takes the form $ Y = \sum_{i=1}^n{y_i}1_{A_i}$ for $A_i \in \mathcal{F}$. We define the expected value of the simple random variable $Y$ w.r.t. probability measure $\mathbb{P}$ to be $\mathbb{E}^\mathbb{P}[Y] = \sum_{m=1}^n{y_i\mathbb{P}(A_i)} $ which all makes sense to me. Now for my QUESTION:

I am looking at a book that defines the expectation of a general random variable $X$ as: $ \mathbb{E}^\mathbb{P}[X] = \sup\{\mathbb{E}^\mathbb{P}[Y],Y\hbox{ is a simple r.v.},0 \leq Y \leq X \}. $

What is the meaning of $0 \leq Y \leq X$ for functions $X,Y\colon\Omega \rightarrow \mathbb{R}^n$? It seems like this is analogous to the construction of the Lebesgue integral, but extending to $\mathbb{R}^n$ for $n>1$ seems nontrivial to me.

1 Answers 1

2

We can define Lebesgue integral for functions with values in $\Bbb R^n$ componentwise, that is $\int gd\mu:=(\int g_jd\mu)_{j=1}^n$. Here, the definition is equivalent to $E^{\mu}[X]:=\sup\{E^{\mu}[Y],Y \mbox{ is a simple random variable}, 0\leq \langle Y,e_j\rangle\leq \langle X,e_j\rangle,1\leq j\leq n \}.$