1
$\begingroup$

So now when I study probability in more mathematical manner I encounter a lack of calculus knowledge especially in Measure theory. So my problem is, knowing that:

$I = \int\limits_{\mathbb{R}} F_{\mu}(x) \mu(dx) $

where $\mu$ is finite measure on $\mathbb{R}$, $\mu(\mathbb{R})=1$, $F_{\mu}(x)=\mu((-\infty,x])$ - distribution function, show that

  • $I=1$ $\Rightarrow$ $\mu=\delta_a$
  • $I\geq \frac{1}{2}$
  • if $\mu(dx)=\rho(x)dx$ then $I=\frac{1}{2}$

There were also a requirement of showing that $I$ is convergent and less or equal to 1. I did it using that

$F_{\mu}(x) \leq \mathbb{1}_{\mathbb{R}}(x)$, so $I\leq \int\limits_{\mathbb{R}} \mathbb{1}_{\mathbb{R}}(x) \mu(dx)= \mu(\mathbb{R})=1$

It was then mentioned that this integral is related to $\mathbb{E}(F_X (X))$.

Still for me its kinda hard to imagine the whole measure thing. Maybe someone also know some kind of undergraduate books of probability (with similar and not so hard level of axiomatics).

  • 0
    For your first question about showing that it must be a point mass, think about what it means when $\mu$ is NOT a point mass. Convince yourself that you can find two distinct points a with 0. It may help you to know that $F$ is right-continuous and $0\leq F(x)\leq 1 $ for every $x$.2012-03-26

2 Answers 2

2

For the first one. Let $G(x)=\mu((x,+\infty))$. Then $G(x)+F(x)=1$ for all $x \in \mathbb{R}$.

So $\int_{\mathbb{R}} (F(x)+G(x)) d\mu=1$ and then $\int_{\mathbb{R}} G(x) d\mu=0$.

Since $G$ is non negative we have that $G=0$ $\quad \mu-$ a.s.; so $\exists \ A$ measurable where $\mu(A)=1$ and $G(x)=0$ for all $x \in A$.

The set $A$ can be chosen in the form $[a,+\infty)$. Indeed $A$ is bounded from below because $\lim_{x \to -\infty}G(x)=1$. So set $a=\inf A$. By the property of the inf and the right continuity of $G$ we get $G(a)=0$. Finally G is non increasing so $G(x)=0 \quad \forall x>a$.

Then $G(a)=0=\mu((a,+\infty))$ but $\mu(A)=\mu([a,+\infty))=1$ so $\mu(\{a\})=1$.

1

It is worthwhile getting acquainted with the quantile function $q$, which is a kind of inverse to the distribution function $F$. For $0 it is defined by $q(u)=\inf(x: F(x)\geq u)$, and it satisfies $F(x)\geq u$ if and only if $x\geq q(u)$.

In particular, taking $x=q(u)-\varepsilon$ and letting $\varepsilon\to 0$, we see that $F(q(u)-)\leq u$.

The random variable $F(X)$ takes values in $[0,1]$ so its expectation is $I=\mathbb{E}[F(X)]=\int_0^1\mathbb{P}(F(X)\geq u)\,du$ which we can rewrite as $\begin{eqnarray*} \mathbb{E}[F(X)] &=&\int_0^1\mathbb{P}(X\geq q(u))\,du\\ &=&\int_0^1 1-F(q(u)-) \,du\\ &\geq&\int_0^1 (1-u)\,du\\ &=&1/2. \end{eqnarray*}$

If $X$ has a density function, or more generally, if $F$ is continuous, then $q$ is actually the inverse of $F$ and we get equality above.