So now when I study probability in more mathematical manner I encounter a lack of calculus knowledge especially in Measure theory. So my problem is, knowing that:
$I = \int\limits_{\mathbb{R}} F_{\mu}(x) \mu(dx) $
where $\mu$ is finite measure on $\mathbb{R}$, $\mu(\mathbb{R})=1$, $F_{\mu}(x)=\mu((-\infty,x])$ - distribution function, show that
- $I=1$ $\Rightarrow$ $\mu=\delta_a$
- $I\geq \frac{1}{2}$
- if $\mu(dx)=\rho(x)dx$ then $I=\frac{1}{2}$
There were also a requirement of showing that $I$ is convergent and less or equal to 1. I did it using that
$F_{\mu}(x) \leq \mathbb{1}_{\mathbb{R}}(x)$, so $I\leq \int\limits_{\mathbb{R}} \mathbb{1}_{\mathbb{R}}(x) \mu(dx)= \mu(\mathbb{R})=1$
It was then mentioned that this integral is related to $\mathbb{E}(F_X (X))$.
Still for me its kinda hard to imagine the whole measure thing. Maybe someone also know some kind of undergraduate books of probability (with similar and not so hard level of axiomatics).