2
$\begingroup$

If I have a random variable X with probability mass function, $f(u)$, would somebody give me an intuitive reason as to why the expected value of X is $\int^t_0 x f(x) dx$ ?

  • 2
    Do you understand the discrete case, $E = \sum x p(x)$?2011-04-14

2 Answers 2

1

The integral there should be not on $[0,t]$ but on the set of possible values of $X$. By taking this integral you assign to each value $x$ its density (like an importance, likelihood) and then taking the average (so, the integral) such that you are summing all possible values of a random variable $X$ taking into account their likelihood (importance).

E.g. for the discrete case these arguments fall into the sum $ \mathsf{E}[X] = \sum\limits_i x_i P(X = x_i). $

On the other hand for the continuous case $P(X = x) = 0$ usually (if there are no singularities). That's why we use density rather than probability and integral rather than the sum.

1

Let us suppose for simplicity that the function $f(x)$ is $0$ outside the interval $[a,b]$.

Divide the interval into $n$ subintervals, each of length $h =(b-a)/n$. Let $x_0=a$, $x_1=a+h$, $x_2=a+2h$, and so on up to $x_{n}=a+nh=b$. The smooth curve $y=f(x)$ can be approximated by the broken line curve $y=f(0)$ on the interval $[x_0,x_1)$, $y=f(x_1)$ on the interval $[x_1,x_2)$, and so on up to $y=f(x_{n-1})$ on the last interval. The probability that $X$ lies in the interval $[x_i,x_{i+1})$ is approximately $f(x_i)h$.

Our random variable behaves much like the discrete random variable which is $x_0$ with probability $f(x_0)h$, $x_1$ with probability $f(x_1)h$, and so on up to $x_{n-1}$ with probability $f(x_{n-1})h$. This discrete random variable has expectation $\sum_{i=0}^{n-1} x_i f(x_i) h=\sum_{i=0}^{n-1} x_i f(x_i) (b-a)/n$ Now let $n \to\infty$. The approximation gets better and better. As $n \to\infty$, the displayed sum above approaches the Riemann Integral of $xf(x)dx$ from $a$ to $b$.