4
$\begingroup$

For random variable $X$ that follows some distribution, $f(x)$ is the probability density function of that distribution if and only if $\mathbb{E}[\phi(X)] = \int_{-\infty}^\infty \phi(x) f(x)dx$ for all functions $\phi$.

Context: My professor used this in lecture to demonstrate a way to find the distribution of $cX$ given random variable $X$ that follows a specific distribution.

  1. What is the name of this theorem?
  2. Have I missed any qualifications/caveats? I am in particular curious about whether "for all functions" is correct. My professor mentioned "positive, bounded, deterministic functions," but I am not sure what he meant by that.
  3. Where can I find a proof of this theorem?
  • 0
    @Donkey_2009 Yes, this is how Ross cites it, if I recall.2015-05-17

1 Answers 1

10

A probability density function is usually defined in the following way:

Let $X$ be a random variable. Then $f$ is the probability density function of the distribution given by $X$ if and only if $f(x)\geq 0$ for all $x\in\mathbb{R}$ and $ P(X\in A)=\int_A f(x)\,\lambda(\mathrm d x), \quad A\in\mathcal{B}(\mathbb{R}). \tag{1} $

However, $(1)$ is equivalent to both of the following

$ P(a\leq X\leq b)=\int_a^bf(x)\,\lambda(\mathrm dx),\quad -\infty

and

$ \mathrm{E}[\varphi(X)]=\int_\mathbb{R}\varphi(x)f(x)\,\lambda(\mathrm dx),\tag{3} $ for every measurable and integrable (with respect to the measure $f\lambda$) function $\varphi$.

It is clear that $(3) \Rightarrow (2)$ (take $\varphi=1_A$) and $(1)\Rightarrow (2)$ (take $A=[a,b]$). But on the other hand $(2)\Rightarrow (1)$ by the use of Dynkin's lemma and also $(1)\Rightarrow (3)$ by a standard argument that is often used in probability theory. The argument goes as follows:

a) The property $(3)$ holds for all indicator functions according to $(1)$.

b) If $\varphi$ and $\psi$ are two functions satisfying $(3)$, then $ \begin{align*} \mathrm{E}[(\varphi+\psi)(X)]=\mathrm{E}[\varphi(X)]+\mathrm{E}[\psi(X)]&=\int_\mathbb{R}\varphi(x)f(x)\,\lambda(\mathrm dx)+\int_\mathbb{R}\psi(x)f(x)\,\lambda(\mathrm dx)\\ &=\int_\mathbb{R}(\varphi+\psi)(x)f(x)\,\lambda(\mathrm dx), \end{align*} $ and so $\varphi+\psi$ satisfies $(3)$. In a similar fashion one can show that $\mathrm{E}[\alpha \varphi(X)]=\alpha\mathrm{E}[\varphi(X)]$ for $\alpha\in\mathbb{R}$ and hence the set of functions satisfying $(3)$ form a vector space.

c) Suppose that $(\varphi_n)_{n\geq 1}$ is a sequence of non-negative, increasing functions satisfying $(3)$ such that $\varphi=\lim_{n\to\infty}\varphi_n$ exists pointwise. Then by applying the monotone convergence theorem (twice) we have $ \mathrm{E}[\varphi(X)]=\lim_{n\to\infty}\mathrm{E}[\varphi_n(X)]=\lim_{n\to\infty}\int_\mathbb{R}\varphi_n(x)f(x)\,\lambda(\mathrm dx)=\int_\mathbb{R}\varphi(x)f(x)\,\lambda(\mathrm dx). $

Now the standard argument yields that $(3)$ holds for every measurable and integrable (with respect to the measure $f\lambda$) $\varphi$.

This also shows that $\varphi$ in $(3)$ could just as well have been chosen to be "positive and bounded".

  • 1
    Yeah, you're right @drhab. Thanks for pointing that out.2015-05-17