1
$\begingroup$

I have some random variable, $x$, distributed according to a probability density function (pdf), $f\left(x\right)$. The Strong Law of Large Numbers (SLLN) implies that, for an expected value, given by:

$ E\left[x\right]=\int x f\left(x\right)\text{d}x $

if I take $N$ samples from $f\left(x\right)$, call them $x_j$ (where $j=1,2,\dots,N$), then the sample average will converge to the expected value as $N\rightarrow\infty$:

$\bar{x}_N = \frac{1}{N}\sum_{j=1}^N x_j \rightarrow E\left[x\right]\ \ \text{as}\ \ N\rightarrow\infty$

Now say that I have another random variable defined by a transformation of $x$:

$y=g\left(x\right)$

with corresponding pdf, $h\left(y\right)$. To use a specific example, let $y=e^{-ikx}$.

  1. Is $y$ a random variable?
  2. What is the relationship between the two pdfs, $f\left(x\right)$ and $h\left(y\right)$? E.g. is $h\left(y\right)=f\left(g\left(x\right)\right)$? It seems like no, but it seems like there ought to be a straightforward relationship.
  3. Is there any meaning to the following expression?:

$\int y f\left(x\right) \text{d}x=\int e^{-ikx} f\left(x\right) \text{d}x$

The reason for $\left(3\right)$ is that I want to know if the following is true:

$ \bar{y}_N=\frac{1}{N}\sum_{j=1}^N y_j=\frac{1}{N}\sum_{j=1}^N e^{-ikx_j}\rightarrow\int e^{-ikx} f\left(x\right) \text{d}x$

As $N\rightarrow\infty$. The answer given in this post seems to suggest that this last expression holds, but the SLLN implies $ \bar{y}_N=\frac{1}{N}\sum_{j=1}^N y_j\rightarrow\int e^{-ikx} h\left(y\right) \text{d}y$, and the two seem contradictory, your thoughts?

1 Answers 1

2

I always find it irritating when people use the same letter to refer both to the random variable and to the variable being integrated in expressions like your $\int xf(x)\,dx$ above. I prefer in most circumstances to use a capital $X$ to refer to the random variable and lower-case $x$ to refer to the bound variable in the integral. That way when one refers to $f_X(x)$, one knows what is meant, and an expression like $f_X(3)$ means the value of the density function at $3$.

That is the notation I will follow below.

If $X$ is a random variable then so is $g(X)$. One need not find $f_{g(X)}(x)$ in order to find $\mathbb{E}(g(X))$, since, according to the law of the unconscious statistician, one has $ \mathbb{E}(g(X)) = \int_{-\infty}^\infty g(x) f_X(x)\,dx. $ If $\mathbb{E}\left(|g(X)|^2\right)<\infty$, then the strong law of large numbers is applicable to the probability distribution of $g(X)$, so the sample average of an i.i.d. sample from that distribution will converge almost surely to $\mathbb{E}(g(X))$. (I use an absolute value since I don't want to require $g(X)$ to be real, so the square would not otherwise necessarily be positive.)