14
$\begingroup$

In probability theory we often used the existence of a sequence $(X_n)_n$ of independent and identically distributed random variables. This was already discussed here. One of the answers says:

As Ahriman has pointed out, if you are given a random variable $X:\Omega\to E$ it may not be possible to construct the whole sequence on $\Omega$ as the latter may be quite a poor space, so you would have to go for a richer space.

My question is the following: How could I "enrich" my given probability space $\Omega$ such that I can ensure the existence of iid random variables on this probability space?

My idea was the following: Assume that I have a given probability space $(\Omega_1,\mathcal{A}_1,\mathbb{P}_1)$ and a random variable $X:\Omega_1 \to E$. Now I can construct a probability space $(\Omega_2,\mathcal{A}_2,\mathbb{P}_2)$ such that there exists a sequence of iid random variables $X_n: \Omega_2 \to E$. Let $(\Omega,\mathcal{A},\mathbb{P}) := (\Omega_1,\mathcal{A}_1,\mathbb{P}_1) \otimes (\Omega_2,\mathcal{A}_2,\mathbb{P}_2)$ the product space, then

$X_n'(w_1,w_2) := X_n(w_2) \qquad \qquad X'(w_1,w_2) := X(w_1)$

would still fulfill $X' \sim X$, $X_n' \sim X_n$ and the random variables $X_n'$ would be independent. Is this correct...?

  • 0
    @dtldarek FYI, a sufficient condition (which is certainly satisfied here by our probability measures) to construct a product space out of two measure spaces is that the measures be sigma finite. This is the one I operate under, and I *believe* that it is what most working analysts use. You can always construct the product space of arbitrary cardinality index set if you ignore the 3rd coordinate of data: the measure. But if you want the measure too, then you need some regularity conditions and for the measures to be probability measures if the index set of the product is infinite.2014-12-25

3 Answers 3

2

Saz your approach is correct. In fact, it is known as the Kolmogorov's existence theorem. One book where you can find the proof is on page 482 of Probability and measure of Patrick Billingsley. On this book, you can see how to construct independent processes and random variables via finite dimensional distributions. These independent processes and random variables will be projections of the original ones, and they will indeed have the same distribution.

2

I cannot comment yet, so I'm posting this as an answer.$\def\ci{\perp\!\!\!\perp}$

This is probably not what you were asking, but I think it's interesting and relevant enough to post.

It's known possible to construct arbitrary distributions from uniform variables. Furthermore, given a $\mathcal U[0,1]$ variable, it's possible to produce from it an i.i.d sequence of such variables, which can then be used to obtain more general distributions. We can always extend a space to obtain such variables by$\hat{\Omega}=\Omega\times[0,1]\text{, }\hat{\mathscr{A}}=\mathscr{A}\otimes\mathscr{B}\text{, }\hat{P}=P\otimes\lambda $ in which case $\vartheta(\omega,t):= t$ is $\mathcal{U}[0,1]$ and $\vartheta\ci \mathscr{A}$.

For more details see Kallenberg - Foundations of Modern Probability (2002), in particular the discussion before Theorem 6.10 (transfer).

2

To give this a simple answer:

Yes, the approach described in the question works fine.