14
$\begingroup$

One learns in a probability course that a (real) random variable is a measurable mapping of some probability space $(\Omega,\mathcal{A},\mathbf{P})$ into $(\mathbb{R},\mathcal{B}(\mathbb{R}))$. But as soon as one gets into topics that are a little advanced, the space $(\Omega,\mathcal{A},\mathbf{P})$ is not mentioned unless it is absolutely necessary. After a long time of frustration, I have become quite comfortable with this language. But some things still trouble me. The following kind of reasoning comes up in the book I'm reading:

The author says that $(X_i)_{i\in I}$ is a family of random variables and specifies the distribution of each random variable. Then he phrases some (random) proposition $A((X_i)_{i_\in I})$ (this is a little imprecise, I hope you get the meaning) and talks about $\mathbf{P}[A((X_i)_{i_\in I}) \text{ holds}]$.

My question: Let $(\Omega',\mathcal{A}',\mathbf{P}')$ be another probability space and $(Y_i)_{i\in I}$ random variables such that, for each $i\in I$, the distribution of $Y_i$ is the same as the distribution of $X_i$. Is it then obvious that $\mathbf{P}[(A(X_i)_{i_\in I}) \text{ holds}]=\mathbf{P}'[(A(Y_i)_{i_\in I}) \text{ holds}]$?

Now my guess is that this is true, but needs a proof, which is not completely trivial in case $I$ is infinite, at least not for a beginner. However, in the book this problem isn't discussed at all. So did I miss something?

Edit:

I'm not sure whether the question was correctly understood, so I'll rephrase it a little.

Let $(\Omega,\mathcal{A},\mathbf{P})$ and $(\Omega',\mathcal{A}',\mathbf{P}')$ be two probability spaces, $I$ a set, and $(X_i)_{i\in I}$ and $(Y_i)_{i\in I}$ families of random variables on $(\Omega,\mathcal{A},\mathbf{P})$ and $(\Omega',\mathcal{A}',\mathbf{P}')$ respectively such that, for each $i\in I$, the distribution of $X_i$ is equal to the distribution of $Y_i$. Let $J$ be a countable subset of $I$ and $B_j$ a Borel set for each $j\in J$. The question is:

Is it obvious that $\mathbf{P}\left[\bigcup_{j\in J}\{X_j\in B_j\}\right]=\mathbf{P}'\left[\bigcup_{j\in J}\{Y_j\in B_j\}\right]$?

The sets $B_j$ and the union over $J$ are just an example. What I mean, but cannot formalize: Let $A\in\mathcal{A}$ and $A'\in\mathcal{A}'$ such that there is an expression for $A$ in terms of the $X_i$, and $A'$ is given by the same expression replacing $X_i$ by $Y_i$ for each $i$. Is it obvious that $\mathbf{P}[A]=\mathbf{P}'[A']$?

  • 0
    If I'm reading your question correctly, the answer is no: you need to know that the _joint_ distributions are the same. If the variables are independent then this is automatic.2011-02-21
  • 1
    If it is an uncountably infinite collection of random variables (often the case in stochastic process theory), then even independent is not enough. Edit: Now I read Shai Covo's answer, he is also making this point.2011-02-21
  • 0
    @Qiaochu @George: I hope I've clarified the question. Do your comments still apply?2011-02-21
  • 1
    @Stefan: yes. You need to know the joint distribution or you don't know anything. You can find a counterexample with two Bernoulli random variables (taking one pair to be independent and the other to be identical).2011-02-21
  • 0
    @Qiaochu: So to prove it in case the families are independent and $I$ is countably infinite, do I have to know that there is a unique product measure and that this has to be the joint distribution of an independent family of random variables? Infinite product measures are in chapter 14 of my book, while the proof for which I think one needs this statement is in chapter 2!2011-02-21
  • 0
    @Qiaochu: *non constant*.2011-02-21
  • 0
    @Stefan: morally you just need to show that you can compute any desired probability of the above form in terms of probabilities of the form $\mathbb{P}(X_i \in B_i)$ by repeatedly applying independence. @Didier: ah, my mistake.2011-02-21
  • 0
    @Stefan Walter, [Kolmogorov's extension theorem might](http://en.wikipedia.org/wiki/Kolmogorov_extension_theorem) might give you the answer you need. If $X_i$ and $Y_i$ are independent then collections $(X_i)_i$ and $(Y_i)_i$ satisfy the conditions of the theorem.2011-02-21
  • 0
    @Qiaochu: Thanks for your help!2011-02-22
  • 0
    Did you get something out of one of the answers below?2011-04-07
  • 0
    @Didier: Yes, your answer gave a counterexample to my much to general statement (and I upvoted it then). I didn't accept it because the important point for me was not that more assumptions are needed, but rather the nature of a proof under these extra assumptions (how trivial is it?). This was at least indirectly answered by Oiaochu in the comments, so I left it at that.2011-04-07

2 Answers 2