14
$\begingroup$

One learns in a probability course that a (real) random variable is a measurable mapping of some probability space $(\Omega,\mathcal{A},\mathbf{P})$ into $(\mathbb{R},\mathcal{B}(\mathbb{R}))$. But as soon as one gets into topics that are a little advanced, the space $(\Omega,\mathcal{A},\mathbf{P})$ is not mentioned unless it is absolutely necessary. After a long time of frustration, I have become quite comfortable with this language. But some things still trouble me. The following kind of reasoning comes up in the book I'm reading:

The author says that $(X_i)_{i\in I}$ is a family of random variables and specifies the distribution of each random variable. Then he phrases some (random) proposition $A((X_i)_{i_\in I})$ (this is a little imprecise, I hope you get the meaning) and talks about $\mathbf{P}[A((X_i)_{i_\in I}) \text{ holds}]$.

My question: Let (\Omega',\mathcal{A}',\mathbf{P}') be another probability space and $(Y_i)_{i\in I}$ random variables such that, for each $i\in I$, the distribution of $Y_i$ is the same as the distribution of $X_i$. Is it then obvious that \mathbf{P}[(A(X_i)_{i_\in I}) \text{ holds}]=\mathbf{P}'[(A(Y_i)_{i_\in I}) \text{ holds}]?

Now my guess is that this is true, but needs a proof, which is not completely trivial in case $I$ is infinite, at least not for a beginner. However, in the book this problem isn't discussed at all. So did I miss something?

Edit:

I'm not sure whether the question was correctly understood, so I'll rephrase it a little.

Let $(\Omega,\mathcal{A},\mathbf{P})$ and (\Omega',\mathcal{A}',\mathbf{P}') be two probability spaces, $I$ a set, and $(X_i)_{i\in I}$ and $(Y_i)_{i\in I}$ families of random variables on $(\Omega,\mathcal{A},\mathbf{P})$ and (\Omega',\mathcal{A}',\mathbf{P}') respectively such that, for each $i\in I$, the distribution of $X_i$ is equal to the distribution of $Y_i$. Let $J$ be a countable subset of $I$ and $B_j$ a Borel set for each $j\in J$. The question is:

Is it obvious that \mathbf{P}\left[\bigcup_{j\in J}\{X_j\in B_j\}\right]=\mathbf{P}'\left[\bigcup_{j\in J}\{Y_j\in B_j\}\right]?

The sets $B_j$ and the union over $J$ are just an example. What I mean, but cannot formalize: Let $A\in\mathcal{A}$ and A'\in\mathcal{A}' such that there is an expression for $A$ in terms of the $X_i$, and A' is given by the same expression replacing $X_i$ by $Y_i$ for each $i$. Is it obvious that \mathbf{P}[A]=\mathbf{P}'[A']?

  • 0
    @Didier: Yes, your answer gave a counterexample to my much to general statement (and$I$upvoted it then). I didn't accept it because the important point for me was not that more assumptions are needed, but rather the nature of a proof under these extra assumptions (how trivial is it?). This was at least indirectly answered by Oiaochu in the comments, so I left it at that.2011-04-07

2 Answers 2

2

The distributions of each $X_i$ and each $Y_i$ are far from being sufficient to decide anything about the families $(X_i)_i$ and $(Y_i)_i$.

Assume for instance that $X_1$, $X_2$, $Y_1$ and $Y_2$ are all uniform $\pm1$ Bernoulli random variables and that $X_1=X_2=Y_1=-Y_2$. Then the event $[X_1=1\ \mbox{or}\ X_2=1]$ has probability $\frac12$ while the event $[Y_1=1\ \mbox{or}\ Y_2=1]$ has probability $1$.

1

The question is not very clear, but here is some idea. Let $U$ be a uniform$[0,1]$ random variable, and define a random process $X=\{X_t: t \in [0,1]\}$ by $X_t = \mathbf{1}(t=U)$, where $\mathbf{1}$ is the indicator function. Then, $X$ is identical in law to the zero process $Y$ defined by $Y_t = 0$ for all $t \in [0,1]$; that is ${\rm P}[X_{t_1} = 0, X_{t_2}=0,\ldots,X_{t_n}=0]=1$ for any choice of $n \geq 1$ and $0 \leq t_1 < \ldots \leq t_n \leq 1$. However, $X$ and $Y$ are quite different: $X$ is not continuous, its $\sup$ is equal to $1$, etc.

  • 0
    I'm not familiar with stochastic processes, so can't completely understand your answer. But I think you're talking about the converse of what I meant. I hope my question is clearer now, my apologies.2011-02-21