1
$\begingroup$

Does the order of variables in the joint probability $P(X,Y,\dots)$ have any implications on the statement of joint probability? Concrete, is:

$$P(X,Y,Z) = P(Y,X,Z)$$

To my mind, clearly this is correct, can someone assure me?

  • 0
    Yes, it is correct, assuming that $X,Y,Z$ denote events.2012-05-14
  • 0
    @Gigili If $X, Y, Z$ are random variables, what does $P(X,Y,Z)$ mean? For a joint distribution of random variables, the order _does_ matter very much.2012-05-14
  • 0
    @DilipSarwate: According to WP, the joint distribution for $X$ and $Y$ defines the probability of events defined in terms of both $X$ and $Y$, so the order doesn't really matter. I don't get your point, maybe my previous comment is not well-worded.2012-05-14
  • 0
    @Gigili If $F(\cdot, \cdot)$ is, say, the cumulative joint probability distribution function of the random variables $X$ and $Y$ **in that order**, that is, $$F(a,b) = \Pr(X \leq a, Y \leq b),$$ then it is **not true** that $F(a,b) = F(b,a)$ since, as Didier indicates with greater generality in his answer, there is no reason why $\Pr(X \leq a, Y \leq b)$ should equal $\Pr(X \leq b, Y \leq a)$.2012-05-14

1 Answers 1

4

First, let us use canonical notations: for every random variable $U$ with values in $\mathbb R^n$, $\mathrm P_U$ is the measure on $\mathcal B(\mathbb R^n)$ defined by $\mathrm P_U(B)=\mathrm P(U\in B)$ for every $B$ in $\mathcal B(\mathbb R^n)$. Hence your question is whether, for every real valued random variables $X$, $Y$ and $Z$ defined on the same probability space and every $B$ in $\mathcal B(\mathbb R^3)$, $$ \mathrm P((X,Y,Z)\in B)=\mathrm P((Y,X,Z)\in B)\ ? $$ In particular, one would have $$ \mathrm P(X\in B,Y\in C)=\mathrm P(Y\in B,X\in C). $$ Dubious, don't you think?

Edit Here is an example. Assume that $X$ and $Y$ are independent, that $X$ is Bernoulli with parameter $p$, hence $\mathrm P(X=1)=1-\mathrm P(X=0)=p$, and that $Y$ is geometric with parameter $a$, hence $\mathrm P(Y=n)=(1-a)a^n$ for every integer $n\geqslant0$. Choose $B=\{0\}$ and $C=\{1\}$. Then, $$ \mathrm P(X\in B,Y\in C)=(1-p)(1-a)a,\qquad \mathrm P(Y\in B,X\in C)=(1-a)p. $$

Second edit Let us recall that, if $X$ is a random variable with values in, say, a discrete space $E$, then $\mathrm P_X$ is the unique distribution on $(E,2^E)$ such that, for every $B\subseteq E$, $\mathrm P_X(B)=\mathrm P(X\in B)$. Likewise, if $X$ and $Y$ are random variables (defined on the same probability space) with values in some discrete spaces $E$ and $F$, then $\mathrm P_{(X,Y)}$ is the unique distribution on $(E\times F,2^{E\times F})$ such that, for every $B\subseteq E\times F$, $\mathrm P_{(X,Y)}(B)=\mathrm P((X,Y)\in B)$.

Thus, three probability measures are involved: $\mathrm P$ is a probability measure on the probability space (usually denoted) $\Omega$ (and in fact one never uses $\Omega$ nor $\mathrm P$ to perform computations), $\mathrm P_X$ is a probability measure on $(E,2^E)$, and $\mathrm P_{(X,Y)}$ is a probability measure on $(E\times F,2^{E\times F})$.

Recall finally that a probability measure is a function defined on a collection of subsets of a given set, with values in $[0,1]$. For example, $\mathrm P_X:2^E\to[0,1]$. The images $\mathrm P_X(B)$ for $B\subseteq E$ are real numbers in $[0,1]$, not $\mathrm P_X$ itself.

  • 0
    But shouldn't it be correct? Looking at the product rule $$P(X,Y) = P(X|Y) \cdot P(Y) = P(Y|X) \cdot P(X)$$ it seems that this is associative.2012-05-14
  • 0
    The origin of this question is the proof on page 9/10 of http://dis.cs.umass.edu/classes/cs683/qa.pdf I need this in order to proof the conditionalized product rule and bayes rule2012-05-14
  • 0
    Also I limit this on the discrete case http://en.wikipedia.org/wiki/Joint_probability_distribution#Discrete_case2012-05-14
  • 2
    @Mahoni: This is not for nothing that my answer begins by fixing your notation. In fact, I think the main source of your trouble is some confusion in the notations you use (but alone in this you are not...). So let me reiterate: what is it **exactly** that you call $P(X,Y)$? A number? A function? A measure? Something else? If we do not fix this once and for all, we could chase our tails until the end of times...2012-05-14
  • 0
    $P(X,Y)$ is the joint distribution, where $X$ and $Y$ are random variables. $P$ is the probability for the given random variables, thus $\in [0,1]$2012-05-14
  • 0
    @Mahoni: Pages 9/10 of the document you link to are concerned with **events**, not with random variables. Your question mentions **variables** and my answer explains the situation for **random variables**. Please make up your mind...2012-05-14
  • 0
    I am sorry if I have caused confusion, I never thought that an event $X$ or a random variable $X$ do have a different meaning in the context of $P(X)$. I edited my question, where I clarify that $X,Y,Z$ are events.2012-05-14
  • 0
    You should have asked *another* question instead of modifying the present one (hence rendering irrelevant any correct answer to your original question). Anyway, if you intend to study randomness, a good idea could be to get acquainted with the basic vocabulary of the field. What would you say if a geometer started calling lines the triangles and triangles the lines?2012-05-14
  • 0
    I will hope that the renaming happened before I learned these vocabular.2012-05-14
  • 0
    Besides I didn't alter the original question, I just added information which were not given. But since you've made your point and for later look ups I will check out the original question. I am helped with the fact that I can interchange the sequence of events. All was well.2012-05-14