0
$\begingroup$

I came across a proof claiming that given arbitrary (discrete) random variables $A$, $B$, $C$, if for every assignment $P(A | B, C) = P(B | A, C)$ then it follows that $ P(A|C) = P(B|C)$ The proof is very short: $(A,B|C) = P(A|B,C)P(B|C) = P(B|A,C)P(A|C)$ And from the last equation the result follows by eliminating the terms that are equal according to the assumption.

However, I am not completely sure what happens in case $P(A,B|C) = 0$ for some assignment to $A$, $B$, $C$. I've been (unsuccessfully) trying to find some counterexample for such a case. Any ideas towards either direction?

  • 0
    @DidierPiau: It was just given and solved superficially (in the way I described above) as an exercise in a course I attended.2012-01-31

1 Answers 1

1

Let $A$ and $B$ denote events such that $P(A|B) = P(B|A)$. Does it follow that $P(A) = P(B)$? From a cursory look, an unwary student might say that $P(A|B) = P(B|A) \Rightarrow \frac{P(A\cap B)}{P(B)} = \frac{P(A\cap B)}{P(A)} \Rightarrow \frac{1}{P(A)} = \frac{1}{P(B)} \Rightarrow P(A) = P(B).$ But a more wary student would say that if the intersection of $A$ and $B$ is an event of zero probability, then the equality $P(A|B) = P(B|A)$ would hold with both sides being zero without it being necessarily true that $P(A) = P(B)$. Thus, if $P(A|B) = P(B|A) > 0$, then $P(A) = P(B)$, but $P(A|B) = P(B|A) = 0$ need not imply that $P(A) = P(B)$

Now suppose that $X$ and $Y$ are discrete random variables taking on values in $\{x_i\}$ and $\{y_j\}$ respectively. If $P\{X = x_i\mid Y = y_j\} = P\{Y = y_j\mid X = x_i\} ~~ \text{for all}~ i, j$ then, since it cannot be true that $P\{X = x_i, Y = y_j\} = 0$ for all choices of $i$ and $j$, there must be some $i$ and $j$ such that $P\{X = x_i, Y = y_j\} > 0$. All we can deduce from this is that $P\{X = x_i\} = P\{Y = y_j\}$. But your statement seems to mean that $X$ and $Y$ have identical marginal distributions, and why $x_i$ must necessarily equal $y_j$ is something that I cannot see immediately. As a trivial example, suppose that $X$ and $Y$ are Bernoulli random variables $\begin{align*} P\{X = 0 \mid Y = 0\} &= P\{Y = 0 \mid X = 0\} = 0\\ P\{X = 0 \mid Y = 1\} &= P\{Y = 1 \mid X = 0\} = 1\\ P\{X = 1 \mid Y = 0\} &= P\{Y = 0 \mid X = 1\} = 1\\ P\{X = 1 \mid Y = 1\} &= P\{Y = 1 \mid X = 1\} = 0 \end{align*}$

From the two middle equations, we deduce that $P\{Y=1\} = P\{X=0\}$ and $P\{Y=0\}=P\{X=1\}$, but it does not follow that $P\{X=1\} = P\{Y=1\}$. In other words, $X$ and $Y$ could be Bernoulli random variables with parameters $p$ and $1-p$ respectively where $p \neq \frac{1}{2}$, and thus have different distributions.

Feel free to insert a $,C$ or $Z = z_i$ to the right of $\mid$ everywhere.

  • 0
    Thanks to @DidierPiau and OP 3electrologos for pointing out the mistake: I wrote the joint probabilities instead of the conditional probabilities in my displayed equations. I have fixed this, and re-written the text to re-inforce the claim that the conditions specified do not necessarily lead to identical distributions for $X$ and $Y$.2012-01-31