0
$\begingroup$

I came across a proof claiming that given arbitrary (discrete) random variables $A$, $B$, $C$, if for every assignment $$P(A | B, C) = P(B | A, C)$$ then it follows that $$ P(A|C) = P(B|C)$$ The proof is very short: $$(A,B|C) = P(A|B,C)P(B|C) = P(B|A,C)P(A|C)$$ And from the last equation the result follows by eliminating the terms that are equal according to the assumption.

However, I am not completely sure what happens in case $P(A,B|C) = 0$ for some assignment to $A$, $B$, $C$. I've been (unsuccessfully) trying to find some counterexample for such a case. Any ideas towards either direction?

  • 0
    Do you (really) mean $A,B,C$ are discrete random variables or simply *events*?2012-01-30
  • 0
    Which, if any, of the following are true statements? $$\text{If}~ P(A) = P(B), \text{then}~ P(A|B) = P(B|A).$$ $$\text{If}~ P(A|B) = P(B|A), \text{then}~ P(A) = P(B).$$ If you get the right answer to the above question, what happens if you condition everything on $C$?2012-01-30
  • 0
    Surely, if P(A,B|C) = 0 then both sides of the equality that you're trying to prove are 0?2012-01-30
  • 0
    @cardinal: No, they are definitely RVs.2012-01-30
  • 0
    @DilipSarwate: Interesting, I'll think about that.2012-01-30
  • 0
    @DavidWallace: Why is that?2012-01-30
  • 0
    If $A$ and $B$ are random variables, what is $P(A\mid B)$?2012-01-30
  • 0
    @DidierPiau: Just a convenience notation for $P(A=a|B=b)$.2012-01-30
  • 0
    In other words, the objects that really concern you are the three EVENTS $[A=a]$, $[B=b]$ and $[C=c]$, for some fixed $(a,b,c)$. You could call these events $A$, $B$ and $C$, in agreement with 99% of the literature on the subject...2012-01-30
  • 0
    @DidierPiau: No, what (I think) the proof intends to say is that the equations hold for every possible assignment $a, b, c$ to $A, B, C$.2012-01-30
  • 0
    Would you indicate the source?2012-01-31
  • 0
    @3lectrologos - to your question addressed to me - because that means that A, B, C can't all happen at once.2012-01-31
  • 0
    @DidierPiau: It was just given and solved superficially (in the way I described above) as an exercise in a course I attended.2012-01-31

1 Answers 1

1

Let $A$ and $B$ denote events such that $P(A|B) = P(B|A)$. Does it follow that $P(A) = P(B)$? From a cursory look, an unwary student might say that $$P(A|B) = P(B|A) \Rightarrow \frac{P(A\cap B)}{P(B)} = \frac{P(A\cap B)}{P(A)} \Rightarrow \frac{1}{P(A)} = \frac{1}{P(B)} \Rightarrow P(A) = P(B).$$ But a more wary student would say that if the intersection of $A$ and $B$ is an event of zero probability, then the equality $P(A|B) = P(B|A)$ would hold with both sides being zero without it being necessarily true that $P(A) = P(B)$. Thus, if $P(A|B) = P(B|A) > 0$, then $P(A) = P(B)$, but $P(A|B) = P(B|A) = 0$ need not imply that $P(A) = P(B)$

Now suppose that $X$ and $Y$ are discrete random variables taking on values in $\{x_i\}$ and $\{y_j\}$ respectively. If $$P\{X = x_i\mid Y = y_j\} = P\{Y = y_j\mid X = x_i\} ~~ \text{for all}~ i, j$$ then, since it cannot be true that $P\{X = x_i, Y = y_j\} = 0$ for all choices of $i$ and $j$, there must be some $i$ and $j$ such that $P\{X = x_i, Y = y_j\} > 0$. All we can deduce from this is that $P\{X = x_i\} = P\{Y = y_j\}$. But your statement seems to mean that $X$ and $Y$ have identical marginal distributions, and why $x_i$ must necessarily equal $y_j$ is something that I cannot see immediately. As a trivial example, suppose that $X$ and $Y$ are Bernoulli random variables $$\begin{align*} P\{X = 0 \mid Y = 0\} &= P\{Y = 0 \mid X = 0\} = 0\\ P\{X = 0 \mid Y = 1\} &= P\{Y = 1 \mid X = 0\} = 1\\ P\{X = 1 \mid Y = 0\} &= P\{Y = 0 \mid X = 1\} = 1\\ P\{X = 1 \mid Y = 1\} &= P\{Y = 1 \mid X = 1\} = 0 \end{align*}$$

From the two middle equations, we deduce that $P\{Y=1\} = P\{X=0\}$ and $P\{Y=0\}=P\{X=1\}$, but it does not follow that $P\{X=1\} = P\{Y=1\}$. In other words, $X$ and $Y$ could be Bernoulli random variables with parameters $p$ and $1-p$ respectively where $p \neq \frac{1}{2}$, and thus have different distributions.

Feel free to insert a $,C$ or $Z = z_i$ to the right of $\mid$ everywhere.

  • 0
    Thanks for the answer! However, your trivial example is faulty, because the conditionals don't sum up to 1 (e.g. $P\{X = 0 | Y = 0\} + P\{X = 1 | Y = 0\} = p$). In fact, if you work out the general case of binary valued RVs, you'll see that if $P\{X = x_i | Y = y_j\} = P\{Y = y_j | X = x_i\}$ hold, then the normalization constraints lead to marginals that are indeed always identical, specifically $P\{X = 0\} = P\{X = 1\} = P\{Y = 0\} = P\{Y = 1\} = 0.5$.2012-01-31
  • 0
    In the example, $P(U=i\mid V=1-i)=1$ for every $i$ in $\{0,1\}$ and every $U$ and $V$ such that $\{U,V\}=\{X,Y\}$.2012-01-31
  • 0
    Thanks to @DidierPiau and OP 3electrologos for pointing out the mistake: I wrote the joint probabilities instead of the conditional probabilities in my displayed equations. I have fixed this, and re-written the text to re-inforce the claim that the conditions specified do not necessarily lead to identical distributions for $X$ and $Y$.2012-01-31