0
$\begingroup$

I've started a course on Markov chains which requires use of conditional probabilities all the time. I don't have much experience with conditional probabilities from my previous courses in probability theory. It's natural to wonder then, which of the standard results/identities/tricks continue to hold for conditional probabilities?

Suppose $A,B,A_j$ are events for all $j$. Then we know that $$ (1) \quad P(A \cup B) = P(A) + P(B) - P(A \cap B)\\ (2) \quad P(A^c) = 1-P(A) \\ (3) \quad P(A_1 \cup A_2 \cup \cdots \cup A_n) = \prod_j P(A_j) \quad \text{if $A_j,A_k$ are independent for all $j,k$}\\ (4) \quad A \subset B \quad \implies P(A) \leq P(B)\\ (5) \quad P(\varnothing) = 0 \\ \vdots $$ (If I have forgotten some standard tricks, feel free to edit them in.)

My question is then, do all the standard tricks we use from ordinary probability theory continue to hold for conditional probabilities in general? i.e. does $$P(A \cup B) = P(A) + P(B) - P(A \cap B) \implies P(A \cup B|C) = P(A|C) + P(B|C) - P(A \cap B|C) \\ P(A^c) = 1 - P(A) \implies P(A^c|C) = 1 - P(A|C)\\ \vdots $$ and so on for all the tricks?

  • 0
    The "conditioned" probability theory is the "ordinary" theory. The case when the variables are independents are not so much.2017-02-04

3 Answers 3

1

Yes the standard tricks used in ordinary probability theory continue to hold for conditional probabilities.

You could verify this using venn diagram.

In general when you talk about conditional probability of A given B then you actually are reducing your sample space to the event of occurrence of B.

1

All of the tricks should still work. The "tricks" work for any probability function, which are functions that are:

  1. Non-negative

  2. have $P(\Omega) = 1$ (where $\Omega$ is your entire sample space)

  3. $P(A) = \sum_{\omega\in A}P(\omega)$.

You can verify that a conditional probability, given that $P(B)\neq 0$, is just another probability function.

So, all the tricks work for any probability function, so if I give you a probability function $P'$, you'd expect the tricks to work. What if $P'$ takes the form? $$P'(A) = P(A\mid B)$$ It doesn't matter. It's still a probability function, so things that are true for those are true for conditional probabilities.

  • 0
    I want to accept this as the answer. For completeness, can you explain what happens if we consider $P'(A|C)$? Is that $P'(A|C) = P(A|B|C)$ or something else, such as $P'(A|C) = P(A|B \cup C)$? How can we define and do math on two conditional statements induced by the $P'$ probability function? Thanks2017-02-04
0

Yes.

Related to your first particular identity it is easily seen that:

$P((A \cup B) \cap C) = $

$P((A \cap C) \cup (B \cap C)) =$

$P(A \cap C) + P(B \cap C) - P(A \cap C \cap B \cap C) =$

$P(A \cap C) + P(B \cap C) - P(A \cap B \cap C)$

Moreover:

$P((A \cup B) \cap C) = P(A \cup B|C) * P(C)$

$P(A \cap C) = P(A|C)*P(C)$

$P(B \cap C) = P(B|C)*P(C)$

$P(A \cap B \cap C) = P(A \cap B|C)*P(C)$

So plug these into equality we just derived, and divide both sides by $P(C)$ (which is ok, since in conditional probabilities we assume the condition is possible, i.e. $P(C) \not =0$), and you obtain:

$P(A \cup B|C) = P(A|C) + P(B|C) - P(A\cap B|C)$

Other kinds of equalities that hold for 'unconditional' probabilities can likewise be extended to conditional probabilities.