I've started a course on Markov chains which requires use of conditional probabilities all the time. I don't have much experience with conditional probabilities from my previous courses in probability theory. It's natural to wonder then, which of the standard results/identities/tricks continue to hold for conditional probabilities?
Suppose $A,B,A_j$ are events for all $j$. Then we know that $$ (1) \quad P(A \cup B) = P(A) + P(B) - P(A \cap B)\\ (2) \quad P(A^c) = 1-P(A) \\ (3) \quad P(A_1 \cup A_2 \cup \cdots \cup A_n) = \prod_j P(A_j) \quad \text{if $A_j,A_k$ are independent for all $j,k$}\\ (4) \quad A \subset B \quad \implies P(A) \leq P(B)\\ (5) \quad P(\varnothing) = 0 \\ \vdots $$ (If I have forgotten some standard tricks, feel free to edit them in.)
My question is then, do all the standard tricks we use from ordinary probability theory continue to hold for conditional probabilities in general? i.e. does $$P(A \cup B) = P(A) + P(B) - P(A \cap B) \implies P(A \cup B|C) = P(A|C) + P(B|C) - P(A \cap B|C) \\ P(A^c) = 1 - P(A) \implies P(A^c|C) = 1 - P(A|C)\\ \vdots $$ and so on for all the tricks?