Suppose I have events $A,B,C,D,\ldots$ and that my probability space is discrete and finite (As nice as possible). Now, suppose I give you the values of say $P(A,B)$ , $P(A|B,C)$, $P(D, B|A)$ etc etc. When can I determine whether or not I have enough information to calculate a quantity like $P(C, D | A, B)$. In other words, I give you a bunch of single, joint and conditional probabilities. Which functions of the form f(x1,x2,\ldots, x1',x2',\ldots):= P(x1, x2, \ldots | x1', x2',\ldots) are calculable? Notice that by basic conditioning rules I can reduce most statements to sums and ratios of probabilities of unions or intersections. I don't expect there to be a "closed form" answer to any of this but, is there at least some general overview?
Conditions for an "Algebra" of probabilities
1 Answers
If the events are $A_j$, $j=1\ldots n$, express everything in terms of the probabilities $p_k$ of the $2^n$ events $\bigcap_j B_j$ where each $B_j$ is either $A_j$ or $A_j^c$. The probability of an event is the sum of some subset of the $p_k$; a conditional probability is the quotient of two such sums. So specifying some probabilities and/or conditional probabilities amounts to linear equality constraints on the $p_k$ (plus, in the case of conditional probabilities, the requirement that the denominator is nonzero). Deciding whether a certain probability is determined by these constraints plus $p_k \ge 0$ and $\sum_k p_k = 1$ is then a linear programming problem.
Deciding whether a conditional probability is determined is slightly trickier, but still decidable using linear programming. Suppose you're interested in the quotient $a/b$ coming from a conditional probability. First test whether the constraints are inconsistent or $b=0$ is possible: either of these means $a/b$ does not have a well-defined value. Otherwise, find one possible value $r$ of $a/b$. Then maximize and minimize $a - r b$ subject to the constraints. In order for $a/b$ to be uniquely determined, the minimum and maximum values must be $0$.