3
$\begingroup$

If I calculate $\Pr(X \mid E_1, E_2, E_3, \dots)$ using Bayes' Theorem all at once (applying Bayes' Theorem with observation $E_1, E_2, E_3, \dots$) vs. one update at a time (i.e. starting with my prior, updating my prior with $E_1$, updating it with $E_2$, etc.), will I get the same result? How might I mathematically demonstrate this?

  • 0
    Alternatively, has anyone seen a proof online?2017-01-08

1 Answers 1

4

I'll just do the case of two updates. You can generalize the argument by induction without much difficulty.

I assume that by an "update" of the prior $P$ on an event $E$, you mean the probability measure $P_E$ defined by $P_E(\cdot) = P(\cdot \mid E)$.

We want to show that $P_{E_1 \cap E_2} = (P_{E_1})_{E_2}$, where the left-hand side is the "all at once" update, and the right-hand side is the "one at a time" update. There's no need to use Bayes' Theorem; using just the definition of conditional probability we have

$$P_{E_1 \cap E_2}(X) = P(X \mid E_1 \cap E_2) = \frac{P(X \cap E_1 \cap E_2)}{P(E_1 \cap E_2)}.$$

Now, for the one at a time case, we have

$$(P_{E_1})_{E_2}(X) = P_{E_1}(X \mid E_2) = \frac{P_{E_1}(X \cap E_2)}{P_{E_1}(E_2)} = \frac{P(X \cap E_2 \mid E_1)}{P(E_2 \mid E_1)} = \frac{P(X \cap E_2 \cap E_1)/P(E_1)}{P(E_2 \cap E_1)/P(E_1)} = \frac{P(X \cap E_2 \cap E_1)}{P(E_2 \cap E_1)} = P_{E_1 \cap E_2}(X).$$

So all at once and one at a time yield the same probability measure.

  • 0
    How might $P_{E_1}$ be defined? Is it like the probability given $E_1$?2017-01-08
  • 1
    Yes. I've edited to clarify.2017-01-08
  • 0
    This is probably a dumb question, but the answer you gave doesn't directly use Bayes' Theorem. How do I know that multiplying my posterior from $E_1$ by the likelihood of $E_2$ and normalizing gives $P(X \mid E_1, E_2)$? Thanks so much!2017-01-08
  • 0
    I'm sorry, I'm not sure I understand your question. Could you point out which step of the proof you're worried about? Note that Bayes' Theorem is just a trivial consequence of the definition of conditional probability, which is all my answer uses.2017-01-08
  • 0
    Ahh, that helps -- thank you so much! If I'm using Bayes' Theorem for the second update, I'd have $P_{E_1}(X | E_2) = \frac{P_{E_1}(E_2 | X) \cdot P_{E_1}(X)}{P_{E_1}(E_2)}$, right? Expanding, would this become $$P(X | E_1, E_2) = \frac{P(E_2|X, E_1) \cdot P(X|E_1)}{P(E_2|E_1)}$$?2017-01-08