3
$\begingroup$

I've read on Wikipedia that the sum (or integral, for continuous $Y$) of $P(Y=y|X=x)$ over $y$ is always equal to one. I've attempted a proof of this statement for the discrete (sum) case:

Proof: By the Kolmogorov definition of conditional probability and the Law of Total Probability,

$$\sum_k P(A_k | B) = \sum_k \frac{P(A_k \cap B)}{P(B)} = \frac{1}{P(B)}\sum_kP(A_k \cap B) = \frac{1}{P(B)}P(B)=1.\ \square$$

Is this a correct proof? How would I proof the above statement for the continuous case, i.e. that $\int_y P(Y=y|X=x)$ always equals one?

1 Answers 1

3

Your proof looks great! For the continuous case, your probability is actually a density, so you could write \begin{align*} \int_{\mathbb{R}} f(y \mid x) dy = \int_{\mathbb{R}} \frac{f(y,x)}{f(x)} dy = \frac{1}{f(x)}\int_{\mathbb{R}} f(y,x) dy = 1 \end{align*} since the integral of the joint density over the support of $Y$ is simply the marginal for $X$, so the proof is more or less the same.

  • 0
    Beautiful -- thank you! Can I prove that the integral over the joint distribution yields the marginal for $X$?2017-01-05
  • 0
    This is the definition for the marginal density of a joint distribution, so you wouldn't need to prove anything. Here is a reference- https://en.wikipedia.org/wiki/Marginal_distribution2017-01-05
  • 0
    Excellent. Thank you! Huge kudos for the lightning-fast answer. I'll accept as soon as the time limit allows it :)2017-01-05