1
$\begingroup$

I have difficulties in the very beginning of probability theory.

A (discrete) random variable $X$ is a function from a finite or countably infinite sample space $S$ to the real numbers. The probability density function of the random variable $X$: $f(x)=Pr\{X=x\}=\sum_{s\in S:X(s)=x}Pr\{s\}$ If I want to know what probability of $X=x_1\ or\ x_2$, then $Pr\{X=x_1 or X=x_2\}=\sum_{s\in S:X(s)=x_1\ or\ X(s)=x_2}Pr\{s\}=$ Consider that, x1 and x2 are mutually exclusive, I can split the sum $=\sum_{s\in S:X(s)=x_1}Pr\{s\}+\sum_{s\in S:X(s)=x_2}Pr\{s\}=Pr\{X=x_1\}+Pr\{X=x_2\}$ The problem is in the case when there are two independent random variables. How can I show that $Pr\{X=x\ and\ Y=y\}=Pr\{X=x\}Pr\{Y=y\}$ Thanks.

  • 1
    .... there are two independent random variables. How can I show that $Pr\{X=x\ and\ Y=y\}=Pr\{X=x\}Pr\{Y=y\}$ You cannot **show** the desired result in the sense that it is part of the **definition** of independent random variables. $X$ and $Y$ are said to be independent (discrete) random variables if and only if $Pr\{X=x\ and\ Y=y\}=Pr\{X=x\}Pr\{Y=y\}$ holds for _all_ choices of $x$ and $y$, and so once you say that $X$ and $Y$ are independent, there is nothing left to **show** or **prove**.2012-10-18

2 Answers 2

2

$\Pr\{X=x\ \text{ and } Y=y\}=\Pr\{X=x|Y=y \}\,\Pr\{Y=y\}$ as a conditional probability.

$\Pr\{X=x|Y=y \}=\Pr\{X=x\}$ if $X$ and $Y$ are independent random variables.

So $\Pr\{X=x\ \text{ and } Y=y\}=\Pr\{X=x \}\,\Pr\{Y=y\}$ for independent random variables.


Added: perhaps you want something like

$\sum_{s\in S:X(s)=x \text{ and }Y(s)=y}\Pr\{s\} = \frac{\sum_{s\in S:X(s)=x \text{ and }Y(s)=y}\Pr\{s\}}{\sum_{s\in S:Y(s)=y}\Pr\{s\}} \sum_{s\in S:Y(s)=y}\Pr\{s\}$ $= \frac{\sum_{s\in S_{Y(s)=y}:X(s)=x }\Pr\{s\}}{\sum_{s\in S_{Y(s)=y}}\Pr\{s\}} \sum_{s\in S:Y(s)=y}\Pr\{s\}= \sum_{s\in S:X(s)=x }\Pr\{s\} \sum_{s\in S:Y(s)=y}\Pr\{s\}$

where $S_{Y(s)=y}$ means $\{s\in S:Y(s)=y\}$, and the final equality is due to independence.

  • 0
    @WiseBird: Try now2012-10-18
0

To answer your comment to Henry's answer, yes, $P(X = x \land Y = y)$ can be expressed as a sum of probabilities of elementary events:

$ P(X = x \land Y = y) = \sum_{s \in S: X(s) = x \land Y(s) = y} P(\{s\}) $

More generally, for any predicate $\Phi(X, Y, Z, \dotsc)$ we have:

$ P(\Phi(X,Y,Z,\dotsc)) = \sum_{s \in S: \Phi(X(s),Y(s),Z(s),\dotsc)} P(\{s\}) $

i.e. the probability that the predicate is true is the sum of the probabilities of the elementary events for which it is true.

(Of course, all this is assuming that the sample space $S$ is countable, so that summing over it makes sense. Corresponding statements do hold for uncountable sample spaces too, but the notation gets more complicated since we have to replace the simple sum with an integral over a measure.)