0
$\begingroup$

Suppose $X$ and $Y$ are i.i.d. random variables. Also suppose they take the values from the set $\{1,2, \dots, n \}$. Then does this mean that $P(X=1, Y= 1) = P(X=1) \cdot P(Y=1)$ $P(X=1, Y=2) = P(X=1) \cdot P(Y=2) \dots$

So there are are $\binom{n}{2}$ cases for which $P(X \cap Y) =P(X) \cdot P(Y)$? If we didn't know that they were independent, we would have to check all these cases?

  • 0
    For i.i.d. discrete random variables $X$ and $Y$, $(X,Y) \in \{1,2,\ldots, n\}^2$ has $n^2$ different points where $P\{X = i, Y = j\} = P\{X = i\}P\{Y = j\}$ holds, $i, j \in \{1,2,\ldots, n\}$, not $\binom{n}{2}$ points. To _prove_ dependence, all you need to do is find _one_ $(i,j)$ such that $P\{X = i, Y = j\} \neq P\{X = i\}P\{Y = j\}$. To prove that a given joint pmf corresponds to i.i.d random variables, you need to work harder.2012-03-12

1 Answers 1

1

Knowing $P(X=1, Y= 2) = P(X=1) \cdot P(Y=2)$ does not tell you $P(X=2, Y= 1) = P(X=2) \cdot P(Y=1)$ unless you have extra information.

So if you do not know that they are independent then you have to check almost $n^2$ pairs, though you can save a small number since probabilities add up to $1$.

To show they are identically distributed, you might also want to check $P(X=3)= P(Y=3)$ etc.

  • 0
    @alexm: yes, though identical marginal distribution does not guarantee identical conditional distributions.2012-03-12