1
$\begingroup$

(In the following I'm only talking about discrete space, variables etc.)

The definition of independence is such that when two events are dependent we don't exactly know, how the dependence is.

Let me specify this: Suppose we have two dependent random variables $X,Y$ on or space $\Omega$; that means, that there exists $A,B\subseteq\mathbb{R}$ such that the events $X^{-1}(A)\subseteq\Omega$ and $Y^{-1}(B)\subseteq\Omega$ are dependent. But how does this translate to what we can say about the distribution of, say, $Y$, if we know the distribution of $X$ ? What is the connection between their distributions ?

I'm asking this since here, right at the beginning, it says that "two random variables are independent if they convey no information about each other and, as a consequence, receiving information about one of the two does not change our assessment of the probability distribution of the other", so if the variables are dependent there would have to be a connection between their distributions.

  • 0
    ah, yes sorry, I forgot I talked about dependence rather than independence.2012-12-29

1 Answers 1

1

Conditional probability tells how a variable depends (pointwise) on another one. Assuming discrete distributions: $P(Y=y | X=x) = \frac{P(X=x \cap Y=y)}{P(X=x)}$

If they are indendependent

$P(Y=y | X=x) = \frac{P(X=x) P(Y=y)}{P(X=x)}=P(Y=y)$

Elsewhere, if they are not independent, knowing the value of $X$ changes the distribution of $Y$ (at least for some values of $X$ and $Y$).

what we can say about the distribution of, say, $Y$, if we know the distribution of $X$

Nothing. If we know the value of $X$ , and assuming that we know the full joint distribution of $X,Y$ we can know the conditioned distribution of $Y$, $P(Y|X)$.

  • 0
    @user26698 : It's just the first equation (I rewrote it with the letter swapped, so that it fits better the statement).2012-12-29