On a probability space, from Billingsley's Probability and Measure,
a family of classes of events is said to be independent, if we arbitrarily choose an event from each class in the family, and these events are then independent.
According to the above definition, I deduce the following definition:
An event $A$ is said to be independent of events $B_1, B_2,\dots , B_m$ if $A$ and $B_i$ are independent, $\forall i =1, \dots, m$.
However, I saw from someone's private note that
An event $A$ is said to be independent of events $B_1, B_2,\dots , B_m$ if for evew subset $S$ of $\{1, 2,\dots ,m\}$, $ P[A |\cap_{i \in S} B_i] = P[A]. $
Are the two different definitions equivalent?
In other words, is the following true:
$A$ and $B_i$ are independent, $\forall i =1, \dots, m$, if and only if for evew subset $S$ of $\{1, 2,\dots ,m\}$, $ P[A |\cap_{i \in S} B_i] = P[A]. $
- I wonder how to define independence between an event and a set of events correctly?
- Same questions when there are infinitely many events $B_i, i \in I$?
- Added: What is the definition of independence of an event and a set of events in Lovasz local lemma?
Thanks!