2
$\begingroup$

Here is a problem I couldn't solve.

Given a probability space $ (\Omega, \mathcal{A},\mathbb{P}) $, and $ \mathcal{F}, \mathcal{G}, \mathcal{B} $ sub-$\sigma$-algebras of $\mathcal{A}$.

Is it true that coniditional independence of $\mathcal{B}$ and $\mathcal{F}$ given $\mathcal{G}$ which we can note $\mathcal{F}\;\amalg_{\mathcal{G}} \mathcal{B}$ implies the following property :

$\mathbb{P}(F\cap B \mid G) = \mathbb{P}(F \mid G)\mathbb{P}(B \mid G),$ almost surely $ \;\; \forall \;(F,G,B)\in \mathcal{F} \times \mathcal{G}\times \mathcal{B}$

Where I note :
$\mathbb{P}(A \mid G) =\mathbb{E}[1_A|\sigma(G)], \forall A\in \mathcal{A}$ or more generally given any sub sigma algebra $\mathcal{C}$ of $\mathcal{A}$ :
$\mathbb{P}(A \mid \mathcal{C}) =\mathbb{E}[1_A|\mathcal{C}], \forall A\in \mathcal{A}$

And where conditional independence $\mathcal{F}\;\amalg_{\mathcal{G}} \mathcal{B}$ is defined by :

$\mathbb{P}(F\cap B \mid \mathcal{G}) = \mathbb{P}(F \mid \mathcal{G})\mathbb{P}(B \mid \mathcal{G}),$ almost surely $ \;\; \forall \;(F,B)\in \mathcal{F} \times \mathcal{B}$.

I think this implication is not true but I can't find a counter-example.

Moreover I think the reverse implication holds true.(check edit)

Best Regards

Edit : I thought I had a proof for the reverse implication but as gnometorule suggested it wasn't true, I rechecked it and found a mistake, so I take that last comment back, as I don't know if it is true or not.

  • 0
    @gnometorule: Unless there is something obvious I miss (which I hope), it only seems to be a restatement, because rewriting all this under the expectation operator I couldn"t derive the result. Best Regards2012-01-06

2 Answers 2

2

First implication is not true, since for $G = \Omega$ it would mean that $\mathcal{F}$ and $\mathcal{B}$ are unconditionally independent as soon as they are conditionally independent given some $\sigma $-algebra $\mathcal{G}$, which is absurd.

Edit: I feel the need to write this down in more detail:

$\mathbb{E}\left[ {{1_F}|\sigma \left( \Omega \right)} \right] = \frac{{\mathbb{E}\left[ {{1_\Omega }{1_F}} \right]}} {{\mathbb{P}\left( \Omega \right)}}{1_\Omega } = \mathbb{E}\left[ {{1_F}} \right] = \mathbb{P}\left( F \right)$, similiarly $\mathbb{E}\left[ {{1_B}|\sigma \left( \Omega \right)} \right] = \mathbb{P}\left( B \right)$ and $\mathbb{E}\left[ {{1_{B \cap F}}|\sigma \left( \Omega \right)} \right] = \mathbb{P}\left( {B \cap F} \right)$, which would give us $\mathbb{P}\left( {B \cap F} \right) = \mathbb{P}\left( F \right)\mathbb{P}\left( B \right),\forall \left( {F,B} \right) \in \mathcal{F} \times \mathcal{B}$, which is to say that $\mathcal{F}$ and $\mathcal{B}$ are independent.

Regarding the other answer posted here, if $\mathcal{G} = \mathcal{A}$, there is no contradiction there. It is then simply true that $\mathcal{F}$ and $\mathcal{B}$ are independent, as you have neatly shown. However, $\mathcal{G}$ here is given and fixed, as are $\mathcal{F}$ and $\mathcal{B}$, what is variable is $G \in \mathcal{G}$.

On the other hand, reverse implication holds true, because

$\mathbb{E}\left[ {\mathbb{E}\left[ {{1_{F \cap B}}|\mathcal{G}} \right]{1_G}} \right] = \mathbb{E}\left[ {\mathbb{E}\left[ {{1_{F \cap B}}|G} \right]{1_G}} \right] = \mathbb{E}\left[ {\mathbb{E}\left[ {{1_F}|G} \right]\mathbb{E}\left[ {{1_B}|G} \right]{1_G}} \right] = \mathbb{E}\left[ {\mathbb{E}\left[ {\mathbb{E}\left[ {{1_B}|G} \right]{1_F}|G} \right]{1_G}} \right] = $ = \mathbb{E}\left[ {\mathbb{E}\left[ {\mathbb{E}\left[ {{1_B}|G} \right]{1_F}|\mathcal{G}} \right]{1_G}} \right] = \mathbb{E}\left[ {\mathbb{E}\left[ {\mathbb{E}\left[ {{1_F}|G} \right]{1_B}|\mathcal{G}} \right]{1_G}} \right] = \mathbb{E}\left[ {\mathbb{E}\left[ {{1_F}|G} \right]\underbrace {{1_G}\mathbb{E}\left[ {{1_B}|\mathcal{G}} \right]}_{ = {1_{G'}},G' \in \mathcal{G}}} \right] =

$\mathbb{E}\left[ {\mathbb{E}\left[ {{1_F}|\mathcal{G}} \right]\mathbb{E}\left[ {{1_B}|\mathcal{G}} \right]{1_G}} \right]$

Intuitively, if you reduce the information given, you can't guarantee that two events will remain independent. On the other hand, if two events are independent, adding more information atop of existing one won't change that.

  • 0
    That is true, I was thinking about that same part after I wrote the answer2012-01-09
1

Suppose we take $\mathcal{G}$ as large as possible, i.e. $\mathcal{G}=\mathcal{A}$.

Then, for each random variable $Z$ we have $\mathbb{E}[Z|\mathcal{G}]=Z$, since $Z$ is already $\mathcal{G}$-measurable. Applying this to your definition of conditional independence gives $\mathcal{F}\;\amalg_{\mathcal{G}} \mathcal{B}$. Now $\Omega\in \mathcal{G}$, so if the result were to hold we'd have $\mathbb{P}[B\cap F]=\mathbb{P}[B]\mathbb{P}[F]$, for all $B\in\mathcal{B}$, $F\in\mathcal{F}$. This is only true if $B$ and $F$ are independent.