2
$\begingroup$

If $A$ and $B$ are independent, with indicator random variables $I_A$ and $I_B$. How can we describe the distribution of $(I_A + I_B)^2$ in terms of $P(A)$ and $P(B)$? I would think its sufficient to say:

a) it has 2 values, 0 or 1

b) has a maximum of 4 and minimum of 0

c) and that its expected value is $E((I_A)^2)+2E(I_AI_B)+E((I_B)^2)$ which is equal to $P(A)^2+2P(A)P(B)+P(B)^2$

Is there anything else I can say?

  • 0
    ($I_A +I_B)^2=I^2_A +2 I_A I_B + I^2_B$, so this RV takes values 0, 1 and 4. Besides 'describe the distribution' is quite vague2012-10-16

1 Answers 1

1

Let $X:=(\chi_A+\chi_B)^2$ and $\omega\in\Omega$. If $\chi_A(\omega)=0=\chi_B(\omega)$, then $X(\omega)=0$; if $\chi_A(\omega)=1$ and $\chi_B(\omega)=0$ or $\chi_A(\omega)=0$ and $\chi_B(\omega)=1$ then $X(\omega)=1$ and when $\chi_A(\omega)=1$ and $\chi_B(\omega)=1$, $X(\omega)=4$. So $P\{X=0\}=P\{A^c\cap B^c)=(1-P(A))(1-P(B)),$ $P\{X=1\}=P(A^c\cap B)+P(A\cap B^c)=(1-P(A))P(B)+P(A)(1-P(B))\\=P(A)+P(B)-2P(A)P(B),$ $P\{X=4\}=P(A)P(B).$ Denoting $\mu$ the law of $X$, we can write $\mu=(1-P(A))(1-P(B))\delta_0+(P(A)+P(B)-2P(A)P(B))\delta_1+P(A)P(B)\delta_4.$