Given a probability distribution function $F(x)$, consider other probability distribution functions $F_1$ and $F_2$ such that $aF_1(x)+bF_2(x)=F(x)$ for some $a,b$ for all $x$. Under what conditions on $F_1$ and $F_2$ we have $F_1(x)(1-F_1(x))+F_2(x)(1-F_2(x)) \ge F(x)(1-F(x)) $?
Some inequality
1 Answers
In order for $a F_1 + b F_2$ to be a probability distribution function, you need $a + b = 1$. I'll assume you're interested in the case $0 < a < 1$. If $F = a F_1 + (1 - a) F_2$, then $G = F_1 (1-F_1) + F_2 (1 - F_2) - F (1 - F) = (F_1 - F_2)^2 a^2 + (F_1 - F_2)(2 F_2 - 1) a + F_1 - F_1^2$. Now certainly the $a^2$ and constant terms are nonnegative. So one sufficient condition is that $(F_1 -F_2) (2 F_2 - 1) \ge 0$, i.e. either $F_1 \ge F_2 \ge 1/2$ or $F_1 \le F_2 \le 1/2$.
By symmetry, it also is true if $F_2 \ge F_1 \ge 1/2$ or $F_2 \le F_1 \le 1/2$. On the other hand, when $F_2 < 1/2 < F_1$ or $F_1 < 1/2 < F_2$, the minimum of $G$ is at $a = \frac{1/2 - F_2}{F_1 - F_2}$ where we get $G =F_1 - F_1^2 + F_2 - F_2^2 - 1/4$. Note that the curve $F_1 - F_1^2 + F_2 - F_2^2 = 1/4$ is a circle of radius $1/2$ centred at $(1/2,1/2)$. So the condition to have $G \ge 0$ for all $0 \le a \le 1$ is that $(F_1, F_2)$ avoids the two regions of the unit square outside that circle that are shown in red in this plot.