1
$\begingroup$

Given, $ 0 define ${E}{}_{k}$ as,

$\begin{array}{l} {E_{1} =\min(p_{1} ,q_{1} )+\min(1-p_{1} ,1-q_{1} )} \\ {E_{2} =\min(p_{1} p_{2} ,q_{1} q_{2} )+\min(p_{1} (1-p_{2} ),q_{1} (1-q_{2} ))+} \\ {\quad \quad \; +\min((1-p_{1} )p_{2} ,(1-q_{1} )q_{2} )+\min((1-p_{1} )(1-p_{2} ),(1-q_{1} )(1-q_{2} ))} \\ {E_{3} =\min(p_{1} p_{2} p_{3} ,\; q_{1} q_{2} q_{3} )+\min(p_{1} p_{2} (1-p_{3} ),\; q_{1} q_{2} (1-q_{3} ))+\cdots \text{etc.}} \end{array}\ $

etc.

Notice that ${E}{}_{k+1}$ is formed by splitting each summand of ${E}{}_{k}$ into two, using ${p}{}_{k+1}$ and ${q}{}_{k+1}$. Notice also that ${E}{}_{k}$ has 2${}^{k}$ terms.

${E}{}_{k}$ is actually twice the Bayes error from two classes with multivariate Bernoulli distributions.

I would like to prove that for any $1 \leq m \leq k$,

$\forall \varepsilon \;,\; 0<\varepsilon <1-q_{m}, \;if \;\; q_{m} -p_{m} >0 \; then\;\; {E’}{}_{k} \leq {E}{}_{k} $ where ${E’}{}_{k}$ takes $q_{m}+\varepsilon$ instead of $q_{m}$

0 Answers 0