1
$\begingroup$

Given, $ 0 define ${E}{}_{k}$ as,

$\begin{array}{l} {E_{1} =min(p_{1} ,q_{1} )+min(1-p_{1} ,1-q_{1} )} \\ {E_{2} =min(p_{1} p_{2} ,q_{1} q_{2} )+min(p_{1} (1-p_{2} ),q_{1} (1-q_{2} ))+} \\ {\quad \quad \; +min((1-p_{1} )p_{2} ,(1-q_{1} )q_{2} )+min((1-p_{1} )(1-p_{2} ),(1-q_{1} )(1-q_{2} ))} \\ {E_{3} =min(p_{1} p_{2} p_{3} ,\; q_{1} q_{2} q_{3} )+min(p_{1} p_{2} (1-p_{3} ),\; q_{1} q_{2} (1-q_{3} ))+......etc.} \end{array}\ $

etc.

Notice that ${E}{}_{k+1}$ is formed by splitting each summand of ${E}{}_{k}$ into two, using ${p}{}_{k+1}$ and ${q}{}_{k+1}$. Notice also that ${E}{}_{k}$ has 2${}^{k}$ terms.

${E}{}_{k}$ is actually twice the Bayes error from two classes with multivariate Bernoulli distributions.

It is easy to prove that ${E}{}_{k+1} \leq {E}{}_{k}$ for all k.

I would like to prove that,

$\forall \varepsilon \;,\; 0<\varepsilon <1, \;if \;\; \forall i\; |p_{i} -q_{i} |\; >\varepsilon \; then\;\; limE_{n\to \infty } =0. $

Any ideas?

1 Answers 1

1

Here is an old trick. Your question can be restated as follows. Suppose that $x_i$ are independent random variables such that $P(x_i=1)=p_i$ and $P(x_i=0)=1-p_i$; suppose that $y_j$ are defined similarly for $q_j$. Your task is to show that no matter how $x_i$ and $y_j$ are related to each other, the probability that $x_i=y_i$ for all $i$ is small. Choose $a_i=\operatorname{sign}(p_j-q_i)$ and consider $X=\sum a_ix_i$ and $Y=\sum a_iy_i$. Then $EX-EY\ge n\varepsilon$ and $VX,VY\le n$. The probability in question is not greater than $P(X=Y)\le P(X where $M=\frac 12(EX+EY)$. By the Chebyshev inequality, both probabilities on the right are at most $\frac 4{\varepsilon^2 n}$. If you want to get a better (exponential) bound, just use the CLT instead of Chebyshev.

  • 0
    What we need here is that this bound can be attained for some coupling (for every string, just allocate the minimum of two probabilities in the same place for both processes after which you can extend them arbitrarily to the whole space).2011-07-06