5
$\begingroup$

If I have $x_1, x_2,\ldots, x_n$ independent NON-identically distributed Bernoulli random variables, how do I show that: $$\mathrm{Pr}\left(\sum_{i=1}^nx_i>\beta\mu\right)\le e^{-g(\beta)\mu}$$

where $$\beta>1$$$$\mu=E\left(\sum_{i=1}^nx_i\right)$$$$g(\beta)=\beta\times \ln(\beta)-\beta+1$$? I believe this can be accomplished using the Markov inequality (because that's what we've been covering), but I'm still not sure how to apply it.

1 Answers 1

6

In markov's inequality, Note that

$$P(\sum_{i=1}^n x_i > \beta \mu) = P(\beta^{\sum_{i=1}^n x_i} > \beta^{\beta \mu})$$

Now apply markov inequality to rhs to get

$$\leq \frac{E[\beta^{\sum_{i=1}^n x_i}]}{\beta^{\beta \mu}} = \frac{\prod_{i=1}^nE[\beta^{x_i}]}{\beta^{\beta \mu}}=\frac{\prod_{i=1}^nE[\beta^{x_i}]}{e^{(\beta \ln \beta) \mu}}$$

Now $E[\beta^{x_i}] = 1 + p_i(\beta-1)$ where $p_i$ is the mean of the i'th bernoulli random variable. Now you observe

$$\prod_{i=1}^nE[\beta^{x_i}] = \prod_{i=1}^n(1 + p_i(\beta-1)) \leq \prod_{i=1}^n e^{p_i(\beta-1)}$$ $$= e^{(\beta - 1)\sum_{i=1}^n p_i} = e^{(\beta - 1)\mu}$$

Putting it all together, we get $$P(\sum_{i=1}^n x_i > \beta \mu) \leq e^{(-\beta \ln \beta +\beta -1)\mu} = e^{-g(\beta)\mu}$$

QED

  • 0
    how are you showing that $E[\beta^{X_i}=1+p_i(\beta-1)$2017-10-16
  • 0
    If $p_i = P[X_i=1]$, $E[\beta^X] = \beta^0 P[X_i=0] + \beta^1 P[X_i=1]=1-p_i+p_i\beta $2017-10-16
  • 0
    @ Gautam Shenoy thank you2017-10-16