1
$\begingroup$

I have to prove $\sup|P_{n}(k)-\pi_{\lambda}(k)| \leq \frac{\lambda^2}{n}$ where $P_{n}(k)$ is distribution of Binomial RV with parameters $(n, \frac{\lambda}{n})$ and $\pi_{\lambda}(k)$ is Poisson with parameter $\lambda$.

I've shown that if $\xi=\sum_{j=1}^{n}\xi_j$, where $\xi_j \sim \text{Ber} \left(\frac{\lambda}{n}\right)$ and $\eta=\sum_{j=1}^{n} \eta_j$, where $\eta_j \sim \text{Poi} \left(\frac{\lambda}{n}\right)$, then $\xi$ has probability distribution $P_{n}(k)$ and $\eta \ \pi_{\lambda}(k)$.

Any suggestions will be massively appreciated.

  • 0
    There isn't any "then" after your "if", so it's not clear to me what you are saying you have shown.2011-07-04

1 Answers 1

2

See pages 20-21 of the book Stochastic processes and models, by David Stirzaker, here.

Adapting the linked solution to your case and notation: Using your notation, choose a joint distribution $f(x,y)$ for $\xi_j$ and $\eta_j$ as follows: $$ f(0,0) = 1 - \lambda /n, $$ $$ f(1,0) = e^{ - \lambda /n} - 1 + \lambda /n, $$ $$ f(1,y) = \frac{{(\lambda /n)^y e^{ - \lambda /n} }}{{y!}}, \;\; y \geq 1, $$ $$ f(0,y) = 0,\;\; y \geq 1. $$ Note that $f(\cdot,\cdot)$ indeed determines a joint distribution, as $$ f(0,0) + f(1,0) + \sum\limits_{y = 1}^\infty {f(1,y)} + f(0,y) = 1. $$ Further note that $\xi_j$ and $\eta_j$ are not independent with this joint distribution. It holds $$ {\rm P}(\xi _j \ne \eta _j ) = f(1,0) + {\rm P}(\eta _j \ge 2) = \lambda /n - (\lambda /n)e^{ - \lambda /n} \le (\lambda /n)^2 . $$ Next, using your notation, $$ |{\rm P}(\xi = k) - {\rm P}(\eta = k)| \le {\rm P}(\xi \ne \eta ) \le {\rm P}(\xi _j \ne \eta _j ,\;{\rm for}\;{\rm some}\; j) $$ (the second inequality is trivial; the first is derived in the edit below). Thus, $$ |{\rm P}(\xi = k) - {\rm P}(\eta = k)| \leq \sum\limits_{j = 1}^n {{\rm P}(\xi _j \ne \eta _j )} \leq \frac{{\lambda ^2 }}{n}, $$ and so $$ |P_n (k) - \pi _\lambda (k)| \le \frac{{\lambda ^2 }}{n}. $$

EDIT: Derivation of the inequality $$ |{\rm P}(\xi = k) - {\rm P}(\eta = k)| \le {\rm P}(\xi \ne \eta ) . $$ From $$ {\rm P}(\xi = k) - {\rm P}(\eta = k) = {\rm P}(\xi = k,\eta \ne k) + {\rm P}(\xi = k,\eta = k) - {\rm P}(\eta = k), $$ it follows that $$ {\rm P}(\xi = k) - {\rm P}(\eta = k) \leq {\rm P}(\xi = k,\eta \ne k) \leq {\rm P}(\xi \neq \eta). $$ Analogously, ${\rm P}(\eta = k) - {\rm P}(\xi = k) \leq {\rm P}(\xi \neq \eta)$; hence the result.

Also, in view of the proof (above the edit), it is worth noting that $$ {\rm P}(\xi _j \ne \eta _j ,\;{\rm for}\;{\rm some}\; j) = {\rm P}(\xi_1 \neq \eta_1 \vee \cdots \vee \xi_n \neq \eta_n) \leq \sum\limits_{j = 1}^n {{\rm P}(\xi _j \ne \eta _j )} . $$

  • 0
    Further note that the joint distribution $f(\cdot,\cdot)$ yields the correct marginal distributions for $\xi_j$ and $\eta_j$.2011-07-04
  • 0
    I can see that $\frac{\lambda}{n} -\frac{\lambda}{n}e^{-\frac{\lambda}{n}} \leq \frac{\lambda}{n}$, but why is it less than $\Big(\frac{\lambda}{n}\Big)^2$?2011-07-04
  • 0
    @sigma.z.1980: Let $x=\lambda / n \,( > 0)$, so you want to show $x - xe^{-x} \leq x^2$, or (after dividing by $x$) $1 - e^{-x} \leq x$. The last inequality can be easily proved by comparing the derivatives of $1 - e^{-x}$ and $x$, which are $e^{-x}$ and $1$, respectively.2011-07-04
  • 0
    OK yes of course, it's just Taylor series expansion of $e^{-x}$ till the 2nd term. I regret I didn't see this before.2011-07-05
  • 0
    @sigma: Note, however, that from $e^{-x} = 1 - x + x^2/2! - x^3/3! + \cdots$ it is not clear that $e^{-x} \geq 1 - x$ (though it is true).2011-07-05
  • 0
    that is because positive term is larger than the following negative one, if I'm not mistaken2011-07-05
  • 0
    @sigma: But $x^3 / 3! > x^2 / 2!$ for $x > 3$. I recommend proving the inequality $1-e^{-x} \leq x$ by comparing derivatives, as indicated above.2011-07-05
  • 0
    I got the point. For the second inequality, $|P(\xi=k)-P(\eta-k)| \leq P(\xi \neq \eta)$, do I have to show explicitly that $|\binom{n}{k}(\frac{\lambda}{n})^k(1-\frac{\lambda}{n})^{n-k}-e^{-\lambda}\frac{\lambda^k}{k!}| \leq \frac{\lambda}{n}-\frac{\lambda}{n}e^{-\frac{\lambda}{n}}$?2011-07-05
  • 0
    @sigma: No. I'll show how to derive that inequality later on.2011-07-05
  • 0
    @sigma: Done...2011-07-05
  • 0
    Do you think you made a typo here? The sum should be the upper bound $|{\rm P}(\xi = k) - {\rm P}(\eta = k)| \le {\rm P}(\xi \ne \eta ) \le {\rm P}(\xi _j \ne \eta _j ,\;{\rm for}\;{\rm some}\; j)$, since you sum over $j$ ro obtain the inequality2011-07-05
  • 0
    OK, so you used the law of total probability and then Boole's inequality2011-07-05
  • 0
    @sigma: Since $\xi \neq \eta$ implies $\xi_j \neq \eta_j$ for some $j$, ${\rm P}(\xi \neq \eta) \leq {\rm P}(\xi_j \neq \eta_j,\;{\rm for}\;{\rm some}\; j)$.2011-07-05
  • 0
    but of course! I regret this problem wan't touched upon in any of my stats courses. I don't find it easy at all, although it is supposed to be.2011-07-05