1
$\begingroup$

I have to prove $\sup|P_{n}(k)-\pi_{\lambda}(k)| \leq \frac{\lambda^2}{n}$ where $P_{n}(k)$ is distribution of Binomial RV with parameters $(n, \frac{\lambda}{n})$ and $\pi_{\lambda}(k)$ is Poisson with parameter $\lambda$.

I've shown that if $\xi=\sum_{j=1}^{n}\xi_j$, where $\xi_j \sim \text{Ber} \left(\frac{\lambda}{n}\right)$ and $\eta=\sum_{j=1}^{n} \eta_j$, where $\eta_j \sim \text{Poi} \left(\frac{\lambda}{n}\right)$, then $\xi$ has probability distribution $P_{n}(k)$ and $\eta \ \pi_{\lambda}(k)$.

Any suggestions will be massively appreciated.

  • 0
    There isn't any "then" after your "if", so it's not clear to me what you are saying you have shown.2011-07-04

1 Answers 1

2

See pages 20-21 of the book Stochastic processes and models, by David Stirzaker, here.

Adapting the linked solution to your case and notation: Using your notation, choose a joint distribution $f(x,y)$ for $\xi_j$ and $\eta_j$ as follows: $ f(0,0) = 1 - \lambda /n, $ $ f(1,0) = e^{ - \lambda /n} - 1 + \lambda /n, $ $ f(1,y) = \frac{{(\lambda /n)^y e^{ - \lambda /n} }}{{y!}}, \;\; y \geq 1, $ $ f(0,y) = 0,\;\; y \geq 1. $ Note that $f(\cdot,\cdot)$ indeed determines a joint distribution, as $ f(0,0) + f(1,0) + \sum\limits_{y = 1}^\infty {f(1,y)} + f(0,y) = 1. $ Further note that $\xi_j$ and $\eta_j$ are not independent with this joint distribution. It holds $ {\rm P}(\xi _j \ne \eta _j ) = f(1,0) + {\rm P}(\eta _j \ge 2) = \lambda /n - (\lambda /n)e^{ - \lambda /n} \le (\lambda /n)^2 . $ Next, using your notation, $ |{\rm P}(\xi = k) - {\rm P}(\eta = k)| \le {\rm P}(\xi \ne \eta ) \le {\rm P}(\xi _j \ne \eta _j ,\;{\rm for}\;{\rm some}\; j) $ (the second inequality is trivial; the first is derived in the edit below). Thus, $ |{\rm P}(\xi = k) - {\rm P}(\eta = k)| \leq \sum\limits_{j = 1}^n {{\rm P}(\xi _j \ne \eta _j )} \leq \frac{{\lambda ^2 }}{n}, $ and so $ |P_n (k) - \pi _\lambda (k)| \le \frac{{\lambda ^2 }}{n}. $

EDIT: Derivation of the inequality $ |{\rm P}(\xi = k) - {\rm P}(\eta = k)| \le {\rm P}(\xi \ne \eta ) . $ From $ {\rm P}(\xi = k) - {\rm P}(\eta = k) = {\rm P}(\xi = k,\eta \ne k) + {\rm P}(\xi = k,\eta = k) - {\rm P}(\eta = k), $ it follows that $ {\rm P}(\xi = k) - {\rm P}(\eta = k) \leq {\rm P}(\xi = k,\eta \ne k) \leq {\rm P}(\xi \neq \eta). $ Analogously, ${\rm P}(\eta = k) - {\rm P}(\xi = k) \leq {\rm P}(\xi \neq \eta)$; hence the result.

Also, in view of the proof (above the edit), it is worth noting that $ {\rm P}(\xi _j \ne \eta _j ,\;{\rm for}\;{\rm some}\; j) = {\rm P}(\xi_1 \neq \eta_1 \vee \cdots \vee \xi_n \neq \eta_n) \leq \sum\limits_{j = 1}^n {{\rm P}(\xi _j \ne \eta _j )} . $

  • 0
    but of course! I regret this problem wan't touched upon in any of my stats courses. I don't find it easy at all, although it is supposed to be.2011-07-05