0
$\begingroup$

I know that if I have $A=B+N$ where both $B$ and $N$ are independent and distributed $Pois(\lambda_i)$ respectively, then the estimator of $B$ given $A$ will be linear.

Is that because the distribution depends only on the first and second order parameters (mean and variance)? How do I show it?

Thanks.

1 Answers 1

1

Consider independent integrable random variables $X$ and $Y$ such that, for some parameter $\theta$, $$ \mathbb E(X\mid X+Y)=\theta\cdot(X+Y). $$ Then $\mathbb E(X)=\theta\cdot\mathbb E(X+Y)$ hence $\theta=\mathbb E(X)/(\mathbb E(X)+\mathbb E(Y))$. Furthermore, for every $t$ real, $$ \mathbb E(X\mathrm e^{\mathrm it(X+Y)})=\theta\cdot\mathbb E((X+Y)\mathrm e^{\mathrm it(X+Y)}). $$ Introduce the characteristic functions $\varphi_X$ and $\varphi_Y$ of $X$ and $Y$, defined as $\varphi_X(t)=\mathbb E(\mathrm e^{\mathrm itX})$ and $\varphi_Y(t)=\mathbb E(\mathrm e^{\mathrm itY})$. Then the independence of $X$ and $Y$ shows that the identity above is equivalent to $$ \varphi_X'(t)\varphi_Y(t)=\theta\cdot(\varphi_X\varphi_Y)'(t)=\theta\cdot(\varphi_X'(t)\varphi_Y(t)+\varphi_X(t)\varphi_Y'(t)). $$ This differential equation leads readily to $$ \varphi_X(t)^{1-\theta}=\varphi_Y(t)^\theta. $$ To sum up, $\mathbb E(X\mid X+Y)=\theta\cdot(X+Y)$ if and only if the distributions of $X$ and $Y$ are linked by the identity $\varphi_X^\theta=\varphi_Y^{1-\theta}$ and $\theta=\mathbb E(X)/(\mathbb E(X)+\mathbb E(Y))$. In particular, it is necessary that $\varphi_Y^{(1-\theta)/\theta}$ is a characteristic function.

In the setting of the question, $Y=N$ is Poisson with parameter $\mu$ hence, for every positive $\alpha$, $(\varphi_Y)^\alpha$ is a characteristic function, namely, the characteristic function of the Poisson distribution with parameter $\alpha\mu$. This explains the result when $X=B$ is Poisson with parameter $\lambda$ since the equations $\lambda=\alpha\mu$ and $\alpha=(1-\theta)/\theta$ yield $\theta=\mu/(\lambda+\mu)$.

Is that because the distribution depends only on the first and second order parameters (mean and variance)?

??

  • 0
    Thanks! Two questions though: (1) How does the second line mean the equality in the characteristic functions? (2) How do I solve the diff. equation?2012-11-16
  • 1
    (1) E(X|Z) = w(Z) means exactly that E(Xu(Z)) =E(w(Z)u(Z)) for every test-function u. (2) (1-theta).g'.h = theta.g.h' hence (1-theta).g'/g = theta.h'/h hence (1-theta).log g = C + theta.log h, hence...2012-11-16
  • 0
    Thanks again, now its clear. However - I am wondering if there's an intuitive explanation for this estimator to be linear.2012-11-16