Consider independent integrable random variables $X$ and $Y$ such that, for some parameter $\theta$, $ \mathbb E(X\mid X+Y)=\theta\cdot(X+Y). $ Then $\mathbb E(X)=\theta\cdot\mathbb E(X+Y)$ hence $\theta=\mathbb E(X)/(\mathbb E(X)+\mathbb E(Y))$. Furthermore, for every $t$ real, $ \mathbb E(X\mathrm e^{\mathrm it(X+Y)})=\theta\cdot\mathbb E((X+Y)\mathrm e^{\mathrm it(X+Y)}). $ Introduce the characteristic functions $\varphi_X$ and $\varphi_Y$ of $X$ and $Y$, defined as $\varphi_X(t)=\mathbb E(\mathrm e^{\mathrm itX})$ and $\varphi_Y(t)=\mathbb E(\mathrm e^{\mathrm itY})$. Then the independence of $X$ and $Y$ shows that the identity above is equivalent to $ \varphi_X'(t)\varphi_Y(t)=\theta\cdot(\varphi_X\varphi_Y)'(t)=\theta\cdot(\varphi_X'(t)\varphi_Y(t)+\varphi_X(t)\varphi_Y'(t)). $ This differential equation leads readily to $ \varphi_X(t)^{1-\theta}=\varphi_Y(t)^\theta. $ To sum up, $\mathbb E(X\mid X+Y)=\theta\cdot(X+Y)$ if and only if the distributions of $X$ and $Y$ are linked by the identity $\varphi_X^\theta=\varphi_Y^{1-\theta}$ and $\theta=\mathbb E(X)/(\mathbb E(X)+\mathbb E(Y))$. In particular, it is necessary that $\varphi_Y^{(1-\theta)/\theta}$ is a characteristic function.
In the setting of the question, $Y=N$ is Poisson with parameter $\mu$ hence, for every positive $\alpha$, $(\varphi_Y)^\alpha$ is a characteristic function, namely, the characteristic function of the Poisson distribution with parameter $\alpha\mu$. This explains the result when $X=B$ is Poisson with parameter $\lambda$ since the equations $\lambda=\alpha\mu$ and $\alpha=(1-\theta)/\theta$ yield $\theta=\mu/(\lambda+\mu)$.
Is that because the distribution depends only on the first and second order parameters (mean and variance)?
??