5
$\begingroup$

I am fairly new to this topic but here is my problem:

I have stumbled across a paper (Robinson and Smyth, 2008) stating that the sample sum is a sufficient statistic for NB-distributed random variables.
I have tried to verify this by using the Fisher–Neyman factorization theorem $f_\theta(x)=h(x) \, g_\theta(T(x))$.
This is how far I have come: $\frac{\prod\Gamma(x_i+r)}{\prod x_i! \Gamma(r)^n}(1-p)^{n*r}p^{\sum{x_i}}$
It would be easy if it weren't for the Gamma function in the numerator as then $h(x)=\frac{1}{\prod x_i!}$ and $g_\theta(T(x))=\Gamma(r)^{-n}(1-p)^{n*r}p^{\sum{x_i}}$.

If I am on the wrong path, could somebody please help me solve this?

1 Answers 1

2

Are you given that $r$ is known? If it is, then you could define $h(x)$ as you have but including the product in the numerator. This would not be a problem as $r$ is known.

  • 0
    $r$ is being estimated in an Maximum-Likelihood Estimation approach. This is what irritates me. The sum can only be the sufficient statistic if $r$ is known but at the same time $r$ is being estimated with MLE. What am I not seeing here? – 2012-10-31