0
$\begingroup$

From Wikipedia (emphasis mine):

If $X_1, \dots, X_n$ are independent Bernoulli-distributed random variables with expected value $p$, then the sum $T(X) = X_1 + \dots + X_n$ is a sufficient statistic for $p$.

Is this still true when $X_1, \dots, X_n$ are dependent but uncorrelated? What about if they are correlated?

1 Answers 1

1

On the first question: I can construct a couple of examples when dependent Bernoulli r.v. are pairwise independent (uncorrelated). But in all these examples the joint PDF $$f_p(x_1,\ldots,x_n)=P(X_1=x_1,\ldots, X_n=x_n|p)$$ is a function of $x_1+...+x_n$ (and of $p$).

Is it really possible to construct non-symmetric PDF, I do not know yet.

Concerning the second question, the answer is NO, in general. There exists dependent Bernoulli r.v.'s such that their joint PDF takes different values for the same value of the sum of the arguments $x_1+\ldots+x_n$.

The trivial example is: $X_1, X_3\sim B(p)$, $X_2=X_1$, $X_1$ and $X_3$ are independent. Then the joint probabilities are: $$P((X_1,X_2,X_3)=(1,1,1))=p^2,$$ $$P((X_1,X_2,X_3)=(1,0,1))=0,$$ $$P((X_1,X_2,X_3)=(1,1,0))=p(1-p),$$ and so on.

Last two lines show that the joint PDF $P((X_1,X_2,X_3)=(x_1,x_2,x_3))$ is equal to either $0$ or $p(1-p)$ for two different points $(x_1,x_2,x_3)$ with the same sum $x_1+x_2+x_3=2$. So, sum is not sufficient statistic for $p$ in this case.