4
$\begingroup$

Suppose we have $X_1, \ldots, X_n$ i.i.d, $X_i \sim Exp(1, \mu)$ (pdf is $f_\mu(x) = e^{-(x-\mu)}$ for $x \geq \mu$ and $0$ for $x < \mu$). Is there any one dimensional (i.e. $T: \mathbb{R}^n \to \mathbb{R}$) sufficient statistic for parameter $\mu$? Obvious two dimensional sufficient statistic is $T(x_1, \ldots, x_n) = (\sum x_i, \min x_i)$, but I have a hard time finding one dimensional statistic.

2 Answers 2

3

The likelihood of an i.i.d. sample $x=(x_i)_{1\leqslant i\leqslant n}$ of distribution $\mathcal E(1,\mu)$ is $ L(\mu,x)=\exp(-s(x)+n\mu)\cdot[t(x)\geqslant\mu], $ where $s(x)=\sum\limits_{i=1}^nx_i$ and $t(x)=\min\limits_{1\leqslant i\leqslant n}x_i$. This is $ L(\mu,x)=h(x)g(\mu,t(x)), $ with $h(x)=\exp(-s(x))$ and $g(m,u)=\exp(nm)\cdot[u\geqslant m]$ hence $t:x\mapsto t(x)=\min\limits_{1\leqslant i\leqslant n}x_i$ is a sufficient statistic for $\mu$.

2

We have $L(x_1,\dots,x_n,\mu)=\prod_{j=1}^ne^{-(x_j-\mu)}\mathbf 1_{[\mu,+\infty[}(x_j)=e^{-n\bar x}\cdot e^{n\mu}\mathbf 1_{[\mu,+\infty[}\left(\min_{1\leq j\leq n}x_k\right),$ so by Fischer-Neyman's theorems, $\varphi(x_1,\ldots,x_n)=\min_{1\leq j\leq n}x_j$ is a sufficient statistic.