3
$\begingroup$

Suppose $X_1,X_2,\cdots,X_n$ are independent random variables and we show order statistics of this random variables with $Y_1,Y_2,\cdots,Y_n$.

$X_1,X_2,\cdots,X_{n-1}$ have exponential distribution with mean $1$ and $X_n$ has exponential distribution with mean $\theta$.

How can find $P(Y_i=X_n)$ for ($i=1,2,\cdots,n$)?

3 Answers 3

7

One has $ \mathrm P(Y_i=X_n) = \int_{0}^{\infty} {n-1 \choose i-1} (1-\mathrm e^{-x})^{i-1} \mathrm e^{-(n-i)x} \frac1\theta \mathrm e^{-x/\theta} \, \mathrm dx. $ The change of variable $\mathrm e^{-x}=s$ yields $ \mathrm P(Y_i=X_n)= \frac1\theta{n-1 \choose i-1}\int_{0}^{1} (1-s)^{i-1} s^{n-i} s^{1/\theta} \, s^{-1}\,\mathrm ds, $ that is, $ \mathrm P(Y_i=X_n)= \frac1\theta{n-1 \choose i-1}\mathrm{B}(i,n-i+1/\theta)=\frac1\theta\,\frac{\Gamma(n)}{\Gamma(n+1/\theta)}\,\frac{\Gamma(n-i+1/\theta)}{\Gamma(n-i+1)}. $ Sanity checks:

(i) When $\theta=1$, the distribution is uniform on $\{1,2,\ldots,n\}$.

(ii) When $\theta\to0$, the distribution concentrates on $i=1$.

(iii) When $\theta\to+\infty$, the distribution concentrates on $i=n$.

(iv) (More involved) For every $n\geqslant1$ and $a\gt0$, $ \sum_{k=0}^{n-1}\frac{\Gamma(k+a)}{\Gamma(k+1)}=\frac{\Gamma(n+a)}{a\Gamma(n)}. $ Using this identity for $a=1/\theta$, one sees that the sum of $\mathrm P(Y_i=X_n)$ over $i$ is $1$... as it should.

3

You might start with $P(Y_i=X_n) = \int_{x=0}^{\infty} P(Y_i=X_n | X_n =x)\, p( X_n =x) \, dx $

$ = \int_{x=0}^{\infty} {n-1 \choose i-1} P(X_1 \lt x)^{i-1} P(X_1 \gt x)^{n-i} p(X_n = x) \, dx $

$ = \int_{x=0}^{\infty} {n-1 \choose i-1} (1-e^{-x})^{i-1} (e^{-x})^{n-i} e^{- x/\theta} /\theta\, dx $

  • 0
    +1 Do I understand it correctly that $\mathbb{P}(Y_i=X_n|X_n=x)$ is evaluated using a binomial random variable N = \sum_{i=1}^{n-1} [X_i < x], $\mathbb{P}(Y_i=X_n|X_n=x) = \mathbb{P}(N=i-1)$? This is similar to what I did, but simpler.2012-09-02
1

Since I had trouble grasping the integral expression for the probability $\mathbb{P}(Y_k = X_n)$ given by @Henry and @did, I am writing up my own derivation of it.

Consider first $n-1$ iid samples $X_1, \ldots, X_{n-1}$ from exponential distribution with unit rate, and let $Z_k = X_{n-1:k}$ be its order statistics. $X_n$ and $Z_k$ are independent random variables.

We now insert $X_n$ into the ordered sequence $Z_1, \ldots, Z_{n-1}$. The event $\{Y_k = X_n\}$ is equivalent to $\{Z_{k-1} < X_n : $\begin{eqnarray} \mathbb{P}\left(Y_k = X_n\right) &=& \mathbb{P}\left(Z_{k-1} < X_n Z_{k-1}\right) - \mathbb{P}\left(X_n > Z_{k}\right)|Z_{k-1},Z_k\right) \\ &=& \mathbb{E}\left( \mathbb{P}\left(X_n > Z_{k-1}|Z_{k-1}\right)\right) - \mathbb{E}\left(\mathbb{P}\left(X_n > Z_k |Z_k\right) \right) \\ &=&\mathbb{P}\left(X_n > Z_{k-1}\right) - \mathbb{P}\left(X_n > Z_k \right) \end{eqnarray} $ This readily produces the integral (using the pdf of the single order statistics): $ \begin{eqnarray} \mathbb{P}\left(X_n > Z_k\right) &=& \frac{(n-1)!}{(k-1)!(n-1-k)!} \int_0^\infty \mathrm{e}^{-(n-1-k)z} \left(1-\mathrm{e}^{-z}\right)^{k-1} \mathrm{e}^{-z/\theta} \mathrm{d}z \\ &=& \frac{\Gamma(n)}{\Gamma\left(n+\frac{1}{\theta}\right)} \cdot \frac{\Gamma\left(n-k+\frac{1}{\theta} \right)}{\Gamma\left(n-k\right)} \end{eqnarray} $ The order integral is obtained replacing $k$ with $k-1$. Then subtracting and using recurrence relation for $\Gamma$ function yields the result.