2
$\begingroup$

I've two questions (1) How do I determine the distribution of the first order and the highest order statistic for sample of random size N is taken from the continuous uniform(0, $\theta$) and

\begin{equation} \rm{P} (N = n) = \frac{1}{n! (e − 1)} \text{for n = 1, 2, 3, . . . .} \end{equation}

(2) In general without any distribution of any random variables given, the distribution of highest order statistic is $F_{X(n)}(a) = [F_{X}(a)]^n$ and the distribution of the lowest order statistic is $F_{X(1)}(a) = 1 - [1 - F_X(a)]^n$. I understand this derivation. But what does these values say about the distribution ?

2 Answers 2

2

Didier gave much more than the OP requested. My solution below addresses the original question (and is thus somewhat simpler, in my opinion).

If $Z$ denotes the maximum, then $$ {\rm P}(Z \le z) = \sum\limits_{n = 1}^\infty {{\rm P}(Z \le z|N = n){\rm P}(N = n)} = \sum\limits_{n = 1}^\infty {F_{X_{(n)} } (z)\frac{1}{{n!({\rm e} - 1)}}}. $$ Since $F_{X_{(n)} } (z) = [F_X (z)]^n = (\frac{z}{\theta })^n$, $0 \leq z \leq \theta$, we get $$ {\rm P}(Z \le z) = \frac{1}{{{\rm e} - 1}}\sum\limits_{n = 1}^\infty {\frac{{( z/\theta)^n }}{{n!}}} = \frac{1}{{{\rm e} - 1}}({\rm e}^{z/\theta} - 1). $$ Hence $Z$ has density $f_Z$ given by $$ f_Z (z) = \frac{1}{{({\rm e} - 1)\theta }}{\rm e}^{ z/\theta}, \;\; 0 < z < \theta. $$ Similarly, if $Y$ denotes the minimum, then $$ {\rm P}(Y \le y) = \sum\limits_{n = 1}^\infty {{\rm P}(Y \le y|N = n){\rm P}(N = n)} = \sum\limits_{n = 1}^\infty {F_{X_{(1)} } (y)\frac{1}{{n!({\rm e} - 1)}}}. $$ Since $F_{X_{(1)} } (y) = 1 - [1-F_X (y)]^n = 1 - [1 - \frac{y}{\theta }]^n$, $ 0 \leq y \leq \theta$, we get $$ {\rm P}(Y \le y) = \sum\limits_{n = 1}^\infty {\frac{{1 - [1 - y/\theta ]^n }}{{n!({\rm e}-1)}}} = 1 - \sum\limits_{n = 1}^\infty {\frac{{(1 - y/\theta )^n }}{{n!({\rm e}-1)}}} = 1 - \frac{{{\rm e}^{1 - y/\theta } - 1}}{{{\rm e} - 1}}. $$ Hence $Y$ has density $f_Y$ given by $$ f_Y (y) = \frac{1}{{({\rm e} - 1)\theta }}{\rm e}^{ 1 - y/\theta}, \;\; 0 < y < \theta. $$

  • 0
    Neat explanation. Understood Clearly. Thanks.2011-02-08
  • 2
    @Sunil: Could you explain what in Shai's answer left you more satisfied than the previous answer? Your explanation could include what was not clear in the explanations of the previous answer. Also, what happened to part (2) of your question?2011-02-09
  • 0
    @Didier: Your answer was very correct too but I failed to understand some minor steps that you skipped. You gave the joint distribution as well which was very helpful. In conjunction with Shai's answer I could see how you derived your answer. I'm not saying yours is wrong but since I'm a beginner I was expecting a little bit of a detailed answer. If I were allowed to select multiple correct answers I'd have definitely selected yours too. That is also why I gave +1's for your comments. Regarding the second part I didn't get any answers. I posted a comment regarding it in your answer too.2011-02-09
  • 0
    @Sunil: You may skip assurances that my solution is *very correct* or not *wrong*, they are not needed. Two more points. First, you do not specify what *minor steps* I *skipped* (that is, other than steps already explicit in your question such as formulas for $F_{X(1)}$ and $F_{X(n)}$). So what you mean is still rather mysterious. Second, part (2) of your post seems to ask for results valid for distributions of $X$ more general than uniform, and my answer is phrased so as to be valid (as late as possible) in the general case, but you say you *didn't get any answers* to your part (2)... Funny.2011-02-09
  • 0
    For the first part the mistake I was making in my trials was I didn't know how to calculate the value of $[F_x(z)]^n$. I didn't know that I should use pdf of uniform at that point and it was clear when Shai pointed it out to be equal to $(z/\theta)^n$. That is the single point where I understood how everything was working. For the second part consider this example, the fn. ($x1^2$ + $x2^2$) having two rv's x1, x2 that are iid's from N(0,1) follow chi-square distribution. So I was wondering if the densities obtained here from the given function should follow any known distribution ?2011-02-09
  • 0
    I highly appreciate your concern and it indeed helps me to think in different perceptives. Please point out if I'm wrong/stupid anywhere and I can correct myself. Also if you can, I would appreciate if you could edit your steps and give a little more detail to it. For example you say Z $\leftarrow$ - $\infty$. Can you explain little more on that part as to why that should happen ?2011-02-09
  • 0
    @Sunil: So what you needed was to know that $F_X(z)=z/\theta$? Well... And I fail to see how the *chi-square* part of your comment is (even vaguely) related to the subject of your post. Anyway, enough for me.2011-02-09
  • 0
    @Didier: No offense meant in anyway. I could not explain everything in this answer that made me understand in the comment section so I pointed out one that was helpful. I guess I din't know to explain myself clear enough but anyways, thanks.2011-02-09
6

The idea is to consider simultaneously the minimum $Y$ and the maximum $Z$ of the sample. For $y\le z$, the event $[y\le Y,Z\le z]$ is equivalent to the whole sample being between $y$ and $z$, hence, for a sample of size $n$, $P(y\le Y,Z\le z)$ would be $P(y\le X\le z)^n$. Here one considers a sample of random size $N$, hence $$ P(y\le Y,Z\le z)=\sum_{n\ge1}P(N=n)P(y\le X\le z)^n=c(\mathrm{e}^{P(y\le X\le z)}-1), $$ with $$ c=1/(\mathrm{e}-1). $$ To get the joint distribution of $(Y,Z)$, one should differentiate $P(y\le Y,Z\le z)$ twice, with respect to $y$ and $z$, yielding $$ P(Y\in\mathrm{d}y,Z\in\mathrm{d}z)=c\mathrm{e}^{P(y\le X\le z)}P(X\in\mathrm{d}y)P(X\in\mathrm{d}z). $$ To get the distribution of $Y$ alone is even simpler, one differentiates $P(y\le Y,Z\le z)$ once with respect to $y$ and one lets $z\to+\infty$, hence $$ P(Y\in\mathrm{d}y)=c\mathrm{e}^{P(X\ge y)}P(X\in\mathrm{d}y). $$ Likewise, to get the distribution of $Z$ alone, one differentiates $P(y\le Y,Z\le z)$ once with respect to $z$ and one lets $y\to-\infty$, hence $$ P(Z\in\mathrm{d}z)=c\mathrm{e}^{P(X\le z)}P(X\in\mathrm{d}z). $$ In the special case where the sample is uniform on $(0,\theta)$, for $0\le y\le z\le\theta$, the density of the distribution of $(Y,Z)$ at $(y,z)$ is $$ f_{Y,Z}(y,z)=(c/\theta^{2})\mathrm{e}^{(z-y)/\theta}. $$ Finally, the densities of the distributions of $Y$ and $Z$ on $(0,\theta)$ are $$ f_Y(y)=(c\mathrm{e}/\theta)\mathrm{e}^{-y/\theta}, \quad f_Z(z)=(c/\theta)\mathrm{e}^{z/\theta}. $$

  • 0
    What happened to the n! term ?2011-02-08
  • 2
    @Sunil: Together with the term $P(...)^n$ it yields the series expansion of the exponential.2011-02-08
  • 0
    Oh yes. Correct. I'm sorry. I overlooked. If I'm correct both first and highest order statistic ~ Gamma distribution. Am I correct ?2011-02-08
  • 0
    Typo at the end? ($f_Z (z) = c\theta ^{ - 1} e^{\theta ^{ - 1} z}$.)2011-02-08
  • 1
    @Sunil: They are certainly not Gamma distributed (they are bounded from above by $\theta$).2011-02-08
  • 0
    So In the question above, by asking the distribution it actually means to find the density. Is that right ? Because I always thought that if they ask for distribution it should follow some well known distribution. That is why I usually attempt solving any question by trying to find its MGF or the expected value. Here I was thinking about finding the $E[P(X1 \leq x)]^n$2011-02-08
  • 0
    @Shai: you are right, typo corrected, thanks.2011-02-08