6
$\begingroup$

Let $\{X_i\}_{i=1}^n$ be a sequence of i.i.d. random variables (i.e. a random sample) with pdf:

$$f_X(x) = e^{-(x-\theta)} \, e^{-e^{-(x-\theta)}} · \mathbf{1}_{x\in \mathbf{R}}$$

The goal is to find the distribution of $T = \sum_{i=1}^n e^{-X_i}$ and also to compute $\textbf{E}(\log T)$ and $\textbf{V}(\log T)$.

Some thoughts:

I think I have found the distribution of $T$ by applying the transformation $Y = e^{-X}$. If I am not wrong, it is quite easy to see that $Y \sim \textrm{Exponential}(e^{\theta})$. Therefore, $T = \sum_{i=1}^n Y_i \sim \textrm{Gamma} (n, 1/e^{\theta})$.

However, I am unable to find a reasonable way to compute $\textbf{E}(\log T)$ or $\textbf{V}(\log T)$. The formula for the expectation of a function of a random variable leads to a very complicated integral and the only alternative I can think of, which is yet another transformation, is even worse!

  • 0
    It might be of use, that the distribution followed by $X_i$ is known as [extreme value distribution](http://mathworld.wolfram.com/ExtremeValueDistribution.html) with unit scale parameter.2011-09-26
  • 0
    If $n$ is large, you can also get some decent and easy asympotics by doing a Taylor expansion of $log(T)$ around $E(T)$.2011-09-26
  • 0
    @leonbloy Thanks for your comment! I'll try that, too. It is not what I was looking for, but it is easier to "compute".2011-09-26

1 Answers 1

5

You correctly determined that $Y \sim \rm Exponential(\mathrm{e}^\theta)$, and determined distribution of $T$, so I will concentrate on computing

$$ \mathbb{E}( \log x, x \sim \rm Gamma(n, \frac{1}{\beta})) $$

The easiest way is to compute $\mathbb{E}( x^s )$ and then use $\log(x) = \left. \frac{\mathrm{d}}{\mathrm{d} s} x^s \right\vert_{s=0}$ and $\log^2(x) = \left. \frac{\mathrm{d}^2}{\mathrm{d} s^2} x^s \right\vert_{s=0}$.

$$ \mathbb{E}( x^s ) = \int_0^\infty x^s \cdot \frac{\beta^{n}}{\Gamma(n)} x^{n-1} \exp(-x \beta) \mathrm{d} x = \int_0^\infty \frac{\beta^{n}}{\Gamma(n)} x^{n + s-1} \exp(-x \beta) \mathrm{d} x = \beta^{-s} \frac{\Gamma(n+s)}{\Gamma(n)} $$

Now it is easy to figure out:

$$ \mathbb{E}(\log T) = -\theta + \psi(n) \qquad \mathbb{V}(\log T) = \psi^{(1)}(n) $$ where $\psi(n)$ is digamma function and $\psi^{(1)}(n)$ is its first derivative.

Added: To make it easier to compute the variance, note that $\mathbb{V}(\log T) = \left. \frac{ \mathrm{d}^2}{ \mathrm{d}s^2 } \mu^{-s} \mathbb{E}(x^s)\right\vert_{s=0}$, where $\log \mu$ is the mean of $\log T$, but $\mu^{-s} \mathbb{E}(x^s) = \frac{\Gamma(n+s)}{\Gamma(n)} \exp(-s \psi(n)) = \exp( 1 + \psi(n) s + \frac{1}{2} \psi^\prime(n) s^2 + o(s^3)) \exp(-s \psi(n))$. Therefore $ \mu^{-s} \mathbb{E}(x^s) = \exp(1 + \frac{1}{2} \psi^\prime(n) s^2 + o(s^3)) = 1 + \frac{1}{2} \psi^\prime(n) s^2 + o(s^3)$.

  • 0
    First of all, I would like to thank you for your answer, but there are some things that I don't understand. I don't know how to use $\log(x) = \frac{\mathrm{d}}{\mathrm{d}s} x^s ]_{s=0}$. In fact, I don't even know where it comes from. If it is an identity, I think I haven't seen it before, and if it is a change of variables, I don't know what to do. Again, sorry for the dumb questions, I've had a bad day and I can't seem to get anything!2011-09-26
  • 1
    @VictorP. Remember $x^s = \exp( s \log(x) )$. Thus $\frac{d^k}{d s^k} \exp( s \log(x) ) = (\log x)^k \exp( s \log(x) )$. Now, if you set $s=0$ in the result, the exponential becomes 1, and you are left with $(\log x)^k$ as claimed.2011-09-26
  • 0
    Phew! I've been trying to reproduce the exercise on my own this morning, and I now I am stuck with the last identity $\mathbb{V}(\log T) = \left. \frac{ \mathrm{d}^2}{ \mathrm{d}s^2 } \mu^{-s} \mathbb{E}(x^s)\right\vert_{s=0}$. After computing $\mathbb{E}(\log T)$, I've just applied $\mathbb{V}(\log T) = \mathbb{E}(\log^2 T) - \mathbb{E}(\log T)^2$. I don't know if I got it right, but I couldn't simplify anything.2011-09-27
  • 1
    @VictorP. Variance is the second central moment of $\log T$, and $\mu^{-s} \mathbb{E}(x^s)$ is central moment generating function. Indeed $\mu^{-s} \mathbb{E}(\exp(s \log(x)))$. Actually, $\mu$ should be $\exp( \mathbb{E}(\log T))$, rather than $\mathbb{E}(\log T)$, i.e. $\log \mu = \mathbb{E}(\log T)$. I have corrected the post2011-09-27
  • 0
    Now I fully understand your work. When I read $\mathbb{E}(\exp(s \log(x)))$ it all made sense. Silly me ;) Thanks again!2011-09-27