6
$\begingroup$

Let $\{X_i\}_{i=1}^n$ be a sequence of i.i.d. random variables (i.e. a random sample) with pdf:

f_X(x) = e^{-(x-\theta)} \, e^{-e^{-(x-\theta)}} · \mathbf{1}_{x\in \mathbf{R}}

The goal is to find the distribution of $T = \sum_{i=1}^n e^{-X_i}$ and also to compute $\textbf{E}(\log T)$ and $\textbf{V}(\log T)$.

Some thoughts:

I think I have found the distribution of $T$ by applying the transformation $Y = e^{-X}$. If I am not wrong, it is quite easy to see that $Y \sim \textrm{Exponential}(e^{\theta})$. Therefore, $T = \sum_{i=1}^n Y_i \sim \textrm{Gamma} (n, 1/e^{\theta})$.

However, I am unable to find a reasonable way to compute $\textbf{E}(\log T)$ or $\textbf{V}(\log T)$. The formula for the expectation of a function of a random variable leads to a very complicated integral and the only alternative I can think of, which is yet another transformation, is even worse!

  • 0
    @leonbloy Thanks for your comment! I'll try that, too. It is not what I was looking for, but it is easier to "compute".2011-09-26

1 Answers 1

5

You correctly determined that $Y \sim \rm Exponential(\mathrm{e}^\theta)$, and determined distribution of $T$, so I will concentrate on computing

$ \mathbb{E}( \log x, x \sim \rm Gamma(n, \frac{1}{\beta})) $

The easiest way is to compute $\mathbb{E}( x^s )$ and then use $\log(x) = \left. \frac{\mathrm{d}}{\mathrm{d} s} x^s \right\vert_{s=0}$ and $\log^2(x) = \left. \frac{\mathrm{d}^2}{\mathrm{d} s^2} x^s \right\vert_{s=0}$.

$ \mathbb{E}( x^s ) = \int_0^\infty x^s \cdot \frac{\beta^{n}}{\Gamma(n)} x^{n-1} \exp(-x \beta) \mathrm{d} x = \int_0^\infty \frac{\beta^{n}}{\Gamma(n)} x^{n + s-1} \exp(-x \beta) \mathrm{d} x = \beta^{-s} \frac{\Gamma(n+s)}{\Gamma(n)} $

Now it is easy to figure out:

$ \mathbb{E}(\log T) = -\theta + \psi(n) \qquad \mathbb{V}(\log T) = \psi^{(1)}(n) $ where $\psi(n)$ is digamma function and $\psi^{(1)}(n)$ is its first derivative.

Added: To make it easier to compute the variance, note that $\mathbb{V}(\log T) = \left. \frac{ \mathrm{d}^2}{ \mathrm{d}s^2 } \mu^{-s} \mathbb{E}(x^s)\right\vert_{s=0}$, where $\log \mu$ is the mean of $\log T$, but $\mu^{-s} \mathbb{E}(x^s) = \frac{\Gamma(n+s)}{\Gamma(n)} \exp(-s \psi(n)) = \exp( 1 + \psi(n) s + \frac{1}{2} \psi^\prime(n) s^2 + o(s^3)) \exp(-s \psi(n))$. Therefore $ \mu^{-s} \mathbb{E}(x^s) = \exp(1 + \frac{1}{2} \psi^\prime(n) s^2 + o(s^3)) = 1 + \frac{1}{2} \psi^\prime(n) s^2 + o(s^3)$.

  • 0
    Now I fully understand your work. When I read $\mathbb{E}(\exp(s \log(x)))$ it all made sense. Silly me ;) Thanks again!2011-09-27