8
$\begingroup$

Suppose $X_1,\ldots,X_n$ is a random sample from the $\Gamma(k,\lambda)$ distribution where $\lambda$ is unknown and $k$ is a positive integer and known. How can I find $E\left[\frac{\sum_{i=1}^n X_i^2}{(\sum_{i=1}^n X_i)^2}\right] \>?$

  • 0
    I think the $\lambda$ will cancel from the numerator and the denominator, so the answer will not depend on $\lambda$.2012-04-29

2 Answers 2

15

I will assume $X_i$ be independent.

Using identity, for $a>0$, $ \frac{1}{a^2} = \int_0^\infty t \mathrm{e}^{-a t} \mathrm{d} t $ Rewrite the expectation as follows: $ \begin{eqnarray} \mathbb{E}\left( \frac{\sum_{i=1}^n X_i^2}{\left(\sum_{i=1}^n X_i \right)^2} \right) &=& \int_0^\infty t \cdot \mathbb{E}\left(\sum_{i=1}^n X_i^2 \cdot \exp\left(-t \cdot \sum_{i=1}^n X_i\right) \right) \mathrm{d} t \\ &=& n \int_0^\infty t \cdot \mathbb{E}\left(X_1^2 \cdot \exp\left(-t \cdot \sum_{i=1}^n X_i\right) \right) \mathrm{d} t \\ &=& n \int_0^\infty t \cdot \left(\mathcal{M}_X(-t)\right)^{n-1} \cdot \mathcal{M}^{\prime\prime}_X(-t) \mathrm{d} t \\ &=& n \int_0^\infty t \cdot \left(1+\frac{t}{\lambda}\right)^{-k (n-1)} \cdot \frac{k(k+1)}{\lambda^2} \left( 1+ \frac{t}{\lambda}\right)^{-2-k} \mathrm{d} t \\ &\stackrel{\color\maroon{t \to \lambda u}}{=}& n k (k+1) \int_0^\infty \frac{t}{(1+t)^{k n +2}} \mathrm{d} t \\&\stackrel{t \to \frac{1-s}{s}}{=}& n k (k+1) \int_0^1 (1-s) s^{k n-1} \mathrm{d} s = n k (k+1) \left( \frac{1}{n k} - \frac{1}{n k+1} \right)\\&=& {\color\maroon{\frac{k+1}{k \cdot n +1}}} \end{eqnarray} $ where $\mathcal{M}_X(t)$ denotes the moment generating function of the $\Gamma(k,\lambda)$ distribution.

  • 1
    @Didier Thanks for the feedback and upvote. I have expanded evaluation of the integral per your suggestion. It's better this way.2012-04-29
15

Sasha gives a very nice answer. Here is an alternative using standard relationships between distributions. I, too, will assume that the $X_i$ are independent.

Write $\newcommand{\Beta}{\mathrm{B}}Y_i = \frac{X_i}{X_i + \sum_{i\neq j} X_j}$.

Then, $Y_i \sim \mathrm{Beta}(k,(n-1)k)$ since $X_i \sim \Gamma(k,\lambda)$ and $\sum_{i \neq j} X_j \sim \Gamma((n-1)k,\lambda)$ and the latter two are independent.

By linearity of expectation and the fact that each $Y_i$ has a common distribution, we have $ \mathbb E\left(\frac{\sum_{i=1}^n X_i^2}{(\sum_{i=1}^n X_i)^2}\right) = n \mathbb E Y_1^2 \>. $

But, $ \mathbb E Y_1^2 = \frac{1}{\Beta(k,(n-1)k)} \int_0^1 y^2 y^{k-1} (1-y)^{nk-k-1} \,\mathrm dy = \frac{\Beta(k+2,(n-1)k)}{\Beta(k,(n-1)k)} = \frac{k+1}{n(nk + 1)} \>. $

So, $ \mathbb E\left(\frac{\sum_{i=1}^n X_i^2}{(\sum_{i=1}^n X_i)^2}\right) = \frac{k+1}{nk + 1} \>. $

  • 4
    Wow! No less than **two** good solutions to a same question... What is going on here? :-)2012-04-29