24
$\begingroup$

If $X\sim \mathrm{Gamma}(a_1,b)$ and $Y \sim \mathrm{Gamma}(a_2,b)$, I need to prove $X+Y\sim(a_1+a_2,b)$ if $X$ and $Y$ are independent.

I am trying to apply formula for independence integral and just trying to multiply the gamma function but stuck ?

  • 0
    Hint: After multiplying $f_{X_1}(x)$ and $f_{X_2}(z-y)$ and making sure that the limits are correct, you will get an integral for $f_{X_1+Y_2}(z)$ that can be transformed into a _Beta_ function whose value is $B(a_1,a_2) = \frac{\Gamma(a_1)\Gamma(a_2)}{\Gamma(a_1+a_2)}$.2012-12-03

3 Answers 3

23

Now that the homework deadline is presumably long past, here is a proof for the case of $b=1$, adapted from an answer of mine on stats.SE, which fleshes out the details of what I said in a comment on the question.

If $X$ and $Y$ are independent continuous random variables, then the probability density function of $Z=X+Y$ is given by the convolution of the probability density functions $f_X(x)$ and $f_Y(y)$ of $X$ and $Y$ respectively. Thus, $f_{X+Y}(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z-x)\,\mathrm dx. $ But when $X$ and $Y$ are nonnegative random variables, $f_X(x) = 0$ when $x < 0$, and for positive number $z$, $f_Y(z-x) = 0$ when $x > z$. Consequently, for $z > 0$, the above integral can be simplified to $\begin{align} f_{X+Y}(z) &= \int_0^z f_X(x)f_Y(z-x)\,\mathrm dx\\ &=\int_0^z \frac{x^{a_1-1}e^{-x}}{\Gamma(a_1)}\frac{(z-x)^{a_2-1}e^{-(z-x)}}{\Gamma(a_2)}\,\mathrm dx\\ &= e^{-z}\int_0^z \frac{x^{a_1-1}(z-x)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dx &\scriptstyle{\text{now substitute}}~ x = zt~ \text{and think}\\ &= e^{-z}z^{a_1+a_2-1}\int_0^1 \frac{t^{a_1-1}(1-t)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dt & \scriptstyle{\text{of Beta}}(a_1,a_2)~\text{random variables}\\ &= \frac{e^{-z}z^{a_1+a_2-1}}{\Gamma(a_1+a_2)} \end{align}$

  • 4
    @A_for_Abacus I gave a _hint_ (actually a complete sketch of the answer) about the homework problem 12 minutes after the question was posted by the OP. So, there was no cruelty involved.2018-03-12
15

You may use a easier method. Consider the moment generating function or probability generating function. $E(e^{(X+Y)t} )=E(e^{Xt}e^{Yt})=E(e^{Xt})E(e^{Yt})$ as they are independent then we can get a moment generating function of a gamma distribution. Then you can find the mean and variance from the Moment generating function

10

It's easier to use Moment Generating Functions to prove that. $ M(t;\alpha,\beta ) = Ee^{tX} = \int_{0}^{+\infty} e^{tx} f(x;\alpha,\beta) = \int_{0}^{+\infty} e^{tx} \frac{\beta^\alpha}{\Gamma(\alpha)} x^{\alpha-1}e^{-\beta x} \\ = \frac{\beta^\alpha}{\Gamma(\alpha)} \int_{0}^{+\infty} x^{\alpha-1}e^{-(\beta - t) x} = \frac{\beta^\alpha}{\Gamma(\alpha)} \frac{\Gamma(\alpha)}{(\beta - t)^\alpha} = \frac{1}{(1- \frac{t}{\beta})^\alpha} $ By using the property of independent random variables, we know $M_{X + Y}(t) = M_{X}(t)M_{Y}(t) $ So if $X \sim Gamma(\alpha_1,\beta), Y \sim Gamma(\alpha_2,\beta), $ $M_{X + Y}(t) = \frac{1}{(1- \frac{t}{\beta})^{\alpha_1}} \frac{1}{(1- \frac{t}{\beta})^{\alpha_2}} = \frac{1}{(1- \frac{t}{\beta})^{\alpha_1 + \alpha_2}}$ You can see the MGF of the product is still in the format of Gamma distribution. Finally we can get $X + Y \sim Gamma(\alpha_1 + \alpha_2, \beta)$