12
$\begingroup$

According to Wikipedia, the Beta distribution is related to the gamma distribution by the following relation:

$\lim_{n\to\infty}n B(k, n) = \Gamma(k, 1)$

Can you point me to a derivation of this fact? Can it be generalized? For example, is there a similar relation that results in something other than a constant 1 for the Gamma second parameter? What if we have

$\lim_{n\to\infty,m\to\infty,n=mb}n B(k, m) $

That is, the two variables go to infinity while maintaining a constant ratio b.

The reason I'm asking is because I'm trying to figure out how to simplify a hieraerchical bayesian model involving the beta distribution.

(This is my first post; sorry for the math notation, the MathJaX syntax was too daunting, but I'll try to learn)

  • 2
    Robert, I think you are talking about the Beta and Gamma functions, whereas my question concerns the Beta and Gamma _distributions_.2012-09-06

3 Answers 3

3

There is another way to view the relationship between the gamma distribution and the beta distribution through the Dirichlet distribution. This post (https://math.stackexchange.com/q/190695) talks about exactly how they are related without the Dirichlet distribution, but here is a slightly broader view:

Let $Z_1, Z_2, \ldots, Z_n$ be independent random variables such that $ Z_i \sim \text{Gamma}(\alpha_i,1), \quad i=1,\ldots, n$ where $\alpha_i \geq 0$ are the shape parameters. If $Y_j$ is defined as follows, $Y_j = Z_j/\sum_{i=1}^n Z_i, \quad j=1, \ldots, n$ then $(Y_1, \ldots, Y_n)$ is Dirichlet distributed with parameter $(\alpha_1, \ldots, \alpha_n)$. When $n=2$, the Dirichlet distribution reduces to the Beta distribution, denoted by $\text{Beta}(\alpha_1, \alpha_2)$.

Here is a link to Ferguson's paper that mentions the above relationship: http://www.cis.upenn.edu/~taskar/courses/cis700-sp08/papers/ferguson.pdf

22

This concerns the relationship between the Gamma and Beta distributions as opposed to the Gamma and Beta functions. Let $X \sim \mbox{Gamma}(\alpha, 1)$ and $Y \sim \mbox{Gamma}(\beta, 1)$ where the paramaterization is such that $\alpha$ is the shape parameter. Then
$ \frac{X}{X + Y} \sim \mbox{Beta}(\alpha, \beta). $

To prove this, write the joint pdf $f_{X, Y} (x, y) = \frac{1}{\Gamma(\alpha) \Gamma(\beta)} x^{\alpha - 1} y^{\beta - 1} e^{-(x + y)}$ (on $\mathbb R^2_+$) and make the transformation $U = \frac{X}{X + Y}$ and $V = X + Y$. The Jacobian of the transformation $X = VU, Y = V(1 - U)$ is equal to $V$ so the joint distribution of $U$ and $V$ has pdf $ \frac{v}{\Gamma(\alpha)\Gamma(\beta)} (vu)^{\alpha - 1} (v (1 - u))^{\beta - 1} e^{-v} = \frac{1}{\Gamma(\alpha)\Gamma(\beta)}v^{\alpha + \beta - 1} e^{-v} u^{\alpha - 1} (1 - u)^{\beta - 1} $ (on $\mathbb R_+ \times [0, 1]$) and hence $U$ and $V$ are independent (because the pdf factors over $u$ and $v$) with $V \sim \mbox{Gamma}(\alpha + \beta, 1)$ and $U \sim \mbox{Beta}(\alpha, \beta)$ which is apparent from the terms $v^{\alpha + \beta - 1} e^{-v}$ and $u^{\alpha - 1}(1 - u)^{\beta - 1}$ respectively.

14

Fix some $k$ and, for every $n$, let $X_n$ denote a random variable with beta distribution $\mathrm B(k,n)$ and $Y_n=nX_n$. Then, for every $s\geqslant0$, $\mathrm E(Y_n^s)=n^s\mathrm E(X_n^s)$ and one knows the value of $\mathrm E(X_n^s)$, hence $ \mathrm E(Y_n^s)=n^s\frac{\mathrm B(k+s,n)}{\mathrm B(k,n)}=n^s\frac{\Gamma(k+s)\Gamma(k+n)}{\Gamma(k+s+n)\Gamma(k)}\longrightarrow\frac{\Gamma(k+s)}{\Gamma(k)}. $ This is $\mathrm E(Z^s)$ for any random variable $Z$ with gamma distribution $\Gamma(k)$ hence $Y_n\to Z$ in distribution.

Let $X'_n$ denote a random variable with beta distribution $\mathrm B(k,n/b)$, and $Y'_n=nX'_n$. Then, $X'_n$ is distributed like $X_{n/b}$ hence $Y'_n$ is distributed like $bY_{n/b}$ and $Y'_n\to bZ$ in distribution.

  • 0
    @Sten See? These MSE mods are amazing, aren't they? :-)2012-09-06