4
$\begingroup$

I am trying to caclulate the Factorial Moment of the Geometric Distribution #2 with parameter $p$. Therefore I set $\Omega = \mathbb{N}_0$ and have by using the pochhammer symbol and setting $q=1-q$ that

$$E((k)_l)= \sum _{k=0}^{\infty } (k)_l p q^k = p^{-l} q \cdot l! \sum _{k=0}^{\infty } (\frac{(k+l-1)!}{(k-1)! \cdot l!}\cdot p^{l+1} q^{k-1}) $$

Now Mathematica tells me that $\sum _{k=0}^{\infty } (\frac{(k+l-1)!}{(k-1)! \cdot l!}\cdot p^{l+1} q^{k-1})=1$, but I cannot see why this identity is true. Also when using

FactorialMoment[GeometricDistribution[p], l]

Mathematica suggests that $E((k)_l)=(\frac{q}{p})^l l!$. Thank you in Advance for your help.

2 Answers 2

5

Let $X$ have geometric distribution, where $X$ is the number of failures before the first success.

The easiest approach to the factorial moments in this case is to find the factorial moment generating function, which is $$E(t^X)$$ Suppose the probability of success is $p$. We want $$\sum_{n=0}^\infty pq^n t^n$$ where as usual $q=1-p$. So we want $$\sum_{n=0}^\infty p(qt)^n$$ Sum this infinite geometric series. We get $$\frac{p}{1-qt}$$ To find the $k$-th factorial moment, find the $k$-th derivative of the factorial moment generating function (with respect to $t$) at $t=1$. In our particular case, finding the $k$-th derivative is easy.

If by geometric distribution you mean total number of trials until first success (so values are $1$, $2$, and so on) a small modification of the above calculation will give the answer.

Addendum: The easiest way to find the sum $$\sum_{k=1}^\infty (k)(k-1)\cdots(n-\ell+1)x^k$$ that was asked about is to express this as $$x^{\ell}\sum_{k=1}^\infty (k)(k-1)\cdots(n-\ell+1)x^{k-\ell}$$ and observe that $$(k)(k-1)\cdots(n-\ell+1)x^{k-\ell}$$ is the $\ell$-th derivative of $x^k$. So the desired sum is the $\ell$-th derivative of $1+x+x^2+ x^3+\cdots$, that is, of $1/(1-x)$.

  • 0
    So the approach with using the sum doesn't yield any results in this case? I find it non-intuitive but beautiful (:p) that the $k$-th derivate (to which variable?) is already my solution, maybe you have any reference for this?2011-05-10
  • 1
    @user3123: I will give the usual sort of reference, namely http://en.wikipedia.org/wiki/Factorial_moment_generating_function. Most probability books have similar information, with calculations. Your approach will work fine too, the summation is not hard, but it does take some work (it helps to have done similar things before). The fact that the factorial moment generating function "works" can be (sort of) shown by differentiating the expression for $E(t^X)$ using the usual rules, crossing one's fingers a little.2011-05-10
  • 0
    Ok I did this and now got the same result as Mathematica. Thanks!2011-05-10
4

The factorial moments of an integer valued random variable $X$ are linked to the successive derivatives of the generating function $g_X$ of $X$, defined by $$ g_X(s)=E(s^X)=\sum_{n=0}^{+\infty}P(X=n)s^n. $$ For every $k\ge0$, the $k$th derivative is $$ g_X^{(k)}(s)=E((X)_ks^{X-k}), $$ hence the value at $s=1$ yields the factorial moment.

Now, what is $g_X$ for $X$ geometric?

  • 0
    Ah I see this is what I need. Thanks, but how did you get the formula for the derivate?2011-05-10
  • 1
    If you derive the series term-by-term $k$ times, the $s^n$ term becomes $P(X=n)(n)_ks^{n-k}$. The sum of these is by definition the expectation of $(X)_ks^{X-k}$.2011-05-10
  • 0
    Yes I got the result now, thanks!2011-05-10