0
$\begingroup$

Given a random variable $X$, the moment generating function of $X$ is given by: $$\phi(t) = \begin{cases} \ \sum_x e^{tx}p(x) & \text{when } X \text{ is discrete} \\ \ \int e^{tx}f(x) \, dx & \text{when } X \text{ is continuous} \end{cases}$$ In the above, $p(x)$ and $f(x)$ denote the mass and the density functions respectively. The definition mentions nothing about the convergence in the sum or the integral. So, is this just a "formal" definition, and we need to check convergence and the validity in interchanging limit signs for a given random variable $X$ before differentiating: $$\phi^{(n)}(0) = E(X^n)$$

  • 0
    The moment generating function is defined for all values of $t$, but may take on infinite values. We could also define them only on the set where the expectation $Ee^{tX}$ is finite. For $t=0$ it is obviously always defined and equal to $1$.2017-01-19
  • 0
    Regarding differentiation, formally, you would need to justify that is (n times) differentiable at zero. But most of the known distributions we use fulfill that condition, so, often, not much attention is given to that.2017-01-19

0 Answers 0