4
$\begingroup$

My self-study in measure and probability theory as finally brought me to the subject of characteristic functions, and I have not handled these in the past with any rigor at all, so all of this is somewhat daunting.

Problem: To find explicitly the characteristic function for each of these situations:

$X$~Poisson$(\lambda)$

$Y$~Gamma$(\alpha,\beta)$

$Z$~$U_1+U_2+...+U_n$ where each $U_n$ is ~Uniform(-1,1) (additionally, can this example be used in Levy's inversion formula if $n\ge 2$ to have a completely real-valued integral?


Work/attempts at solution

For $X$:

$\phi_X(t)=E[e^{itX}]=\sum[e^{it}]^xe^{-\lambda}\frac1{x!}\lambda^x=e^{-\lambda}\sum_{x=0}^{\infty}\frac{(\lambda e^{it})^x}{x!}=e^{-\lambda}e^{\lambda e^{it}}=e^{\lambda e^{it-1}}$

I'm not completely sure on this, but it seems reasonable.

I am pretty stuck on the gamma distribution and am largely clueless on the sum of uniform RVs part.

  • 0
    Wikipedia is nice to check your answer. If you search for any distribution, say Poisson Distribution there's a table to the right where there will be a summery of facts such as the characteristic function (CF). Example: http://en.wikipedia.org/wiki/Poisson_distribution2012-08-04
  • 0
    For the sum of uniform random variables, do you know of a property of characteristic functions that will make your life a lot easier? In other words, if $X_1$ and $X_2$ are independent, can you write the char. function of $X_1+X_2$ in terms of the individual characteristic functions?2012-08-04
  • 0
    @Sam, do you mean $\phi_{U_1+U_2+...}(t)=E(e^{it\sum_j U_j})$?2012-08-04

0 Answers 0