13
$\begingroup$

Suppose $X$ is a random variable; when do there exist two random variables X',X'', independent and identically distributed, such that X = X' + X''

My natural impulse here is to use Bochner's theorem but that did not seem to lead anywhere. Specifically, the characteristic function of $X$, which I will call $\phi(t)$, must have the property that we can a find a square root of it (e.g., some $\psi(t)$ with $\psi^2=\phi$) which is positive definite. This is as far as I got, and its pretty unenlightening - I am not sure when this can and can't be done. I am hoping there is a better answer that allows one to answer this question merely by glancing at the distribution of $X$.

  • 1
    Risking at stating triviality: if X is infinitely divisible distribution, then it does admit such a decomposition, but the class you seek to characterize is definitely wider. Skimming though Lukacs' book, he says that if $\phi(t)$ is analytic in a strip -\alpha < \operatorname{Im}(t) < \beta and does not vanish within it, it can be decomposed into a product of two c.f. also analytic in that strip, as far as I understood.2011-08-19

5 Answers 5

7

I think your characteristic function approach is the reasonable one. You take the square root of the characteristic function (the one that is $+1$ at 0), take its Fourier transform, and check that this is a positive measure. In the case where $X$ takes only finitely many values, the characteristic function is a finite sum of exponentials with positive coefficients, and the criterion becomes quite manageable: you need the square root to be a finite sum of exponentials with positive coefficients. More general cases can be more difficult.

  • 0
    And couldn't $X$ be expressed as an independent sum, but not where the two summands have the same distribution?2011-08-20
6

This Wikipedia article addresses precisely this question, with some nice examples, but doesn't actually answer the question: http://en.wikipedia.org/wiki/Indecomposable_distribution

There is a book titled Characteristic Functions, by Eugene Lukacs, which addressed many questions of this kind.

If you want to address only infinite divisibility, rather than the (possibly more complicated) questions of finite divisibility, then one necessary condition for infinite divisibility is that all of the even-order cumulants must be non-negative. PS: That all of the even-order cumulants must be non-negative is true of compound Poisson distributions, but now I think probably not of infinitely divisible distributions in general.

  • 0
    If it does _not_ have finite moments beyond some finite order, nonetheless I think all of its even-order cumulants that do exist will be nonnegative. In which case, the statement still stands.2011-08-20
3

You could try with cumulants, which are additive with respecto to (independent) addition.

Say $Y = X_1 + X_2$ where $X_i$ are iid.

Thus, if you are given the distribution of X, you compute its cumulants, $\kappa^{[x]}_n$, and you have $\kappa^{[y]}_n = 2 \kappa^{[x]}_n$ . For this you can (formally, not necesarily easily) recover the distribution of $Y$.

Of course, this only can be applied when $X$ has finite moments. And even then, it does not guarantee that the new cumulants are "valid" (i.e. they correspond to a valid probability function). I guess this could be worked, though.

  • 0
    I thought along the same lines, but the question is to characterize when it is possible, not to find such a distribution. Besides, proving that cumulants so obtained correspond to a valid distribution will reduce to Bochner theorem again.2011-08-19
1

@Sasha wrote above: "Is there a reference, or an argument, to demonstrate your claim about necessary condition for infinity divisibility in term of non-negativity of even-order cumulants?"

So I've taken a little bit of time to remind myself of this argument for a special case. Let $\{N_t : t \geq 0\}$ be a Poisson process, i.e. for each real $0, the random variables $N_{t_1},\ N_{t_2}-N_{t_1}, \ldots,N_{t_n} - N_{t_{n-1}}$ are independent and have Poisson distributions with expected values proportional to (and we may as well assume equal to) the lengths of the intervals $(0,t_1), (t_1,t_2),\dots$.

Now let $Y_t = \sum\limits_{n=1}^{N_t} X_n$ where $X_1,X_2,\dots$ are an i.i.d. sequence of real random variables. This is called a "compound Poisson process".

I claim every even-order cumulant of $Y_t$ is equal to $t$ times the corresponding even-order moment of $X_1$.

To see this, first recall some properties of cumulants: There's such a thing as the joint cumulant of any finite number of random variables: $\operatorname{cum}(A,B,C,\dots)$. If these are just $n$ copies of $A$ then $\operatorname{cum}(A,\dots,A)$ is what is called the $n$th cumulant $\operatorname{cum}_n(A)$ of $A$. The moment $E(ABCD)$ is a function of the cumulants that comes from enumerating the partitions of the set $\{A,B,C,D\}$ (similarly for sets of sizes other than 4, which is a convenient size to use here), thus: $ \begin{align} E(ABCD) & = \operatorname{cum}(A,B,C,D) \\ & {} + \underbrace{\operatorname{cum}(A,B,C)\operatorname{cum}(D) + \cdots\cdots}_{4\text{ terms}} \\ & {} + \underbrace{\operatorname{cum}(A,B)\operatorname{cum}(C,D)+ \cdots\cdots}_{\text{3 terms}} \\ & {} + \underbrace{\operatorname{cum}(A,B)\operatorname{cum}(C)\operatorname{cum}(D) + \cdots\cdots}_{6\text{ terms}} \\ & {} + \operatorname{cum}(A)\operatorname{cum}(B)\operatorname{cum}(C)\operatorname{cum}(D). \end{align} $ (This actually characterizes the cumulants completely if you do the similar thing for sets of sizes other than 4.) Constants come out, e.g. $\operatorname{cum}(3A,B,C,D) = 3\operatorname{cum}(A,B,C,D)$. A conditional cumulant $\operatorname{cum}(A,B,C,D \mid N=n)$ would be a function of $n$, and so $\operatorname{cum}(A,B,C,D \mid N)$ would be a random variable whose value is determined by that of $N$.

The law of total cumulance says $ \begin{align} \operatorname{cum}(A,B,C,D) & = \operatorname{cum}(\operatorname{cum}(A,B,C,D \mid N)) \\ & {} + \operatorname{cum}(\operatorname{cum}(A,B,C\mid N),\operatorname{cum}(D\mid N)) + \text{3 more terms} \\ & {} + \operatorname{cum}(\operatorname{cum}(A,B \mid N),\operatorname{cum}(C,D\mid N)) + \text{2 more terms} \\ & {} + \operatorname{cum}(\operatorname{cum}(A,B\mid N),\operatorname{cum}(C\mid N),\operatorname{cum}(D\mid N)) + \text{5 more terms} \\ & {} + \operatorname{cum}(\operatorname{cum}(A\mid N),\operatorname{cum}(B\mid N), \operatorname{cum}(C\mid N), \operatorname{cum}(D\mid N)), \end{align} $ and similarly for other sizes than 4.

Now apply this to $\operatorname{cum}_4\left(\sum\limits_{n=1}^{N_t} X_n\right)$.

In the course of this, repeatedly use the fact that $\operatorname{cum}_m\left( \sum\limits_{n=1}^{N_t} X_n \mid N_t \right) = N_t\operatorname{cum}_m(X_1)$. Then repeatedly use the fact that $\operatorname{cum}_{\ell}(N_t\operatorname{cum}_m(X_t)) = \operatorname{cum}_m(X_t) \cdot \operatorname{cum}_\ell(N_t)$ (the constant $\operatorname{cum}_m(X_t)$ was pulled out.) Repeatedly use the fact that $\operatorname{cum}_\ell(N_t) = t$ (regardless of the value of $\ell$).

You end up with $t$ times the sum of products of cumulants asserted above to add up to the 4th moment of $X_1$. Then apply to other sizes than 4. Even-numbered moments of $X_1$ are nonnegative; therefore, even-numbered cumulants of $Y_t$ are nonnegative.

  • 0
    A small blooper: I wrote: "I claim every even-order cumulant of $Y_t$ is equal to the corresponding even-order moment of $X_1$." But it's actually $t$ times the corresponding even-order moment of $X_1$.2011-08-20
0

There are many specific examples and a few of these (collected from Wikipedia) are listed below:

  • The sum of $n$ independent and identically distributed Gaussian variables with mean $\mu$ and variance $\sigma^2$ is also Gaussian distributed with $n\mu$ and $n\sigma^2$ (see, e.g., Wikipedia or Wolfram MathWorld
  • The sum of $n$ independent and identically distributed Poisson variables with parameter $\lambda$ is also Poisson distributed with parameter $n\lambda$.
  • The sum of $n$ independent and identically distributed Binomial variables with parameters $m, p$, is also Binomial distributed with $n m, p$.

Distributions defined as a sum random variables include the Poisson binomial distribution, the Gamma distribution (in certain cases) and the $\chi^2$ distribution.

  • 0
    Of course, the number of composing random variables is important and the phrasing of my comment was not precise enough. However, Cramer's and Raikov's theorems provide a unique decomposition of Gaussian or Poisson distributions into $n=2$ iid's.2011-08-21