2
$\begingroup$

Let $(\pi_1, \pi_2, \cdots)$ be an infinite sequence of real numbers such that $\forall i\; \pi_i > 0$ and $\sum_i \pi_i = 1$. This can be thought of as a probability over natural numbers.

Let $(z_1, z_2, \ldots)$ be a sequence of independently and identically distributed Bernoulli random variables such that $P(z_i = 1) = p$ and $P(z_i = 0) = (1-p)$.

What can we say about the distribution of $X = \sum_i \pi_i z_i$?

$X$ is the sum of a random subsequence of $(\pi_i)$ generated by coin tossing.

First couple of moments are $E[X] = p$, and $E[X^2] = p^2 \sum_i \sum_{i\neq j} \pi_i \pi_j + p\sum_i \pi_i^2$. It seems like one can write the moments of $X$ as some polynomial of the moments of $(\pi_i)$ in $p$.

I'm especially interested in $E[\log(X)]$.

Is there a framework for studying such object? Any hint/reference is appreciated.

1 Answers 1

0

The moment generating function is $E[\exp(tX)] = \prod_i E[\exp(t \pi_i z_i)] = \prod_i (p\exp(t \pi_i)+(1-p))$

If $\pi_i$ goes to $0$ rapidly enough that $\sum_{j > i} \pi_j < \pi_i$, the set of numbers of the form $\sum_i \pi_i z_i$ is a Cantor set, and $X$ has a singular continuous distribution.

  • 1
    I don't know if there's a closed form for $E[\ln X]$. Let's consider the case $\pi_i = (1-\alpha) \alpha^{i-1}$ where 0 < \alpha < 1. Then $X$ has the same distribution as $\alpha X + (1-\alpha) B$ where $B$ is Benoulli($p$) and independent of $X$. Using this I get $E[\ln X] = \dfrac{1-p}{p} \ln \alpha + E[\ln (\alpha X + 1 - \alpha)]$ This can then be approximated using the moments, since $\ln(\alpha x + 1 - \alpha)$ is a continuous function of $x$.2012-10-05