Let $(\pi_1, \pi_2, \cdots)$ be an infinite sequence of real numbers such that $\forall i\; \pi_i > 0$ and $\sum_i \pi_i = 1$. This can be thought of as a probability over natural numbers.
Let $(z_1, z_2, \ldots)$ be a sequence of independently and identically distributed Bernoulli random variables such that $P(z_i = 1) = p$ and $P(z_i = 0) = (1-p)$.
What can we say about the distribution of $X = \sum_i \pi_i z_i$?
$X$ is the sum of a random subsequence of $(\pi_i)$ generated by coin tossing.
First couple of moments are $E[X] = p$, and $E[X^2] = p^2 \sum_i \sum_{i\neq j} \pi_i \pi_j + p\sum_i \pi_i^2$. It seems like one can write the moments of $X$ as some polynomial of the moments of $(\pi_i)$ in $p$.
I'm especially interested in $E[\log(X)]$.
Is there a framework for studying such object? Any hint/reference is appreciated.