15
$\begingroup$

Let $X_i$ be an iid sequence of random variables with support in $(0,1)$. I'm looking for references (even just a name) for the following infinite sum random variable:

$$S:=X_1+X_1X_2+X_1X_2X_3+X_1X_2X_3X_4+\cdots.$$

This came up for a waiting-time problem. I can easily calculate the expected value and variance of the above sum, but I'm interested if other people have studied this in literature, specifically if there are asymptotics for limiting distribution. I presume there are also issues when $P(X>1-\epsilon)$ falls off too slowly with increasing $\epsilon$.

  • 0
    Just did a little bit search. If I compute the deterministic version of this series (say the expectation), then I got https://www.wolframalpha.com/input/?i=sum+(product+1%2F2%5Ek,+k%3D1+to+n),+n+%3D+1+to+infinity the theta function, which has some relationship with the q-series. But at the moment I cannot find any other similar thing yet.2017-01-11
  • 0
    Dumb comment on my part, but maybe there are more results when you take the logarithm of this, although probably not since logarithm doesn't distribute over addition.2017-01-12
  • 1
    Keyword: stochastic (random) perpetuity.2017-01-15
  • 2
    Let $Y$ be a random variable that is independent of $S$ and has the same law as $X_1$. Then $S$ is equal in law to $Y(1+S)$. This may help you obtain the desired asymptotics.2017-01-16
  • 0
    I doubt that this helps, but... using above observation by @pre-kidney leads to a recursion $$s_k = \frac{t_k}{1-t_k} \sum_{j=0}^{k-1} {k \choose j } s_j$$ where $s_k,t_k$ are the raw moments of variables $S,X$ resp.2017-01-16
  • 1
    This is probably a good paper to start with: http://link.springer.com/chapter/10.1007/978-3-642-57984-4_62017-01-17
  • 0
    What is the distribution of $X_i$ please?2017-01-18
  • 0
    @NP-hard I guess the idea is to express the distribution of $S$ in terms of the (general) distribution of $X_i$2017-01-18
  • 0
    @zhoraster: Thanks for the reference! If you want to make it an answer, I'll award the bounty.2017-01-19
  • 0
    Link-only answers are discouraged here. Moreover, I'm not sure that this does answer the question (it is not very clear what exactly the question is). It's good if I was able to help.2017-01-20
  • 0
    Since $S \sim X(1 + S)$, don't we have $f_S(s) = \int_{0}^{1} \! x^{-1} \, f_X(x) \, f_S(s/x-1) \, dx$ as an integral equation for the p.d.f.?2017-10-01
  • 0
    https://math.stackexchange.com/questions/2130264/sum-of-random-decreasing-numbers-between-0-and-1-does-it-converge2018-03-29

1 Answers 1

0

Let $\Sigma_1=\sum_{i=1}^{\infty}{(\prod_{j=1}^{j}{X_i})}$ and notice that $\Sigma_1=X_1(1+\sum_{i=2}^{\infty}{(\prod_{j=2}^{j}{X_i})})=X_1(1+\Sigma_2)$

Also note that $\Sigma_1$ and $\Sigma_2$ are also iid (can be seen via dummy index changes).

Let us also assume that $\Sigma_i$ has the pdf $f_{\Sigma}(\sigma)$

Define $Y=(1+\Sigma_2)$, $f_Y(y)=f_{\Sigma}(y-1)$

Anyway since $\Sigma_2$ does not contain $X_1$, they are also independent.

Finally we can write down:

$f_{\Sigma}(\sigma)=\int_{\mathscr{Y}}{f_X(\sigma/y)f_Y(y)dy}=\int_{x=0}^{1}{f_X(x)f_Y(\sigma/x)dx}=\int_{x=0}^{1}{f_X(x)f_{\Sigma}(\sigma/x-1)dx}$

For an arbitrary $f_X(x)$ this cannot be solved, even existence of $f_{\Sigma}$ cannot be guaranteed.

One trivial solution for $(X,\Sigma)$ pair is satisfied by $f_X(t)=f_\Sigma(t)=\delta(t)$

One trivial $X$ implying that $\Sigma$ does not have a (not heavy tailed) pdf is: $f_X(t)=\delta(t-1)$

So if you can give us $f_X(x)$, we can talk again about the solution. If not for all I care I have given you a solution and an answer.