1
$\begingroup$

I need to calculate or at least have an upper bound of

$$\Pr\{B_t > x\}$$

where $B_t$ is the product of several Beta variables, i.e.

$$B_t = \prod_{i=1}^t B(1, k)$$

I've completely no idea how to handle this. There are countless combinations of these Beta variables that could full fill the condition, so it's hard to get the density or cumulative functions of $B_t$.

(I happened to know that for uniform variables $\prod_{i=1}^t - \ln U_i \sim \Gamma(t, 1) $, but that seems to be pure luck and could not be generalized to other distributions.)

2 Answers 2

1

Note that $B_t$ is the product of $t$ independent random variables distributed like $B_1$. Since $\mathrm P(B_1\geqslant x)=(1-x)^k$ for every $x$ in $(0,1)$, $\mathrm P(B_t\geqslant x)\leqslant \mathrm P(B_1\geqslant x)^t=(1-x)^{kt}$.

  • 0
    I'm afraid the first bound isn't tight enough for my purpose. Can you give some clue of how you got the second one?2012-01-24
  • 0
    Your comment is mysterious to me since upper bounds on $E(B_t)$ should not involve $x$. Anyway, the value of $E(B_t)$ is obvious: $E(B_t)=(k+1)^{-t}$.2012-01-24
  • 0
    Sorry I made a mistake. I was actually trying to get $\sum_{t=1}^\infty P (B_t \ge x)$. That's not expectation of $B_t$.2012-01-24
  • 0
    Please do not delete your comments after someone answered them.2012-01-24
1

Since $X \sim \operatorname{B}(1,k)$ is equal in law to $1-U^{1/k}$, where $U$ is uniform on a unit interval, you are computing $$ \mathbb{P}\left( \prod_{i=1}^t (1-U_i^{1/k}) > x \right) $$ The case of $k=1$ is special, due to $1-U \stackrel{d}{=} U$, and $-\log(\prod_{i=1}^t U_i) \sim \Gamma(t,1)$, as you mention.

Observe that $m_r = \mathbb{E}\left( B_t^r \right) = \mathbb{E}\left( B_1^r \right)^t = \binom{k+r}{r}^{-t}$. This permits to compute the moment generating function for $B_t$ in terms of the confluent generalized hypergeometric function: $$ \mathcal{M}_{B_t}(u) = \mathbb{E}\left( \mathrm{e}^{u B_t} \right) = \sum_{r=0}^\infty \frac{u^r}{r!} m_r = {}_t F_t\left(\left. \begin{array}{c}\underbrace{1,\ldots,1}_{t\text{ times}} \\ \underbrace{k+1,\ldots,k+1}_{t\text{ times}} \end{array}\right| u \right) $$ So in principle one can construct Chernoff bound. But it is known that moment bounds are tighter. Assuming $0 < x<1$, $$ \mathbb{P}(B_t \geqslant x) \leqslant \inf_{r\geqslant 0} \left( \frac{m_r}{x^r}\right) = \inf_{r\geqslant 0} \left( \frac{1}{x^r} \binom{k+r}{r}^{-t}\right) = \left. \frac{1}{x^r} \binom{k+r}{r}^{-t} \right|_{r = \lfloor k \left(\frac{1}{1-x^{\frac{1}{t}}}-1\right)-1 \rfloor} $$

enter image description here

  • 0
    Oh, I didn't know moment bound before. But it seems that, when $x$ is very close to 0, this bound just gives that $P(B_r > x) < 1$.2012-01-25