2
$\begingroup$

The question is,

S wants to wants to broadcast messages to A and B until both A and B receive at least K messages. Everytime S broadcast a message, A has a probability of $p_1$ to receive it and B has a probability of $p_2$ to receive it. Links S-A and S-B are assumed to be independent. Then what is the expectation of number of broadcast S needs to make sure both A and B receive at least K messages.

In my derivation, it is computed as the following, $ \begin{equation} \begin{array}{lcl} E & = & \sum_{x=K}^{\infty} x \cdot p(x)\\ \\ & = & \sum_{x=K}^{\infty} x \cdot\{ {{x-1}\choose{K-1}} p_1^{K-1}(1-p_1)^{x-K}p_1{{x} \choose {K}} p_2^{K}\\ \\ & & + {{x-1}\choose{K-1}} p_2^{K-1}(1-p_2)^{x-K}p_2{{x} \choose {K}} p_1^{K}\\ \\ & & - {{x-1}\choose{K-1}} p_1^{K-1}(1-p_1)^{x-K}p_1 {{x-1}\choose{K-1}} p_2^{K-1}(1-p_2)^{x-K}p_2\}\\ \\ & = & \sum_{x=K}^{\infty} \{ {{x-1}\choose{K-1}} x {{x} \choose {K}} p_1^K p_2^K (1-p_1)^{x-K}\\ \\ & & + {{x-1}\choose{K-1}} x {{x} \choose {K}} p_1^K p_2^K (1-p_2)^{x-K}\\ \\ & & - {{x-1}\choose{K-1}} x {{x-1} \choose {K-1}} p_1^K p_2^K (1-p_1)^{x-K} (1-p_2)^{x-K}\}\\ \\ & = & \sum_{x=K}^{\infty} \{ K{{x}\choose{K}} {{x} \choose {K}} p_1^K p_2^K (1-p_1)^{x-K}\\ \\ & & + K{{x}\choose{K}} {{x} \choose {K}} p_1^K p_2^K (1-p_2)^{x-K}\\ \\ & & - K{{x}\choose{K}} {{x-1} \choose {K-1}} p_1^K p_2^K (1-p_1)^{x-K} (1-p_2)^{K-k}\}\\ \\ & = & K p_1^K p_2^K \sum_{x=K}^{\infty} \{ {{x}\choose{K}} {{x} \choose {K}} (1-p_1)^{x-K} + {{x}\choose{K}} {{x} \choose {K}} (1-p_2)^{x-K}\\ \\ & & - {{x}\choose{K}} {{x-1} \choose {K-1}} (1-p_1)^{x-K} (1-p_2)^{x-K}\}\\ \end{array} \end{equation} $

How can I make this formula simpler? Thanks a lot.

Sorry about the misunderstanding. My goal is to compute the expectation of minimal number of broadcasts such that both A and B receive at least K messages.

  • 0
    A small technicality, you need the expectation of the _minimal_ number of broadcasts, such that both $A$ and $B$ received at least $k$ messages, right ?2012-01-11
  • 0
    right, the expectation of minimal number of broadcasts.2012-01-12
  • 1
    It seems to me that you want the expected value of the _maximum_ of two independent _negative binomial_ random variables with parameters $p_1$ and $p_2$ respectively. The CDF of the max is easy to write down but the expected value calculation is messy. Maybe the sum of the series is not expressible more simply. And what do you mean by the "limit of this formula"?2012-01-12
  • 0
    @DilipSarwate Yes, that is exactly the case. Well spotted! +1.2012-01-12

1 Answers 1

1

Let $X_n$ denote the state of the system after $n$ broadcasts encoding $(k_A,k_B)$, where $k_A$ and $k_B$ are number messages received by $A$ and $B$ respectively.

The termination condition is $$ \Omega = \{ (k_A,k_B) \colon \min(k_A, k_B) = k \} $$

Let $T = \inf\{n \colon X_n \in \Omega, X_{n-1} \not\in \Omega \}$. We are interested in determining $\mathbb{E}(T)$.

The probability that $T$ equals $n$ is computed conditioning on $\Omega \not\ni X_{n-1} \to X_n \in \Omega$: $$ \begin{eqnarray} \mathbb{P}\left(T=n\right) &=& \mathbb{P}(K_A=k-1)\mathbb{P}(K_B \geqslant k) p_1 \\ &\phantom{=}& + \mathbb{P}(K_B=k-1)\mathbb{P}(K_A \geqslant k) p_2\\ &\phantom{=}& + \mathbb{P}(K_A=k-1) \mathbb{P}(K_B=k-1) p_1 p_2 \\ &=& \binom{n-1}{k-1} p_1^{k} q_1^{n-k} \sum_{m=k}^{n-1} \binom{n-1}{m} p_2^m q_2^{n-1-m} \\ &+& \binom{n-1}{k-1} p_2^{k} q_2^{n-k} \sum_{m=k}^{n-1} \binom{n-1}{m} p_1^m q_1^{n-1-m} \\ &+& \binom{n-1}{k-1} p_2^{k} q_2^{n-k} \binom{n-1}{k-1} p_1^{k} q_1^{n-k} \end{eqnarray} $$

I am not able to compute the expected value in the closed form, however, I am able to use Mathematica to find means for specific values of $k$:

pdf[{k_Integer, p1_, p2_}, n_] := 
 FullSimplify@
  FunctionExpand@
   PiecewiseExpand[
    Refine[p1 PDF[BinomialDistribution[n - 1, p1], 
        k - 1] SurvivalFunction[BinomialDistribution[n - 1, p2], 
        k - 1], k \[Element] Integers] + 
     Refine[p2 PDF[BinomialDistribution[n - 1, p2], 
        k - 1] SurvivalFunction[BinomialDistribution[n - 1, p1], 
        k - 1], k \[Element] Integers] + 
     Refine[p1 p2 PDF[BinomialDistribution[n - 1, p2], k - 1] PDF[
        BinomialDistribution[n - 1, p1], k - 1], 
      k \[Element] Integers]]

The density can be computed in closed form for explicit values of $k$, and Sum can be used to find the mean:

In[61]:= With[{k = 1},
 Sum[n Assuming[n >= k, 
    FullSimplify@Refine@FunctionExpand[pdf[{k, p1, p2}, n]]], {n, 
   k, \[Infinity]}, Assumptions -> 0 < p1 < 1 && 0 < p2 < 1]
 ]

Out[61]= (-p1^2 - p1 p2 + p1^2 p2 - p2^2 + 
 p1 p2^2)/(p1 p2 (-p1 - p2 + p1 p2))
  • 0
    How is the formula different from the formula I derived in the question? It seems like we are using the same idea.2012-01-12
  • 0
    I am not saying it is simpler. I am saying you can compute moments for exactly for specific values of $k$.2012-01-12
  • 0
    I see your point. Thanks. I'll try to see if I can figure out the closed form.2012-01-12