Suppose $X_i$ is an indicator random variable. There is another random variable Z defined as $Z = \min(c, \sum_i X_i)$, where $c$ is a constant. How do we compute $E[Z]$? I have come up with the following expression, but I am not sure if its correct, $E[Z] = \min(c, \sum_i Pr(X_i = 1))$.
Expectation of a min function of summation of indicator random variable
0
$\begingroup$
probability
-
1There is no simple formula and the one you suggest is incorrect, as can be seen readily by checking some simple cases. – 2012-06-19
-
0Can we atleast claim that $E[Z] \leq \min(c, \sum_i Pr(X_i = 1))$ ? – 2012-06-20
-
0Yes: $Z=\min(c,S)$ implies $E(Z)\leqslant\min(c,E(S))$. – 2012-06-20
1 Answers
1
Assuming $X_i$ are iid with $p=P(X_i=1)$, then $Y=\sum_{i=1}^n X_i$ follows a $(n,p)$ Binomial distribution. Calling $A=A(n,p,c)=P(Y < c)$, $Z$ is the mixing of a truncated Binomial and a discrete Dirac delta (I'm assuming $c$ is an integer).
$$P(Z=z) = A {n \choose z} p^z (1-p)^{n-z} + (1-A) \delta(z-c) \hspace{1 cm} 0 \le z \le c$$
And so
$$E(Z) = A \sum_{z=0}^{c-1} z {n \choose z}p^z (1-p)^{n-z} + (1-A) \, c $$
with $$A=\sum_{z=0}^{c-1} {n \choose z}p^z (1-p)^{n-z}$$
I don't think this can be simplified much.
-
0I may not require an exact expression for $E[Z]$, Can we claim that $E[Z] \geq \min(c, \sum_i E[X_i]) = \min(c, \sum_i Pr(X_i = 1))$ ? – 2012-06-20
-
0user: I do not get it: I answer that in the comments. – 2012-06-21