1
$\begingroup$

I am having trouble finding a general way to solve this problem, so any pointers in the right direction would be very useful.

Problem: Let's say that I have $N$ contiguous "bins" that are each $B$ elements wide. If I draw $K$ elements from a known distribution over these $N$ bins (spanning $N \times B$ elements), what is the expected number of unique bins, $Q$,I will have elements from?

Attempted Solution: I started by thinking about a uniform distribution. If I draw one element $(K=1)$, then $$E[Q] = 1$$ If I draw two elements $(K=2)$, then $$E[Q] = 1*P(same) + 2*P(different) = 1*\frac{1}{N} + 2*\frac{N-1}{N}=\frac{2N-1}{N}$$

With a some labor, I see how I can get a closed form solution for a uniform distribution. However, I am having trouble figuring out some general way to solve this if I have some other distribution (Gaussian windowed over the $N$-bins, for example).

I know that the expected number of bins is proportional to $K$ and $N$ and related to the distribution. Intuitively, it seems like the entropy of a distribution is related to the solution, but I'm not sure if this is the correct way to go about solving the problem.

The quick background is that I want to compute the expected number of burst requests a processor must issue to DRAM in order to gather a number of randomly distributed word addresses.

Thanks!

1 Answers 1

0

Use linearity of expectation. for example in uniform distribution set indicators $1_i$ for each bins (which take $1$ when $i$-th bin have been selected and $0$ o.w.)

So $$Q=\sum_{i=1}^{N}1_i $$ is the number of selected bins. Hence $$E(Q)=\sum_{i=1}^{N}E(1_i)=\sum_{i=1}^{N}P(1_i)=\sum_{i=1}^{N}(1-(\frac{N-1}{N})^k)=N \times(1-(\frac{N-1}{N})^k)$$

for $k=1,2$ gives the same answer as yours...

For other distributions it's enough to calculate $P(1_i)$ and the rest of argument works...