Suppose you throw balls one-by-one into $b$ buckets, uniformly at random. At what time does the size of some (any) bucket exceed size $s$?
That is, consider the following random process. At each of times $t=1, 2, 3, \dots$,
- Pick up a ball (from some infinite supply of balls that you have).
- Assign it to one of $b$ buckets, uniformly at random, and independent of choices made for previous balls.
For this random process, let $T = T(s,b)$ be the time such that
- At time $T-1$ (after the $T-1$th ball was assigned), for each bucket, the number of balls assigned to it was $\le s$.
- At time $T$ (after the $T$th ball was assigned), there is some bucket for which the number of balls assigned to it is $s + 1$.
What can we say about $T$? If we can get the distribution of $T(s,b)$ that would be great, else even knowing its expected value and variance, or even just expected value, would be good.
Beyond the obvious fact that $T \le bs+1$ (and therefore $E[T]$ exists), I don't see anything very helpful. The motivation comes from a real-life computer application involving hashing (the numbers of interest are something like $b = 10000$ and $s = 64$).