The larger a group of people is, the more probable it is that they have to wait for at least one person before they can finally leave for the next pub. Why?
Assume the probability that one person delays is 3%. Then, for, say, 30 persons, it is almost certain -- 90%, or $3 \cdot 30\%$. Right?
Well, of course not. For $3\% =: \alpha$ and $30 =: n$, it is more along the lines of $1 - (1-\alpha)^n$. Had that at school. But still, for a fixed $\alpha$ there should be an $n \neq 0$ so that the "approximation" holds exactly. (Is that true anyway?)
$ 1 - (1-\alpha)^n = \alpha n $
I believe that there's no algebraic solution for this equation. But how to prove this (and perhaps similar problems)?
EDIT: After plotting the difference $1- (1-\alpha)^n-\alpha n$, it seems clear that the only values for $n$ where the equation holds are $0$ and $1$. This should be provable using elementary calculus, so I'm closing the question.