Suppose $A$ and $B$ are playing the following game.
They each possess a fair coin, and individually flip their own coins until they get a tail. Whomever had a longer sequence of consecutive tosses that came up heads wins, and the loser has to pay the winner an amount that depends on the number of heads the loser flipped. If they tossed an equal number of heads before their respective first tail, there is a tie.
That is, denote the number of heads flipped by $A$ before the first tail by $a$, and call the number of heads flipped by $B$ before the first tail $b$. Then $A$'s payoffs in monetary terms from this game is given by
$$ p(a,b) = \begin{cases} 4^b & \text{if }a>b \\ 0 &\text{if } a=b \\ -4^a &\text{if }a
Straightforward calculations show that
$$ \operatorname{E}[p(a,b)] = \frac{1}{2} $$
How can a zero-sum game have positive expected payoff? In other words, how can both players expect to gain from playing this game when the amount that one player wins is exactly the amount that the other loses?
To see the expected value calculation, note that
$$ \operatorname E\left[4^b \vert a>b, b \right] \cdot \Pr (a>b \vert b) = 4^b \cdot \frac{1}{2^{b+1}} =\frac{1}{2}\cdot2^b $$ $$ \operatorname E\left[4^a \vert a
Which means that $\operatorname E [p(a,b)\vert b] = \frac{1}{2}$. The conditional expectation is independent of $b$, so it must equal the unconditional expectation.
I suspect there is some connection to the St. Petersburg paradox, but I'm not quite certain what exactly the relation is.