Suppose two people are playing a game with an infinitely generous agent who gives 1 coin at random between the two players.
The agent gives 1 coin to each player at the start. And then the agent proceeds to give coins randomly to each player with a probability proportional to how many coins they have already received.
So, for example: Players A and B start with 1 coin each. The agent assigns a coin to A and B with probability 1/2. Say the coin is given to B. Now B=2, A=1. Now the agent then gives the coin to B with probability 2/3, and A 1/3. And we start this infinite process.
My question is: Is one player 'expected' to run away from the other as the number of iterations approaches infinity? Or are they expected to switch places an infinite amount of times?
I'm 95% sure that they eventually run away from each other, since the expected value of the ratio B/A is still B/A after one iteration. But I'm looking for direction to prove this in a more rigorous way. Not really too sure where to start.