2
$\begingroup$

I have some set $S_1,\ldots,S_k$ ($k \geq 3$) of bins, each initially with $N_0(S_i)$ balls ($N_t(S_i)$ denotes the number of balls in $S_i$ at time $t$). A bin can contain a negative number of balls. Now I apply the following rule: at time $t$, I choose 3 bins uniformly at random, say $S_{i_1}, S_{i_2}, S_{i_3}$, and $N_{t+1}(S_{i_1}) = N_t(S_{i_1}) - 1,$ $N_{t+1}(S_{i_2}) = N_t(S_{i_2}) + N_{t}(S_{i_3}) + 1,$ $N_{t+1}(S_{i_3}) = 0.$ Can anything reasonably be said about the distribution of balls as $t\to \infty$?

Edit: I should also say, the 3 bins are chosen among the $\binom{k}{3}$ 3-sets of bins; they are always distinct.

  • 0
    Clearly the initial state doesn't matter; and yes, this implies that for large t, each bin is expected to have its fair share of balls. I was just wondering if it's possible to say more. The motivation is that this is a sanitized version of a chip-firing problem I'm thinking about.2012-09-04

1 Answers 1

2

Let's consider an easy case: $k=3$ where the total number of balls (which is a conserved quantity) is $1$. At each positive time there will be one bin with $0$ balls, so the others will have $x$ and $1-x$ for some integer $x$. Let $Y_t = \max(x, 1 - x)$. With probability $1/3$, the bin with $0$ will be chosen as $S_{i_1}$ and $Y_{t+1} = 2$. With probability $1/3$, the bin with $1 - Y_t$ will be chosen as $S_{i-1}$, so $Y_{t+1} = Y_t + 1$. With probability $1/3$, the bin with $Y_t$ will be chosen as $S_{i_1}$, so $Y_{t+1} = Y_t - 1$ unless $Y_t = 1$, in which case $Y_{t+1} = 1$. The transition matrix for the (infinite) Markov chain $Y_t$ looks like this: $ \pmatrix{ 1/3 & 2/3 & 0 & 0 & 0 & \ldots \cr 1/3 & 1/3 & 1/3 & 0 & 0 & \ldots \cr 0 & 2/3 & 0 & 1/3 & 0 & \ldots \cr 0 & 1/3 & 1/3 & 0 & 1/3 & \ldots \cr 0 & 1/3 & 0 & 1/3 & 0 & \ldots \cr 0 & 1/3 & 0 & 0 & 1/3 & \ldots\cr \ldots & \ldots & \ldots & \ldots & \ldots & \ldots}$ Equilibrium probabilities $\mu_j$ for this Markov chain should satisfy $\mu_1 = \mu_2/2$, $(5/6) \mu_2 = 1/3 + \mu_3/3$, and $\mu_j = (\mu_{j-1} + \mu_{j+1})/3$ for $j \ge 3$. The general solution to that recurrence is $\mu_j = A r_1^j + B r_2^j$ where $r_1 = (3 - \sqrt{5})/2$ and $r_2 = (3 + \sqrt{5})/2$, but in order for $\mu_j$ to be summable we must have $B=0$, i.e. $\mu_j = r_1^{j-2} \mu_2$ for $j \ge 2$. That gives us $ 1 = \sum_{j=1}^\infty \mu_j = \frac{1}{2} \mu_2 + \sum_{j \ge 2} r_1^{j-2} \mu_2 = \left(\frac{1}{2} + \frac{2}{\sqrt{5}-1} \right) \mu_2 $ i.e. $\mu_2 = 2 \sqrt{5} - 4$. We do get a stationary probability distribution with $\mu_1 = \sqrt{5} -2$, $\mu_j = (2 \sqrt{5}-4) ((3 - \sqrt{5})/2)^{j-2}$ for $j \ge 2$.