Let's consider a set of nodes $V$, and let some nodes be colored with one color choosen between two possible colors; denote the color $\alpha$ and $\beta$, with respectively $I>0$ and $K>0$ nodes colored with them. Note that we could have $K+I
We're going to generate the set of edges $E$ according to the Erdős–Rényi model (fixed a $p\in (0,1)$, each possible edge follows Bernoulli's law with parameter $p$); now, each uncolored node is going to take the most-viewed-by-him color: in other words, he counts how many nodes in his neighborhood have the first color, and how many have the second color, and eventually become colored with the most-frequent color he sees (if the two color have the same number of occurrences, the node keep uncolored).
Before observing $E$, the probability that a node $v$ take color $\alpha$ is $\sum_{i=1}^{I} \Big\{ {{K}\choose{i}}p^i (1-p)^{K-i}\sum_{k=0}^{\min\{ i-1,K\}} {{I}\choose{k}}p^k (1-p)^{I-k}\Big\}$ where $i$ counts the number of $\alpha$-colored nodes in $N(v)$ while $k$ counts the number of $\beta$-colored nodes in $N(v)$. This expression is not very handy, especially if we want to estimate the expected number of new $\alpha$-colored nodes.
I'm looking for useful inequality and estimation that can be used for nontrivial upper and lower bounds to the density above and to the expected value of the new $\alpha$-colored and $\beta$-colored nodes.
In particular, as a first step, I would investigate the case when $p=\frac{c}{n}$ for any $c>0$, $K+I<\frac{n}{\log n}$ and $I=\frac{K}{n^\epsilon}$ for any small $\epsilon>0$.