While doing research for my thesis, I ran into a paper called "Statistical Models for Co-occurrence Data". In the early pages, when talking about an iterative numerical method (a custom EM-method, to be precise), I ran into a sum notation I'm not exactly sure to interpret correctly.
$\hat{p}_{i|\alpha}^{(t)} = \frac1{L\hat{\pi}_\alpha^{(t)}} \sum_{r:i(r)=i} \langle R_{r\alpha} \rangle^{(t)}$
To summarize the environment:
Denote by $R_{r\alpha}$ an indicator variable to represent the unknown class $C_\alpha$ from which the observation $(x_{i(r)}, y_{j(r)}, r) \in S$ was generated.
Here $S = \{(x_{i(r)}, y_{j(r)}, r) : 1 \leq r \leq L \}$.
$X = \{ x_1,\dots,x_N \}$ and $Y = \{ y_1,\dots,y_M \}$ are finite sets of abstract objects.
$\hat{p}_{i|\alpha}^{(t)}$ is the estimation for the probability (at the $t$:th step) of object $x_i \in X$ being chosen after abstract class $C_\alpha$ was chose
As for my question, how am I to interpret the $\sum_{r:i(r)=i}$ part of the equation?