(One direction of) Shannon's channel coding theorem says that for a discrete memoryless channel, all rates below capacity $C$ are achievable. Specifically, for every rate $R < C$, there exists a sequence of $(2^{nR}, n)$ codes with maximum probability of error $λ^{(n)} \to 0$.
As I understand, $n$ is counting the number channel uses. Is this correct? If so, then the number $2^{nR}$ of input message is increasing also. What is the physical interpretation of this increase in messages?