1
$\begingroup$

(One direction of) Shannon's channel coding theorem says that for a discrete memoryless channel, all rates below capacity $C$ are achievable. Specifically, for every rate $R < C$, there exists a sequence of $(2^{nR}, n)$ codes with maximum probability of error $λ^{(n)} \to 0$.

As I understand, $n$ is counting the number channel uses. Is this correct? If so, then the number $2^{nR}$ of input message is increasing also. What is the physical interpretation of this increase in messages?

2 Answers 2

1

You are correct.

The physical interpretation of $2^{nR}$ messages is: "messages writeable by nR bits".

Therefore the theorem claims that using long enough messages, we can get arbitrarily close to zero-mistakes in transmission, so long as we transmit in a rate which is less than the channel capacity. Where "transmit in a rate R" is translated as "pass X bits with X/R uses"

e.g. if C = 2, then we can (with arbitrary low error probability, for long enough messages) expect to transmit a bit for every use (R=1); but we would be real fools to try to pass over 3 bits with every use - can't be done (from the other direction of the theory).

  • 0
    The setting which makes it most simple is this: you have an infinite st$r$e$a$m of bit$s$ which you have to pass through the channel (think of an international telephone line - after all, Shannon worked for Bell). You can get low error pro$b$., for each rate which is s$m$aller than the channel capacity - but in order to do so you will have to group a stream of bits to be "passed together" (and use error correcting codes). This grouping is the increase of the message length (or the number of messages).2011-10-09
0

I am not absolutely sure but I think if you fix the number of messages you want to send, or alternatively, fix the number of bits required to represent your message set to $ k = nR$, then as you increase $n$, $R$ will drop. In a way, you are increasing the redundancy of your message before sending it. So with every use of the channel, you are sending less of the actual message.

However, Shannon's channel coding theorem talks about achievability of a rate. Instead of fixing the number of messages, we first fix the rate $R$. So all it says, is that it is easier to achieve this rate $R$ if we have a lot of messages to send as long as $R \leq C$. So fixing $R$ basically means that we are fixing the amount of redundancy we add to a message. Increasing $n$ in this case, will increase our message set, but we do not really care about that. We are not talking about a specific message set, but rather we are talking about whether a rate can be achieved.