The title pretty much says it all, but I am particularly interested in the case where the number of input and output symbols are equal and the transition matrix defining the DMC is nondegenerate. I am only interested in constructive/concrete examples, not (e.g.) a pointer to Shannon's channel coding theorem.
Are there simple examples of capacity-achieving block codes for discrete memoryless channels?
4
$\begingroup$
information-theory
2 Answers
2
Well, first as constructive as it could get, capacity is only achieved asymptotically. That being said, you can take a look at several families of codes:
- Of course with high probability, for long enough $n$ any code of the appropriate rate can be used to transmit information with exponentially small decoding probability.
- Families of codes on sparse graphs achieve capacity. In particular, you can get punctured LDPC codes, punctured nonsystematic IRA codes (only for the BEC).
- There are claims that a new variant of LDPC codes is capacity achieving with polynomial decoding complexity.
- Concatenated codes, proposed by Forney, achieve exponentially decreasing error probabilities at all data rates less than capacity (polynomial decoding complexity).
- The new and very popular polar codes are also capacity achieving with low encoding and decoding complexity.
I am not sure if this is what you were looking for.
0
Are you familiar with Arikan's polar codes?