2
$\begingroup$

In information theory, how do I calculate the probability of an erroneous transmission? Let's take for instance a binary symmetric channel with an error probability $ 1-d=0.25 $ and send codewords of length 6 coded in a Hamming code able to correct up to 1 error.

  • 0
    I was the one who deleted my comment asking which shortened Hamming code was under consideration. I have posted a detailed answer about how shortened Hamming codes work in an answer to [another question](http://math.stackexchange.com/q/189315/15941) by the OP. I don't think the question is fully answerable without knowing more about the code.2012-08-31

1 Answers 1

1

We assume independence of bit errors. This is a somewhat dubious assumption, since errors often occur in bursts.

The probability of erroneous interpretation (or inability to decode) of a codeword of length $6$ is the probability that $2$ or more bits are incorrectly transmitted. The probability that $0$ bits are wrong is $(0.75)^6$. The probability that exactly $1$ bit is wrong is $6(0.25)(0.75)^5$. Add these two numbers, subtract the result from $1$.

  • 0
    Actually, you have found the probability $P(C)$ of correct decoding. For _shortened_ Hamming codes, the complementary probability $1 - P(C)$ is the sum of the error probability $P(E)$ (meaning the output is the wrong codeword) and the _failure_ probability $P(F)$ (meaning the decoder is unable to decode the received word into a valid codeword because a _detectable_ but _undecodable_ error pattern has occurred. My answer to [this question](http://math.stackexchange.com/q/189315/15941) describes how such decoder failures occur, and they do depend on _how_ the shortening was done.2012-08-31