0
$\begingroup$

I'm trying to find out whether my communication protocol should have redundant information padded, in order to help the receiver correct the error (error correction code, ECC) without needing a retransmission. In other words, I'm trying to find the point at which it's more interesting to have a small overhead for each message, rather than a whole message retransmission from time to time. If needed, I assume I can ask God to tell me if the packet is altered or not.

Let's take a ECC for which, when t extra bits added, t/2 corrections are possible. Each bit has an error rate of p, so the probability P that the message gets altered (assuming independent errors) is $P=1-(1-p)^n$, where n is the message length. The probability that less than t/2 error occurs is: $\displaystyle P(X\leq\frac t2)=\sum_{k=0}^{\lfloor\frac t2\rfloor}{n+t\choose k}p^k(1-p)^{n+t-k}$ I think I have done most of the work but I'm not sure how to go from there to solve my initial question. I think I will need to solve the inequation $P\leq P(x\leq\frac t2)$, but I'm not sure, and in any case I don't know how to solve it. If it turns out to be infeasible to solve, I can give numerical values to help with the resolution.

1 Answers 1

1

You know:

a) The probability that the message has been altered,

b) The probability that the message has either not been altered ($k=0$) or that it has been altered and the ECC will be able to fix it ($P(X \leq {{t}\over{2}}$)).

From b), you can figure out the probability that the message has been altered and the ECC will not be able to fix it.

Therefore, you can figure out the fraction of the time the message would have to be resent given no ECC, let us call it $p_n$ and the fraction of the time it would have to be resent given the ECC, $p_E$.

The expected number of bits sent with no ECC = $n/(1-p_n)$ (we have to take into account that the resent message also might be altered, requiring more than one retransmission, hence the division by $1-p_n$).

The expected number of bits sent with the ECC = $(n+t)/(1-p_E)$.

From this, I think you should be able to get through the rest of the problem.

  • 0
    Yes, I'm trying to find what$t$I need to use in order to have the best throughput possible, on average. In other words, I'm trying to find, for any given message length n and error rate p, how much redundancy to add to my message to have the best probabilistic throughput. If the result gave t=0, it would mean no redundancy is actually more efficient in spite of P messages being lost on average, due to modification. So is it better (in terms of number of bits received for every 100 sent) to have extra redundancy for each packet (and if so how much), or to let some get modified and resend them ?2012-06-14