The white noise model is often used when considering the problem of errors in a transmitted binary msg. This model is based on the following asssumptions:
a) there is an equal probability,p, of an error in each bit of the msg b) errors in different bits of the msg are independent
One way of encoding a binary msg so as to make it error detecting is to count the number of 1s in the msg and then append an extra bit to it so that the resulting msg (including the additional bit) has even parity, that is , has an even number of 1s in it. If a msg is received with an odd number of 1s in it, the receiver then knows that the msg contains an odd number of errors. The receiver can't detect an even number of errors with this device.
1) Determine the probability of an undetected error in a binary msg comprising n bits( including the parity bit), assuming the white noise models for errors in the msg. 2) Determine the probability of an undetected error in a binary msg consisting of 8 bits(including the parity bit), assuming the white noise model for errors with p =1/3 in the msg. ---------------------------------------------------
Proposed solution: From (b) above we are told that "errors in different bits of the msg are independent", therefore I believe that the problem can be solved using Bernoulli trials. We know that P = 0.5 (where P = the probability of success that a bit is error free) and Q = 0.5 (where Q = the probability of an error in a bit).
The probability of exactly k successes in n independent Bernoulli trials, with probability of success p and probability of failure q = 1 ā p, is
$C(n, k)p^{k}q^{nāk}$.
In the case of trying to solve (1), would it be correct to assume we have $C{n \choose 2}$ potential combinations or is this wrong? I am unsure if the parity bit should be included in any analysis if that makes sense. Please could someone point me in the right direction if I have gone astray here
Many thanks