A binary communication system is used to send one of two messages:
(i) message A is sent with probability 2/3, and consists of an infinite sequence of zeroes,
(ii) message B is sent with probability 1/3, and consists of an infinite sequence of ones.
The ith received bit is “correct" (i.e., the same as the transmitted bit) with probability 3/4, and is “incorrect" (i.e., a transmitted 0 is received as a 1, and vice versa), with probability 1/4. We assume that conditioned on any specific message sent, the received bits, denoted by Y1,Y2,… are independent.
Is Y2+Y3 independent of Y1? Is Y2-Y3 independent of Y1?
My questions:
1) Am I right that Y2 is dependent on Y1 and Y3 is dependent on Y2 and thus, on Y1?
2) How can I compute a PDF of Z1 = Y2+Y3 and Z2 = Y2-Y3?
My idea was to compute first P(Y1=0) and P(Y2=1) Then compute P(Y2=0|Y1=0), P(Y2=0|Y1=1),P(Y2=1|Y1=0), P(Y2=1|Y1=1) Then P(Y3=0|Y1=0, Y2=0)... P(Y3=1|Y1=1,Y2=1)
But even when I do it, how the joint distribution tables will differ?
I'm at a loss because I've read a lot about how to compute a PDF for two independent variables and I do know how to check if two variables are independent given their joint distribution table, but honestly for some reason this task makes me feel I'm a second-grader.