I have the decision problem for 4 hypotheses as follows: $H_j: Y_k=N_k-s_{jk},\ k=1,2,\ldots,n;\ j=0,1,2,3.$ where signals are $s_{jk}=E_0\sin(w_cT(k-1)+(j+\frac{1}{2})\frac{\pi}{2}).$ $$ In vector form: \equiv H_{j}: \underline{Y}=\underline{N}+\underline{s}_j;\ j=0,1,2,3.$ $$ How can I find the minimum error probability for equally likely signals in i.i.d. N(0,\sigma^2)$ noise. (Thess signals are not orthogonal). how can I cobtain orthonormal signals for solving this problem? Thank you in advance.
how can I get minimum error probability for this decision problem?
-
1Since you are a new user, here are some tips to get you started: Show your basic thoughts on the subject or things you have tried, or parts where you are confused. If an acceptable answer is posted for you, don't forget to accept the answer by clicking the check-mark next to their post so that way credit is properly assigned. – 2012-11-23
1 Answers
Conditioned on the $j$-th signal being transmitted, the likelihood function of the observation $(Y_1, \ldots, Y_n)$ is proportional to $\exp\left(-\frac{1}{2}\sum_{k=1}^n (s_{kj}-Y_k)^2\right)$ Since the signals are equally likely to be transmitted, the_ minimum-error-probability decision rul_e is the same as the maximum-likelihood decision rule, viz.
Choose the hypothesis that has the largest likelihood
which in this instance means deciding that the signal $s_j$ that is closest in Euclidean metric to the observation $Y$ is the one that is most likely to have been transmitted. In other words, compute the four sums $Z_j = \sum_{k=1}^n (s_{kj}-Y_k)^2, ~~j = 0, 1, 2, 3$ and decide that signal $s_j$ was transmitted if $Z_j < \min_{i: i\neq j} Z_i$
-
0@ Prof.Sarwate: Thank you for providing me your lecture notes in this field. My goal of minimum error probability is The conditional minimum error probability given that a specific $s_j$ was transmitted(on page.162 of your lecture notes)? – 2012-11-24