0
$\begingroup$

Given the prior probability of 2 distributions N(x,y) and N(a,b), where N(μ,σ^2):

how do you make a decision rule to minimize the probability of error, if the prior probabilities are equal? Can you give an example?

What if the prior probabilities are different, such as
P(Distribution 1) = 0.70
P(Distribution 2) = 0.30 ?

  • 0
    Though this is a very mathematical question, you might get a better response at http://stats.stackexchange.com/2011-03-11

1 Answers 1

1

I think what you are looking at is the Bayes optimal classifier. This is the classifier that minimizes the Bayes error. In other words, we have:

$\text{classify} \ \textbf{x} \ \text{as} \ \begin{cases} C_{0} \ \ \ \ \text{if} \ P(C_0| \textbf{x}) > P(C_1| \textbf{x}) \\ C_1 \ \ \ \ \text{otherwise} \end{cases}$

  • 0
    Do you have an example, to understand it better?2011-03-11