1
$\begingroup$

Given random observations $x$, from a random variable $\mathcal{X}$, we have two different distributions under two hypothesis. \begin{align} \mathcal{H}_0: \mathcal{X}\sim K_1\\ \mathcal{H}_1: \mathcal{X}\sim K_2 \end{align} The likelihood ratio for one observation is defined as \begin{equation} L(x)=\frac{p(x|\mathcal{H}_1)}{p(x|\mathcal{H}_0)} \end{equation} and false alarm and miss detection probabilities are defined as \begin{align} P_F=\int_{\{x: L>\gamma\}}p(x| \mathcal{H}_0)dx\\ P_M=\int_{\{x: L<\gamma\}}p(x| \mathcal{H}_1)dx\\ \end{align} where $\gamma$ is the threshold which is a real number.

Changing the threshold between $-\infty$ and $\infty$ we obtain a set of $P_F$ and $P_M$ which provides a convex function $f$ for $P_M=f(P_F)$.

I wonder if it is possible to change $f$ via $y=g(x)$. I know that it is impossible to lower $P_F+P_M$ at any point on $f$.

Is it possible to lower some $P_F$ with the help of $g$ while allowing some $P_M$ to be larger compared to the case where we have no $g$?

It is also impossible if $g$ is deterministic so my question should focus on some random $g$.

Thank you very much for reading this post. Please feel free to edit since I have inet only on my mobile leading some unavoidable errors.

  • 0
    @Michael yes exactly. It can be made monotone under each hypothesis, such as $L(x|\mathcal{H}_0)$ and $L(x|\mathcal{H}_1)$ are used for integration. One will be monotone increasing and other monotone decreasing. Integration limits would be like before the edit but $x$ instead of $L$. Then I delete the phrase monotone.2012-07-14

1 Answers 1

0

Since Likelihood ratio test $L(x)$ provides the lower bound on the $ROC$ curve, For a given $P_F$ no randomization can provide $P_M^{'}$ that is lower compared to the case where there is no randomization $P_M$. Therefore $g$ can reshape the $ROC$ if for all $P_F$ when $P_M^{'}>P_M$. This makes no sense because the tests other than $L(x)$ can also achieve sub-optimal {$P_F$,$P_M$}