Given random observations $x$, from a random variable $\mathcal{X}$, we have two different distributions under two hypothesis. \begin{align} \mathcal{H}_0: \mathcal{X}\sim K_1\\ \mathcal{H}_1: \mathcal{X}\sim K_2 \end{align} The likelihood ratio for one observation is defined as \begin{equation} L(x)=\frac{p(x|\mathcal{H}_1)}{p(x|\mathcal{H}_0)} \end{equation} and false alarm and miss detection probabilities are defined as \begin{align} P_F=\int_{\{x: L>\gamma\}}p(x| \mathcal{H}_0)dx\\ P_M=\int_{\{x: L<\gamma\}}p(x| \mathcal{H}_1)dx\\ \end{align} where $\gamma$ is the threshold which is a real number.
Changing the threshold between $-\infty$ and $\infty$ we obtain a set of $P_F$ and $P_M$ which provides a convex function $f$ for $P_M=f(P_F)$.
I wonder if it is possible to change $f$ via $y=g(x)$. I know that it is impossible to lower $P_F+P_M$ at any point on $f$.
Is it possible to lower some $P_F$ with the help of $g$ while allowing some $P_M$ to be larger compared to the case where we have no $g$?
It is also impossible if $g$ is deterministic so my question should focus on some random $g$.
Thank you very much for reading this post. Please feel free to edit since I have inet only on my mobile leading some unavoidable errors.