Usually, to define relative entropy between two probability measures, one assumes absolute continuity. Is it possible to extend the usual definition in the non absolutely continuous case?
Relative entropy between singular measures
7
$\begingroup$
probability
entropy
singular-measures
-
1+1, Interesting question! FYI, [Kullback-Leibler divergence](http://en.wikipedia.org/wiki/Kullback-Leibler_divergence) wiki-page explicitly mentions the case of discrete random variates. – 2011-11-07
1 Answers
3
Relative entropy between two probability measures $P$ and $Q$ can be defined even if $P$ is not absolutely continuous w.r.to $Q$. In any case, $P$ and $Q$ are absolutely continuous w.r.to a common measure $\mu$ (one can take $\mu$ to be $\frac{P+Q}{2}$). Then relative entropy between $P$ and $Q$ is defined as $D(P\|Q)=\int p\log\frac{p}{q}d\mu,$ where $p=dP/d\mu$ and $q=dQ/d\mu$.
-
0I think, what you have in mind is that $P$ and $Q$ have disjoint support. In such case $D(P\|Q)=\infty$. But in the case $P(A)=0$ and $Q(A^c)=0$, $D(P\|Q)$ may or may not be $\infty$. If for some $A$, $P(A)\neq 0$ but $Q(A)=0$ then we will definitely have $D(P\|Q)=\infty$. – 2011-11-11