Usually, to define relative entropy between two probability measures, one assumes absolute continuity. Is it possible to extend the usual definition in the non absolutely continuous case?
Relative entropy between singular measures
7
$\begingroup$
probability
entropy
singular-measures
-
1+1, Interesting question! FYI, [Kullback-Leibler divergence](http://en.wikipedia.org/wiki/Kullback-Leibler_divergence) wiki-page explicitly mentions the case of discrete random variates. – 2011-11-07