From Wikipedia:
Let $X$ be a random variable with a probability density function $f$ whose support is a set $\mathbb{X}$. The differential entropy $h(f)$ is defined as $ h(f) = -\int_\mathbb{X} f(x)\log f(x)\,dx \quad \text{i.e.} \quad \mathrm{E}_f(-\log f(X)). $
The discrete entropy of a discrete probability measure is also defined as $\mathrm{E}_p (-\log p(X))$, where $p$ is the mass probability function, which can be viewed as the density function with respect to the counting measure on the discrete sample space.
Am I right that the concept of entropy depends on the underlying measure of the sample space, since the integrand is the density function wrt some underlying measure?
If yes, what is the underlying measure on the sample space for differential entropy that Wiki refers to in the above quote?
What is the/some general structure(s) defined on the sample space for it to be meaningful to discuss entropy, more than $\mathbb{R}^n$ with Lebesgue measure being the underlying measure?
Thanks and regards!