6
$\begingroup$

When a random variable $X$ has only one possible outcome $x_0$, the probability density function at $X=x_0$ is infinite, while the probability density at other locations is zero. Then the p.d.f is exactly a delta function $\Pr(X=x) = \delta(x=x_0)$.

However, when I tried to calculate the entropy of the random variable, the problem arises. How can I calculate the integral $\int_{-\infty}^{+\infty}{\delta(x-x_0)\log\delta(x-x_0) \, dx}$?

  • 1
    @Strin : I'd write "probability density function" in the title. "Distribution function" is usually construed as "cumulative distribution function".2012-09-04

4 Answers 4

4

Consider an absolutely continuous distribution with location parameter $x_0$ and scale parameter $\sigma$. We consider two such distributions: normal $\mathcal{D}_1 = \mathcal{N}\left(x_0, \sigma\right)$, and continuous uniform $\mathcal{D}_2 = \mathcal{U}\left(x_0-\sqrt{3} \sigma, x_0+\sqrt{3} \sigma\right)$. Distributions $\mathcal{D}_1$ and $\mathcal{D}_2$ have equal means and variances.

Carrying out the computation for the Shannon entropy: $H(\mathcal{D}) = \mathbb{E}\left(-\ln(f_\mathcal{D})\right)$ yields the following results: $ \begin{eqnarray} H\left(\mathcal{D}_1\right) &=& \ln(\sigma)+\frac{\ln(2 \pi)}{2} + \frac{1}{2 \sigma^2} \mathbb{Var}(\mathcal{D}_1) = \ln(\sigma) + \frac{1}{2}\left(1 + \ln(2\pi)\right) \\ H\left(\mathcal{D}_2\right) &=& \mathbb{E}\left(\ln(2 \sqrt{3} \sigma)\right) = \ln(\sigma) + \ln\left(2 \sqrt{3}\right) \end{eqnarray} $ Additionally, consider a Cauchy distribution $\mathcal{D}_3 = \mathrm{Cauchy}\left(x_0, \sigma\right)$: $ H\left(\mathcal{D}_3\right) = \ln(\sigma) + \ln(\pi) + \frac{1}{\pi} \int_{-\infty}^\infty \frac{\ln\left(1+x^2\right)}{1+x^2}\mathrm{d}x = \ln(\sigma) + \ln(4 \pi) $ Notice that in the limit $\sigma \to 0^+$, each distribution converges to a degenerate distribution, localized at $x_0$. For each distribution the entropy diverges to $-\infty$ as $\ln(\sigma)$.

Added Per OP's request, one can repeat the above argument for an arbitrary continuous random variable.

Let $f_X(x)$ be the pdf of a standardized random variables $X$ with zero mean, and unit variance. Consider $Y = \sigma X$ with pdf $f_Y(y) = f_X(\sigma x) \frac{1}{\sigma}$. Clearly the Shannon entropy of $Y$ is $H_Y = \log(\sigma) + H_X$ and $H_X$ is a independent of $\sigma$. As $\sigma \downarrow 0$, the distribution of $Y$ converges to a degenerate distribution, and Shannon entropy $H_Y$ tends to $-\infty$ as $\ln(\sigma)$.

  • 0
    You showed this fact by computing several distributions. Is there any general theorem saying the entrophy of the delta function is $-\infty$?2012-09-20
1

Try to make a limit of random variables equidistributed in $[x_0-\varepsilon, x_0+\varepsilon]$. What is the entropy of these? What happens with $\varepsilon\to0$?

-3

My intuition says that the entropy of a delta function should be zero.

  1. Since it counts a system's micro states, entropy is always positive.
  2. Delta distribution function implies certainty of the value $X_0$ and thus $\ln(1)$ is zero.