2
$\begingroup$

Smooth classic entropies generalize the standard notions of entropy. This smoothing stands for a minimization/maximization over all events $\Omega$ such that $p(\Omega)\geq 1-\varepsilon$ for a given $\varepsilon\geq 0$. The smooth max and min entropy are defined as:

\begin{equation} H_\infty^\varepsilon (X)= \max_\Omega \min_x(-\log p_{X\Omega}(x)) \end{equation}

\begin{equation} H_0^\varepsilon (X)= \min_\Omega\log \left| x:p_{X\Omega}(x)>0\right| \end{equation}

where $p_{X\Omega}(x)$ is the probability that the event $\Omega$ takes place and $X$ takes the value $x$.

So my question would be, given a specific discrete distribution, for instance something as simple as a fair coin, could someone explicitly compute an example of its smooth max and min entropy?

1 Answers 1

3

I will refer you to this great informal introduction which gives a few finite examples and explains the intuition behind the definitions.

However $H_\infty^\varepsilon$ is not mentioned. The effect of smoothing is similar to $H_0^\varepsilon$, but in reverse: the smoothing will continuously "steal" probability from the most probable states until $\varepsilon$ probability has been stolen. If you have $n$ maximum-probability states and all other states are somewhat less probable, then the maximum probability will be reduced by $\varepsilon/n$. So a single small peak of probability will be smoothed off, but the effect of smoothing will be negligible in two notable cases: if you have many maximum-probability (or close to maximum-probability) states, or if the maximum probability is much greater than $\varepsilon$.

Unlike $H_0^\varepsilon$ which can be radically different from $H_0$, $H_\infty^\varepsilon$ is continuous in $\varepsilon$ in the following sense: $0<\exp(-H_\infty(X))-\exp(-H_\infty^\varepsilon(X))\le \varepsilon$ (more generally, $\exp(-H_\infty^\varepsilon(X))$ is an increasing 1-Lipschitz function of $\varepsilon$).

Finally, for a fair coin: $H_\infty^\varepsilon(X)=-\log \frac{1-\varepsilon}{2}$ $H_0^\varepsilon(X)=\begin{cases}\log 2&\varepsilon<1/2\\0&\varepsilon\ge 1/2\end{cases}$ $H_\infty^\varepsilon$ steals $\varepsilon/2$ from each state since they have both maximal probability. $H_0^\varepsilon$ tries to steal from one minimum-probability state to reduce the number of states: this only succeeds if $\varepsilon\ge 1/2$.

  • 0
    Ok, for \epsilon>1/2, that is, once you are allowed to still more than 1/2 your fair coin is $\epsilon$-close to a deterministic distribution with 0 entropy. A little late, but I think I got a good understanding!2012-07-04