6
$\begingroup$

We know that $l_i=\log \frac{1}{p_i}$ is the solution to the Shannon's source compression problem: $\arg \min_{\{l_i\}} \sum p_i l_i$ where the minimization is over all possible code length assignments $\{l_i\}$ satisfying the Kraft inequality $\sum 2^{-l_i}\le 1$.

Also $H(p)=\log \frac{1}{p}$ is additive in the following sense. If $E$ and $F$ are two independent events with probabilities $p$ and $q$ respectively, then $H(pq)=H(p)+H(q)$.

As far as I know, mainly for these two reasons $H(p)=\log \frac{1}{p}$ is considered as a measure of information contained in a random event $E$ with probability $p>0$.

On the other hand, if we average the exponentiated lengths, $\sum p_i2^{tl_i}, t>0$, subject to the same Kraft inequality constraints, the optimal solution is l_i=\log \frac{1}{p_i'} where p_i'=\frac{p_i^{\alpha}}{\sum_k p_k^{\alpha}}, \alpha=\frac{1}{1+t}, called Campbell's problem.

Now H_{\alpha}(p_i)=\log \frac{1}{p_i'} is also additive in the sense that $H_{\alpha}(p_i p_j)=H_{\alpha}(p_i)+H_{\alpha}(p_j)$. Moreover $H_{\alpha}(1)=0$ as in the case of Shannon's measure.

Also note that, when $\alpha=1$, $H_1(p_i)=\log \frac{1}{p_i}$ we get back Shannon's measure.

My question is, are these reasons suffice to call H_{\alpha}(p_i)=\log \frac{1}{p_i'} a (generalized) measure of information?

I don't know whether the dependence of measure of information of an event also on the probabilities of the other events make sense.

  • 0
    @Pinocchio: You can try to solve the optimization problem by yourself. It's really easy. If not a good reference is Cover and Thomas's Information theory book. Feel free to ask if any questions.2015-07-07

1 Answers 1

2

That's exactly the extension known as Renyi entropy with a normalization factor of $\frac{1}{1-\alpha}$.

$H_\alpha(X) = \frac{1}{1-\alpha}\log\Bigg(\sum_{i=1}^n p_i^\alpha\Bigg)$

  • Rényi, Alfréd (1961). "On measures of information and entropy". Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp. 547–561.
  • 0
    //That's exactly the extension known as Renyi entropy with a normalization factor of...// Which one do you mean?2014-11-11