3
$\begingroup$

While I was watching a lecture on Information theory, I found that entropy of an information source is the average amount of information that it provides in terms of bits (or nats, decits or whatever), which's actually the weighted average of information contained in all the symbols that source provides (weighted by probabilities of individual symbols).. I found a striking similarity between this & concept of Expectation of a random variable which has similar stuff in it's explanation. Am I right on this ? I mean is their any intuitive connection between the two concepts? The explanation of both the concepts taking into consideration their similarity (if find any) is also welcome. Thank you.

  • 0
    Expectation is a very basic concept in probability theory, and everything is connected to it.2011-04-03

1 Answers 1

5

The entropy is defined using an expectation. If you have a random variable $X$ whose pdf is $P$ then its entropy is $H(X) = \mathbb{E}(-\log P(X)).$ Another connection is through your definition above. For every $n$, one can find a uniquely decodable code $C_n$ that encodes $n$-tuples of values of $X$ in binary, minimizing the expected length $L_n$. The entropy is the limit of $L_n/n$.

  • 1
    Optimally coding $n$ copies of $X$ (using a uniquely decodable code) costs (in expectation) about $nH(X)$ bits. Since you can define this cost without mentioning entropy, you get another definition of entropy.2011-04-04