1
$\begingroup$

Consider a binary message in which $0$ has has probability $1/3$ and $1$ has probability $2/3$. What value of $H$ should be assign?

I know that you split up $1$ into two messages $1a$ and $1b$. Then I think you have to use conditional probability here. Am I on the right track? So $\left[\text{entropy of} \ (0,1a, 1b) \ \text{message} \right] = \left[ \text{entropy of} \ (0,1) \ \text{message} \right] + \text{something}$

  • 0
    I agree with Emre. There is no reason to split 1 into two messages 1a and 1b (unless you are looking at a problem which explicitly involves such a split, in which case you should specify it properly in your question.)2011-05-01

1 Answers 1

8

Could you define $H$ so others don't have to look? Wikipedia says $H(x)=-\sum p(x_i)\log(p(x_i)).$ You have the $p(x_i)$ so what more do you need? Entropy is additive.