1
$\begingroup$

I am trying to prove the conditional entropy of X given Y is greater than or equal to 0.

I am told that the entropy $H(X)$ (according to Boltzmann's H) is equal to $$H(X)=\sum_{i=1}^n -P_i\log_2P_i$$

and that in logic, it is impossible to have a negative entropy, but I am unsure of how to proceed with the conditional probability.

I also believe that the following formula represents the joint probability, but I am not 100% sure.

$$\sum_j\sum_i[P(x_iy_i)[-\log_2P(x_i|y_i)]].$$

  • 1
    What is the "probability of Entropy"?2017-02-14
  • 0
    sorry, bad wording, fixed it.2017-02-14

1 Answers 1

0

Remember that, in this context, $0\cdot\log_2 0=0$.

Now, for all $i$, $P_i\in [0,1]$, so $\log_2 P_i<0$, and that means $P_i\cdot \log_2 P_i>0$. As $H$ is a sum of non negative terms, it must be non negative too.

  • 0
    So that makes pretty good sense as to why H(X)≥0, but how can I expand that to prove that H(X|Y)≥0?2017-02-14
  • 0
    I'm not sure I recall the definition of $H(X|Y)$ correctly, but I think it's the same idea, as $P_i(X|Y)$ is also a probability.2017-02-14
  • 0
    this is where I'm getting info on H(X|Y) (conditional probability) https://en.wikipedia.org/wiki/Conditional_entropy2017-02-14
  • 0
    $H(X|Y)=\sum_{y} P(y) H(X|Y=y)$ so it's a linear combination of nonnegative quantities with positive coefficients.2017-02-14