I am trying to prove the conditional entropy of X given Y is greater than or equal to 0.
I am told that the entropy $H(X)$ (according to Boltzmann's H) is equal to $$H(X)=\sum_{i=1}^n -P_i\log_2P_i$$
and that in logic, it is impossible to have a negative entropy, but I am unsure of how to proceed with the conditional probability.
I also believe that the following formula represents the joint probability, but I am not 100% sure.
$$\sum_j\sum_i[P(x_iy_i)[-\log_2P(x_i|y_i)]].$$