0
$\begingroup$

I was going through binary communication systems whereby I came across the concept of information gain.Now, I assign probabilities to the transmitted symbols of a binary symmetric channel ie, the probability that I transmit a 0 is p(0) and if I transmit 1 is p(1). The channel which I am working on looks like :

enter image description here

Since Information gain is a concept governing the reduction in uncertainity when going down a branch, I think that we should be able to calculate the information gain while observing the output variable Y if I KNOW WHICH SYMBOL I TRANSMITTED, (or GIVEN THAT A SYMBOL HAS BEEN TRANSMITTED), SAY X0. Can anyone explain how can I do this? I am unable to figure out myself.

  • 0
    Do you maybe mean the conditional entropy $H(Y|X=x)$, i.e. the entropy of $Y$ knowing that $X=x$? This would be defined as $H(Y|X=x)=\sum_y p(y|x) \log p(y|x)$. Or maybe you mean the conditional entropy $H(Y|X)$, i.e. the entropy if $Y$ knowing $X$. This would be $H(Y|X)=\sum_{x,y}p(xy)\log p(y|x)$.2017-02-03
  • 0
    Can you please exactly explain why I need to calculate conditional entropy in this case? If you could provide a general explanation relating information gain and entropy for this case, it would be greatly helpful.2017-02-04

0 Answers 0