While reading the Elements of Information Theory I am stuck with this paragraph on page 35:
I cannot understand why $H(X|Y,Z)$ is $0$(any evidence hinting that $Y$ and $Z$ are independent to $X$?), and why $P(Z=0)H(X|Z=0)$ is also 0.
Why is the conditional entropy $0$?
1
$\begingroup$
probability
self-learning
information-theory
1 Answers
4
If $Z=X+Y$ and you condition on both $Y$ and $Z$ (roughly speaking, "if you know $Y$ and $Z$"), then there is no uncertainty about $X$ (you "fully know it"): $$ X=Z-Y. $$
For the second one: looking at $H(X\mid Z=0)$, recall that $X,Y\in\{0,1\}$. So if $0=Z=X+Y$, then we must have $X=Y=0$: $X$ is then fully determined by the fact that $Z=0$, and there is no uncertainty there either. (Same thing for $Y$; so $H(X\mid Z=0)=H(Y\mid Z=0)=0$.)
-
0I'd like to paraphrase your answer to better understand that, and hope I am right bellow. For the first one, $X$ is a function of $Z-Y$ then the conditional entropy is $0$; while for the second one, $Z=0$ also determines that $Z$ is a function of $X$ and also of $Y$ hence the conditional entropy is also $0$. – 2017-02-28
-
0For the second, Z=0 determines X, since 0 <= X <= Z. – 2017-02-28