I have been reading a bit about conditional entropy, joint entropy, etc but I found this: $H(X|Y,Z)$ which seems to imply the entropy associated to $X$ given $Y$ and $Z$ (although I'm not sure how to describe it). Is it the amount of uncertainty of $X$ given that I know $Y$ and $Z$? Anyway, I'd like to know how to calculate it. I thought this expression means the following:
$H(X|Y,Z) = -\sum p(x,y,z)log_{2}p(x|y,z)$
and assuming that $p(x|y,z)$ means $\displaystyle \frac{p(x,y,z)}{p(y)p(z)}$, then \begin{align} p(x|y,z)&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(z)}\frac{p(x,y)}{p(y)}\\&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(z)}p(x|y) \\&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(x,z)}\frac{p(x,z)}{p(z)}p(x|y)\\&=\displaystyle \frac{p(x,y,z)}{p(x,y)p(x,z)}p(x|z)p(x|y) \end{align} but that doesn't really help.
Basically I wanted to get a nice identity such as $H(X|Y)=H(X,Y)-H(Y)$ for the case of two random variables.
Any help?
Thanks