Usually, what I read is taking conditional expecation with respect to X or Z, a single random variable, but in the book "elements of statistical learning" page 291,given $T=(Z,Z^m)$ and $Z^m$ is missing data:
we have
$l(Q';Z)=l_0(Q';T)-l_1(Q';Z^m|Z)$
where $l_1$ is based on the conditional density $Pr(Z^m|Z,Q')$.
Taking conditional expectations with respect to the distribution of T|Z governed by parameter Q gives:
$l(Q';Z)=E[l_0(Q';T)|Z,Q]-E[l_1(Q';Z^m|Z)|Z,Q]$
I don't quite get the idea of distribution of $T|Z$, and how come the likelihood function became the Expectation form. Thanks!