Your surmise is correct.
Let's say with probability $1/2$ I pick a biased coin, and toss it, getting $Y=1$ with probability $1/3$ and $Y=0$ with probability $2/3$. And with probability $1/2$ I pick an unbiased coin. Let $X$ be $0$ or $1$ according as I pick the biased or unbiased coin. Then $ \begin{align} E(Y \mid X=0) & = \frac 1 3 \\ \\ E(Y \mid X=1) & = \frac 1 2 \\ \\ \\ E(Y \mid X) & = \begin{cases} \frac 1 3 & \text{with probability }1/2, \\ \\ \frac 1 2 & \text{with probability }1/2. \end{cases} \end{align} $ And similarly for conditional variances.
Having done that, one can write things like $ E(E(Y \mid X)) = E(Y) $ (the law of total expectation) and $ E(\operatorname{var}(Y \mid X)) + \operatorname{var}(E(Y \mid X)) = \operatorname{var}(Y) $ (the law of total variance, which breaks the variance into an "explained" part and an "unexplained" part). (Now I notice that I wrote the "unexplained" part first, so don't add "respectively".)
In a similar way, one has $ E(\Pr(A \mid X)) = \Pr(A) $ (the law of total probability).