Suppose $\beta$ is a Bernoulli random variable taking $\lbrace 0, 1\rbrace$, $X, Z$ are random variables defined on the same probability space. Is it true or false that
$$ I((1-\beta)Z + \beta X; X) \geq I(Z; X) $$
Suppose $\beta$ is a Bernoulli random variable taking $\lbrace 0, 1\rbrace$, $X, Z$ are random variables defined on the same probability space. Is it true or false that
$$ I((1-\beta)Z + \beta X; X) \geq I(Z; X) $$
I don't think that this is true. The conditional mutual information
$$I((1-\beta)Z+\beta X;X|\beta)$$
certainly exceeds $I(Z;X)$, but the same may not hold for mutual information. Consider the following example where $X$ and $\beta$ are independent Bernoulli-$1/2$ random variables taking values in $\{0,1\}$. Let $Z$ take values in $\{0,1\}$ and suppose that $Z=0$ if and only if $X=1$. We hence have $I(Z;X)=1$. However, $(1-\beta)Z+\beta X$ is independent from $X$, hence $I((1-\beta)Z+\beta X;X)=0$.
The conditional mutual information satisfies the inequality as it is mentioned in another answer, but only if $\beta$ is independent of $X$ and $Z$: $$ I(\beta X+(1-\beta)Z;X|\beta)\\ =P(\beta=0)I(Z;X|\beta=0)+P(\beta=1)H(X|\beta=1)\\ \geq P(\beta=0)I(Z;X)+P(\beta=1)I(X;Z)=I(X;Z). $$ For the last step, we need the independence. Otherwise take $\beta=X$; In this case $I(\beta X+(1-\beta)Z;X|\beta)=0$ while $I(X;Z)$ can be non-zero.
See that if $\beta$ is independent of $X$ and $Z$: \begin{split} I(\beta X+(1-\beta)Z;X)&\leq I(\beta X+(1-\beta)Z;X|\beta)\\ &=P(\beta=0)I(X;Z)+P(\beta=1)H(X)\\ &=I(X;Z)+P(\beta=1)H(X|Z). \end{split}
This means that if $H(X|Z)=0$, i.e., $X$ is a function of $Z$, then the inequality is reversed.
If $X$ and $Z$ are independent, the inequality is satisfied trivially.
If $\beta$ is a function of $X$: \begin{split} I(\beta X+(1-\beta)Z;X)&=I(\beta X+(1-\beta)Z;X,\beta) \geq I(\beta X+(1-\beta)Z;X|\beta)\\ &\geq P(\beta=0)I(Z;X|\beta=0)+P(\beta=1)H(X|\beta=1)\\ &\geq I(Z;X|\beta). \end{split} This is a slightly looser lower bound since in this case $I(X;Z)\geq I(Z;X|\beta)$.