Let $A$ and $B$ denote events such that $P(A|B) = P(B|A)$. Does it follow that
$P(A) = P(B)$? From a cursory look, an unwary student might say that
$$P(A|B) = P(B|A) \Rightarrow \frac{P(A\cap B)}{P(B)} = \frac{P(A\cap B)}{P(A)}
\Rightarrow \frac{1}{P(A)} = \frac{1}{P(B)} \Rightarrow P(A) = P(B).$$
But a more wary student would say that if the intersection of $A$ and $B$
is an event of zero probability, then the equality $P(A|B) = P(B|A)$ would hold
with both sides being zero without it being necessarily true that $P(A) = P(B)$.
Thus, if $P(A|B) = P(B|A) > 0$, then $P(A) = P(B)$, but
$P(A|B) = P(B|A) = 0$ need not imply that $P(A) = P(B)$
Now suppose that $X$ and $Y$ are discrete random variables taking on values
in $\{x_i\}$ and $\{y_j\}$ respectively. If
$$P\{X = x_i\mid Y = y_j\} = P\{Y = y_j\mid X = x_i\} ~~ \text{for all}~ i, j$$
then, since it cannot be true that $P\{X = x_i, Y = y_j\} = 0$ for all
choices of $i$ and $j$, there must be some $i$ and $j$ such that
$P\{X = x_i, Y = y_j\} > 0$. All we can deduce from this is that
$P\{X = x_i\} = P\{Y = y_j\}$. But your statement seems to mean that $X$ and $Y$
have identical marginal distributions, and why $x_i$ must necessarily equal $y_j$
is something that I cannot see immediately. As a trivial example, suppose that
$X$ and $Y$ are Bernoulli random variables
$$\begin{align*}
P\{X = 0 \mid Y = 0\} &= P\{Y = 0 \mid X = 0\} = 0\\
P\{X = 0 \mid Y = 1\} &= P\{Y = 1 \mid X = 0\} = 1\\
P\{X = 1 \mid Y = 0\} &= P\{Y = 0 \mid X = 1\} = 1\\
P\{X = 1 \mid Y = 1\} &= P\{Y = 1 \mid X = 1\} = 0
\end{align*}$$
From the two middle equations, we deduce that
$P\{Y=1\} = P\{X=0\}$ and $P\{Y=0\}=P\{X=1\}$, but
it does not follow that $P\{X=1\} = P\{Y=1\}$.
In other words, $X$ and $Y$ could be Bernoulli
random variables with parameters $p$ and $1-p$
respectively where $p \neq \frac{1}{2}$, and thus
have different distributions.
Feel free to insert a $,C$ or $Z = z_i$ to the right of $\mid$ everywhere.