Let $X$ and $Y$ be random variable and $k$ be some fixed constant.
For a long time I thought that $P(X+Y=k)=P(X=k-Y)$, but recently I read something like $P(X - Y = k) = E_{Y} \Big( P(X-Y = k) \Big) = E_{Y} \Big( P(X = k+Y) \Big) = \sum_{y=0}^{\infty} P(Y=y)P(X = k+y)$ that makes me wonder whether it is common to consider as random variables probabilities involving random variables being equal to a quantity that involves a random variable itself, like $P(X=k-Y)$.
I have no background in Probability Theory from a measure-theoretic point of view, but from my naïve understanding an explanation could be that $P(X=k-Y)=P(\{\omega:\omega\in X^{-1}(k-Y)\})\neq P(\{\omega:\omega\in {(X+Y)}^{-1}(k)\})=P(X+Y=k)$ where $-1$ obviously indicates the preimage of the argument respect to the function.
In other words, in one case "$P(\cdot)$" is used as a measure of the preimage of $k$ via $X+Y$, in the other case it's used as a function of $Y$ therefore becoming itself a r.v.
Then the confusion rise because of the equal sign being improperly used in place of the set-theoretic way to write down the probability that a r.v. takes some value.
Right?