I think your question best understood using two discrete random variables. Suppose you have two random variables $X$ and $Y$ taking values $0,1,2,\ldots,\infty$. Now you are asked to compute the probability of the event $A = X > Y$.
So,
$$
\begin{eqnarray}
P(A) &=& P(X>Y)\\
\end{eqnarray}
$$
Here both $X$ and $Y$ are random. To compute this probability we need the notion of conditional probability. Here it is:
$$
P(A \cup B) = P(A|B) \times P(B)
$$
Now, come to the original problem. We first fix the value of any one random variable, say, $Y = y$. Clearly, $y$ is any value from $0,1,2,\ldots,\infty$, but $y$ can't take these values simultaneously. Now we compute $P(X > y|Y = y)$ and $P(Y = y)$.
Hence $P(X > Y)$ is nothing but $P(X > y|Y = y) \times P(Y = y)$. But we have probabilities of so many events like this for each and every possible value of $y$, again each of these events are mutually exclusive, because occurrence of any one, say, $y = 1$ prevents the occurrence of others i.e. $y = i, i \neq 1$. Therefore, to get the required probability, we need to sum up the probabilities for each of the m.e. events. Thus finally we get,
$$
P(X > Y) = \sum_{y = 0}^{\infty} \left[P(X > y| Y = y) \times P(Y = y)\right]
$$
If you are familiar with the basic definition of expectation of random variable, then previous expression is actually,
$$
\begin{eqnarray}
P(X > Y) &=& \sum_{y = 0}^{\infty} \left[value \times \text{corresponding probability}\right]\\
P(X > Y) &=& E\left[P(X > y| Y = y)\right]
\end{eqnarray}
$$
Now, to make this result suitable for continuous variable, just replace the sum by integration w.r.t $y; (0 \leq y < \infty)$ and $P(Y = y)$ by $f_Y(y)$ i.e. density function of $Y$ at the point $y$.