Let $\epsilon_{0}$, $\epsilon_{1}$ and $\epsilon_{2}$ be independent standard normal random variables.
I would like to compute $\mathbb{P}[b(p \epsilon_{0} + (1-p)\epsilon_{1}) > \max\{\epsilon_{0}p + (1-p)\epsilon_{2}, C + a(p \epsilon_{0} + (1-p)\epsilon_{1}))]$,
For $C,b>0$, $p \in (0,1)$, $a \in (0,1)$.
How do I write this problem down in integral form so that I can plug it into a numerical simulator? I'm having trouble because of the $\epsilon_{0}$...