Note: this is based on this post at physics stack exchange, but I thought it would be good to cross-post the final question here as well since it's pretty mathematical in nature.
Given the following inequality
$$ (z_2 \cos \theta - z_1)^2 + (z_2 \sin \theta)^2 - a^2 \geq 0 $$
where $z_1$, $z_2$, and $\theta$ are independent random variables, and $a$ is a positive real number. The values of $z_1$ and $z_2$ follow a Gaussian distribution such that
$$P(z) = \frac{1}{\sqrt{2\sigma^2 \pi}}\exp{\left(-\frac{z^2}{2\sigma^2}\right)} $$
and the value of $\sigma$ is known. $\theta$ has a uniform distribution given by
$$P_{phase}(\theta_i) = 1/2\pi; \quad \theta_i\in[0,2\pi]$$
What is the probability that the initial inequality is true?
I feel like there should be some way to evaluate this by somehow integrating $z_1$ and $z_2$ from $-\infty$ to $+\infty$ and $\theta$ from $0$ to $2\pi$, but I can't figure out how to put it in terms of the combined probability of all three variables making that inequality true.
I could of course evaluate this numerically by doing a brute-force Monte Carlo approach, but I feel like there should be some kind of closed-form integral or something, even if that integral ultimately has to be evaluated numerically.