1
$\begingroup$

Suppose I have:

$ V_1 = X_1+A $ where $A>0$ is some constant $ V_2 = X_2+B $ where $B>0$ is some constant

Furthermore assume that $X_1$ and $X_2$ are independent and distributed with a standard normal distribution $N(0,1)$.

I want to get $P(V_1+V_2 > 1\ \&\ V_1,V_2 \in [0,1] ) $

Here's what I'm thinking:

$V_1+V_2 > 1 \implies X_1+X_2>1-A-B \implies X_1>1-A-B-X_2$ V_2 \in [0,1] \implies -A

so the answer would be:

Then, I could do a double integral one from $1-A-B-X_2$ to $\infty$ for $X_1$ and the other from $-A$ to $1-A$ for X_2, but I'm still not sure what I'm integrating over, e.g pdf of the sum?

1 Answers 1

1

Not quite.

You want $0 \le V_1 \le 1$ but also $1-V_2 \lt V_1$.

Since you have $0 \le V_2 \le 1$, you can turn your constraints on $V_1$ into $1-V_2 \lt V_1 \le 1$.

Since $V_1 = X_1+a$ and $V_2 = X_2+b$ (it is conventional to use lower case for constants), these become $-b \le X_2 \le 1-b$ $1-a-b-X_2 \lt X_1 \le 1-a$ and so your integral becomes $\int_{x_2=-b}^{1-b} \int_{x_1=1-a-b-x_2}^{1-a} \phi(x_1)\phi(x_2) \, dx_1 \, dx_2$ where $\phi(x)$ is the probability density function of a standard normal distribution.

  • 0
    You should have got $\int_{-b}^{1-b} (\Phi(1-a) - \Phi(1-a-b-x_2)) \phi(x_2)dx_2$ or $\Phi(1-a)(\Phi(1-b)-\Phi(-b))+ \int_{-b}^{1-b} \Phi(1-a-b-x_2) \phi(x_2)dx_2$. Perhaps you did. I doubt there is a further simplification.2012-04-22