Suppose I have:
$ V_1 = X_1+A $ where $A>0$ is some constant $ V_2 = X_2+B $ where $B>0$ is some constant
Furthermore assume that $X_1$ and $X_2$ are independent and distributed with a standard normal distribution $N(0,1)$.
I want to get $P(V_1+V_2 > 1\ \&\ V_1,V_2 \in [0,1] ) $
Here's what I'm thinking:
$V_1+V_2 > 1 \implies X_1+X_2>1-A-B \implies X_1>1-A-B-X_2$ V_2 \in [0,1] \implies -A
so the answer would be:
Then, I could do a double integral one from $1-A-B-X_2$ to $\infty$ for $X_1$ and the other from $-A$ to $1-A$ for X_2, but I'm still not sure what I'm integrating over, e.g pdf of the sum?