If $X_1, \ldots , X_n$ are independent RVs from Gamma$(1,\beta)$ and $S = \sum_i^n X_i$ find the $P(X_1 > 1 | S = s)$.
Attempt: What I know so far is that $S\sim$ Gamma$(n,\beta)$ which is continue. By definition, $$P(X_1 > 1 | S = s) = \frac{P(X_1 > 1, S=s)}{P(S=s)}$$ but since $S$ is a continuous random variable I don't think I can compute it this way. So I figure if I can find the distribution of $X_1 | S=s$ then I can compute the probability.
$$f_{X_1|S}(x_1|s) = \frac{f_{X_1,S}(x_1,s)}{f_S(s)} = \frac{f_{X_1,Y}(x_1, s-x_1)}{f_S(s)}$$ By letting $S = X_1 + Y$ where $Y = \sum_{i=2}^nX_i \sim$ Gamma$(n-1,\beta)$. I think that since $X_1, \ldots , X_n$ are independent then $X_1$ and $Y = \sum_{i=2}^nX_i$ are independent. So $$f_{X_1,Y}(x_1, s-x_1) = f_{X_1}(x_1) f_Y(s-x_1)= \frac{1}{\beta}\exp\left(-\frac{x_1}{\beta} \right)\times \frac{1}{\Gamma(n-1)\beta^{n-1}}(s-x_1)^{n-2}\exp\left( -\frac{s-x_1}{\beta}\right)$$ $$=\frac{1}{\Gamma(n-1)\beta^n}(s-x_1)^{n-2}\exp\left(-\frac{s}{\beta} \right)$$
Dividing by the distribution for $S$ gives
$$\frac{1}{\Gamma(n-1)\beta^n}(s-x_1)^{n-2}\exp\left(-\frac{s}{\beta} \right)\times \Gamma(n)\beta^n s^{1-n} \exp\left(\frac{s}{\beta}\right)$$
which simplifies to
$$(n-1)\frac{(s-x_1)^{n-2}}{s^{n-1}}.$$
But this doesn't seem to reduce down to a distribution that I can identify.
Next I thought to use the fact that $\frac{X_1}{S}$ and $S$ are independent. I think I can do the following $$P(X_1 > 1 | S=s) = P\left(\frac{X_1}{S} > \frac{1}{S} | S = s\right) = P\left(\frac{X_1}{S} > \frac{1}{S}\right) = P(X_1 > 1) = 1 - P(X_1< 1).$$
Then this equals $\exp\left(-1/\beta \right)$. So
- Is this a valid way to compute the conditional probability?
- Does this imply that $X_1$ and $S$ are independent?