$ \begin{align} 1 & = E(X+Y+Z\mid X+Y+Z=1) \\ \\ & = E(X\mid X+Y+Z=1) + E(Y\mid X+Y+Z=1) + E(Z\mid X+Y+Z=1). \end{align} $ If they're independent, you've got symmetry that justifies the conclusion that the three terms are equal. You can get by with much weaker hypotheses than independence if they justify symmetry.
With the variance I'll just assume independence and leave weaker hypotheses for another occasion. We have this bivariate normal distribution: $ \begin{bmatrix} X \\ X+Y+Z \end{bmatrix} \sim N\left( \begin{bmatrix} 0 \\ 0 \end{bmatrix}, \begin{bmatrix} 1 & 1 \\ 1 & 3 \end{bmatrix} \right), $ and hence $\operatorname{cor}(X,X+Y+Z)=1/\sqrt{3}$.
So how do we think about the bivariate normal distribution $ \begin{bmatrix} U \\ V \end{bmatrix} \sim N\left(\begin{bmatrix} 0 \\ 0 \end{bmatrix}, \begin{bmatrix} \sigma^2, & \rho\sigma\tau \\ \rho\sigma\tau, & \tau^2 \end{bmatrix} \right) $ where $\sigma,\tau$ are standard deviations and $\rho$ is the correlation?
Conditional on $V$ being a certain number of standard deviations above its mean, $U$ is expected to be $\rho$ times that many standard deviations about its mean. Thus $ E(U\mid V) = \rho\sigma \frac V\tau = \frac{\rho\sigma\tau}{\tau^2} V. $ By the law of total variance we have $ \begin{align} \sigma^2 = \operatorname{var}(U) & = E(\operatorname{var}(U \mid V)) + \operatorname{var}(E(U\mid V)) \tag{law of total variance} \\ \\ & = E(\operatorname{var}(U \mid V)) + \operatorname{var}\left( \frac{\rho\sigma\tau}{\tau^2} V \right) = E(\operatorname{var}(U \mid V)) + \rho^2\sigma^2. \end{align} $ If you accept "homscedasticity", i.e. the conditional variance of $U$ given $V$ does not depend on $V$, then you get $\operatorname{var}(U\mid V) = (1-\rho^2)\sigma^2$.
Applied to the particular distribution we're considering, this says $ \operatorname{var}(X\mid X+Y+Z) = \left(1 - \frac13\right)\cdot 1 = \frac23. $