Does anyone know how to start this question?
Let random vectors $x,u,v$ have joint Gaussian distribution, and $u,v$ be independent. Show that $E(x|u,v)=E(x|u)+E(x|v)-E(x)$.
Does anyone know how to start this question?
Let random vectors $x,u,v$ have joint Gaussian distribution, and $u,v$ be independent. Show that $E(x|u,v)=E(x|u)+E(x|v)-E(x)$.
Write $x=au+bv+w$ for some numbers $a$ and $b$, with $(u,v,w)$ gaussian and independent.
Thus $y=E(x|u,v)$ is $y=au+bv+E(w)$. Since $(u,v)$ is independent, \begin{align} & E(x|u) = E(y|u) = au+bE(v)+E(w), \\ & E(x|v) = E(y|v) = aE(u)+bv+E(w) \\ & E(x) = E(y) = aE(u)+bE(v)+E(w). \end{align} This yields \begin{align} E(x|u)+E(x|v) & = au+aE(u)+bv+bE(v)+2E(w) \\ & = y+aE(u)+bE(v)+E(w) \end{align} hence $ E(x|u)+E(x|v)=E(x|u,v)+E(x). $
How far $u$ is above the expected value $\mu_u$ of $u$ is $\dfrac{x-\mu_u}{\sigma_u}$ where $\sigma_u$ is the standard deviation of $u$.
How many $x$-standard deviations one should "expect" $x$ to be above the expected value of $x$, given the observed value of $u$, is that expression above multiplied by the correlation between $x$ and $u$; thus it is $\rho_{x,u}\left(\dfrac{x-\mu_u}{\sigma_u}\right)$. So $ E(x\mid u) = \mu_x + \sigma_x \rho_{x,u}\left(\dfrac{x-\mu_u}{\sigma_u}\right) $ where $\mu_x$ and $\sigma_x$ are the expected value and standard deviation of $x$ respectively. And of course a similar result applies to $v$.
So $ \begin{align} & {} \quad E(X\mid u) + E(x\mid c) - E(x) \\ \\ & = \mu_x + \sigma_x \rho_{x,u}\left(\frac{x-\mu_u}{\sigma_u}\right) + \mu_x + \sigma_x \rho_{x,v}\left(\frac{x-\mu_v}{\sigma_v}\right) - \mu_x \\ \\ & = \mu_x + \sigma_x \rho_{x,u}\left(\frac{x-\mu_u}{\sigma_u}\right) + \sigma_x \rho_{x,v}\left(\frac{x-\mu_v}{\sigma_v}\right) \\ \\ \\ & = \mu_x + \frac{\rho_{x,u}\sigma_x\sigma_u}{\sigma_u^2}\left(x-\mu_u\right) + \frac{\rho_{x,v}\sigma_x\sigma_v}{\sigma_u^2}\left(x-\mu_v\right) \\ \\ \\ & = \mu_x + \frac{\operatorname{cov}(x,u)}{\sigma_u^2}\left(x-\mu_u\right) + \frac{\operatorname{cov}(x,v)}{\sigma_u^2}\left(x-\mu_v\right) \end{align} $
The whole matrix of covarinaces, which is the variance of the random vector $(x,u,v)^\top$ is $ \begin{bmatrix} \sigma_x^2, & \rho_{x,u}\sigma_x\sigma_u, & \rho_{x,v}\sigma_x\sigma_v \\ \rho_{x,u}\sigma_x\sigma_u, & \sigma_u^2, & 0 \\ \rho_{x,v}\sigma_x\sigma_v, & 0,& \sigma_v^2 \end{bmatrix}. $ Via the usual formulas for conditional expectations in a multivariate normal distribution, we have $ E\left(x \mid \begin{bmatrix} u \\ v \end{bmatrix}\right) = \mu_x + \begin{bmatrix} \operatorname{cov}(x,u), & \operatorname{cov}(x,v) \end{bmatrix} \begin{bmatrix} \sigma_u^2 & 0 \\ 0 & \sigma_v^2 \end{bmatrix}^{-1} \begin{bmatrix} u-\mu_u \\ v-\mu_v \end{bmatrix}. $
Since the matrix is diagonal, the inversion is simple: just invert the diagonal elements. Multiply the matrices, and you have the desired result.
If $u$ and $v$ had not been independent, this would have been more complicated.
Later edit: I see I treated these three things as scalar-valued. If they're vectors, and not necessarily all of the same number of scalar components, then where I wrote $\dfrac{\operatorname{cov}(x,u)}{\sigma_u^2}$, we would need $\operatorname{cov}(x,u)(\sigma_u^2)^{-1}$ where the two things being multiplied are matrices, in that order. Otherwise it's the same.