Assume that $V_i,V_j,D$ are all dependent random variables and real-valued, and let the matrix $H$ be defined by
$ H_{ij} = \mathrm E(V_i V_j) - \mathrm E(\mathrm E(V_i\mid D)\mathrm E(V_j\mid D)) $
My goal is to determine whether $H$ is positive semi-definite, it is the Hessian of a log-likelihood function that I would like to know whether it is convex.
If I can show that $\langle V_i, V_j \rangle = H_{ij}$ is an inner product then $H$ is Gramian and so it is positive semi-definite.
The first two axioms for an inner-product follows directly
$ \langle V_i, V_j \rangle = \langle V_j, V_i \rangle $ $ \langle aV_i, V_j \rangle = a \langle V_j, V_i \rangle $
Now, to determine whether the third axiom holds
$ \langle V_i, V_i \rangle \geq 0 $
I need to determine whether it is true that
$ \mathrm E(V_i^2) - \mathrm E(\mathrm E(V_i\mid D)^2) \geq 0 $
$ \mathrm E(V_i^2) - \mathrm E(V_i)^2 \geq 0 $
$ Var(V_i) \geq 0 $
From Schwarz' inequality $\mathrm E(V_i)^2 \leq \mathrm E(V_i^2)$, so $\langle V_i, V_i \rangle \geq 0$ is true.
I am quite far from my comfort zone. Is my reasoning OK?
Update: Would also be interesting to hear about other ways that one can prove $H_{ij}$ is positive-semi-definite.