1
$\begingroup$

Assume that $V_i,V_j,D$ are all dependent random variables and real-valued, and let the matrix $H$ be defined by

$ H_{ij} = \mathrm E(V_i V_j) - \mathrm E(\mathrm E(V_i\mid D)\mathrm E(V_j\mid D)) $

My goal is to determine whether $H$ is positive semi-definite, it is the Hessian of a log-likelihood function that I would like to know whether it is convex.

If I can show that $\langle V_i, V_j \rangle = H_{ij}$ is an inner product then $H$ is Gramian and so it is positive semi-definite.

The first two axioms for an inner-product follows directly

$ \langle V_i, V_j \rangle = \langle V_j, V_i \rangle $ $ \langle aV_i, V_j \rangle = a \langle V_j, V_i \rangle $

Now, to determine whether the third axiom holds

$ \langle V_i, V_i \rangle \geq 0 $

I need to determine whether it is true that

$ \mathrm E(V_i^2) - \mathrm E(\mathrm E(V_i\mid D)^2) \geq 0 $

$ \mathrm E(V_i^2) - \mathrm E(V_i)^2 \geq 0 $

$ Var(V_i) \geq 0 $

From Schwarz' inequality $\mathrm E(V_i)^2 \leq \mathrm E(V_i^2)$, so $\langle V_i, V_i \rangle \geq 0$ is true.

I am quite far from my comfort zone. Is my reasoning OK?

Update: Would also be interesting to hear about other ways that one can prove $H_{ij}$ is positive-semi-definite.

  • 0
    @Fabian, thanks.2012-01-25

1 Answers 1

0

I still can't see anything wrong with the approach in the question.

I figured out an alternative way to prove it:

$ H_{ij} = \mathrm E(V_i V_j) - \mathrm E(\mathrm E(V_i\mid D)\mathrm E(V_j\mid D)) $

Noting that $ \begin{align} Cov(X,Y|Z) & = E[(X-E[X|Z])(Y-E[Y|Z])] \\ & = E[XY] - E[X|Z]E[Y|Z] \end{align} $

We get

$ H_{ij} = E[Cov(V_i,V_j|Z)] $

Let

$ A_{ij} = Cov(V_i,V_j|Z) $

Which is positive semidefinite

$ x^TAx \geq 0 \quad \forall x $

also $ E[x^TAx] =x^TE[A]x = x^THx \geq 0 \quad \forall x $

So $H$ is positive semidefinite.

  • 0
    If anyone else answers the original question, I will accept that as the answer instead.2012-01-28