5
$\begingroup$

From wikipedia inner product page: the expected value of product of two random variables is an inner product $\langle X,Y \rangle = \operatorname{E}(X Y)$. How it can be generalized in case of random vectors?

Or more generally for any probability measure. Let $\mathbb{P}$ be a set of all probability measures defined on $X$, and let $\mathbb{M}$ be the linear span of $\mathbb{P} - \mathbb{P}$. How an inner product can be defined on $\mathbb{M} \times \mathbb{M}$?

I've looked to the norm like $$\|P - Q\|= \sup_{f} \left| \int f \, dP - \int f \, dQ \right|$$ But it seems that this norm doesn't satisfy the parallelogram law (so $\langle x, y\rangle = \frac{1}{4}( \|x + y\|^{2} - \|x - y\|^{2})$ trick cannot be used). Is it possible to proof this?

  • 1
    I'm not sure I understand the second question or how it's a generalization of the first question. For random vectors in, say, a Hilbert space, I guess you can take the expectation of their inner product.2011-05-03
  • 0
    let me clarify, in first case we assume that we have inner product defined as $\langle X, Y \rangle = \int x y \, dP_{XY}$. In second case i'm more interested possibility to define inner products based on so called integral probability metrics.2011-05-03

1 Answers 1