4
$\begingroup$

I need to get familiar with probabilities over separable Hilbert spaces. It's a bit difficult for me to think of something as a probability distribution over a space of function: I can't imagine how it could be defined, but in terms of finite-dimensional objects.

For example, let's consider the only probability distribution over a function space that I know of: the Gaussian Process. We say that

$f\sim GP(\mu(x),c(x,x'))$

if, for any finite set $X_n=\{x_1, \dots, x_n\}$, $\mathbf{f}=(f(x_1),\dots,f(x_n))$ is a random vector having the multivariate normal distribution $\mathcal{N}(\boldsymbol{\mu},\boldsymbol{\Sigma})$ with $\boldsymbol{\mu}=(\mu(x_1),\dots,\mu(_n))$ and $\boldsymbol{\Sigma}$ is a symmetric positive definite matrix with components $\Sigma_{ij}=c(x_i,x_j)$. Are there other practical examples of probability laws over function spaces? Are they always defined in terms of probability distributions of finite-dimensional vectors/matrices, or is it possible to define them directly in terms of infinite-dimensional objects?

Also, does it makes sense to talk about a CDF for infinite-dimensional objects? It seems to me that we can at least talk of a Cumulative Distribution Functional. For example, it could make sense to compute

$P(F(x)

where $F(x)$ is my random function and $g(x)$ is any element of the separable Hilbert space, by interpreting the above inequality in the natural sense. I.e., we ask ourselves which is the probability that $F$ is less than $g$ for all $x\in X$ where $X$ is the domain of the functions in our separable Hilbert space. Then the role of the CDF in finite-dimensional probability spaces would be taken by the functional $\mathcal{F}(g)=P(F(x)linear functionals, but this functional is obviously not linear).

  • 1
    What about an infinite sequence of i.i.d. random variables $\{X_n\}_{n=1}^{\infty}$ that are uniform over $[0,1]$? Or, if you want random functions, what about $g(t)=\sum_{n=1}^{\infty} \frac{1}{2^n} X_n \sin(2\pi n t)$? The latter may be an example of what Pepe was talking about in his answer.2017-02-24
  • 0
    @Michael, ok, the second example made me understood better what Pepe was referring to. Thus your $g(t)$ would actually be a $g(\omega,t)$ - a process, and you're expanding it in a series of orthogonal basis functions. Clever! But I was asking also about the probability distribution of the infinite-dimensional object. If I understand correctly, the joint pdf would be $lim_{n\to\infty}I_{[0,1]}(x_1)I_{[0,1]}(x_2)\cdots I_{[0,1]}(x_n)$, since they're iid, but is there also an example with non independent variables?2017-02-24
  • 0
    If you do not want the $\{X_n\}$ to be i.i.d. then define $X_1=X_2$. I give a more detailed response in my answer below.2017-02-24

2 Answers 2

2

As in my comment above, we can consider a sequence of i.i.d. random variables $\{X_i\}_{i=1}^{\infty}$ uniform over $[0,1]$, and define $g(t) = \sum_{i=1}^{\infty} \frac{1}{2^i}X_i \sin(2\pi i t)$, which is perhaps an example of what Pepe was talking about in his answer.


I have never seen an infinite dimensional PDF or CDF, I don’t think there is a need for that. The $\{X_i\}$ variables are mutually independent if $$ P[\cap_{i=1}^m \{X_i\leq x_i\}] = \prod_{i=1}^m P[X_i \leq x_i] $$ for all integers $m>0$ and all $(x_1, …, x_m) \in \mathbb{R}^m$. Then outcome-sets expressed by $g(t)$ are measurable if they are in the sigma-algebra generated by the $X_i$ variables.

So for example, what is $P[g(1)\leq 4.6]$? It can be shown that $\{g(1)\leq 4.6\}$ holds if and only if for all integers $j>0$ there is an integer $k_j>0$ such that $$\sum_{i=1}^m \frac{1}{2^i}X_i \sin(2 \pi i) \leq 4.6 + \frac{1}{j} \quad, \forall m \geq k_j $$

So, $$ \{g(1) \leq 4.6 \} = \cap_{j=1}^{\infty} \cup_{k=1}^{\infty} \cap_{m=k}^{\infty} \left\{ \sum_{i=1}^m\frac{1}{2^i}X_i\sin(2\pi i) \leq 4.6 + \frac{1}{j}\right\}$$ and so the probability of the complicated event $\{g(1)\leq 4.6\}$ is expressed by intersections and unions of basic events involving a finite number of the $X_i$ variables.

1

The way we often talk about probability distributions on Hilbert spaces is by fixing a countable orthonormal basis and choosing an infinite number of random, square summable coordinates (so that norm is finite) to create the random object. Practical applications of this include Stochastic Partial Differential Equations, where a function itself evolves in time (different to SDEs where we have a random function of time). We can look at the Fourier coefficients as coordinates that themselves satisfy a (infinite) system of stochastic differential equations. Examples of this include the Navier-Stokes Equation and Burger's Equation.

A good introduction to this, if a tad abstract, is Giuseppe De Prato's "An Introduction to Infinite-Dimensional Analysis", Chapter 1.

  • 0
    Can you make a specific example of a probability law defined on a separable Hilbert space, which is not a Gaussian Process? Also, is there a concept of CDF or something similar for these probabilities?2017-02-23
  • 0
    The other answer says most of what I was going to, but as an other example consider the stochastic PDEs I referred to earlier. For SPDEs on a compact space like a torus you can describe the PDE at time $t$ by its Fourier series coefficients, which now change through time. This can often be expressed as a countable system of SDEs, which can have whatever distribution you like.2017-02-24
  • 0
    To answer part of your question about functionals though, if we generate a random object in Hilbert space the way I described, with coordinates $\{X_i\}_{i=0}^\infty$, its norm is $\sqrt{\sum_{i=0}^\infty X_i^2}$, so in some cases you may be able to get good approximations on the size of the norm (especially in cases where $\mathbb{P}[X_i]=0\rightarrow 1$.2017-02-24
  • 0
    you meant in cases where $\mathbb{P}(X_i=0)\to 1$ for $i\to\infty$, right?2017-02-25
  • 0
    Yes, edited now.2017-02-25