2
$\begingroup$

I found a contradiction I couldn't resolve by my self. It's about a "Uniform White Noise".

Let ${x}_{t}$ be a "White Noise" i.i.d. Random Process:

$ \forall t \in \mathbb{R}, \ {x}_{t} \sim U[-1, \ 1] $

If we chose to go by the PSD definition of "White Noise" (Constant all over the Frequencies) we get:

$ {R}_{xx}( \tau ) = var({x}_{t}) \delta ( \tau ) $

Yet, Clearly:

$ E[{x}_{t} {x}_{t + \tau}] \underset{ \tau = 0}{=}E[{x}_{t} {x}_{t}]= \frac{1}{3} $

Intuitively, a Process with bounded variance and values can't be "White Noise".
Please mind this is a Continuous Random Process. We don't have such problem in the Discrete case.

What am I missing here? Either there's no such "White Noise" (Why?) or There's a good explanation (Could someone derive it Mathematically) how to get the Delta in The Variance.

Thanks.

1 Answers 1

1

Apparently, saying that $(x_t)$ is a continuous white noise process simply refers to the fact that $(x_t)$ is a continuous time process, that is $(x_t)$ is indexed by a continuous parameter set. That is, the sample paths of $(x_t)$ are not assumed to be continuous, and in fact may be expected to be discontinuous at every fixed point (almost surely). Indeed, consider a continuous Gaussian white noise process. Then, $E[x_s x_t]=0$, $s \neq t$, implies that $x_s$ and $x_t$ are independent, and hence the sample paths must be discontinuous at every point (almost surely). The case of "Uniform White Noise" is essentially the same.

  • 0
    I think there is a mistake in this answer. $E[x_s x_t]=0$, $s \neq t$ is a much weaker property than independence. Independence requires that $P(x_t) = P(x_t | x_s)$, doesn't it?2014-10-28