4
$\begingroup$

It is given in Papoulis that:

Two random processes X(t) and Y(t) are equal in the MS sense iff \begin{equation} E{|\mathbf X(t)-\mathbf Y(t)|^2} = 0 \end{equation} for every t.

It follows that \begin{equation} \mathbf X(t, \xi) = \mathbf Y(t,\xi) \end{equation} with probability 1 where $\xi$ is an outcome.

I have two questions here:

1) What do you mean by two random processes are equal with probability 1 and how is it different from two random processes are equal i.e, X(t,$\xi$) = Y(t,$\xi$), $\forall\xi$?

2) How does the first equation implies second?

The exact text given in the book:

Two processes X(t) and Y(t) are equal in MS sense iff $ E{|\mathbf X(t)-\mathbf Y(t)|^2} = 0 $ for every t. Equality in the MS sense leads to the following conclusion: We denote by $A_t$ the set of outcomes $\xi$ such that $\mathbf X(t,\xi)=\mathbf Y(t,\xi)$ for a specific t, and by $A_\infty$ the set of outcomes $\xi$ such that $\mathbf X(t,\xi)=\mathbf Y(t,\xi)$ for every t. From the above equation it follows that $\mathbf X(t,\xi)-\mathbf Y(t,\xi) = 0$ with probability 1; hence $P(A_t)=P(S)=1$. It does not follow, however, that $P(A_\infty)=1$. In fact, since $A_\infty$ is the intersection of all sets $A_t$ as t ranges over the entire axis, $P(A_\infty)$ might even equal 0.

I'm facing difficulty in understanding/visualizing the above. Thank you very much for the reply for the previous questions which helped me understanding half way. Can you give me an example for the following which I hope will help me understand the above conclusion.

Let two sets $A_{t1}$ and $A_{t2}$ are defined as: $A_{t1} = \{\xi:\mathbf X(t1,\xi)=\mathbf Y(t1,\xi)\}$ $A_{t2} = \{\xi:\mathbf X(t2,\xi)=\mathbf Y(t2,\xi)\}$

MS equality implies: $P(A_{t1}) = 1$ $P(A_{t2}) = 1$

Please give an example of $\mathbf X(t,\xi)$ and $\mathbf Y(t,\xi)$ such that $P(A_{t1}\cap A_{t2})=0$

2 Answers 2

7

Let us forget about $t$ since it does not matter here. Let $X,Y$ be two random variables. What does that mean? It means that there is a space of all possible outcomes $\Xi$ and to each value $\xi$ of the outcome there correspond a value of a random variable $X(\xi)$, $Y(\xi)$. There is also a probability measure on $\Xi$, i.e. to each (measurable) subset $A\subseteq \Xi$ we assign a value $\mathsf P(A)\in[0,1]$.

  1. We say that $X = Y$ with probability $1$ if $\mathsf P(\xi:X(\xi)= Y(\xi)) = 1$. Well, it's certainly different from $X(\xi) = Y(\xi)$ for all $\xi\in \Xi$ since in the former case there still can be some values of $\xi$ where $X$ is different from $Y$ but all these values together are of probability $0$: $ \mathsf P(\xi:X(\xi)\neq Y(\xi)) = 0. $

  2. How does $\mathsf E|X-Y|^2 = 0$ implies $\mathsf P(X=Y)=1$? This is a property of Lebesgue integral - the integral which appears in the definition of the expectation:

    if $f:\Xi\to\mathbb R$ is a non-negative measurable function and $\int\limits_\Xi f\mathrm d\mathsf P = 0$ then $\mathsf P(\xi:f(\xi) \neq 0) = 0$.

    which means that if an integral of a non-negative function is zero, then such function itself is zero by a possible exception of a set which is negligible (of measure zero).

    Recall that an expectation of a random variable $Z$ is defined as $\int\limits_\Xi Z\mathrm d\mathsf P$. In our case, simply $f(\xi):=|X(\xi) - Y(\xi)|^2$ is a non-negative function which integral is zero since $X=Y$ in a mean-square sense.

The dependence on $t$ in this problem is not important, since all the actions we do for each fixed $t$. Also, to make everything rigorous, we assume measurability everywhere we need it, but this shouldn't affect understanding of the idea and just a technical assumption. But if somethings is unclear to you, please tell me.

  • 0
    Thank you very much for the quick answers. I've updated the question. I hope I'll get further answers soon.2012-05-28
2

For your additional question: it's enough to consider $Z(t) = X(t) - Y(t)$ We are interested in finding some stochastic process $Z(t)$ such that, for each $t$, $P[Z(t)=0]=1$, but such that $P[Z(t)=0, \forall t]=0$.

Consider the (deterministic) function $W(t)=0$ for $t\ne 0$, $W(0)=1$. Suppose that we construct $Z(t)$ by throwing a random number $a$ (say, uniformly in [0,1]) and we define $Z(t) = W(t-a)$.

Then, clearly $P(Z(t)=0)=1$ for each $t$, but the probability that $Z(t)=0$ for all $t$ is zero.

Moved from comment: Regarding the example you are seeking: I don't think that you will find it. The finite (or countably infinite) intersection of sets of full measure (probability 1) gives another set of full measure. Only with uncountably infinite intersections, things can change. Eg: take, on the unit interval, the sets $A_x={(0,1)−x}$ (full interval with a point removed) . They all have unit measure, $P(A_x)=1$, but their intersection has measure zero: $\bigcap\limits_{x\in (0,1)} A_x = \emptyset$.

  • 0
    @leON: th$a$t's right2012-05-30