3
$\begingroup$

I would really appreciate if you could help out with the following question

Let $W_1,W_2,\ldots$ be the event times in a Poisson process $\{X(t);t\ge0\}$ of rate $\lambda$, and let $f(w)$ be an arbitrary function. Verify that $ \mathbb{E}\left[\sum_{i=1}^{X(t)}f(W_i)\right]=\lambda\int_0^tf(w)\,dw. $

(originally posted to this link: http://i.stack.imgur.com/3e6mf.png). I tried conditioning on X(t) = n but I am unsure how to proceed after that... Also I am not exactly sure how to show this is true since f(w) is an arbitrary function and I do not understand what that means.

Thanks a bunch for the help! I really

  • 0
    @George i didn't know how to put math symbols on this forum, so I attached the image. but thanks for your response and thanks for transcribing it here.2011-03-18

2 Answers 2

3

Hint: Indeed, condition on $X(t)=n$. Then, note that given $X(t)=n$, the locations of the $n$ points are distributed as $n$ i.i.d. uniform$[0,t]$ rv's. This should lead you easily to the result, noting that ${\rm E}f(U)=\frac{1}{t}\int_0^t {f(w)dw} $, where $U$ is a uniform$[0,t]$ rv.

EDIT: More precisely (cf. cardinal's comment below), given $X(t)=n$, the points $W_1,\ldots,W_n$ are distributed as $n$ order statistics from the uniform distribution on $[0,t]$. However, the sum $f(W_1)+\cdots+f(W_n)$ (given $X(t)=n$) is equal in distribution to the sum $f(U_1)+\cdots+f(U_n)$ where the $U_i$ are i.i.d. uniform$[0,t]$ rv's, and thus the result follows straightforwardly using the hint.

EDIT (additional hint, in response to the OP's request): $ \sum\nolimits_{n = 0}^\infty {n{\rm P}(X(t) = n)} = {\rm E}[X(t)] = ? $ The left-hand side is given by $ \sum\nolimits_{n = 0}^\infty {n{\rm E}[f(U)]{\rm P}(X(t) = n)}, $ where $U$ is a uniform$[0,t]$ random variable.

EDIT (in response to the OP's request): In general, if $Y$ is a random variable with density function $h$, then ${\rm E}[f(Y)] = \int {f(y)h(y)dy} $. A uniform$[0,t]$ random variable, $U$, has constant density function $h(y)=1/t$ for $y \in [0,t]$ (and $0$ otherwise). From this it follows that ${\rm E}[f(U)] = \frac{1}{t}\int_0^t {f(y)dy} $.

As for the summation $\sum\nolimits_{n = 0}^\infty {n{\rm E}[f(U)]{\rm P}(X(t) = n)}$, first note that $ {\rm E}\Big[\sum\nolimits_{i = 1}^{X(t)} {f(W_i )} \Big] = \sum\limits_{n = 0}^\infty {{\rm E}\Big[} \sum\nolimits_{i = 1}^n {f(W_i )} |X(t) = n \Big]{\rm P}(X(t) = n). $ Now, recall that given $X(t)=n$, the sum $f(W_1)+\cdots+f(W_n)$ is equal in distribution to the sum $f(U_1)+\cdots+f(U_n)$, where the $U_i$ are i.i.d. uniform$[0,t]$ rv's. This accounts for the $n {\rm E}[f(U)]$ in the sum.

Further questions?

  • 0
    @ico$b$es: Yes, ${\rm E}[X(t)]=\l$a$m$b$da t$.2011-03-18
3

One way of solving this problem is to rewrite the sum as a Riemann-Stieltjes integral, $ \sum_{i=1}^{X(t)}f(W_i)=\int_0^tf(s)\,dX_s. $ Using the fact that $X(s)$ has expectation $\mathbb{E}[X(s)]=\lambda s$, you can take expectations* of this to get $ \mathbb{E}\left[\sum_{i=1}^{X(t)}f(W_i)\right]=\lambda\int_0^tf(s)ds. $

(*) To be rigorous, you should show that the expectation does indeed commute with the integral in this way. However, this is a general result which can be used in lots of situations, and is a very special case of the fact that stochastic integrals with respect to martingales are martingales ($X(s)-\lambda s$ is actually a martingale). In the case where $f$ is piecewise linear, the integral reduces to a sum, and expectations commute with summation by linearity. The fact that it extends to the general case of locally integrable $f$ is a consequence of the monotone class theorem.