1
$\begingroup$

Let $T_k \sim E(q_k)$ be a countable family of independent exponentially distributed random variables. In a book by Norris (http://www.statslab.cam.ac.uk/~james/Markov/), at pg. 72, I found the following identity:

$ \ \ \mathbb{P}(T_k\geq t\text{ and } T_j>T_k\text{ for all } j\neq k) = \int_t^\infty q_k e^{-q_ks} \mathbb{P}(T_j > s \text{ for all } j\neq k) ds, $

which I'm not able to prove rigorously. Any hint is much appreciated.

  • 0
    **Hint**: For one approach, what is the distribution of the minimum of a collection of independent exponential random variables?2012-02-24

2 Answers 2

0

Let us rewrite the problem to emphasize the useful assumptions:

Let $X$ and $(Y_i)_i$ denote some random variables. Assume that the family $(Y_i)_i$ is countable and independent on $X$ and that the PDF of $X$ is $f$. Then, for every $z$, $ \mathrm P(X\geqslant z, Y_i\geqslant X\ \mbox{for all}\ i)=\int_z^{+\infty}\mathrm P(Y_i\geqslant x\ \mbox{for all}\ i)\,f(x)\,\mathrm dx. $

The simplest proof might be to introduce $Y=\inf_iY_i$. Then $Y$ is a random variable because the family $(Y_i)_i$ is countable. Call $\mu$ the distribution of $Y$. Since $X$ and $Y$ are independent, one has $ \mathrm P(X\geqslant z, Y_i\geqslant X\ \mbox{for all}\ i)=\mathrm P(Y\geqslant X\geqslant z)=\iint[y\geqslant x\geqslant z]\,f(x)\,\mathrm dx\,\mu(\mathrm dy). $ Furthermore, for every fixed $x$, $ \int[y\geqslant x]\,\mu(\mathrm dy)=\mu([x,+\infty))=\mathrm P(Y\geqslant x)=\mathrm P(Y_i\geqslant x\ \mbox{for all}\ i), $ and the assertion follows.

Application: Choose $X=T_k$, $f(x)=q_k\mathrm e^{-q_kx}$ on $x\geqslant0$, and $(Y_i)_i=(T_j)_{j\ne k}$.

  • 0
    In order to obtain the double integral in your answer I proved that, by independence, the distribution of $(X,Y)$ is the product measure of the two distribution. Is this the only/better way obtain the equality?2012-02-27
1

I didn't notice the "countable family" part, sorry. Maybe this answer is still of some value.

I will assume $k=1$ here for convenience. The joint distribution function $F(t_2, \dotsc, t_n)$ for $T_2, \dotsc, T_n$ is the product of the individual distribution functions:

$ F(t_2, \dotsc, t_n) = \prod_{m=2}^nq_m e^{-q_mt_m}. $

The joint distribution for all variables is therefore

$ q_1e^{-q_1t_1}F(t_2, \dotsc, t_n). $

The requested probability is then expressed by the integral

$ \int_{\Omega}q_1e^{-q_1t_1}F(t_2, \dotsc, t_n) dt_1 \cdots dt_n $

where $\Omega$ is the set that matches the conditions for this probability:

$ \Omega = \{ (t_1, \dotsc, t_n) \mid t_1 \geq t \textrm{ and } t_m \geq t_1 \textrm{ for } m = 2, \dotsc, n \} $

Write the integral more explicitly in terms of the coordinates to get

$ \int_t^{\infty} \int_{t_1}^{\infty} \cdots \int_{t_1}^{\infty} q_1e^{-q_1t_1}F(t_2, \dotsc, t_n) dt_n \cdots dt_2 dt_1 = \\ \int_t^{\infty} q_1e^{-q_1t_1} \int_{t_1}^{\infty} \cdots \int_{t_1}^{\infty} F(t_2, \dotsc, t_n) dt_n \cdots dt_2 dt_1 = \\ \int_t^{\infty} q_1e^{-q_1t_1} \mathbb{P}(T_m \geq t_1 \textrm{ for } m=2, \dotsc, n) dt_1 $

  • 0
    @cardinal. Oh right, didn't realize that.2012-02-24