0
$\begingroup$

I have an interesting question but I am a bit confused on a certain aspect of it and I am hoping I can get it cleared up. I am also not very confident in my answers so it is very possible I am making some big or small mistakes.

Suppose that in any 1 hour time period, either a person shows up to wait for a new doctor, or no one shows up, with probabilities $p$ and $(1-p)$ respectively.

Now suppose we check the waiting room after M hours, where $M$ is Poisson distributed with parameter $\lambda$.

I was asked,

What would

$P[S_{M}=k]$ be for $k=0,1,.... $and

$E[M|S_{M}=k]$

Il explain what I have and where I am confused now.

$P[S_{m}=0]=(1-p)^{m}$

$P[S_{m}=1]=(m | 1) (1-p)^{m-1}(p)$ . . .

$P[S_{m}=k]=(m|n)(1-p)^{m-k}(p)^{k}$

( where (m|k) stands for m choose k)

Now the issue here is that I am calculating the probability using little m, not big random variable M like in the question

But $P[S_{M}=k]= \sum_{m=k}^{\infty} P[S_{m}=k]P[M=m]$ moreover M is Poisson so we know its pdf.

So is that as simplified as I could make it?

Essentially, $$P[S_{M}=k]=P[S_{k}=k]P[M=k]+P[S_{k+1}=k]P[M=k+1]+...$$

For the expected value part I am more lost, so far I have used Bayes theorem to get that

$$P[M=m|S_{M}=k]= \frac{P[S_{m}=k]P[M=m]}{P[S_{M}=k]}$$

but this to isnt very neat, it consists of $(m|k)(1-p)^{m-k}(p)^{k} \lambda^{m}e^{- \lambda}/m!$ all divided by the original expression,

thus by best thought would be that using law of total expectation

$$E[M|S_{M}=k]=P(M=1|S_{M}=k)+2P(M=2)|S_{M}=k]+....$$

but again I am not sure this is clean or correct,

I am looking for some guidance and help on understanding either where I went wrong, or how to understand it better.

Also I believe I could represent M as a sum of the events that a person showed versus they did not. For example, could expressions for other such questions be asked such as the $E[M_{0}|S_{M}=k]$ where $M=M_{0}+M_{1}$ where these represent the total number of hours that a person did show up and the total hours that they did not, summed.

Thanks all in advance.

  • 0
    What do $S_M$ and $S_m$ represent?2017-01-17
  • 0
    $S_{M}$ represents the random sum X1+X2+... , $S_{m}=X1+X2+...+Xm$2017-01-17
  • 0
    In short, $S_M$ is a conditionally Binomial Random variable (when given $M$) with parameters $M$ and $p$.2017-01-17

1 Answers 1

2

We have $S_M\mid M~\sim~ \mathcal{Bin}(M, p)$ and $M\sim\mathcal{Pois}(\lambda)$

That is $M$ is the count of poisson arrivals, and $S_M$ is the count of successes among them.   (A success is "no one shows up in that hour", and $M$ is the count of hours checked.)

So indeed $$\begin{align}\mathsf P(S_M{=}k) ~&=~ \sum_{m=k}^\infty \mathsf P(S_M{=}k\mid M{=}n)~\mathsf P(M{=}m) \\[1ex] &=~ \sum_{m=k}^\infty\dfrac{m!~p^k(1-p)^{m-k}}{k!(m-k)!}\cdot\dfrac{\lambda^m\mathsf e^{-\lambda}}{m!}\\[1ex] &=~ \dfrac{\lambda^k~p^k~e^{-\lambda}}{k!}\sum_{m-k=0}^\infty\dfrac{(1-p)^{m-k}\lambda^{m-k}}{(m-k)!}\\[1ex] &=~ \dfrac{(\lambda p)^k~e^{-\lambda}}{k!}\sum_{n=0}^\infty\dfrac{((1-p)\lambda)^{n}}{n!}\end{align}$$

Which you may simplify by recalling that $\displaystyle e^x = \sum_{n=0}^\infty \frac{x^n}{n!}$

It should yeild that $S_M\sim\mathcal{Pois}(\lambda p)$ . Successes occur in a Poisson distribution with rate $\lambda p$.


And $\mathsf E(M\mid S_M=k) = \sum_{m=k}^\infty m~\mathsf P(M=m\mid S_M=k)$

Where $\mathsf P(M=m\mid S_M=k) $ is the probabiluity for $m-k$ failures, given $k$ successes, among the possion arrivals ...

Then $\mathsf E(M\mid S_M=k)$ is the expected number of successes and failures when given there are $k$ successes...

Also I believe I could represent M as a sum of the events that a person showed versus they did not. For example, could expressions for other such questions be asked such as the $E[M_0|S_M=k]$ where $M=M_0 +M_1$ where these represent the total number of hours that a person did show up and the total hours that they did not, summed.

Indeed, and further that $M_1= S_M$; the count of hours no one showed.

So $\mathsf E(M\mid S_M=k) = k+\mathsf E(M_0\mid M_1=k)$.

Thus, $\mathsf E(M\mid S_M=k)= k+(1-p)\lambda$. ... why?

  • 0
    thanks, I am a bit confused how you factored out the sum like that ?2017-01-17
  • 0
    @Quality It is a series; you can factor out the constant factors; powers of $k$ and $\lambda$. In this case I left in factors that were either powers of $(m-k)$, or the factorial, to set thing up for a change of index; $n\gets m-k$.2017-01-17
  • 1
    Oh because the k is also fixed, that makes sense. This makes more sense now, there is still one thing I am trying to work on, I had updated my post2017-01-17
  • 0
    Thanks but actually I am still having trouble figuring out how you got that form for the expected value, ie k+(p-1)lambda, i have tried bayes thereom and using the first part but i dont know how to get in this form2017-01-19
  • 0
    Similarly to the first part, failures occur in a Poisson process at a rate of $(1-p)\lambda$, and this is independent of the number of successes that occur.2017-01-19
  • 0
    Okay but when I try to do it out using pdf definitions I cant seem to get it2017-01-19
  • 0
    @Quality Did you try the following ? $$\begin{align}\mathsf P(M_0{=}j\mid M_1{=}k) ~&=~\dfrac{\mathsf P(M_0=j, M_1{=}k)}{\mathsf P(M_1{=}k)} \\[1ex]~&=~ \dfrac{\mathsf P(M_0=j, M_0{+}M_1{=}j{+}k)}{\mathsf P(M_1{=}k)} \\[1ex]~&=~ \dfrac{\mathsf P(M_0{=}j\mid M_0{+}M_1{=}j{+}k)\cdot\mathsf P( M_0{+}M_1{=}j{+}k)}{\mathsf P(M_1{=}k)}\end{align}$$2017-01-19
  • 0
    I will try that. I was trying to find the expected value using just M before even considering M0 and M12017-01-19
  • 0
    I just keep getting that $P[M=m|S_{M}=k]=e \lambda^{m-k}$2017-01-19
  • 0
    I want to do it by using bayes thereom and noting that $P[M=m|S_{M}=k]=P[S_{M}=k|M=m]P[M=m]$ divided by $P[S_{M}=k]$2017-01-19
  • 0
    Yes, that works too, as: $$\begin{align}\mathsf P(M=m\mid S_M=k)~&=~ \dfrac{\dbinom m k p^k (1-p)^{m-k}\cdot \lambda^m e^{-\lambda}/ m!}{(\lambda p)^ke^{-\lambda p}/k!}\\[1ex]~&=~\dfrac{\big((1-p)\lambda\big)^{m-k}e^{-(1-p)\lambda}}{(m-k)!}\quad\Big[m\in \{k,k+1, ....\}\Big] \\[2ex] \mathsf E(M\mid S_M=k)~&=~ \sum_{m=k}^\infty m~\mathsf P(M=m\mid S_M=k) \\[1ex] &=~ \sum_{n=0}^\infty (k+n)\dfrac{\big((1-p)\lambda\big)^ne^{-(1-p)\lambda}}{n!}\end{align}$$2017-01-19