2
$\begingroup$

I'd really love your help with this one.

In a survey the probability that a person wouldn't like to answer on it is $p$. The survey company calls people until it meets a person who doesn't like to answer it.

First I was asked to compute the geometric likelihood function $L(p;x_1=0,x_2=0,..x_5=1)$ and draw a graph of it as a function of $p$ , and I did it.

Now I need to find the relation between the expected value of the number of people that would answer the survey until the first one who won't answer ($\mu$), and to use it to find Maximum likelihood to $\mu$. How do I find the requested relation? I can't remember from probability classes how to find the above expected value. can you please remind me/ give me a hint?

Thanks a lot!

  • 0
    Where are you stuck? (mind you, this is not a "solve my homework for me" site). Did you solve the first question? If so, show us. Did you compute the expected value? Did you compute the maximum likelihood? You just need to do the three natural steps and you are done.2012-12-10
  • 0
    So what's missing...?2012-12-14
  • 0
    @saz Nothing, Thanks a lot.2012-12-15

1 Answers 1

3

Let $p_n$ the probability that the $n$-th person doesn't like to answer the survey. Then $p_n$ is given by

$$p_n = \underbrace{(1-p)^{n-1}}_{\text{the first n-1 answered}} \cdot p$$

The expectation value $\mu$ is defined as

$$\mu := \sum_{n=1}^\infty p_n \cdot n = \sum_{n=1}^\infty (1-p)^{n-1} \cdot p \cdot n = p \cdot \frac{d}{dp} \left( -\sum_{n=1}^\infty (1-p)^n \right) \\ = -p \cdot \frac{d}{dp} \frac{1}{1-(1-p)} = -p \cdot \left(- \frac{1}{p^2} \right) = \frac{1}{p}$$

where we used the geometric series.

Now we can calculate the maximum likelihood to $\mu$: Let $x_1,\ldots,x_n$ (independent) observations, then the likelihood is given by

$$\mathcal{L}_p(x_1,\ldots,x_n) = \prod_{j=1}^n p \cdot (1-p)^{x_j-1} = p^n \cdot (1-p)^{-n} \cdot (1-p)^{\sum_{j=1}^n x_j} \\ \Rightarrow \ell_p(x_1,\ldots,x_n) := \log \mathcal{L}_p(x_1,\ldots,x_n) = n \cdot \log p - n \cdot \log(1-p) + \sum_{j=1}^n x_j \cdot \log(1-p)$$

We want to find $\hat{p}$ such that

$$\ell_{\hat{p}}(x_1,\ldots,x_n)=\max_{p \in [0,1]} \ell_p(x_1,\ldots,x_n)$$

Assume that $\sum_{j=1}^n x_j \notin \{0,n\}$. Then

$$\frac{d}{dp} \ell_p(x_1,\ldots,x_n) = \frac{n}{p} + \frac{1}{1-p} \cdot \left(n- \sum_{j=1}^n x_j \right) \stackrel{!}{=} 0 \\ \Leftrightarrow \hat{p} = \left( \frac{\sum_{j=1}^n x_j}{n} \right)^{-1}$$

This means that

$$\ell_{\hat{p}}(x_1,\ldots,x_n)=\max_{p \in (0,1)} \ell_p(x_1,\ldots,x_n)$$

You still have to check the boundary (i.e. $p=0$ and $p=1$) and the cases $\sum_{j=1}^n x_j = 0$ ($\Leftrightarrow \forall j: x_j=0$) resp. $\sum_{j=1}^n x_j = n$ ($\Leftrightarrow \forall j: x_j = 1$), but that's straight-forward.