11
$\begingroup$

The probability density function of the exponential distribution is defined as

$$ f(x;\lambda)=\begin{cases} \lambda e^{-\lambda x} &\text{if } x \geq 0 \\ 0 & \text{if } x<0 \end{cases} $$

Its likelihood function is

$$ \mathcal{L}(\lambda,x_1,\dots,x_n)=\prod_{i=1}^n f(x_i,\lambda)=\prod_{i=1}^n \lambda e^{-\lambda x}=\lambda^ne^{-\lambda\sum_{i=1}^nx_i} $$

To calculate the maximum likelihood estimator I solved the equation

$$ \frac{d\ln\left(\mathcal{L}(\lambda,x_1,\dots,x_n)\right)}{d\lambda}\overset{!}{=}0 $$

for $\lambda$.

$$ \begin{align} \frac{d\ln\left(\mathcal{L}(\lambda,x_1,\dots,x_n)\right)}{d\lambda} &= \frac{d\ln\left(\lambda^ne^{-\lambda\sum_{i=1}^nx_i}\right)}{d\lambda} \\ &= \frac{d\ln\left(n\ln(\lambda)-\lambda\sum_{i=1}^n x_i\right)}{d\lambda} \\ &= \frac{n}{\lambda}-\sum_{i=1}^n x_i \end{align} $$

Finally we get $$\lambda = \frac{n}{\sum\limits_{i=1}^n x_i}$$

I hope this is correct this far.

Where I am more uncertain is the proof for consistency.

I understand that to be consistent is in this case equivalent to to converge in probability to $\lambda$. So I have a hinch, that something like

$$ \lim_{n\to\infty}\mathbb{P}\left(\mathcal{L}(\lambda,x_1,\dots,x_n)-\lambda\right)=0 $$

will lead me to a solution.

Am I correct this far? If yes, how can I solve this? A hint would be great.


Update:

Using hints by users @Did and @cardinal I will try to show the consistency by proving that $\frac{1}{\Lambda_n}\to\frac{1}{\lambda}$ for $n\to\infty$ where

$$ \Lambda_n=\frac{n}{\sum\limits_{k=1}^nX_k} $$

Since $E(X_1)=\int\limits_0^\infty\lambda xe^{-\lambda x}dx=\frac{1}{\lambda}$ and the random variables $X_i$ for $i\ge1$ are independent the strong law of large numbers implies that

$$ P\left(\limsup_{n\to\infty}\left|\frac{1}{\Lambda_n}-\frac{1}{\lambda}\right|=0\right)=P\left(\limsup_{n\to\infty}\left|\frac1n\sum_{k=1}^nX_k-\frac{1}{\lambda}\right|=0\right)=1 $$

is true which implies convergence almost everywhere. This implies convergence in probability of $\Lambda_n$ to $\lambda$, which is equivalent to consistency.

Is this proof correct?

2 Answers 2

6

The computation of the MLE of $\lambda$ is correct.

The consistency is the fact that, if $(X_n)_{n\geqslant1}$ is an i.i.d. sequence of random variables with exponential distribution of parameter $\lambda$, then $\Lambda_n\to\lambda$ in probability, where $\Lambda_n$ denotes the random variable $$ \Lambda_n=\frac{n}{\sum\limits_{k=1}^nX_k}. $$ Thus, one is asked to prove that, for every positive $\varepsilon$, $\mathrm P(|\Lambda_n-\lambda|\geqslant\varepsilon)\to0$ when $n\to\infty$.

In the case at hand, it might be easier to prove the stronger statement that $\frac1{\Lambda_n}\to\frac1\lambda$ almost surely when $n\to\infty$. Hint: Law of large numbers.

  • 0
    (+1) I think your answer and my comment were (nearly) simultaneous. I've deleted mine.2012-01-23
  • 0
    First, thank you for the explanation. So you mean I have to show, that $P\left(\lim\sup_{n\to\infty}\left|\frac{1}{\Lambda_n}-\frac{1}{\lambda}\right|=0\right)=1$? (Strong law of great numbers.) Correct?2012-01-23
  • 0
    Weak law, if I'm not mistaken. You need to show convergence in probability, not almost sure convergence.2012-01-23
  • 0
    Aufwind: Yes, if you know this, you know that $\Lambda_n\to\lambda$ almost surely, hence you know that $\Lambda_n\to\lambda$ in probability, which is what you want.2012-01-23
  • 0
    @DidierPiau I tried another approach. Would you say, that is sufficient? Thanks for any of your efforts!2012-01-23
  • 0
    @Did, What if you had a prior on $ \lambda $ to be uniformly distributed on [0, 1]?2015-06-16
  • 0
    @Drazick IF new question THEN new post.2015-06-16
  • 0
    @Did, Here you go - http://math.stackexchange.com/questions/1327752.2015-06-16
3

$\hat\lambda= \frac{n}{\sum_{i=1}^n x_i}$ to be consistent estimator of $\lambda$ it should be Asymptotically

  1. Unbiased,
  2. and it's variance goes to zero.

Using $E\left\{ x\right\}=\frac{1}{\lambda}$ and $E\left\{ x^2\right\}=\frac{2}{\lambda^2}$ and the fact that $x_i$ are iid, we have

Condition 1: $\lim_{n\rightarrow \infty} E\{\hat\lambda - \lambda\}=0$

Condition 2: $\lim_{n\rightarrow \infty}E\left\{\left(\hat\lambda - E\{\hat\lambda\}\right)^2\right\}=0 $

  • 1
    Funny: three upvotes for an answer based on the "identity" $$E\left(\frac1Z\right)=\frac1{E(Z)},$$ used *twice*. An invitation to modesty, I would say.2015-06-16
  • 0
    @Did, Could you answer the question I linked at - http://math.stackexchange.com/questions/1327752/maximum-a-posteriori-map-estimator-of-exponential-random-variable-with-uniform ? I will mark your answer as correct. Thank You. Or you can edit my answer.2015-06-17
  • 1
    Now the question has been silently cleansed of the outrageous E(1/Z)=1/E(Z) claims. Problem: the resulting post answers nothing.2015-06-21