1
$\begingroup$

I've got the density function of Pareto distribution in Wiki: https://en.wikipedia.org/wiki/Pareto_distribution#Parameter_estimation $$ p(x)=\left\{{\begin{matrix}0,&{\mbox{if }}xx_{{\min }}.\end{matrix}}\right. $$ Let me do some simplification, and we got the new dentisity function and two parameters $\alpha$ and $\beta$ $$ f(x)=\left\{{\begin{matrix}0,&{\mbox{if }}xx_{{\min }}.\end{matrix}}\right. $$ According to the maximum likelihood estimator theory, we have the liklihood function $$ {\mathcal {L}}(\alpha,\beta \,;\,x_{1},\ldots ,x_{n})=f(x_{1},x_{2},\ldots ,x_{n}\mid \alpha,\beta )=\prod _{i=1}^{n}{f(x_{i}\mid \alpha,\beta )}=\frac{\beta^n}{(\prod _{i=1}^{n}{x_i)}^{a}} $$ and let the derivative of $\alpha$ and $\beta$ equals to $0$

while $x

while $x>x_{min}$ we got $$ \frac {\partial \mathcal {L}}{\partial \alpha }=\frac{-\beta^n\sum_{i=1}^{n}{lnx_i} }{(\prod _{i=1}^{n}{x_i})^{a}}=0 $$ $$ \frac {\partial \mathcal {L}}{\partial \beta }=\frac{n\beta^{n-1}}{(\prod _{i=1}^{n}{x_i})^{a}}=0 $$ Solve the equation, we got $\beta =0$

Impossible! What's wrong with it??

Let me change some direction, let $$ ln{\mathcal {L}}= ln(\frac{\beta^n}{(\prod _{i=1}^{n}{x_i)}^{a}} )=nln\beta+ \sum_{i=1}^{n}{ln\frac{1}{x_i^a}} =nln\beta -\alpha \sum_{i=1}^{n}{lnx_i} $$ and let the derivative = $0$, we got $$ \frac {\partial ln\mathcal {L}}{\partial \alpha }=-\sum_{i=1}^{n}{lnx_i}=0 $$ $$ \frac {\partial ln\mathcal {L}}{\partial \beta }=\frac{n}{\beta}=0 $$ Solve the equation, we got $\beta =\infty$? Also impossible!

What's wrong with my compute, please help me, thank you very much!

  • 1
    Hint: This is the case when the maximum point and zero derivatives point are not the same. Also, the likelihood is a bit incorrect, should be $\beta^n$ instead of $\beta$.2017-01-25
  • 0
    Thanks for the hint, i'll check it again. And also the $\beta$ problem, your eyes are sharp!2017-01-25
  • 1
    Look at Wikipedia on 'Pareto Distribution' where MLEs $\hat x_{min}=\min_i X_i$ and $\hat \alpha = n[\sum_i(\ln X_i - \ln \hat x_{min})]^{-1}$ are derived (in that order).2017-01-25

0 Answers 0