0
$\begingroup$

I was going through this article and they have this log likelihood given by $ LL = \sum_{i=1}^n A_i\log p_i + \sum_{i=1}^n A'_i\log(1-p_i).$

Basically this is the loglikelihood of a logistic regression where pi is the output from the sigmoid function and Ai is the number of entries at $i$ having y value 1 and $A'_i$ is the number of entries at $i$ having y value $0$.

Now the close form solution of this is given by

$p_i = \frac{A_i}{A_i+A'_i}$

I didn't get this. Where the above solution came from?

  • 0
    I got the idea from this lecture http://ocw.mit.edu/courses/mathematics/18-443-statistics-for-applications-fall-2003/lecture-notes/lec4.pdf2012-06-25

1 Answers 1

1

The derivative of the log-likelihood with respect to $p_i$ is given by $ \frac{A_i}{p_i}-\frac{A_i'}{1-p_i}. $ Putting this equal to zero and solving for $p_i$ yields $ p_i=\frac{A_i}{A_i+A_i'}. $ Of course you should show that this in fact is a maximum and not a minimum, and this is easily done by looking at the second derivative with respect to $p_i$.

  • 0
    And don't forget to check that the function is differentiable in its domain, and that we don't need for check for other (non critical) global maxima (eg, at the borders of the domain)2012-06-25