0
$\begingroup$

In every single website there is written the solution of the solving the parameters $ \mu $ and $ \Sigma $, like on this: http://www.notenoughthoughts.net/posts/normal-log-likelihood-gradient.html

The first thing, what I would do is that:

$$ \mathop{max}_{\theta} \prod_{i=1}^n P(x_i, \theta ) <=> -\sum_{i=1}^n log p(x_i, \theta)$$

Inserting the Gaussian into

$$ \mathop{min}_{\theta} ML(\theta) = \sum_{i=1}^n -log(\frac{1}{((2\pi)^m |\Sigma|)^\frac{1}{2}}) - log ( exp (-\frac{1}{2} (x_i - \mu )^T \Sigma^{-1} (x_i - \mu)) $$

from now on I am stucking. How to get rid of the logarithm?

Edit:

$$ \mathop{min}_{\theta} ML(\theta) = \sum_{i=1}^n log(((2\pi)^m |\Sigma|)^{\frac{1}{2}}) + \frac{1}{2} (x_i - \mu)^T \Sigma^{-1} (x_i - \mu) $$

  • 0
    What is $\log(\exp(z))$?2017-02-20
  • 0
    the solution would be z2017-02-20

0 Answers 0