0
$\begingroup$

I am working with the probability likelihood function $ \log \prod\limits_{i=1}^{n} x_i^{y_i} + \log \prod\limits_{i=1}^{n}\left(1-{{x}_{i}}\right)^{n_i-y_i}. $
I want to take the derivative with respect to $x_i$.

How should I do it? Thanks in advance.

  • 0
    If you want to find the fairness/bias/posteriors or parameter MLEs for each of $n$ coins from some data on their sums, then you need the binomial terms ${n_i\choose y_i}$, and I'm guessing you're going to have a harder time than just factoring out the marginals and working out the univariate estimates. Is that in fact your problem? Without those terms, each marginal distribution needs a normalization constant of $\frac{(1-x_i)^{n_i+1}-x_i^{n_i+1}}{1-2x_i}$ and each $Y_i$ is no longer Binomial$(x_i,n_i)$ but rather a truncated Geometric distribution. So may I ask what the original problem is?2012-05-02

2 Answers 2

2

$\log \prod\limits_{i=1}^{n} x_i^{y_i} + \log \prod\limits_{i=1}^{n}\left(1-{{x}_{i}}\right)^{n_i-y_i} =\sum_{i=1}^{n} {y_i}\log x_i + \sum\limits_{i=1}^{n} ({n_i-y_i})\log \left(1-{{x}_{i}}\right)$

so the derivative you are looking for is $\dfrac{y_i}{x_i} - \dfrac{n_i-y_i}{1-x_i}$. This will be zero when $x_i=\frac{y_i}{n_i}$ which probably has an intuitive interpretation.

  • 0
    @J.M.:indeed - thank you2012-05-02
0

Does this look right?
$\log \prod\limits_{i=1}^{m}{\left( \begin{align} & {{n}_{i}} \\ & {{y}_{i}} \\ \end{align} \right)\theta _{i}^{{{y}_{i}}}{{(1-{{\theta }_{i}})}^{{{n}_{i}}-{{y}_{i}}}}}$
$=\log \prod\limits_{i=n}^{n}{{{\theta }_{i}}^{{{y}_{i}}}}+\log \prod\limits_{i}^{n}{{{(1-{{\theta }_{i}})}^{{{n}_{i}}-{{y}_{i}}}}}$ $\text{Note: I ignored the therm } \left( \begin{align} & {{n}_{i}} \\ & {{y}_{i}} \\ \end{align} \right)$ $=\sum\limits_{i=1}^{m}{{{y}_{i}}\log {{\theta }_{i}}}+\sum\limits_{i=1}^{m}{({{n}_{i}}-{{y}_{i}})\log (1-{{\theta }_{i}})}$

Calculating first derivative:
$\frac{\partial }{\partial \theta }=\frac{\sum{{{y}_{i}}}}{{{\theta }_{i}}}-\frac{\sum{({{n}_{i}}-{{y}_{i}})}}{1-{{\theta }_{i}}}=0$ $\frac{\sum{{{y}_{i}}}}{{{\theta }_{i}}}=\frac{\sum{({{n}_{i}}-{{y}_{i}})}}{1-{{\theta }_{i}}}$ $\sum{{{y}_{i}}-{{\theta }_{i}}\sum{{{y}_{i}}={{\theta }_{i}}\sum{({{n}_{i}}-{{y}_{i}})}}}$ $\sum{{{y}_{i}}={{\theta }_{i}}\left( \sum{({{n}_{i}}-{{y}_{i}})+\sum{{{y}_{i}}}} \right)}$ $\sum{{{y}_{i}}={{\theta }_{i}}(\sum{{{n}_{i}}})}$ ${{\hat{\theta }}_{i}}=\frac{\sum{{{y}_{i}}}}{\sum{{{n}_{i}}}}$

  • 0
    this is wrong! We are taking derivative w.r.t. each ${\theta}_{i}$2012-05-02