0
$\begingroup$

I am working with the probability likelihood function $$ \log \prod\limits_{i=1}^{n} x_i^{y_i} + \log \prod\limits_{i=1}^{n}\left(1-{{x}_{i}}\right)^{n_i-y_i}. $$
I want to take the derivative with respect to $x_i$.

How should I do it? Thanks in advance.

  • 0
    You'll need the chain rule and a memory of logarithm identities, e.g. $\log(a^b c^d)=b\log\,a+d\log\,c$...2012-05-02
  • 0
    how do I take derivative of a product?2012-05-02
  • 0
    What you actually have is the logarithm of a product, so you can use that identity I mentioned earlier before differentiating...2012-05-02
  • 0
    are you saying: $\log \prod\limits_{i=n}^{n}{{{y}_{i}}}\prod\limits_{i=n}^{n}{{{x}_{i}}}+\log \prod\limits_{i=1}^{n}{({{n}_{i}}-{{y}_{i}})}\prod\limits_{i=1}^{n}{(1-{{x}_{i}})}$2012-05-02
  • 0
    $=\log \prod\limits_{i=n}^{n}{{{y}_{i}}}+\prod\limits_{i=n}^{n}{{{x}_{i}}}+\log \prod\limits_{i=1}^{n}{({{n}_{i}}-{{y}_{i}})}+\prod\limits_{i}^{n}{(1-{{x}_{i}})}$ @J.M.2012-05-02
  • 0
    There is no "product rule" because if $x_{i}$ is multiplied by itself you would have a power rule. If you had a product, all the $x_{j}$ with $j \not= i$ would be constants you could ignore when differentiating. J.M. is saying you don't have a product though. After taking the log you have a sum.2012-05-02
  • 0
    @Hank Each ${x}_{i}$ is associated with a particular ${y}_{i}$, can I actually take them apart?2012-05-02
  • 0
    log$ \prod_{i=1}^{n}x_{i}^{y_{i}}=\sum_{i=1}^{n}y_{i}$log$ x_{i}$ If you're not sure what to do after that you may need to review partial differentiation.2012-05-02
  • 0
    Thanks. That was actually quite simple, don't know why I didn't think of it. Must've been under stress.2012-05-02
  • 0
    If you want to find the fairness/bias/posteriors or parameter MLEs for each of $n$ coins from some data on their sums, then you need the binomial terms ${n_i\choose y_i}$, and I'm guessing you're going to have a harder time than just factoring out the marginals and working out the univariate estimates. Is that in fact your problem? Without those terms, each marginal distribution needs a normalization constant of $$\frac{(1-x_i)^{n_i+1}-x_i^{n_i+1}}{1-2x_i}$$ and each $Y_i$ is no longer Binomial$(x_i,n_i)$ but rather a truncated Geometric distribution. So may I ask what the original problem is?2012-05-02

2 Answers 2

2

$$\log \prod\limits_{i=1}^{n} x_i^{y_i} + \log \prod\limits_{i=1}^{n}\left(1-{{x}_{i}}\right)^{n_i-y_i} =\sum_{i=1}^{n} {y_i}\log x_i + \sum\limits_{i=1}^{n} ({n_i-y_i})\log \left(1-{{x}_{i}}\right)$$

so the derivative you are looking for is $\dfrac{y_i}{x_i} - \dfrac{n_i-y_i}{1-x_i}$. This will be zero when $x_i=\frac{y_i}{n_i}$ which probably has an intuitive interpretation.

  • 0
    There seems to be a stray product on the right-hand side...2012-05-02
  • 0
    @J.M.:indeed - thank you2012-05-02
0

Does this look right?
$\log \prod\limits_{i=1}^{m}{\left( \begin{align} & {{n}_{i}} \\ & {{y}_{i}} \\ \end{align} \right)\theta _{i}^{{{y}_{i}}}{{(1-{{\theta }_{i}})}^{{{n}_{i}}-{{y}_{i}}}}}$
$=\log \prod\limits_{i=n}^{n}{{{\theta }_{i}}^{{{y}_{i}}}}+\log \prod\limits_{i}^{n}{{{(1-{{\theta }_{i}})}^{{{n}_{i}}-{{y}_{i}}}}}$ $\text{Note: I ignored the therm } \left( \begin{align} & {{n}_{i}} \\ & {{y}_{i}} \\ \end{align} \right)$ $=\sum\limits_{i=1}^{m}{{{y}_{i}}\log {{\theta }_{i}}}+\sum\limits_{i=1}^{m}{({{n}_{i}}-{{y}_{i}})\log (1-{{\theta }_{i}})}$

Calculating first derivative:
$$\frac{\partial }{\partial \theta }=\frac{\sum{{{y}_{i}}}}{{{\theta }_{i}}}-\frac{\sum{({{n}_{i}}-{{y}_{i}})}}{1-{{\theta }_{i}}}=0$$ $$\frac{\sum{{{y}_{i}}}}{{{\theta }_{i}}}=\frac{\sum{({{n}_{i}}-{{y}_{i}})}}{1-{{\theta }_{i}}}$$ $$\sum{{{y}_{i}}-{{\theta }_{i}}\sum{{{y}_{i}}={{\theta }_{i}}\sum{({{n}_{i}}-{{y}_{i}})}}}$$ $$\sum{{{y}_{i}}={{\theta }_{i}}\left( \sum{({{n}_{i}}-{{y}_{i}})+\sum{{{y}_{i}}}} \right)}$$ $$\sum{{{y}_{i}}={{\theta }_{i}}(\sum{{{n}_{i}}})}$$ $${{\hat{\theta }}_{i}}=\frac{\sum{{{y}_{i}}}}{\sum{{{n}_{i}}}}$$

  • 0
    this is wrong! We are taking derivative w.r.t. each ${\theta}_{i}$2012-05-02