$e^{-1/2 \sum_{i=1}^n (x_i - \theta)^2}$ wrt to $\theta$? (Without log.)
How to differentiate a part of normal likelihood function
-
0First convince yourself (e.g. with a few examples) that the exponent is a quadratic function of $\theta$, of the form $a \theta^2 + b\theta + c$. Therefore the function you want to differentiate is of the form $e^{a \theta^2 + b\theta + c}$. Then (i) recall from Calculus how to differentiate such a function, (ii) determine $a, \, b, \, c$ from the other information given to you, that is, in terms of the $x_i$. – 2012-10-30
-
0The problem actually come from the chain rule. After diff. it will become $\sum (e^{x_j - \theta} e^{-1/2 \sum_{i \neq j} (x_i - \theta)^2})$ right? How to simplify it further? – 2012-10-30
-
0What I want is reduce the above diff. to $\sum_{i=1}^n (x_i - \theta) = 0$. With ln this is obvious $d/d\theta \ln e^{-1/2 \sum (X_i - \theta)} = 0$. But how to find it without ln? – 2012-10-30
-
0Oh I mean product rule above. – 2012-10-30
1 Answers
Well, if that helps you, using the chain rule:
$$\left(e^{-\frac{1}{2}\sum^{n}_{i=1}(x_{i}-\theta)^2}\right)^{\prime}=e^{-\frac{1}{2}\sum^{n}_{i=1}(x_{i}-\theta)^2}\cdot\left(-\frac{1}{2}\right)\sum^{n}_{i=1}(2\theta-2x_{i})=-e^{-\frac{1}{2}\sum^{n}_{i=1}(x_{i}-\theta)^2}\sum^{n}_{i=1}(\theta-x_{i})$$
EDIT
To respond to your comment - if want to equate this expression to $0$, we know that $\sum^n_{i=1}(\theta-x_{i})=0$ because the exponential function does not reach zero for any argument.
EDIT 2
To illustrate how the chain rule works, let $f(\theta)=-\frac{1}{2}(x_{i}-\theta)^2$ and $h(\theta)=e^{f(\theta)}$. The derivative of $h$ is thus equal to:
$$h^{\prime}(\theta)=h(\theta)\cdot f^{\prime}(\theta)$$
We know that $f=-\frac{1}{2}(x_{i}^{2}-2x_{i}\theta+\theta^{2})=-\frac{1}{2}x_{i}+x_{i}\theta-\frac{\theta^{2}}{2}$, hence $f^{\prime}(\theta)=x_{i}-\theta=-\frac{1}{2}(2\theta-2x_{i})$
Thus finally:
$$h^{\prime}(\theta)=h(\theta)\cdot f^{\prime}(\theta)=e^{-\frac{1}{2}(x_{i}-\theta)^2}\cdot -\frac{1}{2}(2\theta-2x_{i})=-e^{-\frac{1}{2}(x_{i}-\theta)^2}\cdot(\theta-x_{i})$$
-
0Why the middle part of the diff. is true? – 2012-10-30
-
0It's called the chain rule: $(f(g(x))^{\prime}=f^{\prime}(g(x))g^{\prime}(x)$ http://en.wikipedia.org/wiki/Chain_rule Do you see that now? – 2012-10-30
-
0And obviously, we have $(x_{i}^{2}-2\theta x_{i}+\theta^{2})^{\prime}=2\theta-2x_{i}$. – 2012-10-30
-
0When I $(d/d\theta) e^{-1/2(x_1 - \theta)^2} \cdot \cdots \cdot e^{-1/2 (x_n - \theta)^2}$. I will get $\sum e^{x_j - \theta} e^{-1/2 \sum_{i \neq j} (x_i - \theta)^2}$ which is a completely different form right? Why? – 2012-10-30
-
0For every $i$, $(d/d\theta)e^{-1/2(x_{i}-\theta)^2}=e^{-1/2(x_{i}-\theta)^2}\cdot(-1/2)(2\theta-2x_{i})$ for the reason I previously mentioned. However, I see absolutely no point in splitting it into a product of exponential functions - I cannot see how you would find a derivative of that, and for now I have no clue how you obtained your result. If could please include your intermediate steps, that would be great. – 2012-10-30
-
0I just diff. the first term $e^{1/2(x_i - \theta)^2}$ which will be equal to $e^{x_1 - \theta}$ and then times the original (n - 1) terms $e^{-1/2(x_2 - \theta)^2}$, etc. And I will get the first term of the diff. $e^{x_1 - \theta} e^{-1/2 \sum_{i=2}^n (x_i - \theta)^2}$. And same for the other terms. So I get $\sum e^{x_j - \theta} e^{-1/2 \sum{i \neq j} (x_i - \theta)^2}$. Which used the generalised product rule. – 2012-10-30
-
0No, you are not differentiating this correctly. First off - the chain rule, which I mentioned, applies. That is why your $(e^{-1/2(x_{i}-\theta)^2})^\prime$ is incorrect. – 2012-10-30
-
0Please see the second edit in my answer. I hope this clarifies it. – 2012-10-30
-
0Right I did diff. the term $e^{-1/2(x_1 - \theta)^2}$ incorrectly. It should be $e^{-1/2(x_1 - \theta)^2}(x_1 - \theta)$ and now I get the whole diff. $(x_1 - \theta)e^{-1/2\sum(x_i - \theta)^2} + (x_2 - \theta)e^{-1/2\sum(x_i - \theta)^2} + \cdots + (x_n - \theta)e^{-1/2\sum(x_i - \theta)^2} = e^{-1/2\sum(x_i - \theta)^2}\sum (x_i - \theta)$. Which is the same as your answer. Thanks. – 2012-10-30
-
0If you don't have any further questions, would you please consider accepting my answer? – 2012-10-30