1
$\begingroup$

Say we have the following regression model: $Y_i = \alpha + \beta(x_i - \mathrm{mean}(x)) + R_i$

where $R_1,\ldots,R_{20} \sim G(0, \sigma)$

If we have $\mu(x) = \alpha + \beta(x - \mathrm{mean}(x))$, how do I go about finding the MLE of $\mu(5)$?

I have a given data set with some calculations done for me, but not sure how to approach this?

  • 0
    Usually $N$ or $\mathcal{N}$ is used for the normal or "Gaussian" distribution.2012-06-28

2 Answers 2

1

https://instruct1.cit.cornell.edu/courses/econ620/reviewm5.pdf

Look at the document above and search for "functional invariance". If the MLE for $\alpha$ is $\hat\alpha$ then the MLE for $\cos\alpha$ is $\cos\hat\alpha$, and so on. So if $\hat\alpha$ and $\hat\beta$ are the respective MLEs of $\alpha$ and $\beta$, then $8\hat\alpha+6\hat\beta$ is the MLE for $8\alpha+6\beta$, etc. That's the sort of function you have here.

Here's another source: http://books.google.com/books?id=5OLlwXg6r9kC&pg=PA487&dq=functional+invariance+of+mle&hl=en&sa=X&ei=Zt3sT5ryFITiqgGQ_KGeAg&ved=0CFsQ6AEwBw#v=onepage&q=functional%20invariance%20of%20mle&f=false

This property of MLEs is quite easy to prove. You don't need calculus; you just need to know definitions of things like "increasing function" and "maximum".

0

Michael Hardy is correct. Let l(θ) be the log of the likelihood function given the observed x and the parameter θ. Then you maximized the likelihood by finding θ such that ∂/∂θ l(θ) =0. Now suppose we take the function g(θ) (in your case θ =(α, β) and g(α, β)=α+β(x−mean(x)). Consider l(g(θ)). This is maximized by set partial derivatives with respect to α and β to 0. For simplicity lets look at the one parameter case. ∂/∂θ l(g(θ))= ∂/∂θ l(g) ∂/∂θ g(θ) by the chain rule. The loglikelihood is has partial derivative 0 for g(θ) at the same theta that the loglikeihood for theta has partial derivative =0. Hence the maximum likelihood estimator for a differentiable function of theta is maximized at the function evaluated at the maximum likelihood estimator for theta.

  • 0
    I was just trying to give a simple proof to convey the idea. Did not intend to say that it was in full generality.2012-06-28