0
$\begingroup$

Consider a sample $X_1, \ldots, X_n$ with density function: $$ f(x;\theta) = (\theta+1)x^\theta; \ 0 \leq x \leq 1 $$

I want to find the asymptotic distribution of the method of moments estimator $\hat{\theta}_1$ for $\theta$. I have already calculated: $$ E[X] = \frac{\theta+1}{\theta+2} $$

And got $\theta_1$ solving for $\theta$: $$ \frac{\theta+1}{\theta+2} = \frac{1}{n}\sum_{i=1}^n X_i\\ \hat{\theta}_1 = \frac{2\bar{X} -1}{1-\bar{X}} $$

However, I am not sure how to apply the delta method to find its asymptotic distribution. In particular, I am not seeing clearly what $g$ would be in: $$ \sqrt{n}(g(\hat{\theta}_1) - g(\theta)) \rightarrow_D N(0, (g'(\theta))^2\sigma_\theta^2) $$

Intuitively I would think that this means that, the difference between the estimator and the actual value will behave as a normal distribution with the change of sample size, and go towards a mean of 0 with a large enough sample, so I don't see why I need to use a function $g$ ? Can I let $g(x) = x$ and just use the actual values for $\theta$ and $\theta_1$? In general, how should I choose this function $g$? Also, how is $\sigma_\theta^2$ supposed to be calculated? My guess is that it should come from: $$ \begin{align} Var(X)&=\int_{0}^1x^2(\theta+1)x^\theta dx - \left(\frac{\theta+1}{\theta+2}\right)^2 = \frac{\theta+1}{\theta+3} - \left(\frac{\theta+1}{\theta+2}\right)^2\\ &= \frac{\theta+1}{(\theta+2)^2(\theta+3)} \end{align} $$ Any guidance is welcome!

  • 0
    Well, you know the asymptotic distribution of $\bar{X}$, and you want the asymptotic distribution of $g\left(\bar{X}\right)$, where $g(x)=\frac{2x-1}{1-x}$, right?2017-01-06
  • 0
    Would this mean then that $\hat{\theta}_1 = g(\bar{X}) = \frac{2\bar{X} -1}{1-\bar{X}}$ and therefore $\sqrt{n}(\frac{2\bar{X} -1}{1-\bar{X}} - \frac{2\theta - 1}{1-\theta}) \rightarrow_D N(0,(g'(\theta))^2\sigma_{\theta}^2$) with $(g'(\theta))^2 = 1/(1-x)^2$? I still don't see how to calculate the variance... does it refer to $\sigma^2/n$ from the distribution of $\bar{X}$?2017-01-06
  • 0
    The $\sigma^2_\theta$ refers to the limiting variance of the asymptotically normal variable $\hat X$ that you're investigating g(of) (there's no factor of $n$ cause that's taken into account by the sqrt(n) on the LHS). The wikipedia https://en.wikipedia.org/wiki/Delta_method#Univariate_delta_method is reasonably concise and clear, but they still unfortunately use $\theta$ to refer to the mean of the distribution rather than to an estimator.2017-01-06

1 Answers 1

1

I think the notation's a bit confusing.

By the central limit theorem, $\sqrt{n}(\bar X-E(X))\rightarrow_D N(0,Var(X)).$ The delta method theorem says that then $$\sqrt{n}(g(\bar X)-g(E(X)))\rightarrow_D N\left(0,g'(E(X))^2Var(X)\right).$$

Here, take $$\hat\theta_1 = \frac{2\bar X-1}{1-\bar X} = g(\bar X),$$ so the delta method gives $$ \sqrt{n}(\hat\theta_1-\theta)\rightarrow_D N\left(0,g'(E(X))^2Var(X)\right) $$

  • 0
    Thanks! I'm still a bit confused though: as I understand, I am building an estimator $\hat{\theta}_1$ for $\theta$, not for $E(X)$. The latter is just used to construct $\hat{\theta}_1$ via the method of moments. Given that $g(x) = (2x-1)/(1-x)$, if I evaluate what you propose for the delta method, I'd be computing the difference between my estimator and $g$ applied on the expected value of the RV, which is not an estimator for $\theta$. Am I missing something here?2017-01-06
  • 1
    Yes, you are building an estimator for $\theta$, but it's as a function of $\bar X$ (which is an estimator for $E(X)$). Remember, $E(X),$ the expected value of the RV, is a constant. It is just equal to $\frac{\theta +1}{\theta + 2}.$ g applied to $E(X)$ is just $\theta,$ so $g(\bar X)-g(E(X)) = \theta_1-\theta.$2017-01-06
  • 0
    Thanks, I see now. Is it always the case that $g(E(X))$ is equal to the parameter I am trying to estimate, for any $g$ ?2017-01-06
  • 0
    @drgxfs To get g you solved $E(X) = \frac{\theta+1}{\theta +2}$ for $\theta.$ When you found $\theta = \frac{2E(X) - 1}{1-E(X)},$ you set $g(x) = \frac{2x-1}{1-x}.$ Then you applied the method of moments by setting $\hat\theta_1 = g(\bar X)$2017-01-06
  • 1
    @drgxfs I'm not sure what you mean by for any $g.$ For this g it's true (cause that's how it was constructed by method of moments). If you're talking about in general for different estimators, and you have $\hat\theta = g(T)$ where $T$ is a statistic, it won't necessarily be the case that $\theta = g(E(T)),$ but in a lot of cases asymptotic normality will still hold, e.g. maximum likelihood.2017-01-06
  • 0
    Right. At the end I have arrived to $(g'(E(X)))^2 = (\theta+2)^2$ and therefore my final answer is $\sqrt{n}(\hat{\theta_1} - \theta) \rightarrow_D N(0, \frac{(\theta+2)^2(\theta+1)}{\theta+3})$. However, my class notes first state that $\bar{X} \sim N(\mu, \sigma^2/n)$ and then present the solution as $\hat{\theta}_1 = g(\bar{X})\sim N(g(\mu), (g'(\mu))^2\frac{\sigma^2}{n}) \equiv N(\theta, \frac{(\theta+2)^2(\theta+1)}{\theta+3})$. My guess is that these solutions are equivalent, in what I did I am subtracting the mean in the LHS and therefore the normal distribution has mean zero [...]2017-01-07
  • 0
    [...] and the proposed solution is not normalizing for the variance and the mean. Is this correct? If yes, is there any standard convention on how to present the results of the asymptotic distribution of an estimator? (i.e. is it usually given with mean 0?, etc).2017-01-07
  • 1
    @drgxfs Yeah, you're correct. No standard way, really. I like the way your solutions did it cause it's easier to interpret intuitively. But it's equivalent to the other one which is mathematically precise given the definition of convergence of distribution.2017-01-07