Suppose you have $X_1,\ldots,X_{20}$ $f(x|\theta) = 0.1(1+\theta x)$ for $-1 Attempt: Isn't the moment estimator of theta just the first moment which is the mean?
How to find a moment estimator?
1
$\begingroup$
statistics
-
0You'll need $0.5$ rather than $0.1$ as your normalizing constant. The integral of the density needs to be $1$. – 2011-11-16
1 Answers
1
$$ \int_{-1}^1 1+\theta x\;dx = 2, $$ so you need $f(x|\theta) = \frac 12 (1+\theta x)$.
If you mean what I think you mean, I would call it a method-of-moments estimator of $\theta$ rather than a "moment estimator", since the latter term might be mistaken for "estimator of the moment(s)".
The first moment of this distribution is $$ \int_{-1}^1 x f(x \mid \theta) \; dx, $$ which by my reckoning is $\theta/3$. The first moment of the sample is $(X_1+\cdots+X_{20})/20$. You need to equate the first moment of the distribution with the first moment of the sample and then solve for $\theta$.
The method-of-moments estimator of $\theta$ would be equal to the sample mean only if $\theta$ were the population mean.
-
0So theta would just be 3*sample mean? – 2011-11-17
-
0In order to show that this estimator is consistent, do you just take the limit as n approaches infinity? – 2011-11-17
-
0@lord12 It would be 3 times the sample mean. Consistency does involve a limit as the sample size approaches $\infty$, but just what you take the limit of is something you need to be careful about. You need to show that no matter how small the positive number $\varepsilon$ is, the probability that the estimator differs from $\theta$ by more than $\varepsilon$ approaches $0$ as the sample size approaches $\infty$. – 2011-11-18