I have to provide a Moment estimator for $θ$ of a random sample $X_1,X_2,…,X_n$ which is given by $X ∼ U(0,θ)$ $(θ > 0)$ using Method of moments.
My first approach trying to solve the assignment was to find out that $U(0,θ)$ might be a continuous uniform distribution on the interval $(a,b)$, written as $X ∼ U(a,b)$.
When I have a random sample $X_1,X_2,…,X_n$ from $X$ than the moment number k is defined by $$ E(X^k) = \frac{1}{n}\sum_{i=1}^{n}X_i^k $$ In my distribution model I have one unknown parameter $θ$ so I set up an equation for the first moment. $$ E(X) = \mu = \frac{1}{n}\sum_{i=1}^{n}X_i = \bar{X_n} $$ The expected value for a uniform distribution $X ∼ U(a,b)$ is $$ E(X) = \frac{a+b}{2} $$ So by rearranging the first moment equation for $U(a,b)$, I want to find $b = θ$ where $a=0$. $$ E(X) = \bar{X_n} = \frac{a+b}{2} \Rightarrow b = θ = 2 \bar{X_n} $$ The estimator $\hat{θ}$ for $X ∼ U(0,θ)$ might be $2 \bar{X_n}$.
Is my solution correct or does anyone have corrections or feedback? Thank you.
Edit 1 and Edit 2 and Edit 3:
I also have to answer if the estimator $2\bar{X_n}$ is (1) unbiased or (2) consistent.
(1) Apparently an estimator $\hat{θ}$ for a parameter $θ$ on a random sample size n is unbiased, if $$ E_θ (\hat{θ_n}) = θ $$
So after setting $\hat{θ_n} = 2\bar{X_n}$ into the formula, I get $$ E_θ(2\bar{X_n}) = E\Bigg(2 \cdot \frac{1}{n}\sum_{i=1}^{n}X_i \Bigg) = \frac{2}{n}\sum_{i=1}^{n}E(X_i) = \frac{2}{n} \cdot n \cdot E(X) = 2 \cdot E(X) $$
Rearranging the equation for the expected value delivers the same, as we can see $$ \frac{a+b}{2} = E(X) \Rightarrow b = 2 \cdot E(X) + a \Rightarrow b = 2 \cdot E(X) $$
See that $E_θ (\hat{θ_n}) = θ$ is fulfilled for $\hat{θ_n} = 2\bar{X_n}$ and $θ = 2 \cdot E(X)$. Hence $\hat{θ_n}$ is unbiased.
(2) The estimator $\hat{θ}$ is consistent for $θ$, if $$ \hat{θ_n} \xrightarrow{P} θ \quad for \quad n \rightarrow \infty $$
We could use convergence in probability from the Law of large numbers which is defined as:
If ${X_n}$ is a sequence of random variables and $X$ is another one, then $X_n$ converges in probability to $X$ for $\epsilon > 0$. We can write. $$ \lim\limits_{n \to \infty}P(|X_n-X| \ge \epsilon) = 0 $$ We can set for $X_n = \hat{θ_n}$ and for $X = θ$. We want to show that $$ \lim\limits_{n \to \infty}P(|\hat{θ_n}-θ| \ge \epsilon) = 0 $$ $$ P(|\hat{θ_n}-θ| \ge \epsilon) = P(|2\bar{X_n}-2 \cdot E(X)| \ge \epsilon)= $$ $$ P\Bigg(\Big|2 \cdot \Bigg(\frac{1}{n}\sum_{i=1}^{n}E(X_i)\Bigg) - 2 \cdot E(X)\Big| \ge \epsilon \Bigg) = P\Bigg(\Big|\frac{2}{n} \cdot n \cdot E(X_i) - 2 \cdot E(X)\Big| \ge \epsilon \Bigg)= $$ $$ P(|2 \cdot E(X) - 2 \cdot E(X)| \ge \epsilon) = P(|0| \ge \epsilon) $$ So we get for $\epsilon > 0$ $$ P(|0| \ge \epsilon) = 0 $$ The probability is $0$ because $0$ is not greater or equal a positive number $\epsilon$. Therefore the estimator is consistent.
I am quite unsure if my solution for estimator consistency is right. Can anybody review my approach, please?
