Let $X_1, \dots, X_n$ be i.i.d random variables with pdf: $f(x|\theta) = \theta x^2$, $0 < \theta \leq x < \infty$.
Find the mle of $\theta$.
Find the mle of $e^{\theta}$.
Find the method of moments estimator of $\theta$.
Let $X_1, \dots, X_n$ be i.i.d random variables with pdf: $f(x|\theta) = \theta x^2$, $0 < \theta \leq x < \infty$.
Find the mle of $\theta$.
Find the mle of $e^{\theta}$.
Find the method of moments estimator of $\theta$.
Are you sure you have the correct pdf? The integral of this one diverges to $\infty$. I'm going to assume the pdf is $f(x|\theta)=\theta x^{-2}$ for $x\geq \theta$. Since we know that each $X_{i}$ $\geq \theta$, and since the likelihood function is $\theta^{n}\Pi_{i=1}^{n}x_{i}^{-2}$, this expression is maximised by choosing $\hat{\theta}_{mle}$ = $X_{(1)}$, the smallest observed $X_{i}$ value. Then, since $\exp(\cdot)$ is a continuous function, the MLE of $\exp(\theta)$ is given by $\exp(\hat{\theta}_{mle})$. The Method of Moments estimator for $\theta$ does not exist because $E(X^{k})=\infty$ $\forall k\geq 1$.