From Wasserman, All of Statistics, Theorem 9.6:
Let $\hat{\theta}_n$ denote the method of moments estimator. Under appropriate conditions on the model, the following statements hold:
- The estimate $\hat{\theta}_n$ exists with probability tending to $1$.
What does this precisely mean? Searching has not helped me find the precise meaning of this statement.
Additional context: $\hat{\theta}_n$ is calculated according to the method of moments to estimate $\theta$. So, for instance, if a probability distribution has one parameter and we have an iid sample $X_1, \dots, X_n$ following such a probability distribution, we would set $\mathbb{E}[X] = \bar{X}$, $\bar{X}$ being the arithmetic average of $X_1, \dots, X_n$. $\mathbb{E}[X]$ would likely be dependent on $\theta$; such a $\theta$ which solves the above equation is $\hat{\theta}_n$. With $k$ parameters $\theta_1, \dots, \theta_k$, we set $\mathbb{E}[X^j] = \dfrac{1}{n}\sum_{i=1}^{n}X_i^{j}$ for $j = 1, \dots, k$.