1
$\begingroup$

Consider a linear model $ y = Ab+n, $ where $b \in \mathbb{R^m}$ is a parameter to be estimated, $n \in \mathbb{R^{n}}$ is a noise with mean $\mathbb{E}n = m_{n}$ and with covariation matrix $R_{n}$. The Gauss-Markov estimator is an umbiased estimator of parameter $b$ given by $\beta(y) = B(y-m_{n})$, where $B$ is a solution of an optimisation problem $ \mathbb{E}(b - B(y-m_{n}))^2 \to \min\limits_{B}. $ Solving this problem we receive $ \beta(y) = (A'R_{n}^{-1}A)^{-1}A'R_{n}^{-1}(y-m_{n}). $ I'm looking for properties of this estimator. Would it have the minimum variance in the class of unbiased linear estimators? Would it be consistent? And if the answer is negative in general case, maybe it is true if $n$ is a gaussian vector?

  • 0
    I think so. And I would think the usual complete sufficient stat argument would do it.2012-04-24

1 Answers 1

1

Yes, if $\tilde{\beta} = Hy$ is a different unbiased linear estimator of $b$, then the difference $\textrm{COV}(\tilde{\beta}) - \textrm{COV}(\beta)$ is a positive definite matrix.

Therefore, the variance of any unbiased linear estimate of $b$ (any element of the vector) which is not given by $(A'R_{n}^{-1}A)^{-1}A'R_{n}^{-1}(y-m_{n})$ is always higher.

The proof is quite simple and follows the Wikipedia entry for the Gauss Markov estimator:

The estimator is unbiased:

$E(Hy) = E((A' R_n^{-1} A)^{-1} A' R^{-1}+D)(Ab+n)=\\ = (A' R_n^{-1} A)^{-1} H' R^{-1}+D)Ab= \\ =(A'R_n^{-1} A)^{-1} A' R_n^{-1} Ab+DAb= \\ =(I+DA)b$

Therefore, the product DA must be 0 (we assume $E(n)=0$)

The covariance of the estimate: $\textrm{COV} (Hy)=H \textrm{COV} (y) H'= [(A' R_n^{-1} A)^{-1} A' R_n^{-1}+D] R_n [(A' R_n^{-1} A)^{-1} A' R_n^{-1}+D]'= \\ =(A' R^{-1} A)^{-1}+DR_nD'$

Where the last equality is due to DA=0. The result follows since the last term is always positive definite if $D \neq 0$.