2
$\begingroup$

I am trying to understand the weighted least squares estimation method, and I'd really appreciate it if you could shed some light on me. Let me explain my problem briefly:

Consider a linear model in a matrix form as $y=\beta x +e$ with $e \sim \mathcal{N} (0, \sigma^2I)$. To find an estimate of $x$, the weighted linear least squares estimator gives $$ \hat{x} = (\beta^tW\beta)^{-1} \beta^tW y, $$ where $W$ is the weight matrix with $w_{ii} = \sigma_i^{-2}$.

Assume that $\beta$ is known (and fixed). How sensitive is the WLS estimator ($\hat{x}$) with respect to the distortions of $y$? What is the relationship between the entries of $\hat{x}$ and $y$? Are there any relationships for the changes of $y$ such that $\hat{x}$ keeps the same values?

  • 0
    $\hat x$ is linear $y$...2017-01-07
  • 0
    @user251257, Your comment is truncated. I am looking for a relationship.2017-01-07
  • 1
    it is not truncated. The dependence is clearly given by your formula. Sensitivity of matrix multiplication is given by the so called condition number of the matrix. And of course if you add some nonzero vector from the null space to $y$, you won't change $\hat x$. There is nothing specific to WLS. Your question is only about matrix-vector-multiplication. The thing is different if you ask for the sensitivity in $\beta$ or $W$.2017-01-07
  • 2
    If the weight is $\sigma^{-2}$ rather than $\sigma_i^{-2}$, i.e. if the weight does not depend on the index $i$, then this isn't really "weighted". $\qquad$2017-01-07
  • 0
    @MichaelHardy thank you for pointing this out. You are right. It is $\sigma_i^{-2}$. I corrected it.2017-01-07

0 Answers 0