0
$\begingroup$

Let $x$ be a vector and $\omega$ a matrix S.T. $\hat y=w^Tx$ is a solution to a regression problem trying to predict $y$. Let $y=az$ where $a$ is a scalar and $z,y,\hat y$ have the same dimension. I want to calculate the squared error $E[(\hat y-y)^2]$ hence:

$$E[(\hat y-y)^2]=E[(\omega^Tx-az)^2]=E[(w^Tx)^2-2az\omega^Tx+a^2z^2]$$

How do I expand upon $(\omega^Tx)^2$? I have seen someone write $(\omega^Tx)^2=\omega^Txx^T\omega$ why is that correct?

  • 1
    If $y$ is a vector, then what is $y^2$ supposed to mean? Should that be $$ E[(\hat y - y)^T(\hat y - y)]? $$2017-02-07
  • 0
    By squaring a vector you probably mean taking the inner product, i.e. $(\hat{y} - y)^2 \equiv (\hat{y} - y)\cdot (\hat{y} - y)$. Then $(\omega^T x)^2 = (\omega^T x)\cdot (\omega^T x)$ which is equal to $x^T(\omega\omega^T)x$.2017-02-07
  • 0
    I am not sure what does $(\hat y-y)^2$ mean. this is supposed to be a calculation of error. I do know that $\hat y$ is a linear model aimed to solve a regression problem and predict $y$.2017-02-07
  • 0
    @havakok well until you answer the question of what $y^2$ means for a vector $y$, you can't possibly make sense of $E[(\hat y - y)^2]$. Check your textbook, or your notes!2017-02-07
  • 0
    Ok, now I am sure it is the inner product $(\hat y-y)\cdot (\hat y-y)$.2017-02-07

1 Answers 1

1

Note that for vectors $u,v$ we have $u^Tv = v^Tu = u \cdot v$. We then have $$ (\omega^Tx-az)^2 = (\omega^Tx-az)^T(\omega^Tx-az) = \\ (\omega^Tx)^T(\omega^Tx) - (\omega^Tx)^T(az) - (az)^T(\omega^Tx) + (az)^T(az) =\\ x^T\omega^T\omega \,x - 2 a \;z^T(\omega^Tx) + a^2\; z^Tz $$ Note that we have $(\omega^Tx)^2 = (\omega^Tx)^T(\omega^Tx) = x^T\omega^T\omega \,x$. Note that $\omega^Txx^T \omega$ is a matrix rather than a scalar. It is notable, however, that $$ \operatorname{Trace}(\omega^Txx^T \omega) = \operatorname{Trace}(x^T\omega\omega^Tx) = x^T\omega\omega^Tx = (\omega^Tx)^2 $$