I have two regression models $Y=X\beta+\varepsilon,\quad \beta\in\mathbb{R}^k$ $Y=Z\alpha+u\quad \alpha\in\mathbb{R}^m$ it is known that using OLS estimates $\hat{\beta},\hat{\alpha}$ fitted values $\hat{Y}_x,\hat{Y}_z$ are orthogonal. I have to find estimates and fitted values of $Y=X\beta'+Z\alpha'+v$ I understand that answer is $\hat\alpha'=\hat\alpha$, $\hat\beta'=\hat\beta$, $\hat{Y}_{x+z}=\hat{Y}_x+\hat{Y}_z$, it is easy to check that in case $k=m=1$ vectors $X$ and $Z$ are then orthogonal themselves and result follows. But I face difficulties proving it in general.. $\hat{Y}_x^T\cdot\hat{Y}_z=(X\hat\beta)^TZ\hat\alpha=Y^T\Pi_X\Pi_YY=0$ where $\Pi_X=X(X^TX)^{-1}X^T$ is projector. Taking $A=(X\,Z)$ and trying to show that $(A^TA)^{-1}AY=\begin{pmatrix}\hat{\beta} \\ \hat{\alpha} \end{pmatrix}=\hat{w}$ I would need to prove $Z^TX\hat{\beta}=0$, $X^TZ\hat{\alpha}=0$ but how? I was given an idea about linear independence among $X$ and $Z$ columns but I can see it just as requirement for $\hat{w}$ to exist.
Orthogonal fitted values
1 Answers
QUOTE
it is known that using OLS estimates $\hat{\beta},\hat{\alpha}$ fitted values $\hat{Y}_x,\hat{Y}_z$ are orthogonal.
END OF QUOTE
I'm going to hazard to what you mean here: You're saying that the matrices $X$, $Z$ are such that regardless of the value of the vector $Y$, these two vectors of fitted values are orthogonal to each other.
If that's what you mean, then if would follow that every column of $X$ is orthogonal to every column of $Z$. Remember that the vector $\hat Y_x$ of fitted values is the orthogonal projection of $Y$ onto the column space of $X$. So every vector in the column space of $X$ is orthogonal to every vector in the column space of $Z$. It follows that $Z^T X=0$ and $X^T Z=0$, because every entry in the matrix product $Z^T X$ is the dot product of a row of $Z^T$ with a column of $X$, and thus is the dot product of a column of $Z$ with a column of $X$.