1
$\begingroup$

Consider a linear forecasting problem where all shocks $\{\epsilon_i\}_1^n$ are independently distributed with $\epsilon_i\sim N(0,\sigma_i^2)$ for all $i$. Suppose you want to forecast $\theta = \sum_{i=1}^m a_i \epsilon_i$ for $m

Let $\mathbb E[\theta|\mathbf x]$ denote the optimal forecast, that minimizes the mean squared error. I want to show that

$$\frac{\partial\operatorname{Cov}(\mathbb E[\theta|\mathbf x],\theta)}{\partial \sigma_i}<0, \quad\text{for }i\in\{m+1,\dots,n\}.$$

I think this is true because an increase in the variance of $\epsilon_i$ for $i\in\{m+1,\dots,n\}$ makes the signals more noisy, therefore, the forecast less accurate.

By a similar argument it should also be possible to prove that

$$\frac{\partial\operatorname{Var}(\mathbb E[\theta|\mathbf x]|\epsilon_i)}{\partial \sigma_i}<0, \quad\text{for }i\in\{m+1,\dots,n\}.$$

For simple examples of this problem I've been able to prove this, but I haven't been able to generalize it. I suspect these results might exist, so just a reference of where I could find them would be of great help.

0 Answers 0