Verify that $ Y_1 = a_1 Z_1 + (a_1 - a_0 )X_0, $ $ Y_2 = a_2 Z_2 + (a_2 - a_1 )Z_1 + (a_2 - a_1 )X_0, $ etc. Following the comments above, if the $Z_i$ are independent of $X_0$, then each $Y_i$ is a linear combination of independent Gaussian random variables, hence Gaussian.
EDIT: Let's explain why an additional assumption (such as the one we used, namely that the $Z_i$ are independent of $X_0$) is necessary here. Suppose that the following assertion is true: ($\star$) There exist normal random variables $X$ and $Y$, and non-zero constants $a$, $b$, a', and b', such that $aX+bY$ is normal but a'X+b'Y is not. Then, letting $X_0 = -bY$ and $X_n=aX$ for all $n \geq 1$, we have $Z_1 = aX+bY$ and $Z_n = 0$ for all $n \geq 2$. Hence, the $X_i$ are normal and the $Z_i$ are independent normal (using the common convention that constants are normal with variance $0$). However, (a'/a)X_1-(b'/b)X_0 = a'X + b'Y, and is hence not normal. So, we only need to explain why we should expect that ($\star$) is true (though it might be difficult to give an example where it is satisfied). For this purpose, consider the fact that $X$ and $Y$ are jointly normal if and only if ANY linear combination of $X$ and $Y$ is univariate normal...