If I have a relationship as follows:
$Y = a X + G(0,\sigma^2),\text{ so }y = a X + \text{some Gaussian noise}.$
The conditional probability distribution of $y$ given $x$, i.e. $P(y|x)$, is equal to a Gaussian with mean $= a X$ and variance $= \sigma^2$.
I intuitively understand this as the expected value for $y$ should be $a X$ and this will vary due to the noise with the same variance of the noise. Is there a formal proof for this?
Thanks, Aly