0
$\begingroup$

My Question is divided in two parts:

1) Let $a>0$, $\eta>0$ and let $\theta$ be Gaussian density for prior distribution $P_0$, then:

$P_0(\theta)=(\frac{a\eta}{\pi})^{n/2} e^{-a\eta||\theta||^2_2}$

Is it Gaussian density?Why?

Answer: This is unnormalised Gaussian and norm is because $\theta$ is a vector.

2) For the following:

$P_0(\theta)\times e^{-\eta\sum_{t=1}^T(y_t-x^{'}_t\theta)^2}=(-\eta\theta^{'}(aI+\sum_{t=1}^Tx_tx^{'}_t)\theta-\eta B+\eta G)(\frac{a\eta}{\pi})^{n/2}$

Why do we have $I$ the identity matrix and the rest of the terms (represented as $B\&G$) are the same from the expansion of $\sum_{t=1}^{T}(y_t-x^{'}_t\theta)^{2}$?

1 Answers 1

0

Answer for (1) is clear, as it is not centered and not standardised Gaussian distribution for the vector $\theta$, that is not minusing the mean and dividing by standard deviation.

Answer for (2) is as follows:

$(\frac{a\eta}{\pi})^{n/2}e^{-a\eta\theta\theta^{'}}e^{-\eta{\sum_{t=1}^T}(y_t-x_t^{'}\theta)^2}$, since $||\theta||_2^2=\theta\theta^{'}$, and we are dealing with vectors, thus instead of $1$, we use $I$. Using these reveals the desired result.

Sorry for causing any inconvenience, perhaps it was a simple linear algebra exercise. I hope it helps others.